Using the Interactive Systems Framework to Support a ... - Springer Link

5 downloads 117 Views 556KB Size Report
May 23, 2012 - Strategies to Promote Early Detection of Breast Cancer: Planning ...... agency partners to form Cancer Action Councils at the ten organizations ...
Am J Community Psychol (2012) 50:497–517 DOI 10.1007/s10464-012-9518-6

ORIGINAL PAPER

Using the Interactive Systems Framework to Support a Quality Improvement Approach to Dissemination of Evidence-Based Strategies to Promote Early Detection of Breast Cancer: Planning a Comprehensive Dynamic Trial Bruce D. Rapkin • Elisa S. Weiss • David W. Lounsbury • Hayley S. Thompson • Robert M. Goodman • Clyde B. Schechter • Cheryl Merzel • Rachel C. Shelton • Arthur E. Blank • Jennifer Erb-Downward • Abigail Williams • Pamela Valera • Deborah K. Padgett Published online: 23 May 2012 Ó Society for Community Research and Action 2012

Abstract Dissemination efforts must optimize interventions for new settings and populations. As such, dissemination research should incorporate principles of quality improvement. Comprehensive Dynamic Trial (CDT) designs examine how information gained during dissemination may be used to modify interventions and improve performance. Although CDT may offer distinct advantages over static designs, organizing the many necessary roles and activities is a significant challenge. In this article, we discuss use of the Interactive Systems Framework for Dissemination and Implementation to systematically implement a CDT. Specifically, we describe ‘‘Bronx ACCESS’’, a program designed to disseminate evidence-based strategies to promote adherence to mammography guidelines. In Bronx ACCESS, the Intervention Delivery System will elicit information needed to adapt strategies to specific settings and circumstances. The Intervention Synthesis and Translation System will use this information to test changes to B. D. Rapkin (&)  E. S. Weiss  D. W. Lounsbury  C. B. Schechter  C. Merzel  A. E. Blank  J. Erb-Downward  A. Williams  P. Valera Department of Epidemiology and Population Health, Albert Einstein College of Medicine, 3300 Kossuth Avenue, Bronx, NY 10467, USA e-mail: [email protected] H. S. Thompson Wayne State University, Detroit, MI, USA R. M. Goodman Indiana University Bloomington, Bloomington, IN, USA R. C. Shelton Columbia University, New York, NY, USA D. K. Padgett New York University, New York, NY, USA

strategies through ‘‘embedded experiments’’. The Intervention Support System will build local capacities found to be necessary for intervention institutionalization. Simulation modeling will be used to integrate findings across systems. Results will inform on-going policy debate about interventions needed to promote population-level screening. More generally, this project is intended to advance understanding of research paradigms necessary to study dissemination. Keywords Comprehensive dynamic trials  Interactive systems framework  Dissemination research  Quality improvement  Mammography  Intervention adaptation

Dissemination of evidence-based behavioral interventions to underserved communities is an overarching priority in public health, with the potential to overcome health disparities and improve outcomes (Kreuter et al. 1999a, b; Glasgow et al. 2003, 2004). Unfortunately, no matter how promising the results of original efficacy studies, we cannot state that an intervention will be optimal (or even efficacious) in a new setting without first experimenting to find out whether and how its performance might be enhanced. There is often so much uncertainty about the generalizability of findings from efficacy studies that it is not warranted to represent behavioral interventions as wellestablished solutions or best-practices (Bull et al. 2003; Kerner 2002). Rather, Sandler (2007) argues that we must move away from top-down dissemination of standardized programs, to engage communities in critical thinking and reflection about what programs they need and how they are working. Similarly Green and Ottoson (2004) discuss dissemination as a joint process of problem solving, involving key stake-holders in making decisions about proper approaches to implement in different contexts. Dissemination

123

498

research must determine how to adapt and ultimately optimize interventions for new settings and populations (McKleroy et al. 2006). Given the central importance of optimizing evidencebased interventions for settings and communities, it seems logical that dissemination efforts should encompass strategies for quality improvement. Rapkin and Trickett (2005) proposed the Comprehensive Dynamic Trial (CDT) paradigm to increase the public health impact of evidencebased interventions by incorporating procedures for quality improvement. Fundamental to CDT is the notion that information gained during the course of conducting an intervention must be fed back into the process, to dynamically refine and improve the intervention’s performance. As such, dissemination is viewed as an overarching process encompassing iterative cycles of implementation and adaptation. In a CDT, intervention implementation must be examined and tracked continuously, to provide decisionmakers with data about intervention performance in terms of intervention reach, acceptability, fidelity, efficacy, and efficiency. In sum, decision-makers systematically use these data to identify possible modifications to improve intervention performance, which are provisionally implemented as ‘‘embedded experiments’’ (Collins et al. 2004). Experiments that lead to improved intervention performance are incorporated into the protocol. The CDT approach offers several advantages for community dissemination. As Green and Glasgow (2006) discuss, organizational leaders and policy makers must be concerned with more than the efficacy of interventions. They have to contend with intervention costs, congruence with other agency programs and priorities, and staff willingness and ability to conduct the program. Continued evolution of interventions through CDT makes it possible to move beyond effectiveness to address optimal performance. Further, CDT provides a central role for community stakeholders to provide experience and expertise needed to revise interventions to best meet local needs. CDT places no inherent limitations on the kinds of modifications that may be made to an intervention. Decisionmakers may enhance, modify and delete core intervention elements and combine elements from multiple interventions, as long as these changes are warranted by results of embedded experiments. Changes may be global or they may be targeted to ensure that all subgroups are able to benefit from the intervention. The clear intent of CDT is to foster the evolution of strategies and practices to achieve optimal performance in particular environments. Thus, dissemination of an initial evidence-based intervention to multiple settings may lead to numerous evidence-adapted variations, each better suited to its host setting and population than the original. Beyond the potential for better meeting public health needs, CDT also provides a context

123

Am J Community Psychol (2012) 50:497–517

for basic discovery concerning the nature of modifications and adaptations needed to make interventions work better. Recently, members of our Cancer Prevention and Control Research program at the Albert Einstein Cancer Center, in conjunction with collaborators at several other institutions, decided to undertake a major project to disseminate evidence-based strategies to promote adherence to mammography screening guidelines. We entered into collaboration with ten community-based organizations offering multiple programs to large numbers of diverse, low-income women in the Bronx, NY. These organizations were intentionally selected to be diverse, varying in size, staffing and mission, and including multisite housing and social service organizations, social service providers focusing on specific ethnic communities, immigrant resettlement groups, faith based organizations, and federallyqualified clinics. Although we had previously worked with two organizations, we reached the remainder through networking and ‘‘cold calls.’’ In fact, through our efforts to recruit agencies, we discovered mutual interests that have resulted in collaborations with a number of these organizations on smaller projects unrelated to Bronx ACCESS. As we shall discuss below, evidence from both population surveillance and efficacy trials supported the rationale for conducting a CDT. After consultation with the National Cancer Institute, we decided to develop a program project application to support this dissemination research. This grant proposal is currently pending resubmission. In developing this application, we were confronted with an important problem: how to best organize the many functions necessary to carry out dissemination according to the CDT paradigm into coherent activities that would be amenable to research, meet the needs of our community, and serve as a model for future dissemination efforts. Our efforts to solve this problem soon led us to recognize the relevance of the Interactive Systems Framework for Dissemination and Implementation (ISF) as an organizing scheme for our work (Wandersman et al. 2008). The ISF model includes a ‘‘Prevention Synthesis and Translation System’’ that provides communities with information about interventions, a ‘‘Prevention Support System’’ that provides training and technical assistance, and a ‘‘Prevention Delivery System’’ to carry out innovative interventions in practice. What is particularly intriguing about this framework is the notion that these interactive systems represent the collaborative infrastructure needed to bring new strategies into practice across multiple projects and populations (Berry et al. 2005; Matson et al. 2008; Rogers 2003). In this article, we describe the components of the Bronx ACCESS program project as an example of how the ISF can be used to support the dynamic process of implementation and adaptation envisioned by the CDT paradigm. Our emphasis on on-going quality improvement

Am J Community Psychol (2012) 50:497–517

determined the specific functions provided by each system, as well as the flow of information and services across systems. Although Bronx ACCESS is still in the development phase, we offer our proposed approach as an example, in order to flesh out issues involved in usingthe ISF to carry out dissemination of evidence-based interventions in a manner intended to lead to optimal performance and sustainability.

The Need to Promote Adherence to Mammography Screening Guidelines A primary goal of the Healthy People 2010 initiative has been to increase the proportion of women in the United States aged 40 years and older who received a mammogram within the preceding 2 years to 70 %. Despite notable gains in use of mammography, many underserved communities, including parts of the Bronx, lag behind national trends. (American Cancer Society 2010; National Center for Health Statistics 2010). Public health initiatives have helped to close racial and ethnic disparities in screening, but those women who remain unscreened may be particularly vulnerable or cut off from care (Chlebowski et al. 2005; Cosp et al. 2009; Lannin et al. 1998; Newman 2005). Recent public controversy regarding screening guidelines has left many women and providers uncertain about mammography (Jørgensen and Gøtzsche 2009; Kolata 2009; Sargenti and Dresdale 2009). In late 2009, the US Preventive Health Services Task Force downgraded the recommendation of bi-annual mammography for women aged 40–49 as well as women over age 74, ultimately suggesting that they discuss this with their physicians (recommendations differ for women with a family history of breast cancer or other risk factors). Other organizations have vociferously argued against these changes. Airing this controversy in the press has understandably led to concern and confusion. Some women who have screened annually may want to stop or reduce frequency. Others may be unsure about postponing screening until age 50 (Esserman et al. 2009). Women may be concerned about their physician’s recommendation to discontinue screening (Gemignani 2010). Still others may lack a regular provider with whom they can discuss these decisions. Pending federal health insurance legislation and recent cuts in state support for screening may also affect access to services and make women’s screening decisions more difficult. Several decades of research have examined behavioral and educational interventions to encourage women to obtain routine mammograms. These studies demonstrate that there is much that we can do to help women make decisions and connect them to appropriate care (Airhihenbuwa 1992; Costanza et al. 2000; Mandelblatt et al.

499

2009; Muss et al. 1992; Yabroff 2008) However, efficacy studies in and of themselves offer little guidance to practitioners and policy makers about what to do, when and for whom (Tunis et al. 2003; Glasgow et al. 2004; Glasgow and Emmons 2007). A meta-analysis by Vernon et al. (2010) highlights the difficulty of determining which strategies are most effective to promote screening. These authors examined the results of 25 studies evaluating interventions to promote mammography in previously screened women who are out of adherence. Studies were classified in terms of types of strategies employed, theoretical foundation, nature of the population, and frequency of contact. They found substantial heterogeneity of effects for most strategies in most contexts, with the exception of more consistent efficacy associated with the use of reminders. Vernon and colleagues’ findings have profound implications for dissemination efforts. Based on their review of the best traditional trials, it is fair to say that we have efficacious techniques that work for some people in some circumstances, but no ‘‘magical’’ fixes, and little guidance about how to promote adherence to mammography screening guidelines in different situations and settings (Glasgow et al. 2006; Klesges et al. 2006; Kreuter et al. 1999a, b; Lipkus et al. 2000; Mandelblatt et al. 2004; Oxman et al. 1995; Rennert 2007). The prevailing assumption in public health is that we must have conclusive findings from efficacy research before dissemination can proceed. After all, what would we disseminate? Yet, the heterogeneity of effects from study to study mean that there is unlikely to ever be definitive trials that would conclusively show which strategies are universally ‘‘best’’ (Baum 2009; Green and Glasgow 2006; Hawe et al. 2009; Ratner et al. 2001). We cannot say with certainty that an intervention found to be effective in one setting will be equally efficacious in another setting (Bonfill et al. 2001; Green 2001; Green and Glasgow 2006; Greenhalgh et al. 2004). Rather, as Vernon and colleagues note, future research must provide information on the external validity of findings in order to better understand sources of variation in screening effects. Thus, dissemination of interventions to promote adherence to mammography screening guidelines cannot be boiled down to direct ‘‘technology transfer’’ (Glasgow 2003; Glasgow et al. 2004). Rather, dissemination requires problem solving to see what works where, when and for whom.

A Stepped Intervention to Promote Adherence to Screening Guidelines Age and risk-related differences in recommendations as well as the pervasiveness of barriers to screening and the

123

500

many potential opportunities to facilitate access all suggest the need for an intervention that can respond to women’s specific needs and circumstances. At the same time, in order to improve population rates of adherence, it is necessary that an intervention be designed to make best use of resources. Thus, as an initial starting point, we decided to weave together a stepped continuum of evidence-based strategies relevant to the broadest cross-section of women. Several studies have reported that women assigned to stepped interventions demonstrate greater adherence to mammography screening guidelines compared to women in control conditions (Bowen and Powers 2010; Janz et al. 1997). The Task Force on Community Preventive Services, an independent, nonfederal, volunteer body of public health and prevention experts, has made recommendations for specific categories of strategies to increase breast cancer screening based on a systematic review of available studies that provide strong or sufficient evidence that the strategy is effective (The Task Force on Community Preventive Services 2010). These intervention strategies include Client reminders, One-on-one education, Small media, including videos and printed material; and Reduction of structural barriers. The National Cancer Institute’s Research-Tested Intervention Programs and the Community Guide developed by the Task Force serve as important scientific syntheses from which evidence-based strategies

Am J Community Psychol (2012) 50:497–517

can be identified, disseminated and adapted in the context of community-based settings. The stepped approach in Bronx ACCESS will include approaches recommended by the Task Force and specific evidence-based strategies from the National Cancer Institute and the Community Guide. Strategies at each step will be delivered by lay health advisors (LHAs). LHAs have received a great deal of attention as interventionists to promote breast cancer screening (Derose et al. 2000; Duan et al. 2000; Sung et al. 1997; Paskett et al. 1999; Eng 1993; Eng and Smith 1995; Earp et al. 1995; Zhu et al. 2002; Erwin et al. 1996; Irwin et al. 1999) and serve as a bridge between the health care system and community. LHAs have been integral to breast cancer screening interventions and are thought to be advantaged as interventionists due to (1) their ‘‘insider knowledge’’ of a specific setting or group, (2) their ability to incorporate cultural values in health promotion efforts, and (3) their high level of skill in communicating effectively within a setting or to a group (Eng et al. 2009). Figure 1 presents the coordinated, stepped intervention model. Briefly, our plan is for LHAs to conduct outreach through our collaborating community sites to identify women eligible for the intervention (Step 0), including women who are out of adherence with relevant age and risk based screening guidelines, and/or who express confusion about screening or a desire to better

Fig. 1 A stepped-intervention to support adherence to breast screening guidelines

123

Am J Community Psychol (2012) 50:497–517

understand guidelines. All eligible women will be sent information and prompts by mail (Step 1). After 8 weeks, LHAs will follow up with these women by telephone to see if they have completed an appointment for a mammogram or spoken to a provider about this decision. If not, or if they are still uncertain, the LHA will provide decisional support (Step 2). For those who decide they want to obtain a mammogram or a referral to discuss mammography but are unable to obtain this on their own, LHAs will provide barriers counseling (Step 3). LHAs will proactively check in with women who have reached Step 3 for up to four follow-up calls, to determine whether or not they have come into age-and risk-appropriate adherence. Thus, at any point during work with LHAs, women who have received Step 1, 2 or 3 may complete a screening mammogram, seek assistance from providers to better understand screening options, take no action, or remain undecided. Of course, some women who complete screening will require follow-up evaluation and some of them will require subsequent cancer treatment. They may seek LHA assistance in finding these services (Step 4). Finally, LHAs will attempt to re-contact all women who enter Step 1 at 24 months after resolution of their initial contact, to adherence to rescreening (Step 5). Each step of our intervention to promote adherence to mammography screening guidelines is made up of one or more evidence-based strategies that were previously tested as free-standing interventions in earlier efficacy trials. The stepped intervention starts with low-intensity and low-cost interventions such as group education and reminders, and then provides successively more intensive interventions that target more narrow groups of women for whom adherence is more complicated, requiring greater assistance. At any point along this continuum, women may decide to pursue screening or decide against screening. Information and support provided will differ according to women’s age and risk, consistent with US Preventive Health Services Task Force screening recommendations at the time of an encounter, as well as the individuals’ level of need. Although the sequential steps of this intervention will not change, we intend to adapt and tailor the evidencebased strategies at every step, to increase the reach, efficacy and efficiency of the overall intervention in our 10 participating settings. As the CDT progresses, we expect that a greater proportion of women will make decisions and achieve adherence at earlier steps and that more intensive services offered at later steps will be necessary only for women facing the most difficult circumstances and the greatest barriers to screening. Thus, we expect that changes in how women progress through the stepped-intervention over time will be related to performance improvements instituted by our CDT.

501

Interactive Systems to Carry Out the Bronx ACCESS Comprehensive Dynamic Trial Dissemination necessarily involves many moving parts: the intervention, the population and the host settings, all affected by concurrent and sometimes unpredictable social and medical developments in real time (Ellis et al. 2005; Grimshaw et al. 2006; Simpson 2002). The overarching goal of Bronx ACCESS is to learn how to optimize and sustain the evidence-based stepped intervention for particular service delivery organizations and the communities they serve. In order to achieve this goal, the CDT design systematically applies knowledge and methods of behavioral and social science at the local level, to examine and address factors that affect intervention performance within particular organizational and community contexts. Given the many possible ways to modify an intervention, LHAs and other knowledgeable members of organizations and communities are needed to identify unanticipated influences and formulate hypotheses about ways to improve intervention performance. At the same time, we expect that key organizational factors that can affect the intervention’s ultimate success for good or ill will become apparent during the process of implementation and optimization, and must be addressed to ensure intervention institutionalization. In order to achieve these objectives, we developed four distinct research projects that address different aspects of dissemination. In addition, we proposed three shared research core facilities to provide technical support and facilitate exchange of information among these projects. Although there is substantial integration among all cores and projects, as Fig. 2 depicts, one core and one project map directly onto each of the three systems that comprise the ISF, while the fourth project addresses broader policy implications. This organization served the requirements of the program project format, with substantive questions about the functioning of each system addressed by the research projects and repetitive operations and administrative functions relegated to the cores. For purposes of this discussion, we will focus on how the projects and cores combine to establish the interactive systems needed to conduct this CDT. Note that we have relabeled ISF components to emphasize that breast cancer screening offers early detection but not cancer prevention. Despite this relabeling, the systems otherwise serve the same functions as originally envisioned by Wandersman and colleagues (2008). For purposes of orientation, we will provide a brief overview of Bronx ACCESS interactive systems before discussing each component in detail. Functions of the Intervention Delivery System are embodied in a research project to examine the implementation of the individual-level stepped adherence intervention to help women understand mammography guidelines,

123

502

Am J Community Psychol (2012) 50:497–517

Fig. 2 Bronx ACCESS interactive systems

make screening decisions, and obtain care. This study is specifically designed to identify and confirm individual and interpersonal variables that must be taken into account in adapting and tailoring the stepped adherence intervention. In addition, the Intervention Delivery System also includes a Lay Health Advisor Core to select, train, and support LHAs’ work in a program that requires continuous evaluation and quality improvement. Key personnel involved with this Core include a faculty-level director and codirector who oversee core activities, as well as masterslevel LHA Core Coordinator who will provide supervision, as well as a research assistant. Note that we chose to place the LHA Core in the Delivery System (rather than in the Intervention Support System) because the functions of this core are necessary for carrying out the stepped adherence intervention on a day to day basis. Alternatively, efforts to help each agency sustain the stepped-adherence intervention are relegated to the Intervention Support System (discussed below). The Intervention Synthesis and Translation System is represented by a community-level participatory action research project to facilitate the formation of local Cancer Action Councils. Ten councils (one for each participating organization) will be made up of about 8–10 members, including two agency staff, current and former agency

123

clients, local providers, collaborators from other agencies, and research staff. Each council will consider data on intervention delivery and performance, data on agency challenges and other sources of information, to guide continued improvement of stepped-intervention performance. These activities are supported by an Intervention Adaptation and Tailoring Core that will provide necessary information and technical support to adapt interventions. This core will also continually update stepped adherence intervention logic models to support the Intervention Delivery System. Intervention Adaptation and Tailoring Core personnel include a faculty-level director and codirector as well as a national panel of scientific experts in screening and behavioral interventions to provide advice to Councils about intervention adaptation. This Core also includes a doctoral-level coordinator who will oversee ongoing development and adaptation of logic models, two research assistants who will assist the coordinator with programming and analytic tasks, and two masters level field specialists who are charged with organizing Cancer Action Councils and helping to facilitate communication of scientific information, as well as programming and data analytic support. We designed our Intervention Support System to foster institutionalization of the stepped adherence intervention

Am J Community Psychol (2012) 50:497–517

and sustained community-academic collaboration. Research will examine an organizational-level process consultation intervention to build agency capacity and engage in problem-solving needed to transfer and sustain the tailored and optimized stepped adherence intervention, as well as the infrastructure for future research. We also established an Administration and Organizational Management Core to oversee all aspects of this program project, including collaborations with community organizations, communication and data-sharing among projects and cores, and coordination of time lines and activities across different studies and cores. In addition to administrative staff, the Intervention Support System includes percent of effort from four faculty members as well as two full-time doctoral level-staff members who will track problems and issues that community agencies experience over the course of program implementation and provide process consultation to help agencies develop capacities necessary to resolve these issues. Our fourth research project is designed to interact with policy and funding macro-systems, as anticipated by the ISF. This project will develop policy recommendations, formed through participatory modeling of CDT decisions, resources, and population-level effects, drawing upon public health policy makers, researchers, service providers and advocates, as well as stakeholders and researchers working in our three interactive systems. Staff members include two faculty investigators plus research analysts and programmers. In the following sections of this paper, we will discuss how we have designed each of these systems to incorporate the continuous quality improvement logic of the CDT paradigm. In each instance, we have had to make specific provisions for the complex and synergistic nature of changes that can occur over time and across levels of analysis. As we shall discuss, conducting research in this context opens the door to questions that cannot be addressed in more static and top-down models of research.

The Bronx ACCESS Intervention Delivery System In public health research, the usual assumption is that the best way to understand individual-level interventions is through a randomized controlled trial. However, this approach is wholly inadequate as a paradigm for dissemination research. Traditional trials to establish gross evidence of intervention efficacy are designed to statistically control many sources of variance to isolate the effect of a specific intervention. Rather, in dissemination using CDT, it is essential to observe how an intervention performs when offered to a highly—but typically—heterogeneous population across a variety of different service settings. By capturing data on intervention

503

performance in context, it is possible to speak to issues of external validity, specifically, which individual and contextual variables affect performance of the intervention, and which modifications to the intervention make the most difference in different situations. The Bronx ACCESS Intervention Delivery System is designed to iteratively identify and confirm psychosocial variables that must be taken into account to inform continuous improvements in the reach, effectiveness and efficiency of a stepped evidence-based intervention to promote adherence to mammography guidelines among diverse inner-city women and community service settings. Our plan is to implement the initial stepped adherence intervention as described above for a six-month period in each of 10 community organizations. Although we will make preliminary modifications with our collaborating community organizations to ensure feasibility and standardization, our starting point represents a combination of ‘‘off-theshelf’’ strategies combined into a practical continuum for delivery to a population. During the initial six-months, we project the Delivery System will reach approximately 600 women per agency (Step 0), will enroll about 150 women to receive reminder prompts, and will offer more intensive services (Step 2 and 3) to about 100 women. This sixmonth period represents a ‘‘baseline’’ period to monitor performance of the stepped intervention, with each site maintaining high fidelity to the same evidence-based protocol (and thus, analogous to static dissemination in a topdown, phase IV trial). After six-months, we will begin systematic efforts to improve intervention performance, in conjunction with the Intervention Synthesis and Translation System (discussed below). Key indicators of performance that we will examine include extent of reach, efficacy in helping women adhere to mammography guidelines, and efficiency in terms of number of intervention steps and amount of contact with the LHA required. In addition to performance measures, data gathered by our Intervention Delivery System will allow us to address the role of individual psychosocial differences on intervention outcomes. The NCI recommends adaptation of the following intervention components: theoretical or conceptual framework, complexity of content, fit with community resources, channels of information transmission, terminology/language and fit with the audience’s culture (National Cancer Institute 2010). The concept of adaptation is similar to that of tailoring on the individual level and targeting on the group level. In tailoring, communication is intended to reach one specific individual, based on specific characteristics of that person that have been formally assessed. In targeting, communication is intended to reach a population subgroup based on characteristics presumed to be shared by the group’s members (Kreuter and Wray 2003). It is typically expected that adaptation of an intervention strategy

123

504

through tailoring will directly optimize its effect on desired outcomes as well as mediators of those outcomes. In order to support intervention adaptation, targeting and tailoring, we will gather data on individual characteristics such as age, race and ethnicity, as well as specific barriers that may impinge on adherence. Intervention adaptation and tailoring requires an understanding of mechanisms involved in promoting desired outcomes. Street and Epstein (2008) suggest that key interpersonal communication mechanisms mediate outcomes of interventions such as ours, including the ability to place health communications in the broader context of individuals’ lives, to demonstrate empathy and understanding of individuals’ emotions, and to empower individuals by maximizing access to resources and opportunities for self-management. Thus, we will also gather data on intervention recipients’ perceptions of the quality of group communication and mailings (at earlier steps) and interpersonal exchanges with LHAs (at more intensive steps). Beyond women’s perceptions of the quality of communication in our intervention, we are also concerned about how individuals’ medical mistrust affects responses to our stepped adherence intervention. Medical mistrust has been reported to be associated with non-adherence to mammography recommendations among racially diverse women (Thompson et al. 2004; Cronan et al. 2008). Screening decisions may be influenced by women’s past interactions with a specific healthcare provider or setting; her expectation or beliefs about future interactions; her beliefs about the motives of these providers and settings; her perceptions of the risk or potential harm resulting from mammograms; and her sense of vulnerability given the risk of negative outcomes. Given the many possible aspects of trust that can affect screening, Bronx ACCESS will adopt a social-ecological assessment model, including women’s perceptions of public policy, controversies about screening, and the quality of local health resources as well as interpersonal aspects of care (Schulz et al. 2002). We hypothesize that the reach, efficacy and efficiency of our stepped adherence intervention will be directly related to our ability to resolve barriers related to mistrust. It is important to note that resolution of mistrust does not necessarily mean reduction of mistrust. Rather, mistrust may be resolved or overcome by the introduction of third-party mediation (e.g., the LHA), the structuring of incentives to encourage cooperation and trustworthy behavior, or devising a transparent system to monitor the mistrusted (Larson 2004). In any event, modifications to our stepped adherence intervention must foster better interpersonal communication and resolution of medical mistrust. Given the importance of communication and trust, the success of the stepped adherence intervention clearly depends upon LHAs’ ability to effectively carry out their

123

Am J Community Psychol (2012) 50:497–517

role. Thus, our Intervention Delivery System includes considerable resources to support these staff. LHAs are typically peers sharing similar cultural, linguistic, social, and economic backgrounds with program participants and who have a relationship with the community in which they serve (Eng et al. 1997; Rhodes et al. 2007). Important characteristics include demographic similarity to the target population and the ability to maintain close, supportive, reciprocal connections with others (Earp et al. 1997). As recommended in the literature (Jackson and Parks 1997), community leaders will be involved in the process of establishing LHA criteria and in the recruitment and selection of candidates. Our LHA Core will provide a centralized infrastructure to bring LHAs working at different sites together for training, supervision, and problem-solving. A particular concern that arises in the CDT paradigm is that we are asking intervention staff to function under conditions of continuous quality improvement. LHAs’ job performance, fidelity to the intervention protocol, and observed outcomes of the stepped intervention will necessarily be under considerable scrutiny. In the best case, this may be a very positive aspect of the LHAs’ role (Kahan and Goodstadt 1999). They will receive considerable feedback about the results of their work and support in figuring out ways to improve the intervention. In the worst case, LHAs may experience stress associated with frequent monitoring of their activity and results and subsequent modification of stepped intervention strategies. LHAs may also feel responsible if the intervention does not work as intended, especially if modifications that they suggested or helped to develop are not successful. Given the central importance of building continuous quality improvement procedures into dissemination of evidence based interventions, devising ways to support delivery staff takes on additional weight. The LHA Core Coordinator and faculty will provide training to directly support LHAs’ ability to engage in quality improvement efforts. Core Faculty will develop a curriculum to provide the LHAs with the skills needed to collect intervention tracking indicators, examine data, reflect on the meaning of the information shared in the CQI technical assistance sessions, and present findings and their own recommendations. The training curriculum will include completing tracking forms and logs, reading and interpreting data tables, charts, and figures; discussion guides for structuring the group discussions; and training in public speaking, giving presentations, and participating in professional meetings. Following initial training, the Intervention Delivery System will conduct monthly sessions with LHAs to prepare them to provide feedback to the Intervention Synthesis and Translation system regarding experiences in implementing the intervention and their perceptions of intervention effectiveness and performance.

Am J Community Psychol (2012) 50:497–517

Through individual and group supervision, the LHA Core Coordinator will also ensure that LHAs are able to deliver the stepped adherence intervention with high fidelity. However, in the context of quality improvement, criteria for fidelity will necessarily change throughout the dissemination process, as we continually adapt the stepped adherence intervention. Further, problems with fidelity might indicate a need for modifications to the stepped adherence intervention rather than inadequate staff performance. In order to support LHAs’ ability to maintain fidelity, they will be trained to use laptop computers programmed with the most up-to-date intervention logic model for their site. Revision and programming of logic models will be carried out by technical staff working in the Intervention Synthesis and Translation System, as we shall describe below. These laptops will automatically track the sequence and amount of time spent on specific activities, which will inform our evaluation of fidelity and resources required. Fidelity will also be self-assessed by LHAs themselves, using brief forms to be completed after each outreach event and each encounter with individual clients. Project field specialists working within the Intervention Synthesis and Translation System will also periodically observe the LHAs’ work at each site, particularly after new experimental procedures are in place. Timely exchange of information about intervention performance is critical to the conduct of a CDT. This means that it is important to automate data collection and minimize burden for staff as well as participants. In order to achieve this, we plan to gather all data in the field as an intrinsic part of conducting the stepped intervention. At the earliest step of the intervention, participants signing-into outreach programs and events will provide minimal contact information, approximate date of most recent mammogram, plus indication of their interest in further information on mammography and breast health. Individuals who receive mail reminders will be contacted by telephone to determine whether they were able to obtain needed information and/or screening. For those progressing to later steps of the intervention, structured data will be gathered as an integral part of the LHAs’ interaction with each participant. For example, LHAs will necessarily assess relevant health history, knowledge and beliefs, and barriers to care. For efficiency, some information may also be captured using automated, computer-assisted audio survey interviews prior to meeting with the LHA. All data collection will be programmed into the intervention script that LHAs will follow using a laptop or pad computer. Data gathered at point of service are essential to ensure that participants receive the proper intervention components, and they are also critical for quality improvement, in order to understand who is benefiting from the intervention and who is not. We will also ask women for permission to

505

contact providers, so that additional data to verify medical appointments and screening exams can be gathered by research assistants. Data entered in the field will be automatically uploaded and stored in our project database, for analysis by research staff working within the Intervention Synthesis and Translation System as discussed below.

The Bronx ACCESS Intervention Synthesis and Translation System Despite the central importance of modifying evidencebased strategies to fit local preferences, circumstances, and needs, there is relatively little research on this process and a lack of tested models to guide adaptation of public health interventions (Breslau et al. 2010). Intervention adaptation takes on central importance in the CDT paradigm. Continuous modification and enhancement is the key to achieving greater effectiveness, efficiency and sustainability of an evidence-based intervention. Moreover, observing when, where and how interventions need to be tweaked, combined or re-imagined to improve performance is critical data for advancing the science of community intervention. The best way to gain this sort of understanding is to systematically experiment in specific places with specific people, to determine what works. Rather than strive for homogeneity through imposition of arbitrary exclusion criteria and controls, dissemination research can only advance by maximizing heterogeneity of populations, settings, and variations in strategies. However, it is not sufficient to simply expand inclusion criteria or loosen fidelity criteria. Rather, we need to be able to describe characteristics of interventions, delivery processes and settings, and recipients as fully as possible, to make sense out of what happens in dissemination. In Bronx ACCESS, the Intervention Synthesis and Translation System is charged with using intervention performance data in real time, in order to guide improvements to the intervention, examining the impact of those changes, and capturing lessons learned to inform future decisions about intervention dissemination. In the CDT paradigm, modifications to interventions are made using ‘‘embedded experiments.’’ The general idea is that decision-makers examine trends in intervention performance. Necessary data on intervention process and performance may be generated by line staff in the process of carrying out the intervention, under the auspice of the Intervention Delivery System, and/or by data gathered by research staff outside the context of intervention delivery. Drawing upon a combination of exploratory data analysis plus their own grounded understanding of local populations and settings, decision-makers generate hypotheses about changes that might be made to improve performance.

123

506

Experiments can then be introduced as provisional changes in the intervention logic model, for implementation by the intervention line staff. Experiments must be designed so their impact is evident in routine tracking data. Depending upon hypotheses under consideration, embedded experiments may use a wide variety of designs. Experiments may be examined by looking at their effects upon performance indicators. In many instances, non-parametric runs tests, time-series analyses, or reversal designs (which turn the modification on, off and on again) would be sufficient to answer questions of interest. However, when decisionmakers are faced with competing hypotheses about what changes to make, they may choose to introduce an embedded randomized controlled trial. This is particularly important when different decision-makers are in disagreement about how to improve intervention performance. A local RCT can help to resolve impasses in a method that stakeholders perceive as fair and transparent. Note that although RCT is not an appropriate overarching paradigm for dissemination research because it obscures diversity and ignores external validity, local use of small-scale controlled experiments to address highly-focused questions within a circumscribed and well-characterized context is highly appropriate. Thus, in CDT, knowledge builds from the juxtapositions of results from multiple embedded experiments in diverse settings, rather than from the aggregate test of a single monolithic effect that is explicitly intended to ignore context. Data collection for individual embedded experiments may take weeks or months, depending upon the design of the experiment and the number of eligible individuals. Experiments that involve all intervention participants could be completed more quickly than those targeting specific sub-groups. However, use of clinically-significant effects and non-parametric tests will make it possible to complete experiments relatively quickly. Hypothesis tests within local agencies will be designed in accord with principles of continuous quality improvement and small scale trials, consistent with the goal of estimating the local magnitude of effect but not in generalizing to a larger, theoretical population (Rapkin and Trickett 2005; Epstein et al. 2010). Experiments that are successful will be instituted as formal changes in the intervention logic model at a particular setting. Since data collection is an on-going part of program delivery, it is natural to continuously monitor, confirm and even replicate experiments. Experiments that are unsuccessful may be subject to further data collection, modification or rejection by decision-makers. In Bronx ACCESS, each community agency’s Cancer Action Council is charged with devising and evaluating embedded experiments, with the support of Intervention Adaptation and Tailoring Core scientists and field specialists. The field specialists will work with our community

123

Am J Community Psychol (2012) 50:497–517

agency partners to form Cancer Action Councils at the ten organizations involved with our project. Efforts to adapt interventions can benefit from participation of representatives of different constituencies and organizations that stand to gain from dissemination (Barton-Villagrana et al. 2002; Miller and Shinn 2005; Yoshikawa et al. 2003). Representation can be best achieved through the development of community-based partnerships comprised of stakeholders that understand the populations in need of services (Speer and Zippay 2005; Chinman et al. 2005; Foster-Fishman et al. Foster-Fishman et al. 2001; Horowitz et al. 2009; Israel et al. 2001). Thus, Cancer Action Councils will include agency clients and other community members, LHAs, agency representatives and researchers. Cancer Action Councils will conduct embedded experiments to improve local performance of the stepped adherence intervention. In order to learn how to formulate these experiments, Cancer Action Council members will be trained and guided by faculty and field specialists in the use of a systematic problem-solving approach to formulate ways to enhance the stepped adherence intervention (Rapkin et al. 2006). A well-functioning partnership that involves invested community members has the greatest potential to identify or devise optimal solutions, taking into account local history, concerns and available resources (Horowitz et al. 2009; Schensul 2009). As part of our research on the dissemination process, we will examine how these partnerships identify opportunities to enhance the intervention, how they make decisions about these changes, and how these changes are related to trends in intervention reach, efficacy and efficiency. Our approach will allow us to test the hypothesis that community partnerships’ observed use of problem-solving strategies will be related to partnership effectiveness, and will contribute to improved intervention performance (Weiss et al. 2010; Lasker and Weiss 2003a, band Weiss et al. 2002; Butterfoss 2006; El Ansari 2003; Schulz et al. 2002). As important as it is to elicit informed local perspectives about ways to improve intervention performance, it is also necessary to provide Cancer Action Councils with scientific support. Masters-level field specialists will work with each Cancer Action Council to provide technical assistance and serve as the communication conduit between the Councils, scientific advisors and other Bronx ACCESS systems in respect to interpretation of data, formulation of hypotheses, and design and interpretation of embedded experiments. Field specialists will be charged with orienting and training Cancer Action Council members to make use of data. We expect wide variation in Council members’ familiarity and comfort in using quantitative information. Thus, we will adopt a number of strategies to make it easier for members to discuss data, including standard graphic presentation formats to represent change over time,

Am J Community Psychol (2012) 50:497–517

participatory interpretation of results, and use of illustrative narrative based on the experiences of LHAs, agency staff and others. At every meeting, we will also elicit Council members’ feedback about how to enhance and clarify data presentations. Preparing research findings to support local community planning is a critical role of the Intervention Synthesis and Translation System in a quality improvement context. Scientists must digest intervention performance data and present it in forms that are accessible and understandable to end users. Throughout the process of implementation, the Intervention Adaptation and Tailoring Core Director and research staff will prepare a variety of straightforward graphs, maps and charts to highlight key trends in intervention performance for each Cancer Action Council. Data will address who is being reached and how that compares to the organization’s neighborhood and client base, who is participating in the intervention, and how are they responding to each step. Cancer Action Councils will also receive data on the number of intervention steps and the amount of time required for different recipients as well as measures of LHAs’ ability to maintain fidelity with the current iteration of the intervention logic model. At every meeting, Cancer Action Councils will review these data on intervention performance. Discussion of these data will focus on noteworthy trends in outreach or intervention performance, areas where LHAs are having difficulty with fidelity or spending inordinate time, and sub-groups of women or settings that are not benefiting from the intervention. Council members will be asked to prioritize concerns and opportunities, and to state one or several hypotheses about what needs to happen to address these priorities. Based on their hypotheses, we will ask members to brainstorm about different specific changes to evidence-based strategies. In order to support formulation of strategies, field specialists will support Councils’ access to any and all relevant, up-to-date scientific literature. We also expect that there will be instances when Councils wish to invent novel strategies or modify interventions in ways that have not been previously examined in the literature. Thus, we will assemble a national panel of experts to address Council’s questions and ideas on interventions to improve screening and promote access to care. Evolution of the stepped-intervention need not be bound by initial intervention content or theory. What matters is what can be demonstrated to work. This is not to suggest that theory is unimportant. Whatever modifications they propose, Cancer Action Councils must be able to specify a hypothesis (or competing hypotheses) that they want to test. Systematic examination of locally-generated hypotheses will inform further development of theory. In order to test their hypotheses about intervention modifications, Intervention

507

Adaptation and Tailoring Core investigators and external scientific advisors will help Councils to determine appropriate experimental designs and procedures, as well as statistical power needed for their experiments. In order to implement experiments, the Intervention Adaptation and Tailoring Core director and research staff will program changes into LHAs’ laptops to incorporate provisional (that is, experimental) changes to the logic model. If an experiment is successful, changes to the logic model will be finalized. Intervention Adaptation and Tailoring Core faculty and staff will also work with Intervention Delivery System faculty and coordinator to establish revised criteria for fidelity, and to modify LHA training accordingly. One additional function planned for the Intervention Synthesis and Translation System is the development of a coding system to describe the nature of modifications to the stepped adherence intervention. Systems to code the components and features of behavioral interventions would have great utility for many studies of dissemination and comparative effectiveness. In order understand how to optimize interventions for different individuals and settings we must know what is being delivered. The Intervention Adaptation and Tailoring Core coordinator and faculty will develop an intervention coding system based on input from Cancer Action Councils and the scientific panel. We expect that this coding system will capture aspects of informational content, modality (individual, group), format (inperson, written, video, etc.), interactivity, and flexibility for tailoring among other features. We will develop a coding manual with clear criteria for characterizing each aspect of the intervention. Ultimately, we want to be able to express all changes to the intervention in terms of features such as these, so that we can better understand the kinds of adaptations required for successful dissemination.

The Bronx ACCESS Intervention Support System Ultimately, the institutionalization of effective interventions into local delivery systems is the central challenge of dissemination research. Institutionalization is essential for the long-term survival of programs (Goodman and Steckler 1989). However, successful institutionalization of evidence-based interventions cannot be reliably achieved by a simple ‘‘hand-off’’ or exchange of information. As Kanter describes: ‘‘it is when the structures surrounding a change also change to support it that we say a change is ‘institutionalized’—that it is now part of legitimate, ongoing practice, infused with value and supported by other aspects of the system’’ (1983, p. 299). We expect that an intervention adapted and optimized by local stakeholders through a deliberative problem-solving process will have the best chance of being institutionalized in a given setting.

123

508

However, even adaptation of an intervention does not guarantee that that it can or will be sustained by the local community. According to Durlak and DuPre (2008), implementation often deteriorates over time. Regardless of a program’s success during a demonstration period, only relatively few interventions are ultimately institutionalized by organizations (Goodman et al. 1993; Rogers 2003; Schensul 2009). Building upon earlier work by Goodman and Steckler (1989, 1990) as well as Wandersman et al. (2000), the CDT paradigm is designed to foster institutionalization of evidence-based practices. Rather than viewing institutionalization or maintenance as taking place after an intervention has been implemented, CDT attempts to address sustainability from the outset. Implementation, adaptation and institutionalization are each components of the dissemination process. As discussed above, CDT requires repeated cycles of implementation and deliberation in order to adapt and optimize an intervention. Time needed to accomplish these local intervention quality improvement efforts also provides an ideal ‘‘incubation’’ period to determine the organizational and community-level challenges that must be addressed to achieve successful institutionalization. We are particularly interested in facilitating organizations’ responses to potentially disruptive events and to positive opportunities that become evident. Thus, our organizational-level assessment uses a prospective, ecological approach based on Barker’s work on the assessment of episodes (c.f., Barker and Shedd Barker 1963) and event history analysis (Allison 1984; Rapkin et al. 2008). Episodes of interest include any problems with the stepped adherence intervention itself, effects of the stepped adherence intervention on other agency programs, changes in staff and leadership, changes in overall agency funding, disruptions in the local health care delivery systems such as hospital closures, changes in screening recommendations, and policy changes that significantly reduced local discretionary funds that for many years had been used to supplement the New York State Department of Health’s Cancer Services Program (the program that administers the CDC’s Breast and Cervical Cancer Early Detection Program in our state). We are also interested in opportunities that may arise, such as increased demand for the intervention, the emergence of new funding sources, requests to expand program scope, or invitations to partner with other programs and organizations. Disruptions and opportunities such as these are commonplace in realworld service delivery. As such, the science and practice of dissemination must make provisions to take such events into account. From the perspective of CDT, these events can serve as ‘‘teachable moments’’ in a community—academic collaboration, to learn how to best

123

Am J Community Psychol (2012) 50:497–517

overcome obstacles and make organization-level changes necessary for institutionalization. The Bronx ACCESS project is designed to foster measurement of sustainability in several ways. Of course, we are interested in how the organization incorporates the stepped intervention. An agency may ultimately choose to carry out all of the steps, or to incorporate specific components (for example, mailed reminders or assistance for women who face significant barriers to care). Agencies may choose to reach all eligible women that they serve or focus on specific sub-groups such as new immigrants or those with chronic mental health problems. Agencies that opt to specialize in some aspect of the stepped intervention may need to arrange cross-referral with other agencies to ensure that all women are adequately served. In incorporating the intervention, we expect that many of the agencies will seek to hire the LHA with whom they have been working. Alternatively, agencies may decide to combine LHA functions into other existing staff roles. In either case, bringing the functions of the Lay Health Advisor into the organization is in some ways emblematic of the whole dissemination process, as outside knowledge, skills and technologies become part of the host organizations. Ability to achieve these milestones of institutionalization depends on the availability and mobilization of necessary resources and capacities. A central role of the Intervention Support System is providing technical assistance to the community organizations charged with assuming responsibility for an intervention (Durlak and DuPre 2008; Wandersman et al. 2008; Wallerstein et al. 2005). As Durlak and DuPre note, ‘‘The goals of technical assistance are to maintain providers’ motivation and commitment, improve their skills levels where needed, and support local problem-solving efforts. Depending on the situation, technical assistance may include some combination of re-training of initial providers, training of new staff, and providing emotional support’’ (2008, pp. 338–339). Consistent with these goals, doctorallevel personnel working with the Bronx ACCESS Intervention Support System will provide technical assistance through process consultation. Process consultation has been used effectively in business settings (Shein 1969, 1987) and in schools to improve teachers’ capacity to address student needs and draw on each other for support (Farouk 2004; Hanko 1999; Jason and Ferone 1978). It is defined as ‘‘a set of activities on the part of the consultant that help the client to perceive, understand, and act upon the process events that occur in the client’s environment’’ (Schein 1987, p. 34). Process consultation entails collaborative problem solving that is based on openness of communication and the development of mutual trust, leadership and authority (Schein 1999; Lambrechts et al. 2009). We selected process consultation from a variety of approaches

Am J Community Psychol (2012) 50:497–517

(Lambrechts et al. 2009) because of its emphasis on flexible problem solving, given the wide variety of disruptive events and opportunities that could affect participating organizations. Process consultation emphasizes engaging in a joint activity in which the consultant and organizational client have a contribution and a stake in the encounter (Lambrechts et al. 2009), yet the problem and problem resolution are ultimately owned by the organization. Through process consultation, organization participants develop insight, knowledge, and skills in the process that can be utilized to achieve immediate aims and resolve future problems (Lambrechts et al. 2009; Schein 1969, 1987). Triggers for process consultation may be identified by agency leadership or emerge from the work of Bronx ACCESS’ other interacting systems, such as problems with training or deployment of LHAs (in the Intervention Delivery System) of conflicts involving Cancer Action Councils’ recommendations for improving the stepped intervention (in the Intervention Translation and Synthesis System). Periodic interviews with key organizational informants may also flag issues for consultation. In addition to built-in ways that the ISF will support proactive offers of consultation, process consultation may also be initiated at the request of the organization. The use of process consultation to facilitate the institutionalization of an evidence-based stepped adherence intervention is consistent with empirically-based recommendations that university partners should ideally play a facilitative role that enhances what organizations do without diminishing any of their own initiative and mission, and that supports informed decision making and problem solving (Umemoto et al. 2009). Process consultation is designed to promote organizational changes necessary to institutionalize the stepped adherence intervention and to encourage continued involvement in collaborative research. However, it is also essential to address ‘‘here and now’’ issues in relationships among organizations involved in Bronx ACCESS, including the relationship of community collaborators to the research team and our academic settings. Thus, senior faculty investigators and administrative support staff will work closely with all other investigators, staff, and consultants involved in Bronx ACCESS, as well as the 10 collaborating agencies, to facilitate and coordinate communication, to establish and maintain a culture of accountability and commitment, and to create a community-wide partnership for sustaining cancer prevention and control dissemination research. Senior faculty leaders are charged with monitoring and synchronizing efforts of Bronx ACCESS components, and will be responsible for establishing transparent structures and procedures for managing the relationships with the 10 community

509

agencies as well as the identification and resolution of problems as they arise. In order to foster effective functioning of the collaboration, the senior faculty will establish Memoranda of Understanding with the leadership of each participating organization, embodying principles of transparency, mutual benefit, effective communication and shared authority (Lounsbury et al. 2007). In order to study our own collaborative capacity, we also plan to closely monitor process consultation provided in support of dissemination. As the project progresses, our administrative team will also maintain a record of collaborative decisions to create a web-based manual of Standard Operating Procedures. Formalized rules, roles, and procedures provide a critical foundation for the maintenance of collaborative efforts of this scope (Wandersman et al. 1997; Woulfe et al. 2010; Zuckerman 1995). These guidelines will explicitly foster inclusiveness, respect, and effective problem resolution, recognizing the importance of taking into account the diversity of perspectives and organizational cultures implicit in a study of this nature (Alter and Hage 1993; Fonner 1998; McKinney et al. 1993). Ultimately, these ‘‘administrative’’ guidelines will be an important product of Bronx ACCESS, providing a template to help outside organizations to replicate our interactive systems and extend CDT procedures. In moving programs from the research incubator into routine practice, the ultimate challenge to sustainability and institutionalization is to ensure on-going funding and/ or in-kind resources. Thus, an explicit goal of our collaborative process consultation and administrative oversight with agencies is to identify and obtain needed resources. We will monitor and support agencies’ efforts to raise funds for institutionalizing the stepped intervention. We will determine whether and how the Intervention Systems that we have created should be sustained to expand Bronx ACCESS functions such as on-going Cancer Action Council activities and organizational process consultation. We also plan to maintain the capacity for community organizations to collaborate with our universities on this and other public health initiatives.

Synthesizing Results Across Interactive Systems The Bronx ACCESS CDT uses intervention approaches and research methods specifically intended to orchestrate dissemination, based on the rationale outlined above. Our primary goal is to learn how to best orchestrate the process of quality improvement to maximize the benefits of dissemination. As Green and Glasgow (2006) remind us, dissemination research is about determining best processes. Our emphasis on continually tailoring and making site

123

510

specific adaptations to the stepped adherence intervention throughout the study leads to different ways of thinking about intervention design, delivery, and analysis. In intervention research, community psychologists and most other public health researchers have long been trained to believe that standardization is equivalent to rigor and that diversity equals noise (or error). We must move past these untenable simplifying assumptions if we hope to understand dissemination of interventions in the real world. Perhaps paradoxically, we must depart from this problematic understanding of ‘‘rigor’’ by strategically creating opportunities to observe how different people experience potentially advantageous changes to interventions to obtain the descriptive information we need to understand intervention dissemination and adaptation. Thus, in Bronx ACCESS we plan to analyze our data by embedding every person within a specific intervention step, at a given site at a certain point in time. In terms of statistical analysis, this provides a nesting structure amenable to hierarchical linear modeling. Rather than treating ‘‘intervention’’ as a simple categorical variable (i.e., experimental versus control), in our analysis it will be necessary to characterize the intervention components actually received by each individual. This can be accomplished by using a vector of variables indicating key intervention components, based on the logic model in use at a given site at a given individuals’ time of participation in the study. In order to describe intervention components, strategies will be specified using codes developed by the director and staff of the Intervention Adaptation and Tailoring Core component of the Intervention Synthesis and Translation System. Sites will also be described by Intervention Support System measures of changes in intervention institutionalization, readiness and capacity gathered as part of process consultation. Individual variables obtained by the Intervention Delivery System through assessments routinely conducted by LHAs are also available, so we can examine multi-level main effects as well as within and cross-level interactions among individual, intervention and site characteristics. Many candidate variables will fall out of the analysis due to little variation or lack of effect. Those that emerge are the most important variables for dissemination and comparative effectiveness.

Linking the Interactive Systems to Planning and Policy We anticipate that the direct population benefits of Bronx ACCESS will include more appropriate use of screening, greater detection of early-stage cancers, fewer false positives, greater access to care, and greater decisional satisfaction. We hypothesize that these outcomes will emerge as we adapt, refine and institutionalize our stepped

123

Am J Community Psychol (2012) 50:497–517

adherence intervention to better serve diverse communities of women, who have unique needs and concerns, live and work in diverse community settings, and who are served by different health care systems. Beyond achieving these benefits for communities and organizations served by our program, our larger goal is to advance understanding of dissemination as an iterative, cyclical process of implementation, adaptation and institutionalization. The CDT paradigm is designed to offer an increasingly nuanced understanding of the types of adaptations and resources necessary to optimize and sustain an evidence-based intervention in different settings. In order to capture and integrate these lessons as they are learned, we will use simulation modeling techniques to assess the interdependent, collective impact of the interactive systems involved in Bronx ACCESS. Simulation modeling draws upon statistical findings, qualitative data, key informants and other sources of information to develop algorithms that can help us to understand what is happening in a dissemination process (Mabry et al. 2008). It is important to emphasize that simulation modeling is not merely statistical analysis by another name. Rather, models emerge from the evolving understanding about what is happening in the process of intervention implementation, adaptation and institutionalization. Models seek to capture and synthesize multiple perspectives about processes of changes in individual behavior and organizational capacity, and to extrapolate about the implications of those changes. Simulation algorithms represent complex interactions among multiple variables at multiple levels. By inputting different values for these variables, simulation modeling may be used to forecast the results of interventions and policy changes under different scenarios. Two complementary modeling approaches—system dynamics modeling and stochastic microsimulation modeling—will be applied in this project. We have selected these two approaches because together, they can address key questions policy makers will necessarily have about dissemination of evidence-based interventions: How do you do it? And, what difference will it make? System dynamics modeling is a methodology to elucidate the connections, or ‘‘feedback structures,’’ among factors that influence a given problem (Forrester 1971; Sterman 1989; Richardson 1991; Levine and Fitzgerald 1992; Andrews et al. 2001; Kiefe et al. 2001). System dynamics models have recently gained considerable attention as a way to represent and examine complex processes in public health interventions (Albarracin et al. 2010; Homer and Hirsch 2006; Hirsch et al. 2010; Homer et al. 2004; Latkin et al. 2010; Lounsbury and Mitchell 2009; Mabry et al. 2008). This integrative approach provides an ideal way to represent the interplay of intervention

Am J Community Psychol (2012) 50:497–517

modifications, progress toward institutionalization and trends in intervention performance over time. We will begin the Bronx ACCESS project by developing a preliminary system dynamics model of factors that shape women’s adherence to guidelines that we can share with all project personnel Cancer Action Councils, community agencies and outside scientists involved with the project. This preliminary model will be based on perspectives from the scientific literature, national experts in women’s health and screening, and input from the key actors in all three interactive systems including the LHAs, Cancer Action Councils, key representatives and leaders of collaborating agencies, Field Specialists and Process Consultants. This initial model will be continually refined, tested against data from field, and reviewed by the LHAs, Cancer Action Councils and agency leaders through the entire project period. We will ask all of these individuals to comment on factors that affect how well the stepped intervention performs, and how well it is incorporated and sustained in different settings. The model that emerges from this participatory process will help us to account for different results in different segments of the community, to better anticipate the kinds of enhancements and tailoring needed to improve performance of the stepped adherence intervention in different settings, and the organization capacities and challenges that affect institutionalization. In short, system dynamics modeling will provide a comprehensive way to answer questions about how to successfully disseminate evidence-based practices to promote screening to different settings serving different populations. As is generally the case with population health studies, it will not be possible to directly observe Bronx ACCESS’ effect on long-term outcomes of breast cancer morbidity and mortality within a five-year time span. However, it is possible to estimate public health benefits over longer time horizons and larger segments of the population. Stochastic microsimulation modeling generates simulated samples of individuals with specified mixtures of characteristics that may affect outcomes (rates of screening, utilization of services, risk factors, and the like). By taking these characteristics into account, it is possible to examine a likely range of outcomes, including extent of illness, mortality and associated costs. Over the past decade, the NCI CISNET program has supported the development of models to examine how changes in screening and treatment have influenced breast cancer mortality (Boer et al. 2004; Clarke et al. 2006; Mandelblatt et al. 2003, 2009; Tosteson et al. 2008). We will adapt the CISNET Spectrum model to examine net improvements in breast cancer outcomes associated with improvement of adherence to screening guidelines and levels of intervention performance achieved by our project. Spectrum stands for Simulating Population Effects of Cancer Control Interventions—Race and Understanding Mortality

511

(Mandelblatt et al. 2006). Our prospective design will allow us to examine how the extent of intervention adaptation and institutionalization affects parameters in the adapted Spectrum model. As dissemination using CDT progresses, our stepped intervention should lead to ever greater proportions of women receiving timely follow-up and improved access to appropriate treatment, when needed. Spectrum can be used to project how achieving a given level of adherence will affect service use, false-positive rates, stage distribution, morbidity and mortality—all outcomes that are not possible to observe directly in the course of a five-year study. This provides the answer to the question, why does this approach matter? Observing what can be achieved when behavioral strategies are fully disseminated and operating under real world conditions is the best way to determine whether benefits of efforts to promote adherence to mammography screening guidelines outweigh their costs. Applying both modeling approaches will enable us to synthesize multiple sources of data and information so that we can better see ‘the big picture’ regarding intervention effectiveness. Our models will explore various scenarios of interest to dissemination. For example, what if LHAs worked half-time instead of full time? What if we restricted the number of sessions focused on barriers counseling? What if we had chosen not to include women under 50? More complex intervention dynamics can also be explored. For example, what are the results of placing more resources for outreach to more isolated segments of the community, or of using resources to reach only those women with the greatest difficulties in connecting to care? Results of these different ‘‘what-if’’ scenarios can be generated by system dynamics models and then used as input to the Spectrum breast cancer model, to project how choices about dissemination would impact longer-term utilization, morbidity and mortality. This forecasting approach is critical to the study of interactive systems, particularly in respect to the need to evolve interventions through continuous quality improvement. Although it is untenable to expect findings in one place to directly generalize to the next, it should be possible to achieve an ever greater understanding of the conditions and rules that affect dissemination as we gain experience across multiple situations. Experience with a wide array of representative programs will allow us to continually refine and improve both system dynamics and stochastic models. Note that both types of models are open to refinements and extensions based on future research.

Advancing the Use of the Interactive Systems Framework for Intervention Dissemination In order to effectively reduce the personal and societal impact of cancer and other chronic diseases, we must learn

123

512

how to move evidence-based health promotion interventions into widespread practice. Incorporating dynamic mechanisms for problem solving and quality improvement increases the likelihood that dissemination efforts will lead to lasting benefits for individuals and organizations. The Interactive System Framework provides a very insightful and adaptable way to organize and structure research of this sort. Based on the work we have done so far to develop the Bronx ACCESS model, we emphasize several principles that are necessary to foster dissemination of research to practice. Most fundamentally, we contend that all dissemination research should incorporate principles of continuous quality improvement. The notion of taking something off the shelf and having it work everywhere (or anywhere) is misguided. Even efforts to ‘‘pre-adapt’’ interventions by tailoring them for specific ethnic communities or other subgroups presume that only one aspect of diversity matters. In contrast, our argument is that it is necessary to engage in the dissemination process with minimal assumptions, rigorous description and an open mind. Methods must be sufficient to modify interventions as we learn about local cultures, service ecologies, and personal and organizational relationships. Researchers have long been socialized to view continuous quality improvement as somehow different from (and inferior to) research, perhaps because quality improvement goals were specific to local settings and not intended to generalize. In fact, dissemination is always local. To be truly rigorous, dissemination research must find ways to glean basic principles from local examples. Understanding local variation in responses to interventions is the hallmark of external validity. Continuous quality improvement provides the necessary raw data for this kind of discovery. The CDT paradigm suggests methods and variables that must be considered in dissemination research using the ISF. First, dissemination research must be designed to track markers of intervention performance. Rather than thinking about outcomes such as screening or behavior change at the end of the study, we must monitor whether or not the intervention is working over brief intervals (even on a case by case basis or within-person basis), in a time frame that permits opportunities to steer, refine and rethink the strategy. However, in order to understand how to change the intervention we must track more than outcome variables. One critical aspect of performance is program reach, measured against target background population. It is also necessary to measure fidelity to the intended delivery process and mechanistic variables to know whether the intervention is working as intended. Measures of potential moderators such as risk factors, level of connection to the local agency or trust in the LHA, or preferences about interventions are also necessary to understand heterogeneity of responses to the

123

Am J Community Psychol (2012) 50:497–517

intervention. Information about costs, staff-burden and impact on other agency activities and programs are also key. Although this seems like a lot to measure, in fact much of this information is routinely generated in the course of day to day delivery of service through outreach, interactions with clients, supervision and the like. Unfortunately, the information is not routinely captured. The ISF Intervention Delivery System must be explicitly tasked with finding ways to track necessary performance indicators in ways that are sustainable and useful to delivery staff and settings as well as researchers. Recent initiatives to expand use of information technology in health care support this approach. Note that the need to capture similar information about performance pertains to the broader notion of capturing ‘‘practice-based’’ evidence. Just as economists attempt to make sense out of multiple, interacting trends over time, dissemination research must also pay attention to the interplay of different performance indicators in different contexts. This information is needed to manage what is happening over the course of dissemination. Thus, we view this analytic and interpretive function as intrinsic to the Intervention Synthesis and Translation System. This system is also charged with defining and describing components of interventions in meaningful ways. In intervention research, we have become accustomed to treating multi-session complex curricula as ‘‘the’’ intervention. In fact, there are always many things going on in such programs, even when delivered as intended. The Intervention Synthesis and Translation System must have ways to describe what is being synthesized and translated, equivalent to doing a chemical assay to identify the active ingredients of a pharmaceutical compound. This is particularly important in the context of continuous quality improvement, in order to express exactly which ingredients of an intervention are being intensified, modified, enhanced or replaced. In both intervention and dissemination research, focus often falls on the results of intervention implementation. The process of intervention development and adaptation are usually presented in sparse detail, and are rarely treated as the focus of research. In any quality improvement effort, the results clearly depend upon the effectiveness of the planning and deliberation process employed to solve problems and make modifications. In Bronx ACCESS, activities to implement and optimize these deliberations are included in the Intervention Synthesis and Translation system. The content of deliberations can be described in terms of the intervention components and ingredients that they consider and reject. The process of deliberation must be measured in terms of the quality of communication among members and the problem solving process that they use. We can also take into account whether and how well different segments of the community are included in the

Am J Community Psychol (2012) 50:497–517

deliberative process. Analyses of trends over time can thus take into account how the deliberative process produces changes to intervention content, which in turn should affect performance. As has long been recognized in communitybased participatory research, the success of interventions depends on meaningful community input. CDT designs incorporate ways to measure the quality of the deliberative process involved in intervention synthesis. We have adopted the idea of the embedded experiment as the central way that deliberators can make use of data to plan and test changes to interventions. The idea here is that changes can and should be made in ways that are amenable to evaluation. Many organizations are familiar with ‘‘plando-study-act’’ cycles, including measuring and examining results. Experiments in local settings may lack direct generalizability to other places, but that is not their purpose. Rather embedded experiments are meant to inform local decisions. Although not every embedded experiment needs to be an RCT, the possibility of using a controlled test to see whether a given modification works as intended is a powerful tool for local communities. Of course, it is critical that all decision makers understand and buy into this process, which necessarily implies ceding expert authority or privileged community perspective to the experimental result. In the ISF using the CDT methods proposed here, continued monitoring of changes and replication of experiments is quite feasible, and can add to or detract from confidence in any decision. In line with other investigators using the ISF, we place central emphasis on the development of capacities for conducting and sustaining an intervention. As is implicit in the notion of dissemination, our discussion here treats the intervention as a set of activities that originate from outside of the host organization, with the intent of incorporating those activities within the organization over time. Thus, the Intervention Support System must take into account capacities of the disseminators (that is, our research team, our relationships, and our respective university resources) as well as the host agencies receiving the intervention. Support to organizations such as process consultation can be described in terms of the nature and timing of problem identification, the success at engaging agency leadership and others in the problem-solving process, the resources and capacities needed to undertake solutions, the nature of solutions considered and their consequences in terms of intervention performance or more effective deliberations to improve the intervention. Over time, effective process consultation should lead to increased capacity at the host organizations to implement, manage and even continue to improve the intervention at hand. Data about the nature and extent of intervention support are needed in order to anticipate and justify the level of effort and resources required to get research into practice.

513

As noted above, the CDT uses a variety of analytic and modeling techniques to make full use of data from multiple sources and levels. It is important to understand that the simulation models we recommend do not simply produce results like statistical analyses. Rather, the models themselves are the results. Developing and refining models is a fully collaborative process, consistent with principles of community-based participatory research. Modeling provides an opportunity for on-going dialogue and reflection about what is happening in the CDT and what it means. Models are intended to incorporate lessons learned as we gain greater understanding of principles governing our interactive systems’ efforts to disseminate and evolve effective strategies for community problem solving. Lessons learned may be shared across efforts to address health issues and other community concerns. Further, while models may be developed to the point that they are very useful, they are not necessarily ever finished. Subsequent experience conducting dissemination through interactive systems can lead to ever more refined and nuanced models. Note that once they are developed, results of forecasting models themselves can be used to support deliberations to enhance interventions and process consultation to build agency capacity.

Limitations and Conclusions Despite our enthusiasm for the promise of incorporating CDT into the ISF, it is important to be appropriately circumspect about the current state of the science. The Bronx ACCESS project is still a proposal at this point, and given the challenges of federal funding, may never be realized in the form described here. More generally, a project of this scope and complexity falls outside of the usual comfort zone for many scientists in public health. Social science methods have forced us to think about measuring small numbers of variables under tightly controlled (read ‘‘decontextualized’’) conditions, rather than thinking about processes that interact and change in complex and dynamic ways over time and that differ from place to place. Clearly, in Bronx ACCESS we subscribe to an ecological understanding of dissemination rather than the more widely accepted metaphors of the medical model. More to the point, in the CDT paradigm, studying dissemination is tantamount to studying and evolving the performance of the interactive systems themselves. Bringing continuous quality improvement into dissemination opens the door to this sort of reflexive inquiry. Although these concepts run counter to predominant research paradigms in health psychology and public health research, our argument rests on the clear need for new methods to address challenges of to establish external validity and promote dissemination of research to practice.

123

514

The Bronx ACCESS CDT uses structures and functions suggested by Wandersman et al. (2008) to ensure the levels and types of information needed to manage and evaluate dissemination. Although not yet widely familiar, these structures are not intrinsically more complicated than data coordinating centers, technical assistance programs, and information clearinghouses that support large multisite randomized clinical trials. We believe the ISF is necessary to support the dynamic approach to research presented here. These methods offer sufficient rigor and flexibility to orchestrate successful dissemination of strategies to improve many aspects of community health.

References Airhihenbuwa, C. O. (1992). Health promotion and disease prevention strategies for African Americans: A conceptual model. In R. L. Braithwaite & S. E. Taylor (Eds.), Health issues in the black community (pp. 267–280). San Francisco: Jossey-Bass. Albarracin, D., Rothman, A. J., Di Clemente, R., & Del Rio, C. (2010). Wanted: A theoretical roadmap to research and practice across individual, interpersonal, and structural levels of analysis. AIDS and Behavior, 14, s185–s188. Allison, P. D. (1984). Event history analysis: Regression for longitudinal event data. Newbury Park, CA: Sage. Alter, C. A., & Hage, J. (1993). Organizations working together. Newbury Park, CA: Sage Publications, Inc. American Cancer Society. (2010). Cancer prevention and early detection: Facts and figures 2010. Atlanta, GA: American Cancer Society. Andrews, J., Tingen, M. S., et al. (2001). Provider feedback improves adherence with AHCPR smoking cessation guideline. Preventive Medicine, 33, 415–421. Barker, R. G., & Shedd Barker, L. (1963). The stream of behavior. New York: Appleton-Century-Crofts. Barton-Villagrana, H., Bedney, B. J., & Miller, R. L. (2002). Peer relationships among community-based organizations providing HIV prevention services. Journal of Primary Prevention, 23, 217–236. Baum, F. (2009). Wealth and health: The need for more strategic public health research. Journal of Epidemiology and Community Health, 59, 542–545. Berry, D. A., Cronin, K. A., Plevritis, S. K., Fryback, D. G., Clarke, L., Zelen, M., et al. (2005) Effect of screening and adjuvant therapy on mortality from breast cancer. New England Journal of Medicine, 353(17), 1784–1792. Boer, R., Plevritis, S., & Clarke, L. (2004). Diversity of model approaches for breast cancer screening: A review of model assumptions by the Cancer Intervention and Surveillance Network (CISNET) Breast Cancer Groups. Statistical Methods in Medical Research, 13, 525–538. Bonfill Cosp, X., Marzo Castillejo, M., Pladevall Vila, M., & Marti, J., Emparanza, J. I. (2001). Strategies for increasing the participation of women in community breast cancer screening. Cochrane Database of Systematic Reviews, 1. Bowen, D. J., & Powers, D. (2010). Effects of a mail and telephone intervention on breast health behaviors. Health Education and Behavior, 37, 479–489. Breslau, E. S., Rochester, P. W., Saslow, D., Crocoll, C. E., Johnson, L. E., & Vinson, C. A. (2010). Developing partnerships to reduce

123

Am J Community Psychol (2012) 50:497–517 disparities in cancer screening. Preventing Chronic Disease, 7(3), A62. Bull, S. S., Gillette, C., Glasgow, R. E., & Estabrooks, P. (2003). Worksite health promotion research: To what extent can we generalize the results and what is needed to translate research to practice? Health Education and Behavior, 30, 537–549. Butterfoss, F.D. (2006). Process evaluation for community participation. Annual Review of Public Health, 27, 323–340. Chinman, M., Hannah, G., Wandersman, A., Ebener, P., Hunter, S., Imm, P., et al. (2005). Developing a community science research agenda for building community capacity for effective preventive interventions. American Journal of Community Psychology, 35, 143–158. Chlebowski, R. T., Chen, Z., Anderson, G. L., Rohan, T., Aragaki, A., Lane, D., et al. (2005). Ethnicity and breast cancer: Factors influencing differences in incidence and outcome. Journal of the National Cancer Institute, 97, 439–448. Clarke, L. D., Plevritis, S. K., Boer, R., Cronin, K. A., & Feuer, E. J. (2006). A comparative review of CISNET breast models used to analyze U.S. breast cancer incidence and mortality trends. Journal National Cancer Institute Monographs, 36, 96–105. Collins, L. M., Murphy, S. A., & Bierman, K. L. (2004). A conceptual framework for adaptive preventive interventions. Prevention Science, 5, 185–196. Cosp, B., Castillejo, M., Pladevall Vila, M., et al. (2009).Strategies for increasing the participation of women in community breast cancer screening (Review). Cochrane Database of Systematic Reviews, 1. Costanza, M. E., Stoddard, A. M., Luckmann, R., White, M. J., Spitz Avrunin, J., & Clemow, L. (2000). Promoting mammography: Results of a randomized trial of telephone counseling and a medical practice intervention. American Journal of Preventive Medicine, 19, 39–46. Cronan, T. A., Villalta, I., Gottfried, E., Vaden, Y., Ribas, M., & Conway, T. L. (2008). Predictors of mammography screening among ethnically diverse low-income women. Journal of Women’s Health, 17, 527–537. Derose, K. P., Fox, S. A., Reigadas, E., & Hawes-Dawson, J. (2000). Church-based telephone mammography counseling with peers. Journal of Health Communication, 5, 175–188. Duan, N., Fox, S. A., Derose, K. P., & Carson, S. (2000). Maintaining mammography adherence through telephone counseling in a churchbased trial. American Journal of Public Health, 90, 1468–1471. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350. Earp, J. L., Altpeter, M., Mayne, L., Viadro, C. I., & O’Malley, M. S. (1995). The orth Carolina Breast Cancer Screening Program: Foundations and design of a model for reaching older, minority, rural women. Breast Cancer Research Treatment, 35, 7–22. Earp, J. A., Viadro, C. I., Vincus, A. A., Altpeter, M., Flax, V., Mayne, L., et al. (1997). Lay health advisors: A strategy for getting the word out about breast cancer. Health Education & Behavior, 24(4), 432–451. El Ansari, W. (2003). Educational partnerships for public health: Do stakeholders perceive similar outcomes? Journal of Public Health Management and Practice, 9(2), 136–156. Ellis, P., Robinson, P., Ciliska, D., Armour, T., Brouwers, M., O’Brien, M. A., et al. (2005). A systematic review of studies evaluating diffusion and dissemination of selected cancer control interventions. Health Psychology, 24, 488–500. Eng, E. (1993). The save our sisters project: A social network strategy for reaching rural black women. Cancer, 72, 1071–1077. Eng, E., Parker, E., & Harlan, C. (1997). Lay health advisor intervention strategies: a continuum from natural helping to

Am J Community Psychol (2012) 50:497–517 paraprofessional helping. Health Education & Behavior, 24, 413–417. Eng, E., Rhodes, S. D., & Parker, E. (2009). Natural helper models to enhance a community’s health and competence. In R. J. DiClemente, R. A. Crosby, & M. C. Kegler (Eds.), Emerging theories in health promotion practice and research (pp. 303–330). San Francisco: Jossey-Bass. Eng, E., & Smith, J. (1995). Natural helping functions of lay health advisors in breast cancer education. Breast Cancer Research and Treatment, 35, 23–29. Epstein, J. N., Langberg, J. M., Lichtenstein, P. K., Kolb, R. C., & Stark, L. J. (2010). Sustained improvement in pediatricians’ ADHD practice behaviors in the context of a community-based quality improvement initiative. Children’s Health Care, 39(4), 296–311. Erwin, D. O., Spatz, T. S., Stotts, R. C., Hollenberg, J. A., & Deloney, L. A. (1996). Increasing mammography and breast self-examination in African American women using The Winess Project Model. Journal of Cancer Education, 11, 210–215. Esserman, L., Shieh, Y., & Thompson, I. (2009). Rethinking screening for breast can-cer and prostate cancer. Journal of American Medical Association, 302, 1685–1692. Farouk, S. (2004). Group work in schools: A process consultation approach. Educational Psychology Practice, 20(3), 207–220. Fonner, E., Jr. (1998). Relationship management: Assessing coalitions, networks, and other workgroups. Administrative Radiology Journal, 17(6), 18–21. Forrester, J. W. (1971). Principles of systems. Cambridge: WrightAllen Press. Foster-Fishman, P. G., Berkowitz, S. L., Lounsbury, D. W., Jacobson, S., & Allen, N. A. (2001). Building collaborative capacity in community coalitions: A review and integrative framework. American Journal of Community Psychology, 29(2), 241–261. Gemignani, M. L. (2010). The new mammographic screening guidelines: What were they thinking? Obstetrics and Gynecology, 115, 484–486. Glasgow, R. E. (2003). Translating research to practice: Lessons learned areas for improvement, and future directions. Diabetes Care, 26, 2451–2456. Glasgow, R. E., & Emmons, K. M. (2007). How can we increase translation of research into practice? Types of evidence needed. Annual Review of Public Health, 28, 413–433. Glasgow, R. E., Green, L. W., Klesges, L. M., Abrams, D. B., Fisher, E. B., Goldstein, M. G., et al. (2006). External validity: We need to do more. Annual of Behavioral Medicine, 31, 105–108. Glasgow, R. E., Lichtenstein, E., & Marcus, A. C. (2003). Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. American Journal of Public Health, 93(8), 1261–1267. Glasgow, R. E., Marcus, A. C., Bull, S. S., & Wilson, K. (2004). Disseminating effective cancer screening. Cancer, 101(Suppl 5), 1239–1250. Goodman, R. M., McLeroy, K. R., Steckler, A., & Hoyle, R. H. (1993). Development of level of institutionalization (LoIn) scales for health promotion programs. Health Education Quarterly, 20(2), 161–178. Goodman, R. M., & Steckler, A. (1989). A model for the institutionalization of health promotion programs. Family and Community Health, 11(4), 63–78. Goodman, R. M., & Steckler, A. (1990). Moblizing organizations for health enhancement: Theories of organizational change. In K. Glanz, F. M. Lewis, & B. K. Rimer (Eds.), Health behavior and health education: Theory, research and practice (pp. 314–341). San Francisco: Jossey-Bass.

515 Green, L. W. (2001). From research to ‘‘best practices’’ in other settings and populations. American Journal of Health Behavior, 25, 165–178. Green, L. W., & Glasgow, R. E. (2006). Evaluating the relevance, generalization, and applicability of research: Issues in external validity and translation methodology. Evaluation & Health Professions, 29, 126–153. Green, L. W., & Ottoson, J. M. (2004). From efficacy to effectiveness to community and back: Evidence-based practice vs. practicebased evidence. Paper presented at: From clinical trials to community: The science of translating diabetes and obesity research, Bethesda, MD. Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakadou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82(4), 581–629. Grimshaw, J., Eccles, M., Thomas, R., MacLennan, G., Ramsay, C., Fraser, C., et al. (2006). Toward evidence-based quality improvement: Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966–1998. Journal of General Internal Medicine, 21(Supplement 2), S14–S20. Hanko, G. (1999). Increasing competence through collaborative problem-solving: Using insight into social and emotional factors in children’s learning. London: David Fulton Publishers Ltd. Hawe, P., Shiell, A., Riley, T., & Gold, L. (2009). Methods for exploring implementation variation and local context within a cluster randomised community intervention trial. Journal of Epidemiology and Community Health, 58, 788–793. Hirsch, G. B., Homer, J. B., Evans, E., & Zielinski, A. (2010). A system dynamics model for planning cardiovascular disease interventions. American Journal of Public Health, 100, 616–622. Homer, J. B., & Hirsch, G. B. (2006). System dynamics modeling for public health: Background and opportunities. American Journal of Public Health, 96, 452–458. Homer, J., Hirsch, G. B., Minniti, M., & Pierson, M. (2004). Models for collaboration: How system dynamics helped a community organize cost-effective care for chronic illness. System Dynamics Review, 20, 199–222. Horowitz, C. R., Robinson, M., & Seifer, S. (2009). Communitybased participatory research from the margin to the mainstream: Are researchers prepared? Circulation, 119, 2633–2642. Irwin, D. O., Spatz, T. S., Stotts, R. C., & Hollenberg, J. A. (1999). Increasing mammography practice by African American women. Cancer Practice, 7, 78–85. Israel, B. A., Lichtenstein, R., Lantz, P., McGranaghan, R., Allen, A., Guzman, J. R., et al. (2001). The Detroit Community— Academic Urban Research Center: Development, implementation and evaluation. Journal of Public Health Management Practice, 7(5), 1–19. Jackson, E. J., & Parks, C. R. (1997). Recruitment and training issues from selected lay health advisor programs among African Americans: A 20-year perspective. Health Education & Behavior, 24(4), 418–431. Janz, N. K., Schottenfeld, D., Doerr, K. M., Selig, S. M., Dunn, R. L., Strawderman, M., et al. (1997). A two-step intervention of increase mammography among women aged 65 and older. American Journal of Public Health, 87, 1683–1686. Jason, L. A., & Ferone, L. (1978). Behavioral versus process consultation interventions in school settings. American Journal of Community Psychology, 6(6), 531–543. Jørgensen, K. J., & Gøtzsche, P. C. (2009). Over diagnosis in publicly organised mam-mography screening programmes: Systematic review of incidence trends. British Medical Journal, 339, b25–b87.

123

516 Kahan, B., & Goodstadt, M. (1999). Continuous quality improvement and health promotion: Can CQI lead to better outcomes? Health Promotion International, 14(1), 83–91. Kanter, R. (1983). The change masters: Innovation for productivity in the American corporation. New York, NY: Simon & Schuster. Kerner, J. F. (2002). Closing the gap between discovery and delivery. Washington, DC: National Cancer Institute. Kiefe, C. I., Allison, J. J., et al. (2001). Improving quality improvement using achievable benchmarks for physician feedback: A randomized controlled trial. Journal of the American Medical Association, 285(22), 2871–2879. Klesges, L. M., Dzewaltowski, D. A., & Christenson, A. J. (2006). Are we creating relevant behavioral medicine research? Show me the evidence? Annals of Behavioral Medicine, 31, 3–4. Kolata, G. (2009). Panel urges mammograms at 50, Not 40. New York Times, A1. Retrieved from http://www.nytimes.com/2009/ 11/17/health/17cancer.html. Kreuter, M. W., Strecher, V. J., & Glassman, B. (1999a). One size does not fit all: The case for tailoring print materials. Annals of Behavioral Medicine, 21, 276–283. Kreuter, M. W., Strecher, V. J., & Glassman, B. (1999b). One size does not fit all: The case for tailoring print materials. Annals of Behavioral Medicine, 21, 276–283. Kreuter, M. W., & Wray, R. J. (2003). Tailored and targeted health communication: Strategies for enhancing information relevance. American Journal of Health Behavior, 27(Supplement 3), S227– S232. Lambrechts, F., Grieten, S., Bouwen, R., & Corthouts, F. (2009). Process consultation revisited: Taking a relational practice perspective. Journal of Applied Behavioral Science, 45(1), 39–58. Lannin, D. R., Mathews, H. F., Mitchell, J., Swanson, M. S., Swanson, F. H., & Edwards, M. S. (1998). Influence of socioeconomic and cultural factors on racial differences in late-stage presentation of breast cancer. Journal of the American Medical Association, 279, 1801–1807. Larson, D. W. (2004). Distrust: Prudent, if not always wise. In R. Hardin (Ed.), Distrust (pp. 34–59). New York: Russell Sage Foundation. Lasker, R. D., & Weiss, E. S. (2003). Broadening participation in community problem solving: A multidisciplinary model to support collaborative practice and research. Journal of Urban Health, 80(1), 14–47; discussion 48–60. Lasker, R. D., & Weiss, E. (2003b). Creating partnership synergy: The critical role of the community stakeholders. Journal of Health and Human Services Administration, 26, 119–139. Latkin, C., Weeks, M. R., Glasman, L., Galletly, C., & Albarracin, D. (2010). A dynamic social systems model for considering structural factors in HIV prevention and detection. AIDS and Behavior, 14, S222–S238. Levine, R., & Fitzgerald, H. (1992). Basic approaches to general systems, dynamics systems, and cybernetics. New York: Plenum Press. Lipkus, I. M., Rimer, B. K., Halabi, S., & Strigo, T. S. (2000). Can tailored interventions increase mammography use among HMO women? American Journal of Preventive Medicine, 18, 1–10. Lounsbury, D. W., & Mitchell, S. G. (2009). Introduction to special issue on social ecological approaches to community health research and action. American Journal of Community Psychology, 44, 213–220. Lounsbury, D. W., Reynolds, T. C., Rapkin, B. D., Robson, M. E., & Ostroff, J. (2007). Protecting the privacy of third-party information: Recommendations for social and behavioral health researchers. Social Science and Medicine, 64(1), 213–222. Mabry, P., Olster D., et al. (2008). Interdisciplinarity and systems science to improve population health: A view from the NIH

123

Am J Community Psychol (2012) 50:497–517 Office of Behavioral and Social Sciences Research. American Journal of Preventive Medicine, 35(S2), S211–S224. Mandelblatt J. S., Cronin K. A., Bailey S., Berry D. A., de Koning H. J., Draisma G., et al. (2009) Effects of mammography screening under different screening schedules: Model estimates of potential benefits and harms. Annals of Internal Medicine, 151(10), 738–774. Mandelblatt, J. S., Saha, S., & Teutsch, S. (2003). The costeffectiveness of screening mammography beyond age 65 years: A systematic review for the U.S. Preventive Services Task Force. Annals of Internal Medicine, 139, 835–842. Mandelblatt, J., Schechter, C. B., Lawrence, W., Yi, B., & Cullen, J. (2006). Chapter 8: The spectrum population model of the impact of screening and treatment on U.S. breast cancer trends from 1975 to 2000: Principles and practice of the model methods. Journal of the National Cancer Institute. Monographs, 36, 47–55. Mandelblatt, J. S., Schechter, C. B., Yabroff, K. R., Lawrence, W., Dignam, J., Muennig, P., et al. (2004). Benefits and costs of interventions to improve breast cancer outcomes in African-American women. Journal of Clinical Oncology, 22(13), 2554–2566. Matson, K. D., Granade, S. A., & Anwuri, V. V. (2008). Strategies for establishing policy, environmental, and systems-level interventions for managing high blood pressure and high cholesterol in healthcare settings: A qualitative case study. Preventing Chronic Disease, 5, A83. McKinney, M. M., Morrissey, J. P., & Kaluzny, A. D. (1993). Interorganizational exchanges as performance markers in a community cancer network. Health Services Research, 28(4), 459–478. McKleroy, V. S., Galbraith, J. S., Cummings, B., Jones, P., Harshbarger, C., Collins, C., et al. (2006). Adapting evidence– based behavioral interventions for new settings and target populations. AIDS & Behavior/AIDS Education and Prevention, 18(Supplement A), 59–73. Miller, R. L., & Shinn, M. (2005). Learning from communities: Overcoming difficulties in dissemination of prevention and promotion effort. American Journal of Community Psychology, 35(3–4), 169–183. Muss, H. B., Hunter, C. P., Wesley, M., Correa, P., Chen, V. W., et al. (1992). Treatment plans for black and white women with stage II node-positive breast cancer: The National Cancer Institute Black/White Cancer Survival Study experience. Cancer, 70, 2460–2467. National Cancer Institute. (2010). Research-Tested Intervention Programs (RTIPS). Retrieved from http://rtips.cancer.gov/ rtips/index.do. National Center for Health Statistics. (2010). Health, United States, 2009: With special feature on medical technology. Hyattsville, MD. Newman, L. A. (2005). Breast cancer in African-American women. The Oncologist, 10, 1–14. Oxman, A. D., Thomson, M. A., Davis, D. A., & Haynes, B. (1995). No magic bullets: A systematic review of 102 trails of interventions to improve professional practice. Canadian Medical Association Journal, 153, 1423–1431. Paskett, E. D., Tatum, C. M., D’Agostino, R. J., Rushing, J., Velez, R., Michielutte, R., et al. (1999). Community-based interventions to improve breast and cervical cancer screening: Results of the Forsyth County Cancer Screening (FoCaS) Project. Cancer Epidemiology, Biomarkers and Prevention, 8, 453–459. Rapkin, B. D., Massie, M. J., Jansky, E. J., Lounsbury, D. W., Murphy, P. D., & Powell, S. (2006). Developing a partnership model for cancer screening with community-based organizations: The ACCESS breast cancer education and outreach project. American Journal of Community Psychology, 38(3–4), 153–164.

Am J Community Psychol (2012) 50:497–517 Rapkin, B. D., & Trickett, E. J. (2005). Comprehensive dynamic trial designs for behavioral prevention research with communities: Overcoming inadequacies of the randomized controlled trial paradigm. In E. J. Trickett & W. Pequenaut (Eds.), Increasing the community impact of HIV prevention interventions (pp. 249–277). New York: Oxford University Press. Rapkin, B., Weiss, E., Chhabra, R., Ryniker, L., Patel, S., Carness, J., et al. (2008). Beyond satisfaction: Using the dynamics of care assessment to better understand patients’ experiences in care. Health and Quality of Life Outcomes, 6, 20. Ratner, P. A., Bottorff, J. L., Johnson, J. L., Cook, R., & Covato, C. Y. (2001). A meta-analysis of mammography screening promotion. Cancer Detection Prevention, 25, 147–160. Rennert, G. (2007). Cancer prevention: From public health interventions to individual tailoring. European Journal of Cancer Prevention, 16, 165–186. Rhodes, S. D., Foley, K. L., Zometa, C. S., & Bloom, F. R. (2007). Lay health advisor interventions among Hispanics/Lations: A qualitative systematic review. American Journal of Preventive Medicine, 33(5), 418–427. Richardson, G. P. (1991). Feedback thought in social science and systems theory. Waltham: MA, Pegasus Communications, Inc. Rogers, E. M. (2003). Diffusion of innovation (5th ed.). New York, NY: Free Press. Sandler, J. (2007). Community-based practices: Integrating dissemination theory with critical theories of power and justice. American Journal of Community Psychology, 40(3–4), 272–289. Sargenti, S., & Dresdale, A. (2009). Celeb survivors protest breast cancer screening guidelines: Olivia Newton-John, Jaclyn Smith, and Sheryl Crow express concern. ABC News. Retrieved from: http://abcnews.go.com/Health/OnCallPlusBreastCancerNews/ celeb-survivors-rail-breast-cancer-screening-guidelines-olivia/ story?id=9128759. Schein, E. H. (1969). Process consultation: Its role in organization development. Reading, MA: Addison-Wesley. Schein, E. (1987). Process consultation volume II: Lessons for managers and consultants. Reading, MA: Addison-Wesley. Schein, E. H. (1999). Process consultation revisited, building the helping relationship. Reading, MA: Addison-Wesley. Schensul, J. J. (2009). Introduction to multi-level community-based culturally situated interventions. American Journal of Community Psychology, 43(3–4), 232–240. Schulz, A. J., Williams, D. R., Israel, B. A., & Lempert, L. B. (2002). Racial and spatial relations as fundamental determinants of health in Detroit. Milbank Quarterly, 80, 677–707, iv. Simpson, D. D. (2002). A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment, 22, 171–182. Speer, P. W., & Zippay, A. (2005). Participatory decision-making among community coalitions: An analysis of task group meetings. Administration in Social Work, 29(3), 61–77. Sterman, J. D. (1989). Modeling managerial behavior: Misperceptions of feedback in a dynamic decision making experiment. Management Science, 35(3), 321–339. Street, R. L. & Epstein, R. M. (2008). Key interpersonal functions and health outcomes: Lessons from theory and research on clinicianpatient communication. In K. Glanz, B. K. Rimber, & K. Viswanath (Eds.), Health behavior and health education: Theory, research, and practice (pp. 237–270). San Francisco: Jossey-Bass. Sung, J. F. C., Blumenthal, D. S., Coates, R. J., & Alema-Mensah, E. (1997). Knowledge, beliefs, attitudes, and cancer screening among inner-city African-American women. Journal of the National Medical Association, 89(6), 405–411. Thompson, H. S., Valdimarsdottir, H. B., Winkel, G., Jandorf, L., & Redd, W. (2004). The group-based medical mistrust scale:

517 Psychometric properties and association with breast cancer screening. Preventive Medicine, 38, 209–218. Tosteson, A., Stout, N. K., Fryback, D. G., Acharyya, S., Herman, B. A., et al. (2008). Cost-effectiveness of digital mammography breast cancer screening: results from ACRIN DMIST. Annals of Internal Medicine, 148, 1–10. Tunis, S. R., Stryer, D. B., & Clancey, C. M. (2003).Practical clinical trials: Increasing the value of clinical research for decision making in clinical and health policy. Journal of American Medical Association, 29o, 1624–1632. Umemoto, K., Baker, C. K., Helm, S., Miao, T., Goebert, D. A., & Hishinuma, E. S. (2009). Moving toward comprehensiveness and sustainability in a social ecological approach to youth violence prevention: Lessons from the Asian/Pacific Islander Youth Violence Prevention Care. American Journal of Community Psychology, 44, 221–232. Vernon, S., McQueen, A., Tiro, J. A., & del Junco, D. J. (2010). Interventions to promote repeat breast cancer screening with mammography: A systematic review and meta-analysis. Journal of the National Cancer Institute, 102(14), 1023–1039. Wallerstein, N., Duran, B., Minkler, M., & Foley, K. (2005). Developing and maintaining partnerships with communities. In B. Israel, E. Eng, A. Schulz, & E. Parker (Eds.), Methods in community-based participatory research methods (pp. 31–51). San Francisco: Jossey-Bass. Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., et al. (2008). Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology, 41, 171–181. Wandersman, A., Goodman, R. M., & Butterfoss, F. D. (1997). Understanding coalitions and how they operate. In M. Minkler (Ed.), Community organizing and community building for health (pp. 261–277). New Brunswick, NJ: Rutgers University Press. Wandersman, A., Imm, P., Chinman, M., & Kaftarian, S. (2000). Getting to outcomes: A result-based approach to accountability. Evaluation and Program Planning, 23, 389–395. Weiss, E. S., Miller-Anderson, R., & Lasker, R. D. (2002). Making the most of collaboration: Exploring the relationship between partnership synergy and partnership functioning. Health Education and Behavior, 29, 683–698. Weiss, E. S., Taber, S. K., Breslau, E. S., Lillie, S. E., & Li, Y. (2010). The role of leadership and management in six southern public health partnerships: A study of member involvement and satisfaction. Health Education and Behavior, 37(5), 737–752. Woulfe, J., Oliver, T. R., Zahner, S. J., & Siemering, K. Q. (2010). Multisector partnerships in population health improvement. Preventing Chronic Disease, 7(6), A119. Yabroff, K. R. (2008). Interventions to improve cancer screening: Commentary from a health services research perspective. American Journal of Preventative Medicine, 35(Supplement 1), S6–S9. Yoshikawa, H., Wilson, P. A., Hsueh, J., Rosman, E. A., Chin, J., & Kim, J. H. (2003). What front-line CBO staff can tell us about culturally anchored theories of behavior change in HIV prevention for Asian/Pacific Islanders American. Journal of Community Psychology, 32, 143–158. Zhu, K., Hunter, S., Bernard, L. J., Payne-Wilks, K., Roland, C. L., Elam, L. C., et al. (2002). An intervention study on screening for breast cancer among single African-American women aged 65 and older. Preventive Medicine, 34, 536–545. Zuckerman, H. S., Kaluzny, A. D., & Ricketts, T. C. (1995). Alliances in health care: What we know, what we think we know, and what we should know. Health Care Management Review, 20(1), 54–64.

123