Research on Social Work Practice

0 downloads 0 Views 114KB Size Report
Dean, Faculty of Social Work, University of Toronto, 246 Bloor Street West,. Toronto .... training and manuals with quality assurance protocols). Within the institute ...
Research on Social Work Practice http://rsw.sagepub.com

Operationalizing Evidence-Based Practice: The Development of an Institute for Evidence-Based Social Work Cheryl Regehr, Susan Stern and Aron Shlonsky Research on Social Work Practice 2007; 17; 408 DOI: 10.1177/1049731506293561 The online version of this article can be found at: http://rsw.sagepub.com/cgi/content/abstract/17/3/408

Published by: http://www.sagepublications.com

Additional services and information for Research on Social Work Practice can be found at: Email Alerts: http://rsw.sagepub.com/cgi/alerts Subscriptions: http://rsw.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav Citations (this article cites 29 articles hosted on the SAGE Journals Online and HighWire Press platforms): http://rsw.sagepub.com/cgi/content/abstract/17/3/408#BIBL

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Operationalizing Evidence-Based Practice: The Development of an Institute for Evidence-Based Social Work Cheryl Regehr Susan Stern Aron Shlonsky University of Toronto, Ontario, Canada

Although evidence-based practice (EBP) has received increasing attention in social work in the past few years, there has been limited success in moving from academic discussion to engaging social workers in the process of implementing EBP in practice. This article describes the challenges, successes, and future aims in the process of developing a university-based institute for evidence-based social work. Aspects of the development include attempting to address concerns and critiques through developing an inclusive model of EBP; engaging community agency partners as active participants in the process; developing collaborative research projects with community partners to further the research evidence available for practice; conducting systematic reviews; finding means of disseminating review and research findings broadly in order to effect social policy revisions and lead to the development of effective programs and practices; and training agency personnel and social work students in the process of EBP and conducting practice evaluation research. Keywords: evidence-based practice; evidence-based social work; systematic review; evidence-based policy

Social work academics have increasingly asserted that clients of social work services have the right to receive effective, empirically validated treatment that can reasonably be expected to reduce their distress and improve their life situation (Myers & Thyer, 1997). Despite the fact that we need a more robust base of evidence generated within our profession, including a stronger base of intervention research, a vast amount of accessible, relevant research evidence now exists to help inform social work practice and policy decision making (Fraser, 2003; Reid & Fortune, 2003). Yet, findings from various studies suggest that social workers’ use of research as a means of informing their practice is disappointing at best (Rosen, 2003). Thus, the impetus for evidence-based practice (EBP) arose out of a concern that social work practices have been conducted in the absence of, or without adequate attention to, evidence of their effectiveness in decision making and client care (Gambrill, 2001). However, although academic discussions and scholarly publications about EBP have mushroomed

Authors’ Note: Correspondence may be addressed to Cheryl Regehr, PhD, Dean, Faculty of Social Work, University of Toronto, 246 Bloor Street West, Toronto, Ontario, M5A 1A1; e-mail: [email protected]. This article was invited and accepted by the editor. Research on Social Work Practice, Vol. 17 No. 3, May 2007 408-416 DOI: 10.1177/1049731506293561 © 2007 Sage Publications

over the past few years, there remains a gap between the academic discourse and how social workers actually practice. This article describes the development of a university-based institute for evidence-based social work and our efforts to operationalize EBP by moving beyond academic discussions and engaging our community of social work practitioners in the process of determining and implementing EBP. ENVISIONING AN EVIDENCE-BASED RESEARCH INSTITUTE The Research Institute for Evidence-Based Social Work at the University of Toronto was conceived by the former dean, James Barber. His vision was articulated as follows: The Research Institute for Evidence-Based Social Work aims to build a bridge between social work research and the communities it serves. Our mission is to put the best and latest research evidence on social work practice from around the world into the hands of the people who need it most—practitioners and policy-makers in the field. We are at the forefront of a growing international movement whose ultimate aim is to create an unprecedented body of evidence for use by social work professionals everywhere. (Barber, 2005)

The goal was to combine the research strength of the university with the professional expertise of a wide network

408 Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Regehr et al. / OPERATIONALIZING EVIDENCE-BASED PRACTICE 409

of local and national community agencies. This would ensure that the institute invested in up-to-the-minute research that is directly relevant to real-life practice needs, then transferred the results of that research to the field for implementation and evaluation. Barber identified that such an aim was only possible with funding that was not restricted to particular research projects or outcomes but was rather committed to a larger goal. As a result, he successfully obtained founding member support from a commercial corporation, a foundation, and several private donors. This provided a basis from which to begin the work of developing a model for the institute. Considerable challenges loomed, however, in working to operationalize this vision. The first set of challenges was related to the suspicions and hostilities harbored by both academics and practitioners stemming from varying conceptions about the meaning of EBP. How could we communicate a nuanced understanding of what EBP means and address many of our colleagues’ views of EBP as having little to do with the real world of social work practice or policy? Furthermore, how could we articulate an inclusive model of EBP that did not appear reductionistic, mechanistic, and dismissive of some methodologies and epistemologies? Clearly, these were necessary steps before we could begin to engage others in the institute. The second set of challenges related to developing concrete ways of accumulating evidence and sharing it with others. What does an evidence-based institute in social work actually look like?

CONCEPTUALIZING AN INCLUSIVE MODEL OF EBP Concerns about EBP abound in many disciplines, but the EBP debate in medicine provides an excellent example of challenges encountered when attempts are made to put EBP into the field. Porta (2004), for instance, refers to “the deepening dichotomy between evidence-based medicine (EBM) and common sense medicine (CSM)” (p. 147) and suggests that many doctors feel the well-intentioned principles of EBM may be increasingly applied for the purposes of cost containment rather than optimal treatments or cures. Others concur that EBM may be rationing disguised as science (Saari & Gylling, 2004) and raise concerns about the extent to which EBM further marginalizes vulnerable and disadvantaged groups in obtaining health care that is relevant to their needs (Rogers, 2005). Lipman (2004) offers additional reasons why practitioners (in this case, family practice doctors in England) fail to use computerized evidence-based guidelines. He suggests that practitioners are acutely aware that the results of randomized controlled trials (RCTs) regarding any medical

problem are aggregated outcomes of heterogeneous populations, who may or may not bear any similarity to the patient currently sitting in the doctor’s office and who present with a wide range of problems (of which only some are biomedical). Lipman contends that the “reductionistic philosophy” of computerized guidelines fail to converge with the needs of practitioners (p. 172). Rosenfeld (2004) further argues that although “EBM was initially liberating and empowering, it quickly became a rigid dogma with a constricting, conflicting, insensitive religion preached by the few and meant to be followed blindly by the masses” (p. 153). Although social work is newer to EBP, similar concerns have been raised about the application of this model to work with clients. Social workers have questioned the fit and feasibility of transporting a model originating in medicine with the profession’s mission, values, and diverse service populations (Webb, 2001; Witkin, 1998). One of social work’s most distinctive strengths lies in its holistic view of person in the environment, and some question whether EBP can adequately address the complexity and context of our clients’ lives (Witkin, 1998). The assumption is that EBP is a cookbook approach involving extracting best practices from the scientific literature and simplistically applying them to clients without regard to who the clients are, their personal motivations and goals, or other potentially complicating life situations. Critics also fear that EBP ignores the social worker’s expertise— experience and judgment—and may “undermine traditional professional practice” (Webb, 2001, p. 58). Blome & Steib (2004), citing child welfare as an example, caution against the assumption that evidence in any given area of practice will clearly point the way to a desired outcome. “Unfortunately, no one evidence-based program leads to faster reunification, more stable placements, or higher rates of recovery from addiction. Many programs and practices may affect these outcomes depending on a myriad of organizational and staffing issues” (p. 611). As in EBM, skeptics link the increasing emphasis on accountability and EBP to economic and resource issues (Webb, 2001) and to the rise of managed care (Mullen & Streiner, 2004; Witkin & Harrison, 2001) and raise concerns whether EBP is necessarily in the best interests of clients. In social work, the scientific base of EBP is attacked on ideological and methodological grounds framed by the epistemological debate on knowledge building in the social sciences (Gambrill, 1999; Webb, 2001). These objections to EBP are only illustrative of some of the challenges in developing an inclusive EBP social work model and, in our case, the institute. Others (Gambrill, 2003; Gibbs & Gambrill, 2002; Mullen & Streiner, 2004) have described objections and rebuttals in greater detail.

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

410 RESEARCH ON SOCIAL WORK PRACTICE

The concerns of those dismissive of EBP are highly vocalized and present barriers to any attempts to bring evidence to practice in a systematic manner. To address these concerns, we first needed to clarify what we meant by EBP, given the many usages and misconceptions (Gibbs & Gambrill, 2002). EBP in medicine originally was defined as “the conscientious, explicit and judicious use of current evidence in making decisions about the care of individual patients” (Sackett, Rosenberg, Gray, Haynes, & Richardson, 1996, p. 71). Based on this and more recent expanded definitions of EBM (Haynes, Devereaux & Guyatt, 2002; Sackett, Straus, Richardson, Rosenberg, & Haynes, 2000), the social work model of EBP proposed by Gambrill (2002, 2006) is clearly not reductionistic or mechanistic. In no uncertain terms, she advocates that any model of EBP must incorporate not only evidence but also client values and preferences as well as clinical state and circumstances. We would add that organizational mission, mandate, and context also should be included in order to include consideration of environmental strengths and barriers that exist outside of the individual. At the policy level, EBP considers the best available evidence in policy development and implementation rather than relying on opinion or authority (Gray, 2001). In this light, EBP can be viewed as a process that blends current best evidence, community values and preferences, and agency, societal, and political considerations in order to establish programs and policies that are effective and contextualized. At the broader level, evidence-based policy involves finding and appraising the evidence, developing the capacity of individuals and organizations to use the evidence wisely, and getting the evidence into practice (Gray, 2001). The term evidence-based practice has been used in the field in related but distinct ways that do not reflect the process of EBP (Gambrill, 2003; Shlonsky & Gibbs, 2004). For instance, the term has been used to refer to the clinical application of empirically supported interventions that have been identified through the process of a critical evaluation of the outcome literature when these might better be considered EBPs. Treatment models with supporting research evidence are hypothesized to increase practitioners’ effectiveness and support competent ethical practice. In this sense of the word, EBP often is used interchangeably with other terms, for example, empirically validated intervention, empirically supported treatments (ESTs), empirical practice, research-based treatment, best practices. When EBPs are identified, they often are promoted for use with clients as a set of guidelines or standards to follow or are disseminated for implementation as entire interventions (usually through training and manuals with quality assurance protocols).

Within the institute, however, and consistent with EBM as described by Sackett et al. (1996, 2000) and carried forward to social work as EBP (Gambrill, 1999, 2003), we use the term EBP to refer to the process of posing a question, searching for and evaluating the evidence, and applying the evidence within a client- or policy-specific context. EBPs, or one of the other similar terms mentioned above, are referred to as interventions with evidence of efficacy or effectiveness. Contributing to the knowledge base of effective practices is an important component of the institute that complements the process of examining existing evidence in a contextual framework. Thus, our work encompasses the promotion of both EBP as a process and research on the development, transportability, and dissemination of EBPs. Clearly contrary to many criticisms of the EBP model, the inclusion of the process components of EBP in our conceptualization means that action is not dictated by current best evidence operating in a vacuum. We have found that the first step in articulating EBP in an inclusive way is to highlight the value of client wishes (whether an individual, family, group, or community) and practitioner expertise (whether a clinician, manager, or policy maker). As well, we highlight the importance of agency mandate and constraints and the broader ecological context as indispensable complements to the use of evidence in practice. We think EBP, by definition, facilitates the very best qualities of social work when a social worker involves clients in a collaborative process to consider the available evidence or lack of evidence. The practitioner fosters self-determination as the client is empowered to contribute to an informed decision about treatment planning in light of one’s goals, values, and situation and, at the same time, the practitioner must take into account his or her own skills, resources, and agency context. Although less has been written about EBP as a process at the policy level, the basic tenets remain the same, with social work professional values underpinning a process that encourages transparency and collaborative decision making while recognizing the economic and political exigencies that operate. Figure 1 is modified from Haynes et al. (2002) to reflect the context and reality of social work practice and policy. Specifically, it considers the emerging research on the ecological influences that may affect moving from evidence to practice such as intraorganizational, extraorganizational, and practitioner-level factors (Dobrow, Goel, & Upshur, 2004; Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005; Glisson & Hemmelgarn, 1998; Schoenwald, Scheidow, Letourneau, & Liao, 2003; Simpson, 2002), as well as the dynamic and reciprocal interactions between levels. Clinical expertise in the model has been replaced with professional expertise to articulate the role in EBP of the social

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Regehr et al. / OPERATIONALIZING EVIDENCE-BASED PRACTICE 411

Political Context

Client Preferences and Actions

Research Evidence

Professional Context

Professional Expertise

Training / Supervision

Client State and Circumstances Community

Socio / Historical Context

Organizational Mandate

Organizational Resources / Constraints

Economic Context Figure 1: Elements of Evidence-Based Policy and Practice SOURCE: Modified from Haynes, Devereaux, and Guyatt (2002).

worker’s critical thinking and professional expertise, whether in the clinical or policy realm. Although building on the influential contributions of others (Gambrill, 1999, 2003; Haynes et al., 2002; Sackett et al., 1996), the proposed model offers a broader view of the ecology of social work practice and social welfare policy. Thus, there is room for discussion, both within the institute and the broader faculty, about ways to improve the use of evidence in policy and practice. Last, inherent in the above concerns by skeptics or critics of EBP is the notion that evidence to practice is a oneway street, with evidence being unilaterally imposed on clients. Rather, we see evidence as part of a process emanating from social work practice questions and, ultimately, leading to systematic reviews of the literature. In addition, we believe that dissemination, if sufficient research exists, or collaborative primary research if it does not exist, are essential parts of the EBP endeavor. In any case, practical questions from the field drive the acquisition of knowledge. For example, a key practice question from a community partner begins with a critical review of the literature. If no current, high-quality, systematic review

exists, one is conducted. In the absence of sufficient time, a rapid evidence assessment (Davies, 2004) might be used as the best substitute. The review then informs practice and policy decisions (e.g., programs to fund, interventions to develop or test, assessment tools to develop or test), and these are rigorously evaluated over time. In addition to concerns about the pressure to mindlessly apply evidence to practice, there are questions as to what constitutes evidence and which methodologies and epistemologies are privileged in EBP. Early models defining EBP identified a hierarchy of evidence that was defined in levels ranging from anecdotal evidence to systematic reviews. One such example of this is the guidelines for systematic reviews from the NHS (National Health Service) Centre for Reviews and Dissemination in England (Kahn, ter Reet, Popay, Nixon, & Kleijnen, 2001). In such a hierarchy, the RCT is inevitably the gold standard (Macdonald, 1999). Although others have questioned the evidentiary basis of this assertion (Djulbegovic, Morris, & Lyman, 2000), the traditional hierarchy model has caused some to believe that those seeking evidence for practice will view RCTs as the only evidence worth considering. In such a

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

412 RESEARCH ON SOCIAL WORK PRACTICE

model, design is equated with quality. Others argue that this is not the case and suggest that evidence comes in many forms and levels of quality and that we must return to the notion of the best available evidence, which includes methodologies other than the RCT (Dreyfus, 2004; Harari, 2001; Rycroft-Malone et al., 2004; Ter Meulen & Dickenson, 2002). Although it would be difficult to argue that a well-conducted RCT when possible or, better yet, a rigorous meta-analysis would not be considered the gold standard for effectiveness questions, we embrace a more inclusive view of what constitutes evidence overall. Qureshi (2004) proposes a model for judging the “worth” of research based on the premise that different research strategies are appropriate at different stages of the evolution of practice, which provides an excellent beginning point for our conceptualization of research evidence and discussions within the institute. There is a growing body of research using a variety of designs in the behavioral and social sciences and increasingly in social work itself, each with its own strengths and biases, that can inform social work questions at different points in the development of evidence-informed practice models or policy (Rzepnicki & Briggs, 2004; Stern, 1994). By embracing EBP as a process model, the choice of study design is contingent on the question posed. Judgments of value focus on the ability of the research to address a gap in the current body of knowledge. This is not to suggest that all knowledge constitutes equally useful evidence but that accumulation of findings across many different combinations of study designs and methodologies that are rigorously conducted, appraised, and synthesized can contribute to advancing the social work evidence base. Such a framework does demand that we convey the evidentiary status to clients and other decision makers with candor and transparency in light of the confidence or uncertainty we have in the findings, given a rigorous and systematic appraisal of the highest quality currently available evidence (Gambrill, 2006).

OPERATIONALIZING EBP IN A RESEARCH INSTITUTE Defining a model of EBP is an important first step, but the operationalization of the model into a center that involves faculty, students, and the community in a meaningful way presents a new challenge. The goal of the institute is to make research knowledge accessible to practitioners and policy makers in order to ensure that consumers of social work services and programs obtain services that are best suited to meet their needs. We are attempting to accomplish this by (a) identifying and

addressing the current priorities in the field; (b) reviewing, evaluating, and establishing a critical mass of the best available data for EBP; (c) engaging in collaborative research with community partners to further the research evidence available for practice; (d) disseminating review and research findings broadly in order to effect social policy revisions and lead to the development of effective programs and practices; and (e) training agency personnel and social work students in the process of EBP and conducting practice evaluation research. In recognition of the fact that we are unable to address all aspects of social work practice, the institute developed (and continues to develop) initiatives that build on the research and practice strengths of the faculty. Within these areas of strength, questions and issues in the field are identified and prioritized in collaboration with our network of social service agencies that work side by side with the faculty as community partners. Each of the initiatives comprises subcomponents including research centers, collaborative research partnerships, systematic reviews, and specific strategies for dissemination. Funding for each of these components has been obtained from a variety of sources including private foundations, private donors, federal government grants, grants obtained from federal research councils, provincial grants, and municipal grants, as well as research contracts from various governmental sources. Community Partner Agencies

Community agency partners of the institute bring many strengths to the partnership. To begin, they must have a stated commitment to instituting EBP in their agency and a commitment to participation in ongoing training and capacity building. Agencies must also be willing to participate in externally funded research with a faculty member in the Faculty of Social Work or in the submission of grant applications with faculty members. Also, wherever possible, agencies should have demonstrated capacity in research as evidenced by commitment of funds for research personnel (e.g., a research coordinator), for research projects, and/or management information systems, although we are cognizant that some service sectors are more likely to have these things in place than others. The first initiative of the institute, The Welfare of Children, has as its community partners child protection organizations, children’s mental health centers, children’s hospitals, and government ministries responsible for the needs of children. For the agency partners, many benefits are available. One representative of each agency becomes a status-only faculty member of the Faculty of Social Work and obtains

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Regehr et al. / OPERATIONALIZING EVIDENCE-BASED PRACTICE 413

access to University of Toronto library services with the understanding that this person will assist with database searches and dissemination of evidence-based reviews within their respective organizations. All partner agency staff members have access to EBP seminars and research methods training sessions offered by the institute. Partner agency researchers also present in these seminars as well as guest lecture in classes where there is a good fit. Finally, the agency and its staff members have the opportunity to develop and participate in research collaborations on topics of need to the field. Agency personnel either actively partner with faculty on writing grant applications or contribute to conceptualization, are named as collaborators, and, where appropriate, share authorship on articles and presentations emanating from those projects. In exchange, agency partners are expected to remain active in the work of the institute or to relinquish their membership in favor of agencies on the waiting list. At this time, we are attempting to limit agency partners in any one initiative to 15 members in order to allow for meaningful participation. A central goal of the partnerships is capacity building within the partner agencies. This is attempted through a variety of mechanisms. Collaborative research projects (described below) allow staff in agencies to work closely with team members designing and conducting research. Research seminars and workshops are provided on a variety of topics including the results of research projects and research reviews; means for conducting systematic information retrieval from databases; setting up data sets, entering data, and basic analyses in SPSS; and coding and conducting thematic analyses in NVivo. For our initial workshop, we were fortunate to have Eileen Gambrill and Leonard Gibbs conduct a day session on EBP in a computer lab, providing hands-on training for agency representatives. One of our ministry partners consults with faculty and hires doctoral students through the institute to conduct rapid evidence assessments and research that contributes to policy design. Another agency has established research chairs by which faculty members are provided with course release to write joint grants and work on collaborative primary research within the agency. In addition, master’s-level students in the faculty often engage in systematic reviews or practice-based research in their practicum settings with agency partners as part of course requirements. The design and conduct of these projects is overseen by faculty members teaching the course. It is required that students discuss and/or present their work with the agency staff at the practicum. In one course, student projects are presented in a conference format to which community colleagues are invited.

Associated Centers

Faculty members in the Faculty of Social Work have long been engaged as investigators or collaborators in other research centers. The work of these centers intersects naturally with the work of the research institute. For instance, one faculty member is the executive director of the Centre of Excellence for Child Welfare (CECW) funded by Public Health Agency Canada. The CECW has sites across Canada including the Faculty of Social Work at University of Toronto (which provides the central administrative structure), the University of Montreal, McGill University, the University of Manitoba, the Child Welfare League of Canada, and the First Nations Caring Society. The CECW coordinates large research initiatives (e.g., the Canadian Incidence Study on Child Abuse and Neglect, Trocmé et al., 2005) and smaller communitybased projects throughout the country. The ultimate goal of the CECW is to inform national and provincial policy makers on issues related to child welfare. Thus, whereas the CECW provides for national partnerships, the institute has developed local partnerships and thus can work to integrate the national and local efforts, transferring knowledge in an important area of social work practice. Another example of associated centers is through the Professional Competency Initiative. In this case, members of the institute are cross-appointed to, or work collaboratively with, the Wilson Centre for Research in Education located in the Faculty of Medicine. That center is designed to bring together professional educators, basic scientists, clinical scientists, and community health care professionals for the purpose of improving teaching, learning, and evaluation, and facilitating educational research. Thus, involvement with the Wilson Centre allows for interdisciplinary research in assessing and addressing professional competency. Systematic Reviews

Part of the institute mandate is to amass, evaluate, and disseminate research evidence relevant to current priorities in the field that have been identified in collaboration with our community partners. These are built around the identified gaps in knowledge in order to better inform the field and develop effective programs and policies. A systematic review provides a thorough appraisal and synthesis of all available research, published and unpublished, on a given topic based on a methodical, comprehensive, and transparent search followed by a rigorous analysis of methodology and findings across identified studies including, where appropriate, statistical pooling of results

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

414 RESEARCH ON SOCIAL WORK PRACTICE

across studies using meta-analytic techniques. The original model and the rigorous and peer-reviewed methodology for conducting systematic reviews was developed in clinical medicine by an international group of scientists who formed the Cochrane Collaboration with the goal of synthesizing and continually updating all the existing evidence on the efficacy of a wide range of health care interventions and establishing a coherent, easily accessible library of knowledge for the practice of EBM. Modeled after the Cochrane Collaboration, the Campbell Collaboration engages in systematic reviews to synthesize evidence on the effectiveness of interventions and policies in social work, education, and criminology. Each of the institute initiatives will generate at least one systematic review to produce high-quality evidence and inform best practices within that area. The institute has hired a systematic review specialist and has begun to support training for key personnel, involved faculty, and agency partners. Such internal expertise in the systematic review process will facilitate subsequent reviews as the most urgent practice and policy challenges are identified and where pre-existing, methodologically sound syntheses are not available to guide decision making. Collaborative Knowledge Building

We need to acknowledge that in many areas of social work practice, an extensive body of research literature does not exist. As a result, the institute aims to work with community agencies and colleagues in other universities to develop knowledge where it is currently inadequate. Faculty members affiliated with the institute join with community agencies in preparing and submitting grant applications to various funding sources. The collaborative nature of the research is highly consistent with funding priorities of both national and provincial research-granting organizations. As partnerships are already established and community agency personnel meet on a regular basis with faculty members at various meetings, seminars, and workshops, the time previously required to develop partnerships as part of the grant-writing process is largely eliminated. Institute collaborations to build knowledge in social work practice include bullying research with the Toronto school board, grant writing and research on parent engagement in children’s mental health and child welfare with institute partners in both service sectors, research on the validity of risk assessment measures with the Ministry of Children and Youth Services and Children’s Aid Societies, and a study of the transportability of an evidencebased parenting program to a range of culturally diverse

community early learning centers with a children’s mental health center partner. In addition to primary research, partnerships lead to knowledge building in other ways. Community consultation meetings discussing preliminary research findings and subsequently leading to final recommendations have been conducted. The best example of this is reflected in a report titled In Whose Best Interest? on the impact of child welfare policy in cases of domestic violence (Jenny, Alaggia, Mazzuca, & Redmond, 2006). The notion is that when practitioners, policy makers, and other stakeholders are central to the examination of results and the determination of conclusions, they will be more committed to instituting these results into policy and practice. Knowledge Transfer and Transformation

The institute aims to engage in both knowledge transfer and knowledge transformation. Knowledge transfer occurs through standard methods in academia including writing for scholarly publications and presenting at scholarly conferences. Collaborative presentations at conferences within the community facilitate local transfer of knowledge. As indicated earlier, seminars are provided for staff in partner agencies and students of the faculty on the outcomes of research projects and systematic reviews. Research reports are available on the institute Web site, and in addition, we are creating a series of two-page “fact sheets” that summarize research findings in a quick and accessible manner for students and practitioners. The ultimate aim, however, is transformation of policy and practice as a result of research evidence. To this end, faculty members in the institute provide consultations to government policy makers and community agencies on effective practices. Faculty members also serve on boards of other organizations committed to advancing the use of evidence in practice, which enhances coordination and the synergistic effects that can be achieved through broadening our reach. Illustrative, one faculty member serves on the EBPs committee of a provincial children’s mental health organization that represents more than 80 provider agencies and is responsible for accreditation standards. We also believe that truly collaborative partnerships from the outset are likely to increase the success of knowledge transfer efforts.

SUMMARY AND CONCLUSION Operationalizing EBP is an ongoing journey. The institute is evolving as we engage in dialogue and learn together with each other and our community partners. We are now ready to extend beyond our first initiative, the

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

Regehr et al. / OPERATIONALIZING EVIDENCE-BASED PRACTICE 415

Welfare of Children, to launch additional initiatives. The movement into new areas may require expanded and innovative ways of thinking about teaching and implementing EBP, as well as the ongoing assessment of what has worked and the barriers we have encountered. Furthermore, the faculty was designing a new master of social work curriculum with an emphasis on EBP at the same time as we developed the institute. As we move forward in both these realms, we need to consider how to strengthen the bridge between them and potentiate their combined impact, which should be enhanced by a new institute initiative on social work education. EBP itself is an evolving field, and the experience of others, as well as our own continued experience and lessons learned, will shape our continued journey. We anticipate ongoing challenges as we strive to attain a more inclusive vision within the institute and bring on board a range of colleagues and community collaborators. Enhancing the relevance and practical application of evidence in order to influence the practice and policy community is no small order. One of our larger challenges centers around the further development of evidence-based policy, which has perhaps been better developed in Europe than in North America. Another major challenge includes continued education efforts to reconcile the EBP model described here with the complexity of populations and issues dealt with by social work practitioners and agencies. The institute was launched with a great deal of enthusiasm and commitment from our original partners. We think the best way to sustain this momentum is to continually ensure that the work of the institute stays directly relevant to real-life practice and policy needs and that we remain faithful to the integrity of the partnership process that makes this possible. We hope that by making explicit a conceptualization of the multidimensional ecology of EBP, one that still places clients at the heart of practice and policy and is responsive to context at all levels and that underscores the role of research evidence, the institute will stimulate dialogue, increase the uptake of EBP, and contribute to its continued evolution. REFERENCES Barber, J. (2005). Research Institute for Evidence-Based Social Work [brochure]. Toronto, Canada: University of Toronto, Faculty of Social Work. Blome, W., & Steib, S. (2004). Whatever the problem, the answer is “evidence-base practice”—Or is it?” Child Welfare, 83, 611-615. Davies, P. (2004). Rapid evidence assessment: A tool for policy making. London: Government Social Research. Retrieved July 9, 2006, from http://www.gsr.gov.uk/new_research/archive/rae.asp

Djulbegovic, B., Morris, L., & Lyman, G. (2000) Evidentiary challenges to evidence-based medicine. Journal of Evaluation in Clinical Practice, 6, 99-109. Dobrow, M., Goel, V., & Upshur, R. (2004) Evidence-based health policy: Context and utilization. Social Science and Medicine, 58, 207-217. Dreyfus, D. (2004). Beyond randomized controlled trials. Current Opinions in Critical Care, 10, 574-578. Fixsen, D. L., Naoom, S. F., Blasé, K., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature (FMHI Publication #231). Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network. Fraser, M. (2003). Intervention research in social work: A basis for evidence-based practice and practice guidelines. In A. Rosen & E. K. Proctor (Eds.), Developing practice guidelines for social work intervention (pp. 17-36). New York: Columbia University Press. Gambrill, E. (1999). Evidence-based practice: An alternative to authority-based practice. Families in Society, 80, 341-350. Gambrill, E. (2001). Evaluating the quality of social work education. Journal of Social Work Education, 37, 418-429. Gambrill, E. (2003). Evidence-based practice: Sea change or the emperor’s new clothes? Journal of Social Work Education, 39, 3-23. Gambrill, E. (2006). Social work practice: A critical thinkers guide (2nd ed.). New York: Oxford University Press. Gibbs, L. E., & Gambrill, E. (2002). Evidence-base practice: Counterarguments to objections. Research on Social Work Practice, 12, 452-476. Glisson, C., & Hemmelgarn, A. L. (1998). The effects of organizational climate and interorganizational coordination on the quality and outcomes of children’s service systems. Child Abuse and Neglect, 22, 401-421. Gray, J. A. M. (2001). Evidence based healthcare (2nd ed.). New York: Churchill Livingstone. Harari, E. (2001) Whose evidence? Lessons from the philosophy of science and the epistemology of medicine. Australian and New Zealand Journal of Psychiatry, 35, 724-730. Haynes, R. B., Devereaux, P. J., & Guyatt, G. H. (2002, March/April). Clinical expertise in the era of evidence-based medicine and patient choice. [Editorial]. ACP Journal Club, 136, A11, pp. 1-7. Jenny, A., Alaggia, R., Mazzuca, J., & Redmond, M. (2006). In whose best interest: The impact of child welfare policy in cases of domestic violence (Research report). Retrieved from http://www.socialwork .utoronto.ca/index.php?section=222 Kahn, K., ter Reet, G., Popay, J., Nixon, J., & Kleijnen, J. (2001). Study quality assessment in NHS Centre for Reviews and Dissemination. In Undertaking systematic reviews of research effectiveness: CRD’s guidance for those carrying out or commissioning reviews (Section 2.5, pp. 1-20). York, Canada: University of York. Lipman, T. (2004). The doctor, his patient, and the computerized evidence-based guideline. Journal of Evaluation in Clinical Practice, 10(2) 163-176. Macdonald, G. (1999). Evidenced-based social care: Wheels off the runway? Public Money and Management, 19(1) 25-32. Myers, L., & Thyer, B. (1997). Should social work clients have the right to effective treatment? Social Work, 42, 288-98. Porta, M. (2004) Is there life after evidence-based medicine? Journal of Evaluation in Clinical Practice, 10, 147-152. Qureshi. H. (2004). Evidence in policy and practice: What kinds of research designs? Journal of Social Work, 4(1), 7-23.

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.

416 RESEARCH ON SOCIAL WORK PRACTICE Reid, W. J., & Fortune, A. E. (2003). Empirical foundations for practice guidelines in current social work knowledge. In A. Rosen & E. K. Proctor (Eds.), Developing practice guidelines for social work intervention (pp. 59-82). New York: Columbia University Press. Rogers, W. (2005). Evidence based medicine and justice: A framework for looking at the impact of EBM upon vulnerable and disadvantaged groups. Journal of Medical Ethics, 4(30), 141-145. Rosen, A. (2003). Evidence-based social work practice: Challenges and promise. Social Work Research, 27, 197-208. Rosenfeld, J. (2004). The view of evidence-based medicine from the trenches: Liberating or authoritarian? Journal of Evaluation in Clinical Practice, 10, 153-155. Rycroft-Malone, J., Seers, K., Titchen, A., Harvey, G., Kitson, A. & McCormack, B. (2004). What counts as evidence in evidencebased practice? Nursing and Health Care Management and Policy, 47(1), 81-90. Rzepnicki, T., & Briggs, H. E. (2004). Using evidence in your practice. In T. Rzepnicki & H. E. Briggs (Eds.), Using evidence in social work practice (pp. ix-xxiii). Chicago: Lyceum Books. Saari, S., & Gylling, H. (2004). Evidence based medicine guidelines: A solution to rationing or politics disguised as science? Journal of Medical Ethics, 30, 171-175. Sackett, D., Rosenberg, W., Gray, J., Haynes, R., & Richardson, W. (1996). Evidence-based medicine: What it is and what it isn’t. British Medical Journal, 321, 71-72. Sackett, D. L., Straus, S. E., Richardson, W. S., Rosenberg, W., & Haynes, R. B. (2000). Evidence-based medicine: How to practice and teach EBM (2nd ed.). New York: Churchill Livingstone.

Schoenwald, S. K., Sheidow, A. J., Letourneau, E. J., & Liao, J. G. (2003). Transportability of multisystemic therapy: Evidence for multilevel influences. Mental Health Services Research, 5, 223-239. Shlonsky, A., & Gibbs, L. (2004). Will the real evidence-based practice please step forward: Teaching the process of EBP to the helping professions. Journal of Brief Therapy and Crisis Intervention, 4(2), 137-153. Simpson, D. (2002). A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment, 4, 171-182. Stern, S. B. (1994). Wanted! Social work practice evaluation and research: All methods considered. In E. Sherman & W. J. Reid (Eds.), Qualitative research in social work (pp. 285-290). New York: Columbia University Press. Ter Meulen, R., & Dickenson, D. (2002). Into the hidden world behind evidence based medicine. Health Care Analysis, 10, 319-328. Trocmé, N., Fallon, B., MacLaurin B., Daciuk, J., Felstiner, C., Black, T., et al. (2005). Canadian Incidence Study of Reported Child Abuse and Neglect 2003: Major findings. Ottawa, Canada: Minister of Public Works and Government Services. Webb, S. (2001). Some considerations on the validity of evidencebased practice in social work. British Journal of Social Work, 31, 57-79. Witkin, S. L. (1998). The right to effective treatment and the effective treatment of rights: Rhetorical empiricism and the politics of research. Social Work, 43, 75-80. Witkin, S. L., & Harrison, W. D. (2001). Whose evidence and for what purpose? Social Work, 46, 293-296.

Downloaded from http://rsw.sagepub.com at TORONTO UNIVERSITY on May 3, 2007 © 2007 SAGE Publications. All rights reserved. Not for commercial use or unauthorized distribution.