ACAD EMERG MED
July 2005, Vol. 12, No. 7
Assessing the Suitability of Intervention Sites for Quality Improvement Studies in Emergency Departments Joshua P. Metlay, MD, PhD, Carlos A. Camargo Jr., MD, DrPH, Karen Bos, BA, Ralph Gonzales, MD, MSPH Abstract Objectives: As an initial step in disseminating an emergency department (ED)-based quality improvement program (QIP) to improve antibiotic prescribing for patients with acute respiratory infections, the authors conducted a nationwide survey to assess the value and feasibility of the QIP. Methods: Directors of EDs at 119 Veterans Administration hospitals and 160 non–Veterans Administration hospitals (identified based on the existence of accredited emergency medicine training programs and/or participation in an existing ED-based research network) were surveyed. The survey included questions on the current existence of an antibiotic QIP in the ED, enthusiasm for an antibiotic QIP program, and the existence of physical features in the ED that would support the QIP intervention. Results: Overall, 77% of ED directors reported they did not
have an existing antibiotic QIP and 84% reported they would benefit from having such a program (either new or in addition to their current program). In addition, 63% of respondents indicated that improving antibiotic prescribing was an intermediate to high priority in the ED. Forty-five percent reported that they did not have a suitable location for a key component of the intervention (an interactive computer kiosk), and 26% reported that they could not display educational posters on the walls of the examination room. Conclusions: Many EDs identify barriers to implementing an antibiotic QIP. Perceived and real barriers are important factors to consider in translating successful QIPs into routine clinical practice. Key words: emergency medical services; data collection; quality improvement. ACADEMIC EMERGENCY MEDICINE 2005; 12:667–670.
Over the past several years, there has been a substantial growth in quality improvement research.1 However, such research is often conducted in settings where the perceived value of the intervention is high and the practical barriers to implementation are low. As a result, successful quality improvement programs (QIPs) may have limited applicability to other clinical sites and may be poorly reproducible.2 Emergency departments (EDs) are an important setting for testing and disseminating QIPs.3 One area
that has been relatively ignored in these ED-based QIPs is the overuse of antibiotic drugs for the management of patients with acute respiratory tract infections. The aim of this study was to describe the perceived benefits and feasibility of an ED-based antibiotic QIP across a national sample of hospital sites, both as a first step to identify sites for a randomized trial of the antibiotic QIP and as a means of estimating the acceptability of the program across a broad range of potential sites.
From the Center for Health Equity Research and Promotion, VA Medical Center and University of Pennsylvania School of Medicine (JPM), Philadelphia, PA; EMNet Coordinating Center, Department of Emergency Medicine, Massachusetts General Hospital (CAC, KB), Boston, MA; and Division of General Internal Medicine, University of California, San Francisco (RG), San Francisco, CA. Received November 22, 2004; revisions received January 5, 2005, and January 19, 2005; accepted January 19, 2005. Presented in part at the Agency for Healthcare Research and Quality national meeting, Translating Research into Practice, Washington, DC, July 2004. Supported by grants R01 HS13915 from the Agency for Healthcare Research and Quality and AVA-03-239 from the Health Services Research and Development Service of the Department of Veterans Affairs. Dr. Metlay is supported through an Advanced Research Career Development Award from the VA. Address for correspondence and reprints: Joshua P. Metlay, MD, PhD, 712 Blockley Hall, 423 Guardian Drive, Philadelphia, PA 19104. Fax: 215-573-0198; e-mail: [email protected]
METHODS Study Design. We conducted a cross-sectional study of hospital EDs and urgent care centers in August 2003. This study was approved by the institutional review board at each of the authors’ institutions. Survey Content and Administration. Our questionnaire was developed to measure two domains: 1) the perceived benefit of an antibiotic QIP improvement program based within the ED and 2) the feasibility of implementing a specific antibiotic QIP in the ED. For the first domain, we included questions on the current existence of any antibiotic QIP activity in the ED, the relative importance of improving antibiotic use for the hospital ED, and the perceived benefit of an antibiotic
Metlay et al.
QIP for the ED. To frame the latter question, we provided a link on the survey site to a recent publication describing a multidimensional patient and physician intervention designed to reduce antibiotic overuse in urgent care settings.4 However, we did not specifically ask the respondents to indicate their enthusiasm for our proposed QIP; rather, we asked them to indicate their enthusiasm for an antibiotic QIP in general. For the questions on feasibility of the antibiotic QIP, we included specific questions regarding the ability to display posters on examination room walls and the availability of a semiprivate place for a computer kiosk in or near the waiting room. The question on the availability of a semiprivate place for a computer kiosk was included because the antibiotic QIP included an interactive computer program that provides patient education on antibiotics through both displayed messages and dialogue. The actual wording of each question is included in Table 1.
SELECTING SITES FOR QI STUDIES
We surveyed directors of hospital EDs from two groups. First, we identified all 133 Veterans Administration (VA) medical centers, excluding satellite clinics and nursing homes, through manual review of the Federal Register of Medical Facilities.5 At each site, we contacted the chief of staff’s office to identify the staff physician responsible for directing the ED/ urgent care center. Fourteen VA hospitals indicated that they did not have an ED or urgent care center, leaving us a total of 119 VA hospitals. Second, we identified all non-VA hospitals that participate in an existing national research network of EDs, the Emergency Medicine Network (EMNet). We used this network because it provides access to a large national sample of academic EDs and the network would ultimately facilitate the performance of a multisite trial. There were 120 hospitals in the network at the time of the survey. The full list of participating sites can be found at http://www.
TABLE 1. Perceived Benefit and Feasibility of an Antibiotic QIP at VA and Non-VA Hospitals EMNet Hospitals (n = 132) Survey Item Perceived benefit Do you currently have an antibiotic QIP in place in your ED? Yes Do you think your ED would benefit from an antibiotic QIP (either new or in addition to what you currently have in place)? Yes If yes, how much improvement do you think is needed? Small amount Intermediate amount Large amount Perceived feasibility Do you think your ED would be capable and willing to place posters on examination room walls as part of an antibiotic QIP? Yes Does your ED have a semiprivate place for a computer kiosk (approximately 2 feet wide, 2 feet deep, and 4 feet high) in or near the waiting room area? Yes Does your ED have a separate area or zone within or near the ED (e.g., ‘‘fast track’’ area) where nonurgent patients with acute respiratory infections can be evaluated and treated? Yes Where do you think improving antibiotic use ranks on the list of priorities facing your ED at the present time? Low Intermediate High
VA Hospitals (n = 94)
14 68 18
8, 27 59, 77 11, 26
25 69 6
16, 36 58, 79 2, 14
50 45 5
41, 59 36, 54 2, 11
30 53 17
21, 40 43, 64 10, 26
Missing data accounted for ,5% of respondents for each survey item. *Chi-square test for binary responses or Mantel–Haenszel chi-square test for trend for ordinal responses. EMNet = Emergency Medicine Network; QIP = quality improvement program.
ACAD EMERG MED
July 2005, Vol. 12, No. 7
emnet-usa.org. In addition, we also included all other non-VA hospitals with accredited emergency medicine residency programs. The total number of hospitals with accredited emergency medicine residency programs was 121, of which 67% (n = 81) were already identified in EMNet. Thus, an additional 40 hospitals were added to our final list of non-VA hospitals, which we refer to as EMNet hospitals, for a total sample size of 160 hospitals. We compiled e-mail addresses for all of the directors of the EDs/urgent care centers. We e-mailed each director and provided a link embedded within the e-mail to a Web-based survey instrument. Respondents were asked to record a preassigned facility number on the Web tool so that we could track respondents versus nonrespondents. We sent up to three e-mails to nonrespondents and then attempted to contact directors by telephone. For those respondents contacted by telephone, the survey could be completed over the telephone or the respondent could enter responses on the Web-based survey tool. Data Analysis. For the primary survey results, we calculated the percentage of sites providing each response to each item on the survey, both overall and stratifying by VA and EMNet sites. We calculated the 95% binomial confidence limits for each set of responses. We tested the difference between VA and EMNet hospitals using the chi-square test for binary response categories and the Mantel–Haenszel chisquare test for trend for ordinal response categories.
RESULTS We received a total of 132 responses from the 160 EMNet hospitals surveyed (83%) and a total of 94 responses from the 119 VA hospitals surveyed (79%). Among the EMNet sites, 77% of participating hospitals versus 70% of nonparticipating hospitals had accredited emergency medicine residency programs (p = 0.4 by x2 test). Among the VA sites, 97% of participating sites versus 96% of nonparticipating sites had academic affiliations (p = 0.84 by x2 test). Table 1 summarizes the survey results. Only 21% of EMNet hospitals and 27% of VA hospitals reported that they had an antibiotic QIP in place in their EDs. However, 50% of EMNet hospital ED directors and 70% of VA hospital ED directors reported that improving antibiotic prescribing was either an intermediate or high priority for their site (p , 0.001 by Mantel–Haenszel x2 test for trend). Eighty-six percent of EMNet sites and 82% of VA sites indicated that their site would benefit from an antibiotic QIP, but this difference was not statistically significant (p = 0.34 by x2 test). In terms of feasibility measures, 65% of EMNet hospitals and 87% of VA hospitals indicated that they would and could display posters in examination
rooms as part of the antibiotic QIP (p , 0.001 by x2 test). In contrast, 64% of EMNet hospitals and 42% of VA hospitals responded that they had a semiprivate place for a computer kiosk as part of the antibiotic QIP (p = 0.002 by x2 test).
DISCUSSION Sites that self-select to participate in quality improvement research activities may be atypical from most health care sites and therefore may experience results that will not be realized when the study results are disseminated more widely. As a prelude to a multihospital cluster-randomized trial of an antibiotic QIP, we conducted a nationwide survey to assess perceived benefit and feasibility of the intervention. In terms of perceived benefit, the majority of sites indicated that they did not have currently active antibiotic QIPs, believed antibiotic improvement was an intermediate to high priority, and strongly indicated interest in adopting an antibiotic QIP. The perceived benefit was relatively similar across VA and EMNet sites. In contrast, perceived feasibility of the intervention was more heterogeneous, with VA hospitals less likely to have the physical ED layout required for a computer kiosk intervention but more likely to endorse placing informational posters in examination rooms. VA and non-VA hospital EDs differ in several fundamental ways. Most VA hospital EDs operate in a manner more similar to urgent care centers, typically not handling major trauma or medical emergencies. Similarly, the staffing of VA hospital EDs is often more closely aligned with primary care services than non-VA hospital EDs. Given the small size of many VA hospital EDs, it is not surprising that the physical options for implementing a kiosk-based patient education program are more limited in VA sites.
LIMITATIONS One limitation of our study is that we measured the perceived benefits and feasibility of implementing our antibiotic QIP based on the single survey responses of the director or senior physician at each hospital ED site. Clearly, the priority estimates of these respondents might not accurately reflect the priority estimates of other clinical staff and administrative leaders. Nonetheless, these persons are usually opinion leaders in their respective clinical settings. In addition, the survey asked respondents whether they thought the ED would benefit from an antibiotic QIP; while we provided information on a specific antibiotic QIP, it is possible that respondents indicated their enthusiasm for an antibiotic QIP in general, not the specific QIP we are planning to test. Indeed, the QIP that we plan to test was developed and evaluated in primary care settings. It is possible that the unique features of
Metlay et al.
ambulatory care in emergency settings will require a different set of interventions to improve antibiotic prescribing for acute respiratory infections. It is interesting to note that even though the majority of sites indicated that improving antibiotic prescribing was an intermediate to high priority, only a minority of sites reported actually having an antibiotic QIP in place. Clearly, unique barriers may exist that inhibit the adoption of QIPs developed in primary care settings into ED settings. Results of the ongoing trial will specifically address this issue.
CONCLUSIONS Most major VA and non-VA hospital ED directors reported that they would benefit from the implementation of an antibiotic QIP. The feasibility of specific elements of the QIP varied across sites and limits the true pool of eligible sites for the intervention. Future research will determine the actual impact of imple-
SELECTING SITES FOR QI STUDIES
menting this antibiotic QIP in a sample of eligible hospital EDs. The authors thank Anee Lee at the Center for Health Equity Research and Promotion at the VA Medical Center in Philadelphia for help administering the survey.
References 1. Chassin MR. Quality of health care. Part 3: improving the quality of care. N Engl J Med. 1996; 335:1060–3. 2. Mittman BS. Creating the evidence base for quality improvement collaboratives. Ann Intern Med. 2004; 140: 897–901. 3. Burstin HR, Conn A, Setnik G, et al. Benchmarking and quality improvement: the Harvard Emergency Department Quality Study. Am J Med. 1999; 107:437–49. 4. Harris RH, MacKenzie TD, Leeman-Castillo B, et al. Optimizing antibiotic prescribing for acute respiratory tract infections in an urban urgent care clinic. J Gen Intern Med. 2003; 18:326–34. 5. US Medicine Directory 2004: Federal Medical Treatment Facilities. Vol 39. Washington, DC: U.S. Medicine, Inc., 2003.