Anatomic Pathology Databases and Patient Safety - Archives of ...

4 downloads 20466 Views 320KB Size Report
errors database to establish benchmarks for error frequen- cy. The database ... of Pittsburgh Medical Center, Shadyside Hospital, 5150 Centre Ave,. Pittsburgh, PA ..... 5. Kizer KW. Patient safety: a call to action: a consensus statement from the.
Anatomic Pathology Databases and Patient Safety Stephen S. Raab, MD; Dana M. Grzybicki, MD, PhD; Richard J. Zarbo, MD, DMD; Frederick A. Meier, MDCM; Stanley J. Geyer, MD; Chris Jensen, MD

● Context.—The utility of anatomic pathology discrepancies has not been rigorously studied. Objective.—To outline how databases may be used to study anatomic pathology patient safety. Design.—The Agency for Healthcare Research and Quality funded the creation of a national anatomic pathology errors database to establish benchmarks for error frequency. The database is used to track more frequent errors and errors that result in more serious harm, in order to design quality improvement interventions intended to reduce these types of errors. In the first year of funding, 4 institutions (University of Pittsburgh, Henry Ford Hospital, University of Iowa, and Western Pennsylvania Hospital) reported cytologic-histologic correlation error data after standardizing correlation methods. Root cause analysis was performed to determine sources of error, and error reduction plans were implemented. Participants.—Four institutions self-reported anatomic pathology error data. Main Outcome Measures.—Frequency of cytologic-histologic correlation error, case type, cause of error (sam-

pling or interpretation), and effect of error on patient outcome (ie, no harm, near miss, and harm). Results.—The institutional gynecologic cytologic-histologic correlation error frequency ranged from 0.17% to 0.63%, using the denominator of all Papanicolaou tests. Based on the nongynecologic cytologic-histologic correlation data, the specimen sites with the highest discrepancy frequency (by project site) were lung (ranging from 16.5% to 62.3% of all errors) and urinary bladder (ranging from 4.4% to 25.0%). Most errors detected by the gynecologic cytologic-histologic correlation process were no-harm events (ranging from 10.7% to 43.2% by project site). Root cause analysis identified sources of error on both the clinical and pathology sides of the process, and error intervention programs are currently being implemented to improve patient safety. Conclusions.—A multi-institutional anatomic pathology error database may be used to benchmark practices and target specific high-frequency errors or errors with high clinical impact. These error reduction programs have national import. (Arch Pathol Lab Med. 2005;129:1246–1251)

I

administered) compared to studying other error types, such as diagnostic errors.8–10 We define anatomic pathology diagnostic error as the assignment of a pathologic diagnosis that does not represent the true nature of disease (or nondisease) in a patient.11 Diagnostic errors detected in a pathology laboratory occur as a result of errors in the pathology or in the clinical medicine process. Anatomic pathologists may make 2 major types of error, namely, interpretation errors and reporting errors (including typographic errors and mistakes in specimen description or clinical comments).12 Clinical medicine errors detected in the pathology laboratory may be secondary to sampling, in which the tissue that exhibits the features of a disease process is not obtained. Pathology errors are detected through several methods, and the pathology community has not reached consensus on the optimal methods for error detection.13 Most methods of error detection include a form of secondary review by 1 or more pathologists. These methods include review of all specimens or a subgroup of specimens,14,15 review of a fixed percentage of cases,16–25 review of all discrepancies,26 review of cases presented at conferences,27 and review of cases through consultative services.28–30 Most of the error data reported in the pathology literature are from studies of cases reviewed after sign-out,31 although most

n 1999, the Institute of Medicine reported on the scope of medical error.1 The Institute of Medicine defined medical error to be a failure of a planned action to be completed as intended or the use of the wrong plan to achieve an aim.1 Medical errors are found at all levels of patient care, and the Institute of Medicine reported on means of reducing error and improving patient safety.2–7 Researchers have directed more of an effort studying some error types, such as action-related errors (eg, medication errors in which a wrong dose or a wrong drug is

Accepted for publication May 17, 2005. From the Department of Pathology and Laboratory Medicine, University of Pittsburgh Medical Center/Shadyside, Pittsburgh, Pa (Drs Raab and Grzybicki); Department of Pathology, Henry Ford Health System, Detroit, Mich (Drs Zarbo and Meier); Department of Pathology, Western Pennsylvania Hospital, Pittsburgh (Dr Geyer); and Department of Pathology, University of Iowa Hospitals and Clinics, Iowa City (Dr Jensen). Presented at the College of American Pathologists Special Topic Symposium, Error in Pathology and Laboratory Medicine: Practical Lessons for the Pathologist, Phoenix, Ariz, September 20, 2004. The authors have no relevant financial interest in the products or companies described in this article. Reprints: Stephen S. Raab, MD, Department of Pathology, University of Pittsburgh Medical Center, Shadyside Hospital, 5150 Centre Ave, Pittsburgh, PA 15232 (e-mail: [email protected]). 1246 Arch Pathol Lab Med—Vol 129, October 2005

Anatomic Pathology Databases and Safety—Raab et al

anatomic pathology laboratories perform some form of secondary review before sign-out. Little is known about the benefits of review before sign-out. In general, pathology error analysis to date simply has documented errors without using methods to control and limit errors and improve patient outcomes. Anatomic Pathology Error Percentages Most anatomic pathology diagnostic error studies are limited to single institutions and contain relatively few patients and errors. Institutional anatomic pathology error percentages are highly variable, ranging from less than 1% to 43% of all patient specimens (median, between 1% and 5%).13–23,26–29,32–48 These studies indicate that the greatest overall type of error is the false-negative error, and cytology errors are more common than surgical pathology errors.43 Clary et al26 reported that gynecologic discrepancies occurred in 0.87% and 7.7% of all cytologic and histologic specimens, respectively, and nongynecologic discrepancies occurred in 2.3% and 0.45% of all cytologic and histologic specimens, respectively. Of the gynecologic discrepancies, 67% and 33% were a result of interpretive and sampling error, respectively.26 A confounding element in error frequency analysis is interobserver diagnostic variability (even among expert pathologists).49–55 A limitation in anatomic pathology diagnostic error analysis is the lack of a significantly large database from which researchers may draw confident inferences regarding error types and percentages and the effect of errors on clinical outcomes. Web-Based Systems for Storing and Analyzing Medical Errors Patient safety experts have reported that computer systems designed for the large-scale collection, analysis, and reporting of data are necessary for error reduction.56–59 The Minnesota Society of Thoracic Surgeons database exemplifies the utility of such a system.60 The primary goal behind the establishment (in 1993) and use of this database was local self-improvement and quality assurance. The society developed a physician-driven, standardized, computerized process for the collection of individual provider and institutional performance measures and created a risk-adjusted statewide database. This database has been used for regional benchmarking and to benchmark local and regional patient outcomes. A status report regarding this database published in 1999 described an increase in numbers of registered cases from 7000 to greater than 1 million.61 Over the years, a progressive decrease in operative mortality and length of stay has been observed. MATERIALS AND METHODS To more thoroughly study and decrease the clinical impact of anatomic pathology errors, in September 2002 the Agency for Healthcare Research and Quality (Rockville, Md) began funding a 5-year anatomic pathology patient safety project, entitled Improving Pathology Practice by Examining Pathology Errors.11 The participating institutions are the University of Pittsburgh Medical Center (Pittsburgh, Pa), Western Pennsylvania Hospital (Pittsburgh, Pa), the University of Iowa Hospitals and Clinics (Iowa City), and the Henry Ford Hospital System (Detroit, Mich). These project sites proposed the following specific aims: Specific Aim 1.—To use a Web-based database to collect diagnostic errors detected by cytologic-histologic correlation and by second-pathologist review of specific cases. Specific Aim 2.—To quantitatively analyze the collected error Arch Pathol Lab Med—Vol 129, October 2005

data and generate quality performance reports useful for institutional quality improvement programs. Specific Aim 3.—To plan and implement interventions to reduce errors and improve clinical outcomes, based on information derived from root cause analysis of diagnostic errors. Specific Aim 4.—To assess the success of implemented interventions by quantitative measure of postinterventional errors and clinical outcomes and by qualitative assessment by project participants. This report details aspects of this Agency for Healthcare Research and Quality project, including data collection, data analysis, and root cause analysis. Currently, this project is in its third year of funding. The first generation of the database was completed in 2003, and all project sites may now enter deidentified error data directly through the Web. The database has been constructed so that any new type of error (eg, frozen section–permanent section correlation error) may be added. The database allows for the collection of denominator data so that error frequencies may be calculated. Additional project sites that join this project may benchmark themselves against other project sites with similar demographic characteristics (number of specimens, practice type, etc).

Data Collection The data collection process initiates with the identification of specific types of error. In the first year of the project, institutions collected errors identified by the cytologic-histologic correlation process. In the second year, errors identified through conference review were collected. Currently, in the third year, errors identified by review of amended reports are being collected. For cytologic-histologic correlation errors, institutions record prospective cases and have added previously collected cases from their files dating back to September 1997. Thus, in 2005, the database houses 9 years of cytologic-histologic correlation data. Following the identification of a case that contains an error, institutions record data points on patient (eg, age and sex), specimen (eg, organ system and type), and provider (eg, clinician, cytotechnologist, or pathologist characteristics). All case-identifying characteristics are removed prior to entry into the database. For all errors, clinical outcome data (eg, mortality, delays in diagnosis, unnecessary additional tests, and complications of unnecessary tests) and assessment of the clinical impact of error are determined. At the initiation of the project, the methods of data collection were standardized across all sites. For example, the method of cytologic-histologic correlation adopted by all sites included identifying cases in which a cytology specimen and a surgical pathology specimen were obtained within 6 months of each other. Cases with a step difference of 2 (between the cytologic and histologic diagnosis) were determined to be discrepant cases for this project. Table 1 shows the categories of nongynecologic and gynecologic cytologic and histologic diagnoses and the steps assigned to these diagnoses. For example, a diagnosis of ‘‘atypical’’ made on a bronchial washing specimen and a diagnosis of ‘‘non– small cell carcinoma’’ made on a bronchial biopsy specimen are considered discrepant; a diagnosis of ‘‘atypical’’ on a liver aspiration specimen and a diagnosis of ‘‘cirrhosis’’ on a liver biopsy specimen are not discrepant. The University of Pittsburgh Medical Center project site served as the arbitrator if classification uncertainty existed at the other project sites. The clinical impact of error was classified according to traditional taxonomy schemes.62–64 An error that did not affect patient care was classified as a no-harm event (eg, a typographic error in which the word breast was spelled as breastt). If an intervention occurred after an error was made, and that intervention prevented harm from occurring, that error was classified as a near-miss event (eg, a bronchial washing specimen is diagnosed as benign, and the clinician performs a lobectomy 3 days later and the diagnosis is non–small cell carcinoma). A significant event was an error that affected patient outcome; a significant event was further classified as causing mild (eg, an error leading to an additional noninvasive diagnostic procedure), moderate (eg, an error Anatomic Pathology Databases and Safety—Raab et al 1247

Table 1. Step Differences in Nongynecologic and Gynecologic Diagnoses Nongynecologic Cytology and Histology Diagnoses

Benign Atypical Suspicious Malignant

Step No.

Gynecologic Cytology Diagnoses

Gynecologic Histology Diagnoses

1 2 3 4 5

Within normal limits Atypical squamous (ASC-US) or glandular cells Low-grade squamous intraepithelial lesion High-grade squamous intraepithelial lesion Carcinoma

Benign No equivalent Cervical intraepithelial neoplasia I Cervical intraepithelial neoplasia II or III Carcinoma

Table 2. Error Frequency by Project Site Discrepancies

Gynecologic Cytology Cases

Nongynecologic Cytology Cases

Site

Gynecologic

Nongynecologic

Total Case Volume

% Discrepant

Total Case Volume

% Discrepant

A B C D

139 103 430 18

74 100 477 23

22 325 72 641 118 952 10 379

0.63 0.14 0.36 0.17

5228 8463 8746 1363

1.42 1.18 5.45 1.69

leading to an additional invasive diagnostic procedure), or severe harm (eg, an error leading to loss of life or limb, or long-lasting morbidity secondary to an unnecessary diagnostic test). Data from the calendar year of 2002 were chosen for detailed analysis and some of these data are presented below.

RESULTS Table 2 shows the frequency of cytologic-histologic correlation error (also known as a discrepancy) by deidentified project site and subclassified into gynecologic and nongynecologic specimen types. Gynecologic correlation cases are from patients who had a Papanicolaou (Pap) test and a procedure that generated histologic tissue (eg, cervical or vaginal biopsy, endocervical or endometrial curettage, cone biopsy, or hysterectomy). Nongynecologic correlation cases are from patients who had a procedure that generated a cytology specimen (eg, brush, wash, scrape, discharge, or fine-needle aspiration) and a procedure that generated a histologic specimen (eg, surgical biopsy or excision) from other anatomic sites (generally excluding the gynecologic tract). The data in Table 2 show a varied discrepancy frequency by project site. Project site A had the highest gynecologic discrepancy frequency, and project site C had the highest nongynecologic discrepancy frequency. Note that the denominator is the number of all cases and not the number of correlating specimens. If the discrepancy frequency is calculated using the denominator as the number of correlating specimens, the frequency, in some instances, is 10 times higher. Although discrepancy frequencies vary markedly across project sites, this variation has a number of causes, including reporting and recording differences. Table 3 shows the percentage of nongynecologic cytologic-histologic correlation discrepancies by body site. For example, for project site C, 0.8% of all nongynecologic discrepancies were associated with noncorrelating cytologic and histologic bile duct specimens, and 1.3% of all nongynecologic discrepancies were associated with noncorrelating cytologic and histologic small bowel specimens. The lung and urinary bladder were the most common body sites for discrepancies across all project sites. In addition, at each project site, these particular cytologic specimens were high volume. In addition to measuring the frequency of discrepancy, we also measured the clinical impact of discrepancy by 1248 Arch Pathol Lab Med—Vol 129, October 2005

Table 3. Percentage of All Discrepancies by Specific Specimen Site Percentage of All Discrepancies* Specimen Site

Site A Site B Site C Site D (n 5 74) (n 5 100) (n 5 477) (n 5 23)

Bile duct 0 4.0 Bowel, small 0 0 Brain 0 6.0 Kidney 0 0 Lung 62.3 47.0 Lymph node 4.1 1.0 Ovary 0 1.0 Pleura 5.4 0 Thyroid gland 0 1.0 Urinary bladder 20.3 25.0 * n indicates number of discrepancies.

0.8 1.3 0.4 4.4 16.5 3.8 12.8 4.8 5.5 15.7

1.3 0 0 8.7 52.1 13.0 9.2 0 8.7 4.4

performing chart reviews. The clinical assessment of the impact of discrepancies varied considerably from project site to project site. For example, for gynecologic specimens, the percentage of cytologic-histologic discrepancies that were no-harm events varied from 10.7% to 43.2% by project site. The percentage of cytologic-histologic gynecologic discrepancies that resulted in severe harm ranged from 0% to 5.8% by project site. Root Cause Analysis Root cause analysis was used to determine the causes of error. Root cause analysis is a qualitative method that focuses on discovering the underlying system that enables errors to occur. Two root cause analysis methods are being used. The first is a modification of the medical version of the Eindhoven Classification model for the Medical Event Reporting System for Transfusion Medicine.65–67 The Eindhoven Classification method focuses on 3 main causes (technical, organizational, and human) and involves construction of a causal tree that visually represents the factors, activities, and decisions leading to the diagnostic error. Root cause analysis focuses on system and human factors involved in the routine collection, processing, and examination of cytology and surgical pathology specimens. Using a classification system and codes for root causes of error,65–67 a causal tree shows that latent errors result Anatomic Pathology Databases and Safety—Raab et al

Lean production A3 diagram of urinary tract specimen processing.

from underlying system failures involving protocols, procedures, and management priorities, and that active errors result from human knowledge-based behaviors, rulebased behaviors involving lack of coordination, and skillbased behaviors. Other patient-related factors (beyond the control of the health care team) are involved in some of these errors. The second root cause analysis method is used in lean production process investigation of error.68,69 The lean production method focuses on creating a work process that leads to improvements in quality and efficiency, and in simultaneously reducing costs. One aspect of the quality improvement process is reducing errors. In the lean production system, an ideal working environment functions as a continuous-flow, one-by-one working process. The ideal state may be constructed by using 4 work design principles (also known as rules in use). Pathway Rule.—Product, information, or service is given its value over simple, predefined, direct pathways without loops or forks. Connection Rule.—The handoff between people is highly specified, direct, and binary with an unambiguous way of making requests and providing responses. Activity Rule.—Work is highly specified as to timing, content, sequence, and location. Improvement Rule.—When a problem occurs, that problem is fixed toward an ideal and is close in time, place, and person to the occurrence of the problem. Results of the statistical analyses performed on error data guided the choice of which specific errors to examine using root cause analysis. Errors of high frequency and/ Arch Pathol Lab Med—Vol 129, October 2005

or high severity were targeted for error reduction. Thus, errors in pulmonary and urinary bladder specimens were targeted at all project sites, and error reduction plans were developed. The Figure shows an example of a lean production A3 diagram, outlining the processes involved in the processing of urinary tract cytology specimens. A3 diagrams may be drawn ‘‘on the shop floor’’ and illustrate the processes on a real-time basis. Similar A3 diagrams may be depicted for the collection and sign-out of these specimens. The lines between the individuals represent connections. The text or processing steps enclosed in the jagged shapes are called ‘‘hairy clouds’’ and are possible sources of error. Based on this diagram, these hairy clouds may be targeted for error reduction. Hairy clouds may be observed in specimen accessioning, processing, screening, sign-out, and reporting. For example, cytology laboratory processes that could be changed to reduce errors included increasing the number of cytospin slides to decrease the number of falsenegative diagnoses, derivation of adequacy checklists to quantify ‘‘good’’ preparatory technique, recording clinical history (including specimen type) with each case, and repreparing all cases that a cytotechnologist diagnoses as atypical or suspicious. The error reduction interventions that were attempted in the second year of the project included (1) double-viewing of all bronchial washing and brushing specimens to decrease the number of interpretive errors; (2) decreasing sampling error in Pap test procurement by applying lean production methods in the gynecologist’s office to create Anatomic Pathology Databases and Safety—Raab et al 1249

a one-by-one, continuous-flow process; (3) creating a standardized reporting process for cervical biopsy specimens that automatically incorporates correlating Pap test results and recommendations; (4) creating adequacy terminology for thyroid gland fine-needle aspiration cytology to decrease false-negative diagnoses that are made on less than optimal specimens; (5) profiling Pap test providers to determine frequency of unsatisfactory Pap tests and providing feedback to these providers; (6) measuring the utility of obtaining Pap tests at the same time as cervical biopsies and/or endocervical curettings at the time of colposcopy; and (7) use of cell block specimens in pulmonary cytology specimens to decrease false-negative diagnoses. These error reduction interventions generally were aimed at either the pathology component (interpretation) or the clinical medicine component (sampling). The data from these interventions currently are being collected and analyzed, and the successes and failures of these interventions will be published. COMMENT This 5-year Agency for Healthcare Research and Quality–funded anatomic pathology error project has benchmarked specific anatomic pathology error frequencies and types of error across multiple institutions. Currently, error reduction interventions are being performed. We have determined baseline error frequencies based on the cytologic-histologic correlation detection method. Error frequencies vary by anatomic site and vary in terms of the clinical harm. We have used root cause analysis to identify sources of error and have designed error reduction initiatives in an attempt to decrease error. In the future, we hope to add additional sites interested in devising real-time patient safety initiatives. This project was funded by the Agency for Healthcare Research and Quality, Rockville, Md. References 1. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press; 1999. 2. Becher EC, Chassin MR. Improving the quality of health care: who will lead? Health Aff (Millwood). 2001;20:164–179. 3. Ioannidis JPA, Lau J. Evidence on interventions to reduce medical errors: an overview and recommendations for future research. J Gen Intern Med. 2001;16: 325–334. 4. Joint Commission on Accreditation of Healthcare Organizations. Sentinel events: approaches to error reduction and prevention. J Qual Improv. 1998;24: 175–186. 5. Kizer KW. Patient safety: a call to action: a consensus statement from the National Quality Forum. Med Gen Med. 2001;3:10. 6. Meyer G, Foster N, Christrup S, Eisenberg J. Setting a research agenda for medical errors and patient safety. Health Services Res. April 2001;36(1 pt 1):x–xx. 7. Meyer G, Lewin DI, Eisenberg JM. To err is preventable: medical errors and academic medicine. Am J Med. 2001;110:597–603. 8. Leape LL, Kabcenell AI, Gandhi TK, Carver P, Nolan TW, Berwick DM. Reducing adverse drug events: lessons from a breakthrough series collaborative. Jt Comm J Qual Improv. 2000;26:321–331. 9. Silver MP, Antonow JA. Reducing medication errors in hospitals: a peer review organization collaboration. Jt Comm J Qual Improv. 2000;26:332–340. 10. Stump LS. Re-engineering the medication error-reporting process: removing the blame and improving the system. Am J Health Syst Pharm. 2000;57(suppl 4):S10–S17. 11. Raab SS. Improving patient safety by examining pathology errors. Clin Lab Med. 2004;24:849–863. 12. Goldstein NS. Diagnostic errors in surgical pathology. Clin Lab Med. 1999; 19:743–756. 13. Ramsay AD. Errors in histopathology reporting: detection and avoidance. Histopathology. 1999;34:481–490. 14. Hahm GK, Niemann TH, Lucas JG, Frankel WL. The value of second opinion in gastrointestinal and liver pathology. Arch Pathol Lab Med. 2001;125:736– 739. 15. Safrin RE, Bark CJ. Surgical pathology signout: routine review of every case by a second pathologist. Am J Surg Pathol. 1993;17:1190–1192.

1250 Arch Pathol Lab Med—Vol 129, October 2005

16. Cree IA, Guthrie W, Anderson JM, et al. Departmental audit in histopathology. Pathol Res Pract. 1993;189:453–457. 17. Hocking GR, Niteckis N, Cairns BJ, Hayman JA. Departmental audit in surgical anatomical pathology. Pathology. 1997;29:418–421. 18. Ramsay AD, Gallagher PJ. Local audit of surgical pathology: 18 month’s experience of peer-review–based quality assessment in an English teaching hospital. Am J Surg Pathol. 1992;16:476–482. 19. Whitehead ME, Fitzwater JE, Lindley SK, Kern SB, Ulirsch RC. Winecoff WF 3rd. Quality assurance of histopathologic diagnoses: a prospective audit of three thousand cases. Am J Clin Pathol. 1984;81:487–491. 20. Whitehead ME, Grieve JHK, Payne MJ, et al. Quality assurance of histopathologic diagnosis in the British army: role of the Army Histopathology Registry in completed case review. J R Army Med Corp. 1986;132:71–75. 21. Zuk JA, Kenyon WE, Myskow MW. Audit in histopathology: description of an internal quality assessment scheme with analysis of preliminary results. J Clin Pathol. 1991;44:10–15. 22. Lind AC, Bewtra C, Healy JC, Sims KL. Prospective peer review in surgical pathology. Am J Clin Pathol. 1995;104:560–566. 23. Chan YM, Cheung AN, Cheng DK, Ng TY, Ngan HY, Wong LC. Pathology slide review in gynecologic oncology: routine or selective? Gynecol Oncol. 1999; 75:267–271. 24. Murphy WM, Rivera-Ramirez I, Luciani LG, Wajsman Z. Second opinion of anatomical pathology: a complex issue not easily reduced to matters of right and wrong. J Urol. 2001;165:1957–1959. 25. Wurzer JC, Al-Saleem TI, Hanlon AL, Freedman GM, Patchefsky A, Hanks GE. Histopathologic review of prostate biopsies from patients referred to a comprehensive cancer center. Cancer. 1998;83:753–759. 26. Clary KM, Silverman JF, Liu Y, et al. Cytohistologic discrepancies: a means to improve pathology practice and patient outcomes. Am J Clin Pathol. 2002; 117:567–573. 27. McBroom HM, Ramsay AD. The clinicopathological meeting: a means of auditing performance. Am J Surg Pathol. 1993;17:75–80. 28. Aldape K, Simmons ML, Davis RL, et al. Discrepancies in diagnoses of neuroepithelial neoplasms. Cancer. 2000;88:2342–2349. 29. Arbiser ZK, Folpe AL, Weiss SW. Consultative (expert) second opinions in soft tissue pathology: analysis of problem-prone diagnostic situations. Am J Clin Pathol. 2001;116:473–476. 30. Gupta D, Layfield LJ. Prevalence of inter-institutional anatomic pathology slide review: a survey of current practice. Am J Surg Pathol. 2000;24:280–284. 31. Zarbo RJ. Monitoring anatomic pathology practice through quality assurance measures. Clin Lab Med. 1999;19:713–742. 32. Adad SJ, Souza MA, Etchebehere RM, Saldanha JC, Falco VA, Murta EF. Cyto-histological correlation of 219 patients submitted to surgical treatment due to diagnosis of cervical intraepithelial neoplasia. Sao Paulo Med J. 1999;117:81– 84. 33. Bakhos R, Selvaggi SM, DeJong S, et al. Fine-needle aspiration of the thyroid: rate and causes of cytohistopathologic discordance. Diagn Cytopathol. 2000;23:233–237. 34. Bruner JM, Inouye L, Fuller GN, Langford LA. Diagnostic discrepancies and their clinical impact in a neuropathology referral practice. Cancer. 1997;79: 796–803. 35. Furness PN, Lauder I. A questionnaire-based survey of errors in diagnostic histopathology throughout the United Kingdom. J Clin Pathol. 1997;50:457–460. 36. Jorda M, Rey L, Hanly A, Ganjei Azar P. Fine-needle aspiration cytology of bone: accuracy and pitfalls of cytodiagnosis. Cancer. 2000;90:47–54. 37. Labbe S, Petitjean A. False negatives and quality assurance in cervicouterine cytology. Ann Pathol. 1999;19:457–462. 38. Lerma E, Fumanal V, Carreras A, Esteva E, Prat J. Undetected invasive lobular carcinoma of the breast: review of false negative smears. Diagn Cytopathol. 2000;23:303–307. 39. Logrono R, Kurtycz DF, Molina CP, Trivedi VA, Wong JY, Block KP. Analysis of false-negative diagnoses on endoscopic brush cytology of biliary and pancreatic duct strictures: the experience at 2 university hospitals. Arch Pathol Lab Med. 2000;124:387–392. 40. MacDonald L, Yazdi HM. Fine needle aspiration biopsy of Hashimoto’s thyroiditis: sources of diagnostic error. Acta Cytol. 1999;43:400–406. 41. Nakhleh RE, Zarbo RJ. Amended reports in surgical pathology and implications for diagnostic error detection and avoidance. Arch Pathol Lab Med. 1998; 122:303–309. 42. Naryshkin S. The false-negative fraction for Papanicolaou smears: how often are ‘‘abnormal’’ smears not detected by a ‘‘standard’’ screening cytologist. Arch Pathol Lab Med. 1997;121:270–272. 43. Renshaw AA. Measuring and reporting errors in surgical pathology: lessons from gynecologic cytology. Am J Clin Pathol. 2001;115:338–341. 44. Ruijter E, van Leenders G, Miller G, Debruyne F, van de Kaa C. Errors in histological grading by prostatic needle biopsy specimens: frequency and predisposing factors. J Pathol. 2000;192:229–233. 45. Stewart CJ, MacKenzie K, McGarry GW, Mowat A. Fine-needle aspiration cytology of salivary gland: a review of 341 cases. Diagn Cytopathol. 2000;22: 139–146. 46. Troxel DB, Sabella JD. Problem areas in pathology practice uncovered by a review of malpractice claims. Am J Surg Pathol. 1994;18:821–831. 47. Wakely SL, Baxendine-Jones, Gallagher PJ, Mullee M, Pickering R. Aberrant diagnoses by individual surgical pathologists. Am J Surg Pathol. 1998;22:77– 82.

Anatomic Pathology Databases and Safety—Raab et al

48. Zardawi IM, Bennett G, Jain S, Brown M. Internal quality assurance activities of a surgical pathology department in an Australian teaching hospital. J Clin Pathol. 1998;51:695–699. 49. Bethwaite P, Smith N, Delahunt B, Kenwright D. Reproducibility of new classification schemes for the pathology of ductal carcinoma in situ of the breast. J Clin Pathol. 1998;51:450–454. 50. Gupta DK, Komaromy-Hiller G, Raab SS, Nath ME. Interobserver and intraobserver variability in the cytologic diagnosis of normal and abnormal metaplastic squamous cells in Pap smears. Acta Cytol. 2001;45:1–7. 51. Plummer M, Buiatti E, Lopez G, et al. Histological diagnosis of precancerous lesions of the stomach: a reliability study. Int J Epidemiol. 1997;26:716– 720. 52. Raab SS, Geisinger KR, Silverman JF, Thomas PA, Stanley MW. Interobserver variability of a Pap smear diagnosis of atypical glandular cells of undetermined significance (AGUS). Am J Clin Pathol. 1998;110:653–659. 53. Raab SS, Robinson RA, Jensen CS, et al. Mucinous tumors of the ovary: interobserver variability and utility of sectioning protocols. Arch Pathol Lab Med. 1997;121:1192–1198. 54. Raab SS, Robinson RA, Snider TE, et al. Telepathology on a difficult case consultation service: utility, diagnostic accuracy and interobserver variability. Mod Pathol. 1997;10:630–635. 55. Raab SS, Snider TE, Potts SA, et al. Atypical glandular cells of undetermined significance: diagnostic accuracy and interobserver variability using select cytologic criteria. Am J Clin Pathol. 1997;107:299–307. 56. Bagian JP, Lee C, Gosbee J, et al. Developing and deploying a patient safety program in a large health care delivery system: you can’t fix what you don’t know about. J Qual Improv. 2001;27:522–532. 57. Bates DW, Cohen M, Leape LL, Overhage JM, Shabot MM, Sheridan T. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc. 2001;8:299–308.

Arch Pathol Lab Med—Vol 129, October 2005

58. Sheikh A, Hurwitz B. A national database of medical error. J R Soc Med. 1999;92:554–555. 59. Sheikh A, Hurwitz B. Setting up a database of medical error in general practice: conceptual and methodological considerations. Br J Gen Pract. 2001; 51:57–60. 60. Arom KV, Petersen RJ, Orszulak TA, et al. Establishing and using a local/ regional cardiac surgery database. Ann Thorac Surg. 1997;64:1245–1249. 61. Grover FL. The society of thoracic surgeons national database: current status and future directions. Ann Thorac Surg. 1999;68:367–373. 62. Fernald DH, Pace WD, Harris DM, West DR, Main DS, Westfall JM. Event reporting to a primary care patient safety reporting system: a report from the ASIPS collaborative. Ann Fam Med. 2004;2:327–332. 63. Pace WD, Staton EW, Higgins GS, Main DS, West DR, Harris DM; ASIPS Collaborative. Database design to ensure anonymous study of medical errors: a report from the ASIPS Collaborative. J Am Med Inform Assoc. 2003;10:531–540. 64. Dovey SM, Meyers DS, Phillips RL Jr, et al. A preliminary taxonomy of medical errors in family practice. Qual Saf Health Care. 2002 Sep;11:233–238. 65. Battles JB, Kaplan HS, Van der Schaaf TW, Shea CE. The attributes of medical event reporting systems: experience with a prototype medical event reporting system for transfusion medicine. Arch Pathol Lab Med. 1998;122:231–238. 66. Battles JB, Shea CE. A system of analyzing medical errors to improve GME curricula and programs. Acad Med. 2001;76:125–133. 67. Fernandes CMB, Walker R, Price A, Marsden J, Haley L. Root cause analysis of laboratory delays to an emergency department. J Emerg Med. 1997;15: 735–739. 68. Ohno T. Toyota Production System: Beyond Large-Scale Production. Portland, Ore: Productivity Press; 1988. 69. Spear S, Bowen K. Decoding the DNA of the Toyota Production System. Boston, Mass: Harvard Business School Press; 1999.

Anatomic Pathology Databases and Safety—Raab et al 1251