A comparative review of patient safety initiatives for ... - Semantic Scholar

4 downloads 78688 Views 493KB Size Report
The safety of health information technology (HIT) needs to be urgently addressed .... or regulatory control should be proportional to the degree of risk that the ...... (Prepared by Abt. Associates and Geisinger Health System, under Contract No.
i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

journal homepage: www.ijmijournal.com

A comparative review of patient safety initiatives for national health information technology Farah Magrabi a,∗ , Jos Aarts b , Christian Nohr c , Maureen Baker d , Stuart Harrison d , Sylvia Pelayo e , Jan Talmon f , Dean F. Sittig g , Enrico Coiera a a

Centre for Health Informatics, Australian Institute of Health Innovation, The University of New South Wales, Sydney, Australia Institute of Health Policy and Management, Erasmus University Rotterdam, Rotterdam, The Netherlands c Danish Centre for Health Informatics, Department of Development and Planning, Aalborg University, Denmark d Department of Health Informatics Directorate, Leeds, England, United Kingdom e EVALAB – INSERM CIC IT, University Hospital of Lille, University of Lille Nord de France, F-59000 Lille, France f Department Medical Informatics, Maastricht University, Maastricht, The Netherlands g University of Texas – Memorial Hermann Center for Healthcare Quality & Safety, School of Biomedical Informatics, University of Texas Health Sciences Center, Houston, TX, United States b

a r t i c l e

i n f o

a b s t r a c t

Article history:

Objective: To collect and critically review patient safety initiatives for health information

Received 1 March 2012

technology (HIT).

Received in revised form

Method: Publicly promulgated set of advisories, recommendations, guidelines, or stan-

20 November 2012

dards potentially addressing safe system design, build, implementation or use were identified

Accepted 23 November 2012

by searching the websites of regional and national agencies and programmes in a nonexhaustive set of exemplar countries including England, Denmark, the Netherlands, the

Keywords:

USA, Canada and Australia. Initiatives were categorised by type and software systems cov-

Health information technology

ered.

Patient safety

Results: We found 27 patient safety initiatives for HIT predominantly dealing with software

Oversight

systems for health professionals. Three initiatives addressed consumer systems. Seven of

Regulation

the initiatives specifically dealt with software for diagnosis and treatment, which are regu-

Standards

lated as medical devices in England, Denmark and Canada. Four initiatives dealt with blood

Clinical governance

bank and image management software which is regulated in the USA. Of the 16 initiatives

e-Health

directed at unregulated software, 11 were aimed at increasing standardisation using guide-

Safety governance

lines and standards for safe system design, build, implementation and use. Three initiatives for

Internet/legislation and

unregulated software were aimed at certification in the USA, Canada and Australia. Safety

jurisprudence

is addressed alongside interoperability in the Australian certification programme but it is

Patient safety/legislation and

not explicitly addressed in the US and Canadian programmes, though conformance with

jurisprudence

specific functionality, interoperability, security and privacy requirements may lead to safer

Internet/standards

systems. England appears to have the most comprehensive safety management programme

Medical Informatics/legislation and

for unregulated software, incorporating safety assurance at a local healthcare organisation

jurisprudence

level based on standards for risk management and user interface design, with national

Medical Informatics/standards

incident monitoring and a response function.

∗ Corresponding author at: Centre for Health Informatics, Australian Institute of Health Innovation, The University of New South Wales, Sydney 2052, Australia. Tel.: +61 2 9385 8694; fax: +61 2 9385 8692. E-mail address: [email protected] (F. Magrabi). 1386-5056/$ – see front matter © 2012 Elsevier Ireland Ltd. All rights reserved. http://dx.doi.org/10.1016/j.ijmedinf.2012.11.014

e140

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

Conclusions: There are significant gaps in the safety initiatives for HIT systems. Current initiatives are largely focussed on software. With the exception of diagnostic, prognostic, monitoring and treatment software, which are subject to medical device regulations in some countries, the safety of the most common types of HIT systems such as EHRs and CPOE without decision support is not being explicitly addressed in most nations. Appropriate mechanisms for safety assurance are required for the full range of HIT systems for health professionals and consumers including all software and hardware throughout the system lifecycle. In addition to greater standardisation and oversight to ensure safe system design and build, appropriate implementation and use of HIT is critical to ensure patient safety. © 2012 Elsevier Ireland Ltd. All rights reserved.

1.

Introduction

1.1.

HIT can pose a risk to patient safety

The safety of health information technology (HIT) needs to be urgently addressed [1,2]. HIT broadly includes computer hardware and software used by health professionals and consumers to support care. Evidence that problems with HIT can pose a risk to patient safety is emerging even though such systems are central to improving the safety and quality of health services [3–8]. For instance, there were 42 reports of patient harm and four deaths in 436 critical incidents involving HIT that were reported to the US Food and Drug Administration (FDA) over a 30-month period ending July 2010 [5]. HIT problems also disrupt clinical work contributing to new types of errors leading to delays and re-work [4,9–12]. Safety issues involving information technology are not unique to healthcare [13], but this sector has lagged behind other industries in addressing such problems [2].

1.2. Background: safety needs to be addressed throughout the system cycle Safety is an emergent system property and needs to be addressed throughout the lifecycle of HIT systems [14]. The safety of patients is not solely dependent upon HIT systems on their own but is influenced by their interactions with users and other technology in a given environment [15]. Patients are harmed when interactions between system components (human and machine) create unsafe states [3]. As safety is inextricably linked to how a system operates, safety considerations should be at the forefront of design and build processes. All the possible interactions among system components are not predictable at design, especially when HIT are used in context of a broader sociotechnical system. In large complex systems, safety problems tend to emerge from unexpected interactions between system components. Therefore safety must also be addressed during and post system implementation. The potential for unsafe interactions needs to be investigated when HIT are integrated with local clinical workflows including other technology and the organisational structure. Beyond implementation any unsafe interactions that emerge during routine use need to be detected and mitigated by monitoring for critical incidents involving patient harm (adverse events) and near misses.

1.3.

Ensuring the safety of HIT

Historically HIT has not been subject to regulation. In particular standalone software systems (i.e. software not embedded in hardware) such as electronic health records (EHR) or computerised provider order entry (CPOE) systems have not been subject to regulatory requirements whereas software that is part of a medical device is regulated [16]. With the increasing proliferation of HIT, the safety of software is at last being addressed within large national programmes in many countries. Examples of such programmes are the National Health Service (NHS) Connecting for Health (CfH) programme in England; the American Reinvestment and Recovery Act (ARRA) “meaningful use” programme for adoption and use of EHRs; and the Australian Government’s Personally Controlled Electronic Health Record System that was launched in 2012. Little has been published about HIT safety initiatives. Recommended strategies to address software safety are largely based upon increasing standardisation and introducing mechanisms for oversight [2,17,18], though some argue that such measures may hamper innovation [19]. The recently revised European Union (EU) Medical Device Directive 93/42/EC (MDD) considers certain standalone software for diagnosis, prognosis, treatment and monitoring as medical devices and has become mandatory [20]. A directive is a type of legislation issued by the EU which sets out certain end results that must be achieved in every member country. As directives are not self-executing each country within the EU is now required to implement the MDD within their national regulations. In this paper we thus set out to collect and critically review current national initiatives for HIT relevant to improving patient safety.

2.

Methods

We defined HIT safety initiatives as any publicly promulgated set of advisories, recommendations, guidelines, or standards potentially addressing safe system design, build, implementation or use. Using a broad definition of HIT we included, “computer hardware and software that deals with the storage, retrieval, sharing, and use of health care information, data, and knowledge for communication and decision making” [21]. To understand the full range of initiatives a non-exhaustive set of exemplar countries was selected by the authors (England, Denmark, the Netherlands, the USA, Canada and Australia).

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

The websites of national agencies and programmes were searched including the NHS CfH programme in England [22], the European Commission’s Directorate general for health and consumers [20], the Danish Patient Safety Database [23], the Dutch Health Care Inspectorate [24], the US Office of the National Coordinator for HIT [25], the US FDA [26], Health Canada [27], Canada Health Infoway [28], and the Australian National e-Health Transition Authority [29]. Initiatives were categorised by type and software systems covered.

3.

Results

We found 27 safety initiatives for HIT across the six nations examined (Table 1). England had eight initiatives and the USA had nine. HIT safety initiatives predominantly dealt with software systems for health professionals while hardware was implicitly addressed. For example, requirements for appropriate hardware were included within guidance for implementing electronic medication management systems [30] or managing the risks of EHRs [21]. Seven initiatives dealt with software for diagnosis and treatment, which is regulated as a medical device in England, Denmark and Canada. Four initiatives dealt with blood bank and image management software which are regulated in the USA, and 16 were directed at unregulated software. More than half of the initiatives (n = 16) were directed at increasing standardisation including guidance, standards and regulation-mandated standards (Table 2). The remaining initiatives were concerned with providing oversight using certification, regulation and incident monitoring. Only three initiatives addressed consumer systems.

3.1.

Standardisation

Guidelines and standards can be applied at many different points in the lifecycle of HIT, from design, to build, to implementation and use in a clinical setting, and can be informed guidance or become mandatory standards. There are a range of international standards for HIT, for this review we chose to focus upon standards and guidelines for patient safety.

the implementation and use of blood bank software [43] and image management systems [44]. Considering safety in the broader context of usability the US National Institute of Standards and Technology (NIST) published a guide to evaluate EHR usability in 2012 [40]. The NIST guide is specifically focused on improving safety by encouraging user centred development processes. It proposes formative usability evaluation by experts and summative testing in the hands of users incorporating a risk-based approach to examining usability problems. NIST has also published consensus recommendations for critical user interactions influencing the usability and safety of paediatric EHRs [41]. While improved usability can contribute to safety a high usable system is not necessarily safe [47].

3.1.2.

Guidelines

Efforts to promote safe design and implementation of software have largely relied on guidance for manufacturers and healthcare organisations. With a specific focus on the safety of user interactions the English NHS CfH Common User Interface programme has developed interface design guides for the display of patient identifiers, clinical notes, medications and navigation [34]. Guides for patient demographics will become mandatory NHS standards in December 2015. The CfH programme has also developed guides for design and implementation of software based upon their standards (discussed in Section 3.1.2) [35,36]. A similar guide for implementing hospital medication management systems was released by the Australian Safety Commission in 2011 [30]. In the USA, the 2011 Agency for Healthcare Research and Quality (AHRQ) guide addresses safety risks as part of reducing the broader unintended consequences of EHR implementation [21]. The FDA also provides guidance for validation of software by manufacturers [42], with a particular focus on

Standards

We found only three standards that formally address the safety of software that is not subject to regulatory requirements. Along with standards for patient demographics, the English CfH programme has implemented two standards for risk management in software design, implementation and use [37,38]. These standards are consistent with those for safety critical software (e.g. International Electrotechnical Commission IEC 61508) and medical devices (e.g. International Organisation for Standardisation ISO 14971), and were formally adopted as NHS standards in 2009. Based upon these standards the CfH Clinical Safety Management System requires manufacturers and health organisations to demonstrate safe design and implementation practices. An internal assurance process encompasses hazard assessment, a safety case and safety closure report. It should be noted that the NHS standards were originally proposed as ISO standards in 2008 but were withdrawn before they were put up for ballot. The ISO is currently working on ISO/TR 17791 Health informatics – Guidance on standards for enabling safety in health software, which will provide recommendations on best practices in assuring the safer implementation of software, irrespective of whether it is regulated as a medical device.

3.1.3. 3.1.1.

e141

Regulation mandated standards

For software formally regulated as a medical device (see Section 3.2.2) international standards for usability, risk management and networks are mandatory [31–33].

3.2.

Oversight

As with medical devices, certification and regulation offer formal mechanisms to assure the safety and effectiveness of standalone software (e.g. an EHR or CPOE). Certification can provide independent assurance for specific software requirements. Regulation, on the other hand, ensures that manufacturers comply with legal requirements for software to be designed and built in a manner that its use does not compromise patient safety. In general the level of oversight or regulatory control should be proportional to the degree of risk that the software poses to patients. Beyond the stages of design and implementation, incident monitoring is a mechanism to track any emergent safety problems associated with routine use of HIT. For the initiatives we reviewed it was

e142

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

Table 1 – National and regional safety initiatives for HIT. Safety initiative European Union b Medical devices – application of risk management to medical devices (EN International Organisation for Standardisation, ISO 14971: 2007) [31] b Medical devices – application of usability engineering to medical devices (International Electrotechnical Commission, IEC 62366:2007) [32] b Application of risk management for IT-networks incorporating medical devices (IEC 80001: 2010) [33] Revised Medical Device Directive 93/42/EC [20] England The National Health Service (NHS) Connecting for Health (CfH) Common User Interface programme [34] Safer Implementation: Clinical Risk Guidance – A Guide to Implementation [35] Safer Design: Clinical Risk Guidance Document [36] Health Informatics – Application of clinical risk management to the manufacture of health software [37] Health informatics – Guidance on the management of clinical risk relating to the deployment and use of health software [38] Common User Interface standards for patient identifiers [34] NHS CfH clinical safety management system incl. incident monitoring [22] UK Medical Device regulations [39] Denmark Danish Patient Safety Database [23] Netherlands Health Care Inspectorate [24] USA Guide to Reducing Unintended Consequences of Electronic Health Records [21] Technical Evaluation, Testing and Validation of the Usability of Electronic Health Records [40] Human Factors Guide to Enhance EHR Usability of Critical User Interactions when Supporting Paediatric Patient Care [41] General Principles of Software Validation: Final Guidance for Industry and Food and Drug Administration (FDA) Staff [42] Draft Guidance Blood Establishment Computer System Validation in the User’s Facility [43] Guidance for the Submission of Premarket Notifications for Medical Image Management Devices [44] a Certification for meaningful use including the Certification Commission for Health Information Technology (CCHIT) programme [45] FDA Manufacturer and User Facility Device Experience database [46] Draft Guidance for Industry and FDA Staff – Mobile Medical Applications [26] Canada a Canada Health Infoway [28] Health Canada – software regulated as a Class I or Class II Device [27] Australia Electronic Medication Management Systems: A Guide to Safe Implementation [30] National e-Health Transition Authority Compliance, Conformance and Accreditation programme [29] a b

Software covered Diagnosis and treatment software

All software delivered through NHS CfH contracts (except regulated software i.e. for diagnosis and treatment)

Diagnosis and treatment software Diagnosis and treatment software All patient safety incidents EHRs EHRs Paediatric EHRs All healthcare software Blood bank software Image management software EHR; prescribing systems

Reports about medical devices Mobile applications incl. consumer applications

Registries: client, provider, immunisation; consumer systems; diagnostic imaging; drug information systems; EHRs Diagnosis and treatment software incl. consumer systems. Medication management systems All software connected to shared EHR

Initiatives that do not directly address patient safety. Global standards that are mandatory in the EU.

usually mandatory to report incidents associated with regulated software whereas the reporting of general patient safety incidents (including those involving unregulated software) by healthcare professionals was voluntary.

3.2.1.

Certification

We found HIT certification programmes in three nations, however safety was explicitly addressed in only one programme. US certification programmes are focussed around the

e143

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

Table 2 – National and regional HIT safety initiatives by type. Country/region

European Union England Denmark Netherlands USA Canada Australia a

Standardisation Guidance

Standard

[34,35] [36]

[34,37] [38]

Oversight

Regulation mandated standard

Certification

[31–33]

[21,40–44]

[20] [33]

a,

[45] [28] [29]

a,

[30]

Regulation

[26] [27]

Incident monitoring [22] [23] [24] [46]

Initiatives that do not directly address patient safety.

meaningful use of EHRs. The Office of the National Coordinator for HIT has authorised six testing and certification bodies including the Certification Commission for Health Information Technology (CCHIT) to support the ARRA programme. A second CCHIT programme covers functionality, interoperability, and security for EHRs in ambulatory, inpatient, emergency, behavioural health and long-term care; and electronic prescribing [45]. It requires key aspects of successful use to be verified at live sites. For ambulatory EHRs, usability is assessed by expert review using validated instruments (e.g. After Scenario Questionnaire, Perceived Usability Questionnaire and System Usability Survey [48]). Professional training and certification for EHR use are another response to ARRA (e.g. www.healthitcertification.com). Focusing on a broader range of software the Canadian Health Infoway’s (Infoway) certification programme includes consumer software, shared records, EHRs and core modules including medications, imaging, and registries for patients, providers and immunisations [28]. Infoway’s certification criteria focus on privacy, security, and interoperability based on international and Canadian standards. A fourth set of criteria focusing on manufacturer practices for managing risk, data, system security, as well as third party solutions and services is noteworthy from a safety perspective. Risk management is based upon Canadian Standards Association’s Risk Management guidance and international IT governance standards. The programme requires manufacturers to report all changes to certified software and report incidents involving the software. In Australia, the National e-Health Transition Authority has established a Compliance, Conformance and Accreditation programme to oversee a national framework for assurance with respect to clinical safety alongside interoperability and security [29].

3.2.2.

Regulation

Although HIT (i.e. standalone software) has largely been outside the strict regulatory regimen applied to medical devices, current initiatives indicate a gradual move towards regulation. In the EU the safety of medical devices is regulated through a directive that focuses on proper manufacturing and premarket testing leading to a declaration of conformity [49,50]. Each member state is required to appoint a Competent Authority who is responsible for implementing the requirements of the Directive. For example, in the Netherlands the Health

Care Inspectorate is entrusted with this responsibility and the Medicines and Healthcare products Regulatory Agency fulfils this role in England. The revised EU MDD considers standalone software intended by its manufacturer to be used for, “(a) diagnosis, prevention, monitoring, treatment or alleviation of disease, (b) diagnosis, monitoring, treatment, alleviation of or compensation for an injury or handicap, (c) investigation, replacement or modification of the anatomy or of a physiological process,” as a medical device in its own right [20]. Such software or devices incorporating software are required to be validated in accordance with state of the art processes for software development, risk management, validation and verification. The MDD explicitly mentions the role of clinical evaluation for proving that the device is safe and meets the stated objectives. While patient safety considerations are part of the essential requirements, usability considerations are implicit. The directive states that software should minimise risks of use error using ergonomic features in context of the operating environment. Manufacturers must (i) adopt a usability engineering process during design and development; (ii) document the process; and (iii) ultimately prove the “ergonomic quality” (usability) of the device. Technical documentation must include the usability engineering process based on the IEC 62366:2007 standard [27], which relies on widely accepted usability methodologies [e.g. ISO 9241-210:2010]. Follow-up guidance to the MDD (MEDDEV 2.1/6) was published in early 2012 and provides a flowchart to help decide whether or not standalone software should be considered as a medical device [51]. Software that merely stores, archives or communicates clinical information (e.g. an EHR) is not considered as medical devices. However decision support modules which combine medical knowledge databases and algorithms with patient specific data to provide recommendations for diagnosis, prognosis, monitoring and treatment of individual patients are considered as devices. Image viewers, medication modules and software that generates alarms based on the monitoring and analysis of patient specific physiological parameters are also considered as devices. The MDD applies across the EU, yet regulators in different member states interpret it in differing fashions. In England, software qualifies as a medical device if a clinician relies on its output (e.g. a radiotherapy treatment planning system) and it is regulated by the Medicines and Healthcare products

e144

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

Regulatory Agency. For example, software intended to enhance images from an X-ray or ultrasound is considered a medical device. Health Canada similarly announced the regulation of specific software as medical devices in 2010. In this regimen software used to store, acquire, transfer, or view clinical data or images is considered as a Class I device, and Class II if it manipulates the data and images upon which clinical decisions are based. Software that simply replaces patients’ paper records is excluded if it is only intended to store and view patient information (for example: age, weight, notes about patients’ appointments, test results, order processing, scheduling, or managing patient movements). Software that (i) performs administrative calculations and manipulations (e.g. appointment and workflow management systems); or (ii) connects two or more HIT applications (i.e. middleware) so that they can exchange data is also excluded. As for consumer HIT, software used to manage data (e.g. blood pressure, spirometry) from patients’ homes is classified as a Class II device if it is used to analyse data for diagnosis and treatment, and Class I if it is only transmits and stores the data. At present there is considerable debate in the USA about the FDA’s role in regulating HIT [52,53]. Under the Federal, Food, Drug, and Cosmetic Act, HIT are considered as medical devices. However, the FDA does not currently enforce its regulatory requirements with respect to HIT. In 2011, the FDA moved towards enforcing its regulatory requirements with respect to software for mobile platforms (e.g. smartphones and tablet computers). Specifically, applications (apps) for mobile platforms which are used as an accessory to a currently regulated device (e.g. an app to view medical images on an iPad and make a diagnosis) or transform a mobile platform to a device (e.g. “an app that turns a smartphone into an electrocardiography, or ECG, machine to detect abnormal heart rhythms or determine if a patient is experiencing a heart attack.”).

3.2.3.

Incident monitoring

The monitoring of critical incidents is central to detecting emerging problems in mainstream patient safety programmes which are now well-established in most developed nations [54]. The only large-scale programme directed at monitoring and responding to HIT incidents is part of the English NHS Clinical Safety Management System which has been in place since 2005 and has handled more than 900 incidents reported by health organisations and manufacturers. Yet HIT incidents are also being reported among general patient safety incidents. An analysis of 40,000 incidents submitted by health professionals to a state-wide incident monitoring system in Australia yielded 99 HIT-related incidents associated with 32 categories of safety problems [4]. Another potential source of HIT incidents are reports about medical device failure and hazards submitted by users and manufacturers. One source of such reports is the US FDA Manufacturer and User Facility Device Experience database. Although the FDA does not enforce its regulatory requirements with respect to HIT, some manufacturers have voluntarily listed their systems and reported incidents. A search of almost 900,000 reports from January 2008 to July 2010 yielded 436 incidents, of these 11% were associated with patient harm including four deaths [5,6].

To facilitate the reporting of such incidents the US AHRQ has included categories for HIT in its new standard for reporting hazardous events called the “common format” [55]. The AHRQ has also developed and tested a comprehensive software tool to support detection and management of hazards throughout the HIT lifecycle. The Health IT Hazard Manager facilitates the characterisation and communication of hazards along with their actual and potential adverse effects to support learning within healthcare organisations, across organisation using the same software and, by vendors and policymakers [56]. The Danish Patient Safety Database is another example which has collected reports about adverse events from clinicians since 2004 [23]. In addition to medical devices the system collects reports about, “software necessary for its proper application intended by the manufacturer to be used for human beings for the purpose of: diagnosis, prevention, monitoring, treatment or alleviation of disease” [57]. It does not have an inbuilt coding system for automatic identification of incidents involving HIT and was opened up to reporting by patients and their relatives in 2011. In the Netherlands, HIT safety problems are addressed on a case-by-case basis by the Health Care Inspectorate and no provision exists for collection of incidents involving HIT [24].

4.

Discussion

4.1.

Current safety initiatives

We found that current national safety initiatives for HIT are largely focussed upon software for health professionals. Standalone software that is “intended by its manufacturer to be used specifically for diagnostic, prognostic, monitoring and/or therapeutic purposes” is subject to regulatory oversight [20]. Where enforced, only a small subset of software is legally required to be safe, the vast majority of common HIT systems such as EHRs and CPOE without decision support are seen to be outside this subset (Fig. 1). For such unregulated software, safety initiatives aim to increase standardisation

Fig. 1 – Current safety initiatives primarily address safety of software with limited oversight.

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

using guidelines and standards to ensure safe design, build, implementation and use. Our review identified 11 such initiatives [21,30,34–38,40–42]. At present most certification programmes for unregulated software do not explicitly address safety, although improvements to specific functionality, interoperability, security and privacy requirements may indirectly lead to safer systems [28,29,45]. At the time of writing England appears to have the most comprehensive safety management programme for unregulated software, incorporating safety assurance at a local healthcare organisation level based on standards for risk management and user interface design, with national incident monitoring and a response function. However the effectiveness of such initiatives is not known and further research is required to evaluate their impact on minimising risks to patients.

4.2.

Improving oversight of HIT

Certification and regulation offer two mechanisms to improve oversight of software. Canada has announced regulation of a select group of software very similar to the new EU device directive. The US Institute of Medicine (IOM) report recommended compulsory registration of all manufacturers and public listing of HIT systems with the Office of the National Coordinator which is already administering EHRs as part of the meaningful use programme [2]. Secondly it recommended mandatory adoption of quality management and risk management principles in processes to build HIT systems based upon ISO standards with a particular focus on human factors, safety culture, and usability. Thirdly the IOM called for oversight to ensure compliance along with mandatory manufacturer and voluntary user reporting of HIT related patient safety incidents including aggregation, analysis, and investigation by an independent body. The IOM recommendations to monitor and publicly report progress towards safety at the top level, and to direct the FDA to develop a regulatory framework seem to indicate that regulation is inevitable in the USA. National safety initiatives need to address a broad range of HIT including consumer systems. While it is appropriate to deal with specific software systems (e.g. e-prescribing) for certification of particular functionality, safety assurance is required for all software and interactions at a system level. For instance, software intended to store patients’ demographic details and appointment schedule (currently excluded in the EU directive and the Canadian regulatory regimen) may pose risks if data corruption results in a failure to follow-up on an abnormal test result indicating a potential cancer [58]. Moreover software systems are usually interconnected i.e. software storing demographic details may provide input to a safety critical prescribing module. Therefore even software that is merely ‘storing’ patient information requires safety assurance even though it may not be appropriate to subject such systems to full regulation. Ultimately the level of oversight should be proportional to the risk posed by HIT. Safety assurance also needs to be extended to the hardware that hosts software. In an analysis of 712 HIT problems reported to the US FDA 16% directly involved computer hardware [6]. In general, hardware and network configurations across disparate organisations and clinical settings are likely

e145

to be highly heterogeneous. While existing standards for risk management (such as the IEC80001) can be applied to networks, standardisation of hardware and operating systems may prove to be more challenging. One strategy is to design software with the ability to detect and activate pre-defined corrective or fail-safe responses in the event of hardware or network failures [59]. In this regard HIT safety efforts can be usefully informed by the experience of implementing computer systems in other safety critical industries [14,60–62].

4.3.

Need for HIT specific standards

The effectiveness of any mechanism to improve oversight is contingent on sound certification criteria and safety standards. Patient safety is currently on the agenda of the ISO and the European Committee for Standardisation (CEN) Technical Committees on Health Informatics. A new technical report providing guidance on standards for safety in design, implementation and use of software is scheduled for release in 2013. While existing standards for the safe design and use of medical devices might be a good starting point, there is a need for HIT specific standards for development, evaluation and post-market surveillance of software. For instance, hazard assessment techniques need to focus on identifying classes of HIT specific errors [3–6]. There is a need for suitable approaches to interface design as well as methods to ensure that the implemented interfaces are indeed safe to use. The question arises whether approaches, which are proposed as means to develop software that will meet users’ requirements (e.g. Agile approaches like Scrum [63]), are sufficiently grounded in safety principles.

4.4. Beyond certification and regulation – ensuring safe operation Regulation and certification of software will not address safe implementation and use of HIT within the context of clinical work processes and the underlying IT infrastructure (i.e. hardware) as this must be addressed locally by the organisation implementing and using the technology [64]. For instance, a certified CPOE system can pose a risk to patient safety due to a range of local issues including if: (i) it is not available; (ii) it cannot reliably communicate with the local EHR system; (iii) there is a mismatch between the system and actual clinical work process; or (iv) users are not adequately trained (use error) [7]. Currently the approach to safety in implementation is through guidelines, with the exception of the English standard for risk management. However there are no guides about safe system use and further research is required to address this gap. One possible way to address safe use and operation is to incorporate specific use requirements into existing accreditation of health organisations’ safety practices (e.g. the US Joint Commission’s safety standards [65]).

e146

5.

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

Conclusions

There are significant gaps in safety initiatives for HIT systems. Current initiatives are largely focussed on software for health professionals. With the exception of diagnostic, prognostic, monitoring and treatment software, which is subject to medical device regulations in some countries, the safety of the majority of all types of HIT is not being explicitly addressed in most nations. England appears to have the most comprehensive programme explicitly addressing the safety of unregulated software. Appropriate mechanisms for safety assurance are required for all types of HIT systems for health professionals and consumers including all software and hardware throughout the system lifecycle. In addition to greater standardisation and oversight to ensure safe design and build, appropriate implementation and use is critical to ensure patient safety.

Authors’ contributions This paper emerged from a workshop on the, “Safety of health information technology: Identifying and mitigating risks” by FM, JA, Marie-Catherine Beuscart-Zephir and CN at the XXIII International Conference of the European Federation for Medical Informatics (EFMI), 28–31 August 2011. FM, EC, JA and CN conceptualised the study; all authors contributed to collating the safety initiatives, and the analysis and interpretation of findings. FM drafted the initial version. All the authors contributed to subsequent versions and gave final approval.

Conflict of interest MB is Clinical Director for Patient Safety and SH is a Lead Safety Engineer at NHS Connecting for Health, Department of Health Informatics Directorate in England. The other authors have no conflict of interest to declare.

Acknowledgements This research is supported in part by grants from the Australian National Health and Medical Research Council (NHMRC) Program Grant 568612 and Project Grant 630583. The funding sources did not play any role in the study design, in the collection, analysis and interpretation of data; in the writing of the manuscript; and in the decision to submit the manuscript for publication. The authors wish to thank Tim Chearman, Stephen Corbett and Don Newsham for information about the English and Canadian initiatives; and Dr David Greenfield his valuable advice. We also wish to thank the EFMI Working Group on Human and Organisational Factors of Medical Informatics and participants of the workshop at the 2011 EFMI meeting.

references

Summary points What was already known on the topic: • Alongside benefits health information technology (HIT) can pose risks to patient safety. • The safety of HIT needs to be urgently addressed. What this study added to our knowledge: • There are significant gaps in patient safety initiatives for HIT. • Current safety initiatives are largely focused on software. • The safety of the most common types of HIT systems such as EHRs and CPOE without decision support is not being explicitly addressed in most nations. • Greater standardisation and oversight is required to ensure safety throughout the lifecycle of all types of HIT systems for health professionals and consumers including all software and hardware.

[1] E. Coiera, J. Aarts, C. Kulikowski, The dangerous decade, J. Am. Med. Inform. Assoc. 19 (1) (2012) 2–5. [2] IOM (Institute of Medicine), Health IT and Patient Safety: Building Safer Systems for Better Care, The National Academies Press, Washington, DC, 2012. [3] J.S. Ash, M. Berg, E. Coiera, Some unintended consequences of information technology in health care: the nature of patient care information system-related errors, J. Am. Med. Inform. Assoc. 11 (2) (2004) 104–112. [4] F. Magrabi, M.S. Ong, W. Runciman, E. Coiera, An analysis of computer-related patient safety incidents to inform the development of a classification, J. Am. Med. Inform. Assoc. 17 (6) (2010) 663–670. [5] F. Magrabi, M.S. Ong, W. Runciman, E. Coiera, Patient safety problems associated with healthcare information technology: an analysis of adverse events reported to the US Food and Drug Administration, AMIA Annu. Symp. Proc. 2011 (2011) 853–857. [6] F. Magrabi, M.S. Ong, W. Runciman, E. Coiera, Using FDA reports to inform a classification for health information technology safety problems, J. Am. Med. Inform. Assoc. 19 (1) (2012) 45–53. [7] D.F. Sittig, H. Singh, Defining health information technology-related errors: new developments since to err is human, Adv. Intern. Med. 171 (14) (2011) 1281–1284. [8] J.P. Weiner, T. Kfuri, K. Chan, J.B. Fowles, “e-Iatrogenesis”: the most critical unintended consequence of CPOE and other HIT, J. Am. Med. Inform. Assoc. 14 (3) (2007) 387–388, discussion 9. [9] T.L. Hanuscak, S.L. Szeinbach, E. Seoane-Vazquez, B.J. Reichert, C.F. McCluskey, Evaluation of causes and frequency of medication errors during information technology downtime, Am. J. Health Syst. Pharm. 66 (12) (2009) 1119–1124. [10] M.I. Harrison, R. Koppel, S. Bar-Lev, Unintended consequences of information technologies in health care – an interactive sociotechnical analysis, J. Am. Med. Inform. Assoc. 14 (5) (2007) 542–549. [11] S.J. Perry, R.L. Wears, R.I. Cook, The role of automation in complex system failures, J. Patient Saf. 1 (1) (2005) 56–61.

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

[12] T.B. Wetterneck, J.M. Walker, M.A. Blosky, R.S. Cartmill, P. Hoonakker, M.A. Johnson, E. Norfolk, P. Carayon, Factors contributing to an increase in duplicate medication order errors after CPOE implementation, J. Am. Med. Inform. Assoc. 18 (6) (2011) 774–782. [13] D. Jackson, M. Thomas, L.I. Millett (Eds.), Software for Dependable Systems: Sufficient Evidence?, The National Academies Press, Washington, DC, 2007. [14] N.G. Leveson, Engineering a Safer World: Systems Thinking Applied to Safety, Massachusetts Institute of Technology, USA, 2011. [15] E. Coiera, Interaction design theory, Int. J. Med. Inform. 69 (2–3) (2003) 205–222. [16] F.E. Young, Validation of medical software: present policy of the Food and Drug Administration, Ann. Intern. Med. 106 (4) (1987) 628–629. [17] J.M. Walker, P. Carayon, N. Leveson, R.A. Paulus, J. Tooker, H. Chin, A. Bothe Jr., W.F. Stewart, EHR safety: the way forward to safe and effective systems, J. Am. Med. Inform. Assoc. 15 (3) (2008) 272–277. [18] H. Singh, D.C. Classen, D.F. Sittig, Creating an oversight infrastructure for electronic health record-related patient safety hazards, J. Patient Saf. 7 (4) (2011) 169–174. [19] C.A. Longhurst, H.M. Landa, Health information technology and patient safety, BMJ 344 (2012) e1096. [20] European Commission’s Directorate for public health and risk assessment, available from: ec.europa.eu/health/medical-devices/regulatory-framework/ (September 2012). [21] S.S. Jones, R. Koppel, M.S. Ridgely, T.E. Palen, S. Wu, M.I. Harrison, Guide to Reducing Unintended Consequences of Electronic Health Records (Prepared by RAND Corporation under Contract No. HHSA290200600017I, Task Order #5), AHRQ Publication No. 11-0105-EF, Agency for Healthcare Research and Quality, Rockville, MD, available from: http://www.ucguide.org/ (August 2011). [22] NHS Connecting for Health: Clinical Safety, available from: http://www.connectingforhealth.nhs.uk/systemsandservices /clinsafety (December 2011). [23] Dansk Patient-Sikkerheds-Database, available from: http://www.dpsd.dk (September 2012). [24] The Dutch Health Care Inspectorate, available from: http://www.igz.nl/english/ (February 2012). [25] US Office of the National Coordinator for Health IT, available from: http://www.healthit.hhs.gov/portal/server.pt/community/ healthit hhs gov home/1204accessed (December 2011). [26] US Food and Drug Administration: Draft Guidance for Mobile Medical Applications, available from: http://www.fda.gov/MedicalDevices/ProductsandMedical Procedures/ucm255978.htm (January 2012). [27] Health Canada, available from: http://www.hc-sc.gc.ca/dhp-mps/md-im/activit/announce -annonce/md notice software im avis logicels-eng.php (December 2011). [28] Canada Health Infoway, available from: http://www.infoway-inforoute.ca/working-with-ehr/solution -providers/certification (December 2011). [29] NEHTA – National E-Health Transition Authority: http://www.nehta.gov.au/connecting-australia/cca (December 2011). [30] Australian Commission on Safety and Quality in Health Care, Electronic Medication Management Systems – A Guide to Safe Implementation, 2nd edition, ACSQHC, Sydney, 2012, available from http://www.safetyandquality.gov.au/publications-resources/ publications/page/3/ (September 2012).

e147

[31] International Organization for Standardization – Medical Devices – Application of Risk Management to Medical Devices (EN ISO 14971:2007) – Geneva, 2007. [32] International Electrotechnical Commission – Medical Devices – Application of usability engineering to medical devices (IEC 62366:2008) – Geneva, 2007. [33] International Electrotechnical Commission – Medical Devices – Application of risk management for IT-networks incorporating medical devices (IEC 80001:2010) – Geneva, 2010. [34] NHS CfH Common User Interface, available from: http://www.connectingforhealth.nhs.uk/systemsandservices /data/cui (December 2011). [35] NHS Connecting for Health, Safer Implementation – A Guide to Implementation, 2010, available from: http://www.connectingforhealth.nhs.uk/systemsandservices /clinsafety/key/ [36] NHS Connecting for Health, Safer Design – A Guide to Design, 2010, available from: http://www.connectingforhealth.nhs.uk/systemsandservices /clinsafety/key/ [37] NHS Connecting for Health, Health Informatics – Application of Clinical Risk Management to the Manufacture of Health Software (Formerly ISO/TS 29321:2008(E)) – DSCN14/2009. [38] NHS Connecting for Health, Health Informatics – Guidance on the Management of Clinical Risk Relating to the Deployment and Use of Health Software (Formerly ISO/TR 29322:2008(E)), DSCN18/2009. [39] UK Medicines and Healthcare Products Regulatory Agency, available from: http://www.mhra.gov.uk/Howweregulate/Devices/index.htm (September 2012). [40] NIST Technical Evaluation, Testing, and Validation of the Usability of Electronic Health Records, US National Institute of Standards and Technology (NISTIR 7804), US National Institute of Standards and Technology, 2012, available from: http://www.nist.gov/manuscript-publication-search.cfm? pub id=909701 [41] S.Z. Lowry, M.T. Quinn, M. Ramaiah, D. Brick, E.S. Patterson, J. Zhang, P. Abbott, M.C. Gibbons, A Human Factors Guide to Enhance EHR Usability of Critical User Interactions when Supporting Pediatric Patient Care (NISTIR 7865), US National Institute of Standards and Technology, 2012. [42] General Principles of Software Validation, Final Guidance for Industry and FDA Staff, http://www.fda.gov/cdrh/comp/guidance/938.html (11.01.02). [43] Draft Guidance for Industry: Blood Establishment Computer System Validation in the User’s Facility, http://www.fda.gov/BiologicsBloodVaccines/Guidance ComplianceRegulatoryInformation/Guidances/Blood/ ucm072560.htm (October 2007). [44] Guidance for the Submission of Premarket Notifications for Medical Image Management Devices, http://www.fda.gov/downloads/MedicalDevices/Device RegulationandGuidance/GuidanceDocuments/ (July 2007). [45] US Certification Commission for Healthcare IT, available from: http://www.cchit.org/cchit-certified (September 2012). [46] Manufacturer and User Facility Device Experience Database – (MAUDE), available from: http://www.fda.gov/MedicalDevices/DeviceRegulationand Guidance/PostmarketRequirements/ReportingAdverseEvents /ucm127891.htm (accessed December 2011). [47] B.T. Karsh, Beyond usability: designing effective technology implementation systems to promote patient safety, Qual. Saf. Health Care 13 (5) (2004) 388–394.

e148

i n t e r n a t i o n a l j o u r n a l o f m e d i c a l i n f o r m a t i c s 8 2 ( 2 0 1 3 ) e139–e148

[48] T. Tullis, A. Albert, Measuring the User Experience: Collecting, Analyzing and Presenting Usability Metrics, Elsevier Inc., 2008. [49] J.Y. Chai, Medical device regulation in the United States and the European Union: a comparative study, Food Drug Law J. [Comp. Study] 55 (1) (2000) 57–80. [50] J.D. Kuyvenhoven, P. Lahorte, K. Persyn, E. De Geest, P.W. van Loon, F. Jacobs, P.P. van Rijk, I. Lemahieu, R.A. Dierckx, European Medical Device Directive: impact on nuclear medicine, Comput. Med. Imaging Graph. 25 (2) (2001) 207–212. [51] Guidelines on the Qualification and Classification of Stand Alone Software Used in Healthcare Within the Regulatory Framework of Medical Devices, ec.europa.eu/health/medical-devices/documents/guidelines/ index en.htm (February 2012). [52] K.W. Goodman, E.S. Berner, M.A. Dente, B. Kaplan, R. Koppel, D. Rucker, D.Z. Sands, P. Winkelstein, Challenges in ethics, safety, best practices, and oversight regarding HIT vendors, their customers, and patients: a report of an AMIA special task force, J. Am. Med. Inform. Assoc. 18 (1) (2011) 77–81. [53] R.A. Miller, R.M. Gardner, Summary recommendations for responsible monitoring and regulation of clinical software systems. American Medical Informatics Association, The Computer-based Patient Record Institute, The Medical Library Association, The Association of Academic Health Science Libraries, The American Health Information Management Association, and The American Nurses Association, Ann. Intern. Med. 127 (9) (1997) 842–845. [54] J.C. Pham, S. Gianci, J. Battles, P. Beard, J.R. Clarke, H. Coates, L. Donaldson, N. Eldridge, M. Fletcher, C.A. Goeschel, E. Heitmiller, J. Hensen, E. Kelley, J. Loeb, W. Runciman, S. Sheridan, A.W. Wu, P.J. Pronovost, Establishing a global learning community for incident-reporting systems, Qual. Saf. Health Care 19 (5) (2010) 446–451. [55] Agency for Healthcare Research and Quality, Common Formats, US Department of Health and Human Services, Rockville, MD, available from:

[56]

[57]

[58]

[59]

[60]

[61]

[62] [63] [64]

[65]

http://www.pso.ahrq.gov/formats/commonfmt.htm (September 2012). J.M. Walker, A. Hassol, B. Bradshaw, M.E. Rezaee, Health IT Hazard Manager Beta-Test: Final Report (Prepared by Abt Associates and Geisinger Health System, under Contract No. HHSA290200600011i, #14), AHRQ Publication No. 12-0058-EF, Agency for Health care Research and Quality, Rockville, MD, May 2012. European Commission DG Enterprise and Industry Directorate F-Consumer Good Unit F3-Cosmetic and Medical Devices: Guidelines on a Medical Device Vigilance System, MEDDEV 2.12-1 rev 6, December 2009. H. Singh, L. Wilson, L.A. Petersen, M.K. Sawhney, B. Reis, D. Espadas, D.F. Sittig, Improving follow-up of abnormal cancer screens using electronic health records: trust but verify test result communication, BMC Med. Inform. Decis. Making 9 (2009) 49. B.S. Medikonda, S.R. Panchumarthy, An approach to modeling software safety in safety-critical systems, J. Comp. Sci. 5 (4) (2009) 311–322. The Use of Computers in Safety-Critical Applications, Health & Safety Executive, Norwich, UK, 1998, available from: http://www.hse.gov.uk/nuclear/computers.pdf (19.09.12). A.J. Kornecki, Software Development Tools for Safety-Critical, Real-Time Systems Handbook, Federal Aviation Administration, Washington, DC, 2007, Contract No.: DOT/FAA/AR-06/35. N. Storey, Safety-Critical Computer Systems, Pearson Education Limited, UK, 1996. D. Marshall, J. Bruno, Solid Code:, Optimizing the Software Development Life Cycle, O’Reilly Media, Inc., 2009. D.F. Sittig, H. Singh, Electronic health records and national patient-safety goals, N. Engl. J. Med. 367 (19) (2012) 1854–1860. D.F. Sittig, D.C. Classen, Safe electronic health record use requires a comprehensive monitoring and evaluation framework, JAMA 303 (5) (2010) 450–451.