23
Mon, Oct
21 New Articles

Should the Cause of Healthcare Accidents and their Subsequent Management be Approached Through a Person-centred Approach or a Systems Approach?

Special Reports

Dr Christopher Dalley, Consultant Haematologist, Southampton General Hospital University Hospitals Southampton NHS Foundation Trust


Patient centered approach to healthcare accidents

When a healthcare error occurs and a patient or worker is harmed, all concerned: investigators, healthcare worker and patient, attempt to make sense of what happened. According to Reason, the personcentered approach to human error problem solving “focuses on the unsafe acts-errors and procedural violations of people at the sharp end” (Reason, 2000). In healthcare the ‘sharp end’ is typically the direct interface between patient and health care worker. Attribution theory introduced by Heider in 1958 and developed by others (Jones and Davis, 1965) is concerned with how people try to make sense of events through attributional activities.

The link between attribution theory and safety emerged in the 1990’s and in 1994 Dejoy suggested that attributional processes and inferences lie at the centre of workplace safety and that ‘causal inferences in turn, broadly determine the actions that are taken or not taken to correct hazards and prevent injuries. In a very real sense, actions to manage safety derive more from attributions than from actual causes’ (Dejoy, 1994). Viewed through the lens of attribution theory the person-centered approach to healthcare accidents can be seen as the process of assigning blame to an individual or individuals. However, blame and casual attribution are not necessarily the same; blame implies inappropriate behaviour but workplace behaviour may still lead to an accident even when an agent’s acts are not considered inappropriate or deviant. Despite this subtlety the person approach tends to satisfy a professional and societal need to apportion causality, fault or responsibility for an accident to the individual(s), rather than the organization. This may in part be cultural as individual agency predominates in Western societies. Additionally, physical actions of an individual are often more conspicuous to an observer than latent events which may be more proximal and/or relevant to the propagation of an accident. Additionally, in attribution theory a person’s beliefs and motives may influence or bias attributional thinking and lead to
actor-observer bias, self-serving bias and correspondence bias (Holden, 2009). Nevertheless, attribution theory and person-centered approach to healthcare
accidents are not only applicable to the unintentional errors and mistakes of individuals but also to the violations of protocols and procedures by individuals.

Empirically, when healthcare accidents are thought to be the result of a knowledge gap in an individual, education and training are considered appropriate
remedial actions. Additionally, healthcare professionals with leadership roles, and those involved in the investigation of healthcare accidents, should be cognizant of the biases inherent in attribution theory and the risk of assigning fault to individuals as they may be falsely labelled. Person-centered approach and attribution theory may underestimate the contribution of non-human factors in an accident. The recommendation by the External Inquiry into the adverse incident that occurred at the Queen’s Medical Centre, Nottingham, 4th January 2001 that led to the design of new spinal needles that do not allow intravenous syringe connection was an important ergonomic safety enhancement for intrathecal chemotherapy procedures (Toft, 2001). However, person-centered solutions to healthcare accidents may downplay or miss engineering and environmental solutions to healthcare accidents. Furthermore, the culture within the medical profession and societal norms dictate the use of medico-legal and disciplinary actions against individuals who may have violated professional standards and protocols and caused harm to patients.

Systems approach to healthcare accidents

The origin of systems approach to healthcare accident investigation lies within high-risk, high reliability industries including aviation and nuclear power where accidents and errors may lead to a catastrophic loss of life. In these industries, as in healthcare, accidents are due to a combination of active and latent errors (Rasmussen & Pedersen, 1984, Reason 1990). Active errors, mediated through the actions and behaviours of individuals (slips, lapses mistakes or protocol violations), have immediate effect. Latent errors are distal, in time and space, to the active errors and have a period of dormancy before they become operational. Their genesis lies within the decisions and activities taken by individuals who operate away from the sharp end, and in healthcare may include
politicians, commissioners, hospital directors, managers, healthcare personnel and others.

The external enquiry into the adverse incident that occurred at Queen’s Medical Centre, Nottingham, encapsulates the benefits of a systematic approach to
healthcare incident investigation and learning when applied to a serious adverse event (Toft, 2001). Although the inquiry report did not disclose the methodology used by Toft, the scope of investigation in to how an intrathecal injection of Vincristine was wrongly given to a patient with acute leukaemia, resulting in death, was far reaching. It exposed active errors on the part of doctors, nurses and a multitude of latent failures including a national trial protocol for leukaemia that permitted the administration of intravenous and intrathecal chemotherapy to a patient on the same day. More than 50 safety recommendations were made by Toft covering operational practices (pharmacy and ward), protocols (local and national), drug collection and administration, staff training, protocol revision and document control, communication, labelling of medication, and national issues such as redesign of spinal needles.

The theoretical framework for understanding the aetiology of accidents in complex industries and the effective restorative efforts to prevent accident recurrence, were described by Reason in his Organisational Accident Causation Model (1990, 1995). In his model, accident causation is the culmination of latent failures
resulting from fallible decisions at corporate and line-management level, coupled with psychological precursors of unsafe acts like stress and motivation, the unsafe acts of the individual(s), and breaches in safeguards against error. Eagle, Davies & Reason used a systems approach based on Reason’s model to investigate an anaesthetic incident in which a patient died (Eagle, Davies & Reason, 1992). They conducted a multidisciplinary meeting which included anaesthetists, nurses and surgeons and uncovered four active and four latent failures associated with the incident, and rightly noted; “Unfortunately, analysis only of active failures leads to a constrained view of the problem. Focusing on the anaesthetists excludes the influence and interaction of, and latent failures
extant in other operating room personnel”. Reason’s analysis has been adapted and formalised by other researchers. For example, using ‘human-factor’ checklists and structured 20-30 minute interviews, Stanhope et.al., 1997, investigated an obstetric near miss incident resulting in the birth of a pre-term baby by Caesarian section. They systematically analysed active failures, conditions of work and organizational issues surrounding the case and concluded that the most important factors were operational and attributable to staff shortages, poor communication, supervision and training. They highlighted the advantages of investigating near misses, including deficiencies in the systems involved in care delivery, and once understood, appropriate preventative remedial actions can be instigated to reduce the chances of the serious incident recurring. They also suggest that individuals involved in near miss incidents may be more candid and open than if involved in serious adverse events as recrimination and legal actions are less likely in near miss incidents.

Currently there is no agreed standard approach to the systematic investigation of healthcare accidents. However, the London Protocol, with its seven levels of
safety framework, extends and adapts Reason’s model to healthcare with the advantage that it can be applied to acute medicine, mental health and primary
healthcare (Taylor-Adams & Vincent, 2004). Importantly, the protocol uses ‘systems analysis’ and expands accident contributory factors to include: patient factors such as condition (complexity and seriousness), language and communication, personality and social factors; task design; technological factors; the availability of protocols; and the term ‘care delivery problems’ is used in preference to the term ‘unsafe acts’ to reflect that a problem may extend over time. The protocol includes a framework for selecting the investigative team, conducting interviews as well as how to formulate recommendations and developing an action plan from the investigations findings.

Learning from adverse health-care events is imperative if patient care is to be improved by reducing the risk of patient harm. However, learning opportunities are reduced if incidents are under reported. Interestingly, the patient as a potential source of incident reporting is often ignored. Recent studies indicate that patient-identified incidents are often not reported to hospital incident reporting databases (Weingart et.al, 2005; Weissman et.al, 2008). In addition, patient surveys suggest that patients report a much higher rate of adverse events compared to published rates from hospital records (King et.al., 2010; Lehman et.al., 2010).

With revisions by Vincent and Amalberti in 2016, the London-ALARME (Association of Litigation and Risk Management-European) protocol sets out a methodology through which the factors that contribute to a medical incident are systematically investigated, including a greater emphasis on patient’s perspective:

• The patient’s perspective of the events contributing to the medical error are actively sought.

• Analysis of the medical incident includes the whole ‘event journey’ undertaken by the patient rather than just the error incident itself. This broader time frame allows a more complete assessment of the possible contributory factors in a medical incident.

• Greater emphasis on learning from the medical incident by conducting detailed assessment of the failures and successes in error detection linked to the medical incident, combining them to produce an overall risk benefit ratio of harm for the patient concerned.

Vincent and Amalberti also argue that vital learning from adverse events can also be achieved by analysing near miss events, and crucially, understanding how near miss events are detected and recovered from.

In 2003 the Department of Health through its National Patient Safety Agency (NPSA) introduced the National reporting and Learning System (NRLS) database. The NRLS provides a framework for National Health Service (NHS) organisations and their staff to register and investigate serious incidents and learn from them using root cause analysis methodology. Although some academics have criticised this approach (Taylor-Adams & Vincent, 2004) the NRLS provides a near identical systems approach methodology to the London protocol.

Summary

Systems human-factor based methodologies for investigating and learning from healthcare incidents offer a comprehensive evaluation of active and latent failures that underlie medical incidents and errors. By extension, a systems approach to understanding healthcare incidents is more likely to produce effective actions to address latent and active failures than person-centered methodologies.

Recent research in the field of patient safety has placed more emphasis on a patient-centered approach to medical incident investigation. This has been adopted in the London-ALARME protocol. However, healthcare incident investigation using systematic methodologies has the potential to be complicated, time consuming, and may not always lead improvements in patient safety if investigation is poorly conducted or if subsequent safety recommendations are inadequate or poorly implemented. These pitfalls can be mitigated if healthcare organisations incorporate systematic incident investigation methodology and analysis in their quality and safety programmes and include appropriate training and support to healthcare staff who are incident investigators In turn, healthcare organisations should encourage patient participation in medical incident investigation, and have guidelines and protocols that facilitate reporting and sharing of the learning from both near miss as well as adverse medical events (Scholefield, 2007 and Mahajan, 2010).

References
1. Dejoy, D.M. (1994). Managing safety in the workplace: An attribution theory analysis and model. Journal of Safety Research; 25(1):3-17.

2. Eagle,C.J., Davies, J.M., Reason, J. (1992). Accident analysis of large-scale technological disaters applied to an anaesthetic complication. Canadian Journal of Anaethesia;39(2):118-122.

3. Mid Staffordshire, N. H. S. (2015). Foundation Trust: Public inquiry-chaired by Robert Francis QC. Final Report. 2013.

4. Heider, F. (1958). The psychology of interpersonal relations. New York: Wiley

5. Holden, R. J. (2009). People or systems? To blame is human. The fix is to engineer. Professional Safety, 54(12), 34–41.

6. Jones, E.E., Davis, K.E (1965). From acts to dispositions: The attribution process in person perception. In Leonard Berkowitz (ed.), Advances in Experimental Social Psychology. Volume 2. New York: Academic.

7. King, A., Daniels, J., Lim, J., Cochrane, D. D., Taylor, A., & Ansermino, J. M. (2010). Time to listen: a review of methodsto solicit patient reports of adverse events. Quality and Safety in Health Care, 19(2), 148-157.

8. Lehmann, M., Monte, K., Barach, P., & Kindler, C. H. (2010). Postoperative patient complaints: a prospective interview study of 12,276 patients. Journal of Clinical Anesthesia, 22(1), 13-21.

9. Mahajan, R.P. (2010). Critical incident reporting and learning. British Journal of Anaesthesia, 105(1):69-75

10. Reason, J. (1990). Human Error. Cambridge university press.

11. Reason, J. (1995). Understanding adverse events: human factors. Quality in Health Care;4:80-89

12. Reason, J. (2000). Human error: models and management. British Medical Journal;320:768-770.

13. Stanhope, N., Vincent, C., Taylor-Adams, S.E. (1997). Applying human factors methods to clinical risk management. British Journal of Obstetrics and Gynaecology;104:1225-1232.

14. Taylor-Adams, S., Vincent, C., & Street, P. (2004). Systems analysis of clinical incidents: the London protocol. Clinical Risk, 10(6), 211-220.

15. Toft, B. (2001). External Inquiry into the adverse incident that occurred at Queen's Medical Centre, Nottingham, 4th January 2001. London: Department of Health.

16. Scholefield, H. (2007). Embedding quality improvement and patient safety at Liverpool Women’s NHS Foundation Trust. Best Practice & Clinical Obstetrics and Gynaecology, 21(4):593-607

17. Vincent, C., & Amalberti, R. (2016). The Consequences for Incident Analysis. In Safer Healthcare (pp. 47-58). Springer International Publishing.

18. Weingart, S. N., Pagovich, O., Sands, D. Z., Li, J. M., Aronson, M. D., Davis, R. B., & Phillips, R. S. (2005). What can hospitalized patients tell us about adverse events? Learning from patient-reported incidents. Journal of General Internal Medicine, 20(9), 830-836.

19. Weissman, J. S., Schneider, E. C., Weingart, S. N., Epstein, A. M., David-Kasdan, J., Feibelmann, S., & Gatsonis, C. (2008). Comparing patient-reported hospital adverse events with medical record review: do patients know something that hospitals do not? Annals of Internal Medicine,149(2), 100