DOI: 10.4244/EIJV13I17A321

What interventionalists can learn from the aviation industry

Robert A. Byrne, MB, BCh, PhD, Deputy Editor

Plane crashes tend to have three classical preconditions: a minor technical malfunction, bad weather and a tired pilot1.

Good doctors have at least three things in common: they know how to observe, they know how to listen and they are very tired2.

Just recently, I read an interesting report on aviation crashes in the newspaper3. It presented a brief overview of the history of aviation accidents and started by recounting the story of 24 airline pilots who met in Chicago to form the first pilots’ trade union in the 1930s. Foremost among their concerns at that time was safety, as it could be estimated that approximately half of them were destined to die in air crashes. Nowadays, the author wrote, take any 24 pilots, or even 24,000 of them, and it is a safe bet that they will die peacefully in their beds.

Plane crashes have much in common with medical mistakes. Firstly, both are much more likely to be the result of an accumulation of minor errors and seemingly trivial malfunctions –the so-called Swiss cheese effect– rather than a single major mistake4. Secondly, human error remains a common cause, perhaps the dominant cause, of both. As far as plane crashes are concerned, it is estimated that human error is causal for approximately 70% of crashes3. Regarding medical errors, the relative contribution is less clear. Although a recent report suggested that about 250,000 people die annually because of medical errors in the USA –making it potentially the third leading cause of death5– I was unable to find reliable estimates for the relative contributions of human and systems error. The contribution of human error, however, is likely very considerable. As Fiona Godlee wrote recently in the BMJ, “Who of us has not made mistakes that have harmed patients?”6.

Improvement in the safety of aviation has a lot to do with the culture of dealing with errors and sharing of experiences in a transparent manner. Because errors happen. And, if possible, they should only happen once. Pilots have long realised that high performance technology and preparedness are not the only elements important in preventing adverse incidents. Minimising the human error factor is critical. High reliability organisations –such as most commercial airlines– recognise the contribution of human variability to errors and try to account for and deal with this4. In this respect, although some progress has been made in terms of reducing human error in medicine, there remains much that physicians can learn from pilots.

In fact, a number of initiatives based on cooperation between doctors and pilots are ongoing. One example was recently reviewed in the pages of the Deutsches Ärzteblatt7. The report detailed how a clinic group in Germany is currently educating about 1,000 of its staff in the Lufthansa training school. The course is called “interpersonal competence” and has been offered since 2015. Each course is led by a pilot and a physician. One message learned from the aviation industry is key: strong interpersonal competence of individuals is central to reducing the error rate from human mistakes7.

A number of interventions may be expected to reduce medical errors. Firstly, a culture of transparency in relation to errors is vital and this is an ingrained behaviour for pilots. For physicians, increasingly critical incident reporting systems (CIRS) are being implemented in hospitals to assist error reporting. But the issue is challenging. While many feel guilty after a medical error, many also feel fear, in relation to the consequences for their professional reputation, their job, their licensure and for their future8.

The complexity of this issue was acutely felt in the last few weeks. In the United Kingdom the case of trainee paediatrician Hadiza Bawa-Garba made headlines9. Convicted of manslaughter by gross medical negligence for her failings in the death of a six-year old patient, she was initially given a suspended two-year prison sentence. In addition, she received a 12-month suspension from practice in 2017. However, the decision was appealed against leniency and on 25th January 2018 the High Court in London ordered that her licensure be permanently removed. Her case sparked concern in the medical profession, due to the contribution of systems failure to the incident. The perception among many doctors was that of a broken system seeking to heap blame on individual practitioners, who made human errors9. Amid outcry, one senior cardiologist went a step further and referred himself for investigation stating: “during four decades of practice, I have made clinical errors including delayed diagnosis and errors in treatment. In some cases my errors are likely to have contributed to poor outcomes and some patient deaths”10. Critically, it has been contended that the use in the case against her of the personal recollection notes she had made may lead to a reluctance in the future to reflect and learn from medical errors.

Other lessons to learn from aviation include the definition of clear communication rules, the recognition of limits of personal competencies, the use of structured teamwork to accomplish tasks and the mitigation of steep hierarchies. Interpersonal communication and issuing of clear commands between co-workers is a central feature of work practices in aviation. Although fog was clearly a major contributory factor to the world’s most deadly airline crash, when two Boeing 747 jets collided at Tenerife airport in 1977 killing 583 people, confused communications between both aircraft and the tower and between the cockpit crews were also major factors3. This led to revised rules and practices, each incident being used to reduce the risk of further events.

A related intervention to reduce errors is the development of a work culture where superiors can be freely challenged on their actions. This type of flat hierarchy is something that physicians, in contrast to pilots, still need to improve on. The airline industry has already learned some hard lessons in this respect. Anyone who has read Malcolm Gladwell’s book “Outliers” will know about the ethnic theory of plane crashes1. It cites the example of Korean Airlines, which had safety issues in the 1980s and 1990s such that the loss rate for aircraft was 4.79 per million departures, more than 17 times higher than that of United Airlines, for example (0.27 per million departures).

A turning point was reached when flight 801 crashed while making a landing at Guam airport in 1997. The first officer had in vain been trying to challenge the captain and politely recommending he go around because the runway could not be visualised. “Mitigated speech” is the term linguists use for any attempt to downplay or sugar-coat the meaning of a statement, usually in an attempt to be polite or to be deferential to superiors. It is well recognised in aviation as something that is to be manifestly avoided. Korea is identified as a country with a high power-distance index, which refers to a culture where it can be difficult to challenge superiors. This is also reflected in linguistics. While many European languages have two forms of second person address –formal and informal– Korean has six. English, of course, only has one, and the five countries with the lowest power-distance indexes, as recounted by Gladwell, are all English speaking1. Nowadays, Korean Air is one of the safest airlines in the world. One reason may have been the switch to English as the language of all cockpit communication from around 2000 and its impact on “mitigated speech”. The parallels with cath lab communication are obvious and the potential for intervention is clear.

Error prevention in aviation also has a lot to do with structured work processes. Who amongst us has not experienced a situation where a miscommunication meant that heparin was not administered when it should have been, or that a loading dose of P2Y12 inhibitor was thought to have been administered but subsequently transpired not to have been. In most cases of course, there are no immediate consequences for the individual patient. However, the consequences for work practices are clear. Nowadays, as in aviation, and in clinical research (as discussed recently in these pages)11, working to checklists and including “team time-outs” is becoming increasingly common in cath labs in many countries. This is certain to reduce sources of error. One recent study showed that a structured handover intervention between shifts tested at nine hospitals reduced medical errors by 23%12.

The role of fatigue is also well recognised in aviation3. A very tired pilot can have the performance ability of a drunk and be incapable of making quick correct decisions when things go wrong. I’m sure most of us agree that a very tired interventionalist is liable to make similar errors. Somewhat surprisingly however, studies in the medical literature of interventional performance post call don’t always back this up13.

Finally, returning to the course for medical personnel run at Lufthansa, one of its leaders, an orthopaedic surgeon from Cologne, continually reiterates the importance not just of learning, but of putting into practice the lessons learned during the course. “When I am having a bad day”, he says, “I enter the operating room and say: I have a lot on my mind today, if one of you notices something is off, just say it. And keep an eye on me”. Maybe it is an approach we could use from time to time in the cath lab too.


References

Volume 13 Number 17
Apr 20, 2018
Volume 13 Number 17
View full issue


Key metrics

Suggested by Cory

Viewpoint

10.4244/EIJ-D-24-00413 Oct 21, 2024
The invisible gorilla in the cath lab: can we fly away from it?
Silva-Marques J and Oliveira C
free

10.4244/EIJV11I13A283 Apr 20, 2016
Simulation training for the trainee cardiologist: the evidence mounts
Gosai J and Gunn J
free

10.4244/EIJV9I4A77 Aug 23, 2013
Implementation of a physician-staffed helicopter: impact on time to primary PCI
Hesselfeldt R et al
free

CLINICAL RESEARCH

10.4244/EIJY15M06_05 Apr 20, 2016
Virtual reality training in coronary angiography and its transfer effect to real-life catheterisation lab
Jensen U et al
free

10.4244/EIJV13I6A99 Aug 25, 2017
Critical reflection on postgraduate learning: education through sharing
Marco J et al
free
Trending articles
152.9

Clinical research

10.4244/EIJ-D-20-01125 Oct 20, 2021
An upfront combined strategy for endovascular haemostasis in transfemoral transcatheter aortic valve implantation
Costa G et al
free
47.8

NEW INNOVATION

10.4244/EIJ-D-15-00467 Feb 20, 2018
Design and principle of operation of the HeartMate PHP (percutaneous heart pump)
Van Mieghem NM et al
free
39.1

Clinical research

10.4244/EIJ-D-22-00558 Feb 6, 2023
Permanent pacemaker implantation and left bundle branch block with self-expanding valves – a SCOPE 2 subanalysis
Pellegrini C et al
free
38.95

State-of-the-Art

10.4244/EIJ-D-23-00912 Oct 7, 2024
Optical coherence tomography to guide percutaneous coronary intervention
Almajid F et al
free
X

The Official Journal of EuroPCR and the European Association of Percutaneous Cardiovascular Interventions (EAPCI)

EuroPCR EAPCI
PCR ESC
Impact factor: 7.6
2023 Journal Citation Reports®
Science Edition (Clarivate Analytics, 2024)
Online ISSN 1969-6213 - Print ISSN 1774-024X
© 2005-2024 Europa Group - All rights reserved