Fatal Medical Error: Too common to investigate thoroughly?
Regulatory Updates | Medical Devices
EMERGO BY UL SUMMARY OF KEY POINTS:
- Investigations of preventable medical errors in the US healthcare system in need of revamp;
- Scale of fatal medical errors hindering more rigorous investigations;
- Multidisciplinary approach incorporating human factors components needed to address ongoing medical use errors.
On Friday, March 1, 2019, a Tesla automobile was involved in a fatal crash in Palm Beach, Florida. Considering the possibility that the vehicle crash occurred while the vehicle was in autopilot mode, the National Highway Traffic Safety Administration (NHTSA) and the National Transportation Safety Board (NTSB) initiated an investigation of the accident.
On Saturday, February 23, 2019, an Atlas Air cargo plane (Boeing 767-300) crashed into a bay near Houston, Texas, killing the three people aboard. The NTSB immediately sent a “Go Team” to the crash site to investigate.
How the healthcare system reacts to fatal medical errors
On the day you read this blog post, enough people (a Johns Hopkins team estimates 689 people a day) are likely to die due to preventable medical error; enough people to fill two Boeing 747s. Clinicians are not the primary cause per se (remember, to err is human). Rather, the data suggests causes such as poor coordination of the healthcare system, misdiagnosis and absence of “safety nets” that would otherwise prevent the error from causing harm.
The response to the hundreds of fatal medical errors will pale in comparison to the cases above, at least in terms of scope and publicity. Locally, a hospital team will undoubtedly conduct an internal investigation of the death and discuss the case in a mortality and morbidity meeting. This team might even institute changes in their healthcare delivery process to reduce the risk of a reoccurrence.
At a higher level, organizations such as the Joint Commission might learn about the adverse outcome as well as a hospital’s associated risk mitigation response, and then build important safety insights into future guidance and requirements. FDA officials concerned with medical error might also draw lessons from the fatal accident and incorporate the learnings into future guidance and regulations, and has been the case with medical device reprocessing concerns (see the response by FDA, the Center for Disease Control and other entities to duodenoscope and endoscope infection and reprocessing issues).
Limitations of the MAUDE database
In fact, if the error involved a medical device, it must be documented in the Manufacturer and User Facility Device Experience (MAUDE) database. Unfortunately, the medical error investigation process is far less exhaustive than one exercised in response to an aviation accident. By law, all aviation accidents require investigation and the findings must be published. A principal reason for the lack of rigorous investigation of medical errors is the sheer scale of the problem. Remember, it amounts to two Boeing 747s per day, every day of the year.
Expanding investigations of use errors
How can we prevent the daily toll arising from medical error? I advocate for establishing a large multidisciplinary body dedicated to investigating (and learning from) the staggering number of fatal medical errors. Such an entity would pay for itself many times over in terms of saved lives and lower healthcare costs (estimated by the Journal of Health Care Finance to be $19.8 billion in 2008 alone). I envision that human factors professionals would be important members of investigation teams, not only to help determine the role that human beings and medical device user interfaces played in the accidents, but also to develop guidance on how to prevent future accidents (aka adverse events).
Such an approach to investigating medical errors would take the current system that allows for investigation on a limited scale and put it on steroids (at the right dose, of course). This approach would need to be supported with a new degree of transparency about medical errors that is missing in some institutions and often blocked by legal protection maneuvers. By the way, human factors professionals long ago stopped using the term user error and switched to the term use error because we realized that blaming the user gets you nowhere. Most use errors are system and design problems, and these problems can be solved if society:
(1) adopts a “no tolerance” attitude comparable to that adopted regarding air travel, and
(2) calls for vigorous governmental and industry actions that will make the difference.
Michael Wiklund is General Manager of Emergo by UL's Human Factors Research & Design division.