Earlier this year, Alan Smith, COMET’s Head of Investigations, spoke at several industry events, including the 2024 Safety Expo, on the topic, "Human Error: The start of your investigation, not the end." He explained how human error and human factors should be seen as the starting point for investigations rather than the end.
Redefining Human Error
The traditional human error definition is a 'human action or a lack of action that can lead to unexpected and often unwanted results'. This definition can make human error seem like the end point—someone made a mistake,and that’s where the investigation stops.
At COMET, we take a different approach. We define human error as "unexpected opportunities to better encourage learning". We should not view an error as a final judgment. Instead, it should prompt us to examine the system, process, or management issues that led to it.
Human Error Statistics: A Global Issue
Human error is a significant factor across various industries:
- 74% of data breaches involve poor human judgment.
- 90% of traffic collisions are attributed to driver error.
- 80% of aircraft disasters cite pilot error as the primary cause.
- 250,000 deaths in the U.S. in 2022 were because of medical error.
- 75% to 96% of marine accidents can be traced back to human error.
These statistics show that human error often comes from larger systemic problems. This leads us to the next point: we need to look deeper than just blaming the individual.
Examples of Human Error in Action
Alan discussed examples of accidents and incidents where human error was initially blamed, but further investigation revealed deeper, systemic problems.
1. The Titanic
Original blame:
The disaster was attributed to the captain’s decision to change course, resulting in the collision with an iceberg.
Actual factors:
The belief that the ship could not sink, design problems in the hull, and water entering through weak rivets led to the disaster were all major factors that contributed to this disaster. Additionally, the ‘watertight’ compartments were not truly watertight, human error here masked larger organisational and design issues.
Source - Medium. (Mar 3, 2023) The Untold Story of the Titanic Disaster: Was it All Down to Human Error?. Medium. Link
2. Tenerife Airport Disaster
Original blame:
A miscommunication between the KLM captain and air traffic control led to a catastrophic collision on the runway.
Actual factors:
The airport’s air traffic control had failed to address previous communication issues and the airport was overcrowded with planes. Poor visibility combined with defective runway lights then worsened the situation. What seemed like a simple miscommunication actually reflected a systemic failure in the airport's management.
Source - McCreary et al. (1st Jan, 1998). Human Factors: Tenerife Revisited. Journal of Air Transportation World Wide. Link
Vanderbilt Medical Centre Fatal Injection
Original blame:
A nurse administered the wrong injection, resulting in a patient’s death, for which she faced criminal charges.
Actual factors:
The hospital was understaffed and nurses were tired after working double shifts. The medicine phials looked almost identical and sat next to each other and the hospital also ignored previous near-miss incidents just like this one. The error was not because of negligence but was a symptom of larger systemic issues.
Source - Williams et al. (2023). Investigative approaches: Lessons learned from the RaDonda Vaught case. Science Direct. Link
Case Study: The Japanese Train Crash
One of the most compelling examples Alan shared was the Japanese train crash in 2005. The crash caused 106 deaths and over 500 injuries.
At first, authorities blamed the crash on the train drivergoing too fast. The driver was trying to make up time after running slightly late, however, further investigation revealed a deep-rooted cultural issue within the company.
The organisation punished late-running drivers severely, using a culture of shame (known as "haji") and financial penalties. The company’s focus was on controlling driver behaviour rather than improving error resilience. Additionally, the curve where the accident occurred did not have an Automatic Train Stop (ATS) system, which could have prevented the derailment.
This case shows that focusing only on a person's mistake ignores important organisational and cultural factors. These factors can greatly influence incidents.
Source - Chikudate, N. (2009). If human errors are assumed as crimes in a safety culture: A lifeworld analysis of a rail crash. Human Relations. Link
Performance Influencing Factors (PIFs)
Alan emphasised the importance of understanding Performance Influencing Factors (PIFs). These factors can help identify why errors occur, and they typically fall into three categories:
Task PIFs:
These include clarity of signs, time allotted for task completion, divided attention, and whether adequate tools were provided.
Individual PIFs:
These factors relate to the worker’s physical or mental state, including fatigue, stress, time pressure, and motivation.
Organisational PIFs:
These include pressure to meet targets versus safety, the quality of supervision, and the organisation’s ability to learn from previous mistakes.
In the case of the Japanese train crash, several PIFs influenced the incident, including time pressure, stress, inadequate engineering solutions, and a lack of safety mechanisms.
Investigating Human Error: A Deeper Dive
At COMET, we believe you must get into the mind of the person involved at the time of the error, despite the outcome, their actions likely made sense to them at that moment. Alan discussed the importance of avoiding hindsight bias—resisting the urge to ask, “What were you thinking?” Instead, investigations should focus on the context surrounding the incident.
One effective tool for investigators is the Substitution Test. This involves asking, “If someone else with the same skills was in this situation, could they have made the same mistake?” If the answer is yes, this indicates a broader systemic issue that needs addressing.
Avoiding Error Traps
Finally, Alan identified several common error traps that can lead to incidents:
- Unrealistic deadlines or time pressures
- Procedures that are too long or complex
- A distracting work environment
- Inadequate resourcing
- Confusing controls or alarms
These traps can lead even skilled workers to make mistakes, and many companies may find they have one or multiple lurking. A way of identifying and resolving potential error traps is by organising ‘trap hunts’ as part of a team building exercise, this can help to open staffs eyes and encourage more employees to report error traps they see to prevent and future accidents from happening.
Conclusion
In his talk, Alan Smith reminded us that human error is only the beginning of an investigation, not the end. By examining the broader context—organisational culture, task complexity, and individual stressors— we can learn from errors and improve our systems to prevent future incidents.
At COMET, we believe that improving the system, not the worker, is the key to safer and better workplaces.
Contact us today to learn how we can help your organisation.