The instinct to assign blame when things go wrong is natural because it gives us a sense of control. However, this instinct might be making us less safe. In the following article, Rohit Talwar, HSSEQ Manager at V. Group, used the 2021 Stolt Groenland incident report to highlight that mistakes are often unintentional, but the tendency to focus on who is at fault is common.
When things go wrong, our instinct is often to ask, “Who’s at fault?” It feels natural – blaming an individual, group or a company gives us a sense of control. But what if that very instinct is making us less safe?
Lessons learned from the Stolt Groenland incident report
(Note: This is not a critique of the report, which is thorough and well-constructed. Rather, it is used as an example to emphasize the importance of framing findings in a way that promotes learning rather than assigning blame)
The MAIB report on the 2021 Stolt Groenland incident highlights a recurring challenge in incident investigations: the unintentional attribution of blame. While no one involved intended for the incident to occur, and errors are seldom the result of deliberate negligence, the language used in reports can sometimes imply fault.
For example, the report cites “a failure to appreciate the extent of heat transfer between cargo tanks” as a primary cause. While this statement is technically accurate, it can be interpreted as placing responsibility on those managing the cargo. This raises a critical question: is the primary objective of such investigations to identify systemic vulnerabilities—such as gaps in knowledge about heat transfer risks, deficiencies in cargo compatibility assessments, and operational pressures affecting decision-making? Or do we risk focusing on individual errors, which can inadvertently lead to a culture of blame rather than one of learning?
The way findings are framed is crucial. Reports should not only analyze what went wrong but also emphasize what can be learned to prevent recurrence. Consider this alternative approach: instead of stating, “Had the incident on the Stolt Focus been shared more widely, it could have provided valuable insights to prevent the subsequent explosion in Ulsan,” a more constructive phrasing would be:
“The investigation highlighted the importance of sharing information about similar incidents, such as the Stolt Focus case, across fleets and with relevant authorities. Strengthening incident reporting and communication protocols is essential to ensuring that lessons learned contribute to enhanced safety and the prevention of future incidents.”
By focusing on systemic improvements rather than individual shortcomings, incident reports can better serve their purpose—enhancing safety through knowledge-sharing and proactive risk mitigation.
What hinders us from addressing the true root causes
Traditionally, we believe that highlighting mistakes prevents others from repeating them or that better training will fix the issue. Both approaches focus on the question: “Who’s to blame?”
Research shows that blame—whether overt or subtle—often leads to fear and concealment. When people fear repercussions, they hide mistakes instead of reporting them, preventing us from addressing the root causes.
The IMO emphasizes that investigations should prevent incidents, not assign blame. Safety experts like James Reason and Sidney Dekker argue that most failures stem from systemic issues, not individual mistakes.
Emphasis on systemic failures
James Reason’s work highlights that accidents are often the result of ‘latent conditions’ within a system. This means that focusing on individual errors is less effective than addressing the underlying systemic issues.
His ‘Swiss Cheese Model’ illustrates how accidents occur when multiple layers of defense fail, shifting the focus from individual blame to system vulnerabilities.
As Reason puts it, “We cannot change the human condition, but we can change the conditions under which humans work.” Dekker adds, “Human error is a symptom, not a cause.” This implies that instead of blaming the human, we should change the work environment.
Advocacy for just culture
Sidney Dekker emphasizes the importance of understanding ‘why’ people make mistakes, rather than simply punishing them.
He promotes the idea that ‘human error is a symptom, not a cause’ which encourages a deeper investigation into the factors that contributed to the error.
His work really pushes for the understanding of the local rationality of the person involved in the incident. Meaning that when an incident occurs, to understand why the person did what they did, from their perspective.
Building a culture of trust
Instead of asking who made the mistake, we should ask why the mistake made sense at the time.
When we focus on learning, not punishment, we build a culture of trust—where people report errors and systems improve.
Next time something goes wrong, resist the urge to blame. Ask instead: “What can we learn from this?”
You might be surprised by the answers.
Above article by Rohit Talwar, HSSEQ Manager at V.Group has been initially published on his Linkedin account and is reproduced here with kind permission
The views presented are only those of the authors and do not necessarily reflect those of SAFETY4SEA and are for information sharing and discussion purposes only.