Safety

Cultural Disasters: Learning From Yesterday’s Failures To Be Safe Tomorrow

Although offshore disasters are rare, they have resulted in significant loss of human life, environmental damage, and negative impacts on the larger society. We vow to learn from them, and address the technical challenges, but less often explore the role of culture, such as safety culture.

ogf-2012-06-culturehero.jpg

The history of the offshore oil and gas industry contains many great achievements and, sadly, the occasional tragic failure. Although offshore disasters are rare, they have resulted in significant loss of human life, environmental damage, and negative impacts on the larger society. After each disaster, we vow to learn lessons from the event so we can prevent a reoccurrence. The industry is effective in addressing the technical causes of disasters but is less successful in dealing with the cultural causes, such as a poor safety culture.

Safety culture describes group members’ shared norms and values related to risk, which determine acceptable behavior and desirability of different outcomes. Although there are many definitions of safety culture, the one proposed by the Advisory Committee for Safety in Nuclear Installations (ACSNI) is the most widely used and the most comprehensive: “Safety culture is the product of individual and group values, attitudes, competencies, and patterns of behavior that determine the commitment to, and the style and proficiency of, an organization’s health and safety programs” (Advisory Committee for Safety in Nuclear Installations 1993).

To create a better understanding of how a poor safety culture increases the probability of a disaster, we reviewed the available inquiry reports for 15 major incidents occurring offshore from 1980–2010, as shown in Table 1. These incidents were representative of the types of incidents that are possible in the offshore industry. Nine incidents occurred on offshore installations, five occurred in flights to or from offshore installations, and one involved a support vessel.

ogf-2012-06-tab1culture.jpg

Five out of the 15 reports make direct reference to safety culture—Piper Alpha (1988), Usumacinta (2007), Montara (2009), Sikorsky S-92A (2009), and Deepwater Horizon (2010). Excluding the Piper Alpha disaster, the four remaining disasters directly referencing safety culture occurred within the last 6 years. This is reflective of the recent widespread understanding and acceptance of the importance of safety culture.

Classifying Cultural Causes of Offshore Disasters

We used James Reason’s safety culture model as a framework to conduct a detailed review of the cultural causes of these disasters (Reason 1998). In 12 of the 15 reports reviewed, we classified at least one of the causal factors identified in the report as cultural. Only three of the inquiry reports reviewed—Cormorant Alpha helicopter accident (1992), East Cameron block blowout (1997), and the Leman field helicopter accident (2002)—did not have any causal factors that could be classified into one of the three poor safety culture dimensions. This does not mean that a poor safety culture was not a causal factor in these three events because we only reviewed the findings presented in the official public investigation. The majority of the reports identified multiple cultural factors that contributed to the disaster.

  • Tolerance of inadequate systems and resources

In a poor safety culture, there is an acceptance of having to make do with inadequate systems and resources (e.g., equipment that is not fit for the purpose and large maintenance backlog). This lack of priority placed on safety can also result in only considering a narrow range of hazards, such as those hazards directly specified by legislation or those that the organization has sole responsibility to manage. For example, in the Deepwater Horizon disaster, the original well design required 16 or more centralizers to be placed along the long string, but BP’s supplier had only six in stock. Even after modeling raised concerns about using only six (in fact, 21 were required), only six were used.

  • Normalization of deviance

Deviations from documented management systems are accepted as a requirement for getting the job done in organizations with a poor safety culture. Alternatively, the documented management system may be allowed to deviate from accepted safety practices or regulatory requirements. For example, in the case of the Piper Alpha disaster, the permit-to-work system did not function as intended and had been identified as a causal factor in previous incidents. In organizations with a positive safety culture, everyone shares the belief that compliance with the safety rules and procedures is essential and the only way to act.

  • Complacency

Because disasters are low-probability events, the absence of a major event does not mean you are safe. In a poor safety culture, people forget to be afraid and do not give safety the attention warranted by the risk. People start to believe that their organization is in some way special or unique and that they do not need to conform to industry standards. Ironically, in these cultures people often cite their strong safety culture as an example of how they are different. Organizations that use occupational injuries as their primary measure of safety are particularly at risk of complacency. Cultures that prioritize management of occupational safety hazards can compromise process safety controls by not performing inspection or maintenance routines, which can increase the risk of minor injuries. Because major hazards are low-probability events, the increased risk of not performing these routines is not apparent. For example, complacency was highlighted as a contributing factor in the Maersk Victory accident. The official inquiry into this accident determined that a contributing factor to the accident was that “a degree of complacency had developed at a number of points in the total system which reduced its effectiveness to below that required to achieve a risk level as low as reasonably practical” (Aust 1996).

Avoiding Disaster by Learning the Lessons From History

The disasters that we reviewed were all different in terms of types of hazards, immediate causes, and sectors of the industry, yet they shared many common underlying cultural causes. From a cultural perspective, the offshore industry is not learning the lessons from history and, therefore, may be doomed to repeat them. Managing these cultural threats to safety is challenging because they are psychological rather than physical (i.e., it is much more difficult to prevent a culture of complacency than to replace fire walls with blast walls).

To effectively manage safety culture, the industry needs to adopt the same systematic approach used to manage hazards present in the offshore environment. All offshore companies (contracting and operating) should create a safety culture continuous improvement (SCCI) plan. This plan should adopt the same framework used to manage safety in general, such as using elements of safety management systems. At a minimum, the SCCI plan should include a policy statement and describe responsibility for implementing the plan, specific strategies to manage culture, performance evaluations, continual improvement, and audits. Companies should also regularly conduct comprehensive safety culture self-assessments. These assessments should not rely solely on employee perception surveys but should incorporate more objective indicators, such as observation and document analysis.

Senior managers need continuous information about the health of the safety culture in every asset they manage so they can identify threats and take action before a major event occurs. Currently, there is no standard key performance indicator (KPI) for safety culture, so individual companies need to develop their own measures. To collect this information in an efficient manner, this KPI should use information that is already being collected. For example, many companies track the number of safety observation or hazard identification reports submitted. In addition to the number of reports, companies could assess the quality of the reports submitted to assess employee engagement over time. It is important to use multiple safety culture metrics because safety culture is a complex phenomenon, containing a number of dimensions. Given the perceptual nature of safety culture, such as it being the product of employee values, attitudes, and perceptions of how to behave, assessment is commonly prone to bias. The use of multiple metrics will minimize this effect.

Finally, creating and maintaining a positive safety culture involves actively promoting the desired values, attitudes, and behaviors. Leaders play a key role in determining the culture within their organization. To be effective, they require safety leadership skills. Organizations should consider safety leadership skills when selecting managers and create development programs to enhance managers’ leadership skills. Organizations should also ensure that leader performance evaluation promotes a positive safety culture. Safety culture interventions should also be targeted at frontline workers to ensure that they share the desired values and have the skills to intervene if they observe an unsafe situation or action. Organizations that want to learn from previous disasters should systematically cultivate a positive safety culture by developing leadership skills, promoting the desired values, and assessing safety culture on an ongoing basis.


References

Advisory Committee for Safety in Nuclear Installations, Human Factors Study Group, Organising for Safety, Third Report, London, HSE Books (1993).

Aust, T. 1997. Accident to the Mobile Offshore Drilling Unit Maersk Victory on November 6, 1996. Mines and Energy Resources, South Australia.

Reason, J. 1998. Managing the Risks of Organizational Accidents. Aldershot, England: Ashgate Publishing Limited.


Mark Fleming is an associate professor in the Department of Psychology at Saint Mary’s University, Halifax, Nova Scotia. Mark is an applied psychologist with nearly 20 years of experience in industrial health and safety management in high-hazard industries, including offshore oil and gas, nuclear power, petrochemical, power generation, and construction.

Natasha Scott is the director of scientific instruments in the Applied Science Division at Pascal Metrics. Natasha is a PhD candidate in the Department of Psychology at Saint Mary’s University, Halifax, Nova Scotia. Natasha’s area of expertise is safety culture. For the past 6 years she has conducted applied research and consulted in various high-hazard industries, including oil and gas, construction, nuclear energy, and healthcare.