Are Egos a Main Obstacle to Intelligent Energy Implementation—And Can We Get Around Them Published December 9, 2013
Most of the information in this column is wrong.
I don’t plan to waste your time—I believe that I have useful ideas here that could make a big contribution to offshore safety.
But when the CEO of a Norwegian oilfield services company said to me last year that “most people are wrong most of the time,” I thought, well, he’s right in the sense that I cannot think of anyone who is right most of the time. So, that probably goes for me, too.
I also want to make the point that, as reflective individuals, we do have the capacity to train our egos; and, if we believe that egos are the source of many obstacles to achieving safety, perhaps we can solve the problem at the source.
To start explaining what I mean, I would like to tell you about some talks that were presented at a breakfast forum at Offshore Europe in September 2013, about 2 weeks after the Sumburgh helicopter disaster in Scotland when four people lost their lives.
The speakers included Martin Rune Pedersen with Maersk Oil UK; Judith Hackitt, chairperson of the UK’s Health and Safety Executive; and Ian Sharp, chief operating officer for Fairfield Energy.
Pedersen explained that, every time a new drilling rig is brought to Maersk, the company organizes 2-day workshops with the drilling company staff that include team building exercises and technical discussions. Maersk values “humbleness,” which it says is “about listening and learning and giving space to others.” It also values what it calls “uprightness,” where people stick to their word.
Judith Hackitt advocated a mindset of “constant unease,” which she said “means never thinking the problem is fixed.” “Constant unease means never being complacent, being prepared to ask hard questions, and not seeking reassurance from what you know is right.”
Meanwhile, Sharp presented a result of a survey into worker engagement in the North Sea, which showed expected results on first glance; but, when examined more deeply, some questions emerge. For example: Why do site leaders feel less personally engaged in the site’s safety culture than workers do?
I am trying to pull out a common thread between all of these points—that all speakers were actually focusing on ego, a main threat to safety, and how to stop it causing problems and help us work more intelligently.
The ego, which tells us that everything is fine when it is not, the ego, which stops us questioning too hard, and the ego, which arises in difficult personal discussions when we get defensive talking to people we don’t know very well.
I should probably try to define what I mean by “ego” if we are going to discuss it.
I am skipping over Wikipedia definitions of ego and coming up with one of my own, which I think you will recognize: when we create a kind of storybook self as a kind of defense.
Our actual selves can use the full force of our subconscious minds to weigh situations and figure out the best response and judgment for the benefit of everybody involved in a situation, what you want when trying to mitigate risks or find the best response after an accident.
Meanwhile, our storybook selves are worrying about how we have been treated, if the way someone spoke to us is compliant with our idea of what the storybook self wants. Our storybook self cares far more about our position in the organization and feels that being asked to change could be a sign that someone else has power over us.
But living behind our storybook selves can be easier if our real selves are not strong, or have not had enough exposure to gain strength. It is easier to see things as we would like them to be.
Drilling Rigs and Fighter Pilots
At the Integrated Operations forum in Trondheim in October 2013, I heard a talk by
Arent Arntzen, project manager for Statoil’s Arctic Drilling Unit and a former fighter pilot with the Royal Norwegian Air Force for 22 years. In it he spoke about how his pilot experience is relevant to his role now.
Much of the air force training is about avoiding the negative effects of the ego so people can do what is best for the organization, not themselves, he explained.
By contrast, oil and gas drillers make most of their decisions around not looking bad, he said. “Drillers are all mortally afraid of doing something foolish. If you know that, you can probably handle them.”
Arntzen was asked what advice he had for the oil and gas industry as to how to better manage people’s egos. In the air force, “every mission is briefed and debriefed,” he said. “When you debrief, everyone is subjected to his or her errors during this mission. That tends to shave away your ego every time.”
“Because, whether you are colonel or lieutenant, it is the same thing; you are all the same when you debrief, there is no hierarchy when you are debriefed.
“This is part of becoming an integrated team. You are able to put your position in the military hierarchy to the side. Because you were a team at the time. When you leave it, you shut the book and you back into the other structure.
“This takes some practice. And that will help with your ego.”
At the Aberdeen Piper 25 conference in June 2013, we heard from Lord Cullen, who conducted the enquiry into the 1988 Piper Alpha disaster. One of Lord Cullen’s key observations was that it is important to have people in a position to question the people who make safety decisions.
He might have said (although he did not) that this is a good way to prick people’s egos.
The “safety representative” idea was introduced in 1989 after Piper Alpha. In this idea, people elected by staff, not management, have powers to carry out investigations and can put safety concerns to senior managers without worrying about their jobs.
Companies should also have to present safety cases, or structured argument, showing that their system is as safe as reasonably practical. And this should be subject to interrogation by someone with expertise and indepednence.
Lord Cullen’s talk was followed by Andrew Hopkins, professor of sociology with Australian National University, who explained the critical factors with making safety cases work and why they fail.
The important features of a safety case regime, are (1) it must have a risk/ hazard framework, (2) there must be workforce involvement, (3) you must be required to make the case to a regulator, (4) the regulator must be engaged, and (5) there must be a requirement of duty of care, he said.
There is little point in introducing a safety case regime unless all five components are in place, he said.
“The US has 1 and 2, but Items 3, 4, and 5 are lacking. People say, which should you do first. My argument is this system won’t work unless you see it as a package. The safety case is not worth the paper it is written on unless it is presented to regulator for scrutiny.”
The “as low as reasonably practicable” (ALARP) requirement means that people can’t hide with the security that they have complied to a requirement because the requirement can change as soon as someone finds a less risky way to do it.
“One of the really tragic outcomes of the Macondo accident is that the US Department of Justice is prosecuting two of the wellsite leaders on the rig, who are basically foremen, low-level managers in the role they performed. In the US, we are seeing a “clumsy and misdirected prosecution,” Hopkins said.
“These are the only two individuals the Department of Justice is going to prosecute for criminal negligence. That seems to me to show a complete misunderstanding of what is going on and what the causes are.”
Hopkins also argued that the decentralization of BP, which happened after around 2000, with (for example) local drilling engineers reporting to the local asset manager rather than the company’s most senior drilling engineer, could have led to problems.
And (he might have said but did not), the senior drilling engineer has the most expertise and is maybe best able to prick the egos of his juniors, rather than the asset manager who might, if anything, just end up in a conflict over who is right, leading to inflated egos.
We hear a lot about “management of change” as a main problem with digital energy implementations when we really mean “trouble convincing people to accept a change,” which sounds like the ego is in the room. The ego doesn’t like the idea that someone else can tell it what to do or to do things differently.
We’ll hear about conflicts people get into, which can be driven more by the ego wanting to get its own way.
We hear a lot about people given petty rules to follow, which feel like, and maybe are, someone with more power trying to show it to satisfy their egos.
We’ll also hear about people who don’t notice things as though they are more in their storybook world.
Four years ago at Intelligent Energy, I heard a great quote from Satish Pai, then vice president of operations with Schlumberger, about how so many people in the oil and gas industry want to save the world and want to convince their colleagues that the technology that they work on, or their expertise, is vital for saving the world. This also sounds like the ego in the room. Perhaps the real self doesn’t care if it saves the world or not.
We are an industry that loves to manage things. Perhaps the ego is one more thing to manage; and, perhaps if we actively thought about it, we could do it very well.
Karl Jeffery is editor and cofounder of Digital Energy Journal. He is also publisher of Carbon Capture Journal and Tanker Operator, and cofounder of Digital Ship, a publishing and events company covering digital technology for the deep sea maritime industry. Jeffery holds a BEng degree in chemical engineering from Nottingham University.