19 Feb 2015
Industry Continues Discussion on Human Factors
In May 2014, the Society of Petroleum Engineers (SPE) held a Web event that examined human factors as they pertained to process safety and culture. The event was a revisit of the 2-day summit held on human factors in 2012, from which resulted a technical report intended to provide guidance on the human factors risks in exploration and production operations and what can be done to reduce those risks and increase safety.
The technical report can be downloaded here.
The 2014 Web event was moderated by Roland Moreau, safety, security, health, and environment manager with ExxonMobil. The speakers were Kenneth E. Arnold, a consultant with more than 45 years of industry experience, including 16 years with Shell; J. Ford Brett, a consultant in the area of petroleum project management who has delivered workshops and short courses in more than 20 countries; and Andrew Dingee, chairman of the SPE Human Factors Technical Section. Dingee worked extensively in aviation safety after leaving active duty from the Marine Corps, where he was an aviation instructor. He transitioned to the oil and gas industry in 2010, bringing lessons learned from the recent revolution in aviation safety to the oilfield environment.
The following is an edited transcript of the question-and-answer session from the Web event.
Moreau: The first question is from the UK. “Would the panel be kind enough to comment on or suggest practical ways of making an organization consider human factors throughout their processes? What are the suggested first steps?”
Arnold: It’s pretty hard to say what a first step is. One step, which a lot of companies are already doing and which is an easy step, is to consider human factors in engineering and design. And there are organizations that have human factors specialists who review their designs to make sure that human factors are included.
I want to go back and create an analogy. When people first came up with the idea that we need to do hazard analysis as a separate analysis of our designs, many of us in the industry said, “Well, why do we need to do this? We think about all of these things as we’re doing the design.” And what we found is, when you just have a workshop that is focused only on safety and operability, you find things that people knew they shouldn’t have done. They didn’t see the implications of it.
So, having this separate step of safety review turns out to be a pretty good thing. And some of us old-timers, who at the time thought, “Oh, this is just a waste of time,” have had to rethink our thought processes.
The same thing happens with the human factors review. If you have someone who is truly trained in human factors engineering involved in a review process of what is being designed, you will be amazed what they can find. And when they find it, you look at it and you say, “Oh, this is just common sense.” But, it isn’t just common sense because we didn’t do it in the design.
So that’s an easy thing, and it’s something that can be implemented fairly quickly. But the real implementation is in leadership, in getting leadership on board and getting leadership to lead at every level all the way through the organization, really understanding what it means to do the things that are in the section on leadership and culture in the technical report.
Brett: This is something that may work, depending upon the organizational situation. It’s certainly not the only possible first step, but you can get together the leadership, the organization, and talk about the past five, three, seven, 10 biggest problems that we’ve had. Let’s understand them, work on them, figure them out, and do an analysis of that with appropriate facilitation to elicit the human factors components of it. Almost everyone in that process will come to the self-discovery that, “Hey! Humans had some big fat factor in this problem that we all experienced and lived with.”
We need to work on communication. We need to improve how we communicate. Specifically, how are we going to do that? Crew resource management is a good way to do that in a structured way. But, if you just start from zero, let’s talk about what’s gone wrong around here and try to understand.
Moreau: The next question we received is one I’ve had in my mind. “In my industry experience, the individuals that have a lower risk tolerance or are emotionally connected to safety have witnessed an event. Do you have any thoughts on creating that connection to safety without living through such an event?”
Dingee: One thing from my background that has helped us is what we call realistic-based scenario training. And, obviously, the airlines and the military fighter community have put in millions and millions of dollars putting us in the exact same scenario that doesn’t cost your life or the airplane.
So, I think, the more we can put our own workers, the closer we can make them feel, smell, taste the environment but in a safe scenario, the more they’re going to learn from it.
The other half of that comes down to effective communications. Some companies do fantastic jobs communicating lessons learned back to the field, and others struggle with it.
Arnold: One of the things that I’ve learned from our buddy John Thorogood is that, in some industries, when there is a major accident, everybody in that industry gets to learn a lot about what happen and remember it. We don’t do a good enough job in our industry of talking about the major accidents. I don’t know how many people in our industry really know what happened at Piper Alpha other than 167 people died and understand the mistakes that were made.
I think it would be a good thing if we evaluated things like Piper Alpha and Macondo and P-36 and the other disasters of that nature on a regular basis and made sure that everyone in our organization, not just the engineers because anyone who’s in an operating mode needs to know this stuff, needs to know what happened in the past, what did we learn from that as a way of giving them this feeling of actually kind of being there and understanding and putting themselves in that position and realizing that, if they were there, they may have died.
Moreau: A couple more questions have been asked. One is, “Can we talk a little bit about situational awareness, which, in my analysis, was the biggest factor in the Macondo incident.” And then the other one is, “How do we refocus our competency and approach to this attitude?”
Arnold: Well, it’s really hard. Situational awareness is something that, when we look at crew resource management in the air industry, it falls into that same category. And one of the things that makes it difficult is we have a lot of different situations. And it’s just a multitude of things that can go wrong. And there’s a multitude of times when you have information that’s not complete and you have to do something with it, take some action or avoid taking an action.
Sometimes, the best thing you can do is not take action. Three Mile Island is a good example. If the people in the control room had just let the automatic system do its job, the accident would not have advanced to the position that it did. But they thought they were reacting to something that was bad information, and so they started to do things rather than just let the system take care of itself.
So it’s very difficult, especially when you’re under time pressures and your own personal safety is at risk. The way they do it in the airline industry is, every year, a pilot goes into a simulator and they put him in some bad situations and give him the experience of having to deal with that.
Now we’re doing that more and more in the drilling side of our business, where we do have around the world several good drilling simulators as training tools. We’re bringing operational people into these simulations, and designers as well, and having them deal with real bad things as a way of getting them used to what happens.
I think we can do more with that. I think we could do a lot more of that kind of simulation training even in the production end of our business. We have very elaborate control systems now. We measure everything under the sun. But, do we really put people in a mock control room that mimics their actual operation and feed them a disaster and get them used to responding to it the way we want them to respond to it?
Moreau: Another question: “A lot of corporate cultures are averse to litigation, and the leaders are afraid of tackling poor performance in a positive way, deferring to take punitive enforcement actions rather than fostering an uplifting culture. How do you strike the balance?”
Arnold: One way to look at this is to look at the difference in societies, the greater society culture, between the way Norway handles safety and the way the US handles safety.
The US is a culture that is always looking for who the bad guy is. And, if we just punish him, then whatever happened bad isn’t going to happen again. And we have this litigious culture of we’re going to sue immediately. We’re going to sue somebody. Look, our own government from Day 1 was out to sue various individuals even at BP, and they’re still doing it. The Justice Department is still doing this because that way we’re going to catch the bad guys. And if we just punish them enough, nobody will ever make that same mistake again.
Well, that’s not the way Norway approaches that stuff. Norway understands that it’s not a pass/fail system. You have to have punishments. You have to have some room for punishment in the system somewhere. But, if we’re going to learn from it and if we’re going to disseminate knowledge, we can’t focus on litigation.
Moreau: I have another question here: “Often new employees who have recently been trained are more safety conscious and try to intervene in unsafe behaviors of highly experienced personnel but get shunned by them. What are your thought on this?”
Dingee: Overall, it’s leadership failure. If the senior leadership doesn’t come down and influence the workers who are currently there, it is very difficult. It becomes the hammer approach to leadership, which, in my opinion, is unsuccessful and such a negative culture for learning. So, it takes time.
Moreau: Then, it goes back to basically the important role that leadership has in fostering a positive culture.
Arnold: Can I just say it’s not just senior leadership. It’s at every level. Leadership at every level can do that.
Moreau: Yes, there’s always a fear, I think, that people might hear about the hard clay layer in middle leadership—that the passion from up top doesn’t make it to the bottom and the issues from the bottom don’t make it to the top. And I think those challenges continue to happen.
Moreau: And one last question: “How do you transfer a corporate culture to a contractor?”
Arnold: If you look at that how improvement happened from the ’70s until now, it was a multidecade process of getting everybody to think that safety was important. And when I joined the oil industry, people actually said if you haven’t lost any fingers, you haven’t been working hard enough. And they meant it. Well, we changed that culture. And now, that’s not even funny, wherein it used to kind of be funny.
And so, how do you do it? It has to be us as an industry addressing this issue. Common definitions of term is one way so that people just understand. It’s a relatively complicated problem, but it’s something that every one of us needs to engage as an industry in addressing. And it’s something that, if you tried to do it in your own company, it would not be successful because it’s something that we as an industry have to do.
Listen to an archive of the Web event here.