A subject field perhaps little known in the oil and gas industry is action science, a strategy for increasing the skills and confidence of individuals in groups to create organizations and to foster long-term individual and group effectiveness. This strategy applies to human relations in organizational, group, or interpersonal contexts in which individuals work together on challenging tasks.
Defensiveness and silo behavior are facts of life in group settings. To understand defensive behaviors and social limits to action, we must understand the concept of theories of action. The discussion below is based on the concepts in Flawed Advice and the Management Trap by Chris Argyris and Making Sense and Making Decisions by Howard Duhon.
A theory of action is defined as the set of rules, conscious and unconscious, that you use to design your actions in social situations to achieve your objectives.
On a macro level, our theories of action are similar; we are more alike than different. For instance, in business and personal dealings, most people claim that they try to be fair, consider other people’s points of view, and seek valid and accurate data.
Suppose that you have been drinking with friends in a bar and upon your return from a restroom break, you find your friends in a fight with strangers.
If you enter the fight, whose side will you be on?
That may be a foolish question. You will, of course, join the fight on the side of your friends. But is it fair? Who started the fight? What is it about? Did you bother to ask?
According to your espoused theory of action, you try to be fair and to seek valid information. Clearly, there are times when you may violate that theory.
We tend to violate our espoused theory of action when something important is at stake.
Most of us behave according to our theories in routine, low-threat encounters. But in situations when we feel threatened, or when the outcome of a decision is important to us, we may take a different tact in the following scenarios:
The theory of action we use when threatened has been labeled Model I behavior (also referred to as “theory in use”) by Argyris. The espoused theory has been labeled Model II behavior. Fig. 1 compares the situations in which Model I and Model II
Model I behavior kicks in when the outcome is important. We are more than willing to be fair when nothing important is at stake, but we want to be in control if what is at stake is important. We are at our worst in the most important situations. The cost of such behavior to us in business and in social settings is high.
When we act in ways that violate our espoused theory of action, it follows that we will deny it. A typical pattern of behavior is to
This behavior is perhaps most readily apparent among groups sharing religious beliefs (some issues are intensely important to church members). It is also present in business. The impact of such behavior can be dramatic (Fig. 2).
What makes matters of importance to us? Some subjects are inherently important, while many are important only because they are our team’s position, and being a team player is important.
For example, sharing of data and information is important when working on a project. But we are less likely to share data that are embarrassing to our team.
In The Reflective Practitioner, Donald Schon investigates the effects of Model I behavior on engineering practice. He views engineering as a conversation with the environment.
When confronted with a situation that we want to improve, we identify and analyze a possible improvement. As we analyze the idea, the environment “talks back to us,” telling us whether the change will work as planned or if there will be unexpected side effects.
The more changes we consider, and the more effectively we consider them, the more back talk we get and the more we learn about the situation. Each move we make is an experiment, and both the problem definition and the solution evolve from this ongoing conversation.
Just as engineering practice involves an interactive conversation with the environment, it also involves an interactive conversation with the client. This conversation is limited by the theories of action of both the engineer and the client.
The engineer is supposed to understand the problem and confidently develop a solution with minimal input. In addition, the engineer will likely have preferences about how the problem should be solved.
We should expect the engineer to attempt to be in control, seek data that support her position, and fully develop her position before engaging the client in serious discussions so that she will be able to exhibit competence and defend her solution.
The client will also have preferences for how the problem should be solved and will wish to exert some control. In a situation in which the engineer is playing it close to the vest, the client feels left out and out of control.
The relationship is based on a degree of deception when
The client observes the stonewalling and suspects a lack of competence, which the engineer is trying her best to avoid portraying. Worse still, the client misses out on the fun part: the sense of discovery as the engineer discusses the problem. Deprived of this experience, the client may not have the same appreciation for the final design that the engineer does and cannot appreciate the tradeoffs the engineer made to arrive at the final solution.
The alternative, reflective practice, involves the client in the conversation with the environment; expose the client to the ambiguity of the problem and to one’s own uncertainties and concerns; solve the problem together; and give up a measure of control in order to make the client a meaningful partner in developing the solution.
This pattern also colors our conversations with the public, including communities in which we conduct projects. Engineers would rather not talk about the project until they have finished the design and know what should be done.
Is it any wonder that members of the public feel left out? Using our current approach, they do not get to see the fun part.
Argyris, C. 2000. Flawed Advice and the Management Trap: How Managers Can Know When They’re Getting Good Advice and When They’re Not, first edition. Oxford University Press.
Duhon, H. 2012. Making Sense and Making Decisions: An Engineer’s Guide to Project Decision Making, second edition. GATE.
Schon, D. 1983. The Reflective Practitioner: How Professionals Think in Action, first edition. Maurice Temple Smith.
Howard Duhon is the systems engineering manager at GATE and the SPE technical director of Projects, Facilities, and Construction. He is a member of the Editorial Board of Oil and Gas Facilities. He may be reached at firstname.lastname@example.org.