Influence is an important part of engineering. It is not enough to develop a brilliant design; you have to convince someone to build it.
There is a great deal written about influence. Robert Cialdini’s Influence: Science and Practice is a classic. Jay Heinrichs’ Thank You for Arguing is one of my all-time favorite books; it summarizes the Greek art of rhetoric. Nancy Duarte’s Resonate: Present Visual Stories That Transform Audiences is a brilliant book about designing PowerPoint presentations that move people. John Maxwell, one of the authors of Becoming a Person of Influence, has built a company on the ideas of servant leadership and influence.
I recommend all these books, but this column is a book review of an even more interesting book, Jonathan Haidt’s The Righteous Mind: Why Good People Are Divided by Politics and Religion. Actually, I only discuss the first third of the book and do not delve into his writing about why people believe different things about politics and religion. Perhaps I will tackle those topics in a future column.
The book is structured around this intriguing metaphor:
The mind is divided like a rider on an elephant.
The elephant is intuition, the rider is reason.
The rider is there to serve the elephant.
Intuition comes first; the role of reason is to justify opinions served up by intuition. Moral judgment is a cognitive process with two parts: intuition and reasoning. Intuition is a rapid, automatic response over which we have little or no control. Reasoning is a slow, conscious process.
Kahneman’s (2011) discussion of thinking is useful. The mind comprises two systems: System 1 is an automatic, unconscious system, and System 2 is an effortful, conscious system. A bias identified in System 1 is that this thinking answers questions easier than the questions asked.
Part of Haidt’s research consisted of telling stories with disgusting aspects in which no one was obviously harmed. For instance, he presented to research subjects this story of a family that ate its dog:
A family’s dog was killed by a car. They had heard that dog meat is very tasty. So they cooked and ate the dog, being very careful that no one saw them do it. Was it wrong for them to eat the dog?
Almost everyone responded immediately (intuition) that this is wrong. Haidt then asked them why it is wrong. People gave very different and often nonsensical reasons. Many try to construct a story in which someone is harmed by the family’s act. These stories are never very convincing, even to the subjects. Yet few of the subjects change their mind. They mostly stick with their initial intuition that eating the family pet is wrong: “I’m just not sure why.”
Is moral thinking different from other forms of thinking? Early philosophical thinking suggested that even children make a distinction between moral and conventional rules. The distinction lies in causing harm to others. The story above (and many other Haidt stories) show that moral reasoning is not simply about causing harm to others. Why do all cultures have moral rules that have nothing to do with causing harm? (Comment: In The Moral Case for Fossil Fuels, Epstein pins his arguments directly and solely on what is good for human beings. (I did not fully buy into his arguments when I read his book. I think Haidt’s book explains where Epstein errs.)
People make moral judgments immediately, but it takes time to reason the why of the judgment. Given a paradoxical situation, individuals will try to find reasons to support instinctual moral judgments. Note that the reasoning is not the search for truth, but is instead the search for arguments to justify the intuition. This search is usually to find arguments for convincing other people (the intuition is usually good enough for us unless challenged by others).
Moral thinking is not innate and it cannot be entirely learned by children based on avoiding harm to others. The moral domain varies by culture. We are born to be righteous, but we have to learn what, exactly, people like us should be righteous about. Cultures need to decide how to balance the needs of society with the needs of individuals. Western culture leans toward individual rights; most other cultures focus on the good of society.
So the reason for reason (the evolutionary reason for our development of the capability of developing reasoned arguments) seems to be to enable us to develop arguments for convincing other people that our intuitions are correct.
People reason, and people have moral intuitions. The relationship between these have been defined as follows:
Affect refers to flashes of positive or negative feelings that prepare us to approach or avoid something. Affective reactions are so tightly integrated with perception that we find ourselves liking or disliking something before we even know what it is. (Comment: The affect heuristic is one of the more powerful biases. If we have a positive opinion of a technology, then we will overestimate its positive aspects and underestimate its negative aspects.)
Affective reactions are so fast and powerful that they act as blinders do on a horse; they limit the universe of alternatives. If the elephant makes a small move to the right, the rider focuses on things to the right and loses interest in everything to the left.
Reason was designed to seek justification, not truth. Moral reasoning developed to help us pursue socially strategic goals such as guarding our reputations and convincing others to support us in disputes.
Exploratory thought is an even-handed consideration of alternative points of view. Confirmatory thought is a one-sided attempt to rationalize a particular point of view.
Accountability can increase exploratory thought and decrease confirmatory thought, but only if these three conditions apply:
The rest of the time accountability simply increases confirmatory thought: people try harder to find justification for their position than to find truth.
The main reason for thought is to act in ways that can be persuasively justified to others or excused by them.
The intelligence quotient was found to be the best predictor of how well people argued, but it predicted only the number of “my-side” arguments. Smart people are not better at finding truth; they are better at finding reasons that support their side. An interesting aspect is that educated people who do not believe in climate change are much better at developing arguments that support their position than uneducated nonbelievers are, and are hence much harder to “convert.”
Reasoning can take you wherever you want to go:
If people can so easily see what they want to see, is it any wonder that scientific arguments often fail to persuade the general public? For nonscientists, there is no such thing as a study that you have to believe.
We should not expect individuals to produce good, open-minded, truth-seeking reasoning, particularly when self-interest or reputational concerns are at play. But if individuals are brought together in the right way, such that some individuals can use their reasoning powers to disconfirm the claims of others, a group can be created that ends up producing good reasoning as an emergent property of the social system.
An ethics class is not likely to make people behave more ethically after they step out of the classroom. Classes are for riders, and riders will use their new knowledge to serve the elephants more effectively. If you want to make people behave more ethically, you can either change the elephant (difficult) or change the path that the elephant and the rider are on.
The heuristics and biases school of thought (e.g., Kahneman) argues that the function of reasoning is to correct mistakes made by System 1 thinking (intuition). But Haidt shows that reasoning itself is the source of some mistakes, and that the function of reasoning is actually to reinforce intuitions rather than correct them.
This is particularly true when one is reasoning alone or with others who hold similar views. Reasoning evolved to enable humans to create persuasive arguments in social situations in order to achieve influence.
When people disagree with an argument, they spend more time evaluating it, trying to find fault. Sometimes the effect is exactly the opposite of the argument’s intention. The reviewer finds so much fault that his opinion is actually strengthened by the argument with which he or she disagrees.
However, when reasoning is between people who disagree, but want to arrive at the correct answer, confirmation bias contributes to an efficient form of division of cognitive labor.
Do you want to influence the people who disagree with you? You have to talk to their elephants.
The main way we change our minds on moral issues is by interacting with other people. We are terrible at seeking evidence that challenges our own beliefs—others must do us that favor. We are good at finding errors in other people’s beliefs. But the interactions must be civil. When discussions are hostile, the elephant leans away and the rider works frantically to rebut the opponent’s charges. But if there is affection, admiration, and trust, the elephant leans in and the rider tries to find truth in the other person’s arguments.
The elephant may not usually change in response to objections from its own rider, but it may be steered by the mere presence of other friendly elephants.
Cialdini, R.B. 2001. Influence: Science and Practice. Allyn and Bacon.
Duarte, N. 2010. Resonate: Present Visual Stories That Transform Audiences. John Wiley and Sons.
Epstein, A. 2014. The Moral Case for Fossil Fuels. Portfolio/Penguin Group.
Haidt, J. 2012. The Righteous Mind: Why Good People Are Divided by Politics and Religion, first edition. Vintage Books.
Heinrichs, J. 2013. Thank You for Arguing: What Aristotle, Lincoln, and Homer Simpson can Teach Us About the Art of Persuasion. Penguin Random House.
Kahneman, D. 2011. Thinking Fast and Slow. Farrar, Straus and Giroux.
Maxwell, J. and Dornan, J. 1997. Becoming a Person of Influence: How To Positively Impact the Lives of Others. Thomas Nelson.
Howard Duhon is the systems engineering manager at GATE and the SPE technical director of Projects, Facilities, and Construction. He is a member of the Editorial Board of Oil and Gas Facilities. He may be reached at email@example.com.