Volume: 4 | Issue: 6

Hedgehogs vs. Foxes: Improving the Accuracy of Predictions

Topics

Thank you to the many readers who commented on my column, “Oil at USD 20 per Barrel: Can It Be?” in the October issue of Oil and Gas Facilities. The future oil price is clearly on everyone’s mind. When I wrote the article, I thought that generating a confident prediction was hopeless.

Based on your comments and suggestions, articles on energy scenario evaluations by consultant Jorge Leis, a news program on forecasting by journalist Fareed Zakaria, and a book on superforecasting by professor Philip Tetlock, I have changed my mind.

I believe that oil price can be determined with reasonable confidence, and the key to such forecasting is aggregative contingent estimation (ACE).

Predicting a Soviet Leadership Change

Soviet leader Leonid Brezhnev died in 1982 and was replaced by a frail old man, Yuri Andropov, who died in 1984. Konstantin Chernenko, another frail old man, led the Soviet Union until he died 13 months later in 1985.

Political experts predicted that the Soviets would appoint another stern, old conservative. Instead, they appointed a young, energetic liberal named Mikhail Gorbachev, who changed the country’s direction sharply toward glasnost (openness).

The political experts did not see this coming, but soon most of them knew why it happened and what was coming next. In retrospect, it suddenly made perfect sense.

The important questions are: Could it have made sense a priori? Could or should the experts have seen it coming?

The Good Judgment Project

The Intelligence Advanced Research Projects Activity (IARPA), a US governmental research agency, initiated the ACE program to enhance the accuracy, precision, and timeliness of intelligence forecasts for a broad range of event types through the development of advanced techniques that elicit, weight, and combine the judgments of intelligence analysts. Since 2011, the agency has funded a tournament on forecasting with five competing teams (http://www.iarpa.gov/index.php/research-programs/ace). Tetlock and research partner Barbara Mellers led the most successful team in the Good Judgment Project.

 Tetlock coauthored Superforecasting: The Art and Science of Prediction, which describes the success of a group of “superforecasters” in the tournament.

The kinds of questions asked in the IARPA tournament are

The results showed that the best of the volunteers, the superforecasters, performed significantly better than experts in the US intelligence community.

Tetlock famously noted in an earlier book, Expert Political Judgment, that the average expert is roughly as accurate as a dart-throwing monkey.

An explanation for expert inaccuracy comes from an essay, The Hedgehog and the Fox, by philosopher Isaiah Berlin, in which he described two categories of thinkers: hedgehogs, who view the world through the lens of a single defining idea, and foxes who draw on a wide variety of experiences and for whom the world cannot be boiled down to a single idea.

Hedgehogs are poor predictors (worse than dart-throwing monkeys). But they are more likely to be considered experts and to make the evening news because they can tell tight, simple stories and express great confidence in their predictions based on a single defining idea. Foxes are more nuanced, less confident of the future, and less interesting, but much more accurate.

It was the hedgehogs that predicted confidently that the Soviets would replace Chernenko with an aged conservative. A fox may have identified alternative drivers of the decision and made a different prediction.

Ten Commandments of Forecasting

Tetlock (2015) identifies 10 rules to effectively predict the future, or more precisely, to effectively narrow the uncertainty about the future.

1. Triage. Focus your effort on important questions that can be answered, and in situations in which your effort will improve the answer. Some questions are too easy and others are impossible to answer, both of which are a waste of your time.

2. Reductionism. Break large problems into pieces. For example, what will the price of oil be in 2020? The question cannot be answered directly with any confidence, but it can be broken into many parts: Where is oil produced today? How will production change in each location between now and 2020? Where is the oil used? How will demand change? Will there be oil revenue tax increases in the UK? These are not easy questions, but they are more manageable and can be reduced further into more granular questions.

3. Inside vs. outside views. Seek analogues. Consider the problem from multiple points of view and ensure that you are answering the right question. Identification of analogues is a good starting point, one which we do often in the oil field. For example, the best first guess about the performance of a new reservoir is the performance of a similar reservoir elsewhere. However, we are not as good at using previous data to predict the success of the project. Although the average major project exceeds budget by 33%, the historical data do not seem to be used to improve future budget estimates.

4. React appropriately to evidence. Once we form an opinion, we generally favor the evidence that supports our view and discount evidence that does not. This phenomenon is called anchoring, which degrades the quality of forecasts.

5. Look for clashing factors. For every good argument, there is a counterargument, and for every action, there is a reaction. A decision often will produce both winners and losers. Almost everyone in the oil and gas industry would have predicted confidently that Saudi Arabia would cut oil production in the face of the declining oil price because the country’s producers historically did so. But the superforecasters accurately predicted that they would not by observing the different driving forces this time: The shale oil boom made this case different.

6. Granularity and measurement. Strive for precision in estimates (for example, “I am 65% confident” instead of a general statement such as “I think that is likely.”) For example, George Tenet, director of CIA, said in 2002 that it was a “slam dunk case” that Saddam Hussein, the president of Iraq, had weapons of mass destruction. He may have said something more nuanced had he been required to state a numerical confidence limit.

7. Strike a balance between under- and overconfidence. The main problem with being a subject matter expert (SME) can be overconfidence. SMEs, who often are hedgehogs, form opinions effortlessly and may not see the need to evaluate the situation carefully.

8. Look for the errors behind your mistakes and successes. Understand why an estimate turned out to be wrong or right.

9. Bring out the best in others and let others bring out the best in you. Create conditions in which people with varying opinions can contribute. The wisdom of teams applies and teams forecast better than individuals.

10. Practice forecasting.

Forecasting Oil Prices

Predicting the price of oil is more difficult than answering the questions in the IARPA tournament. But, maybe, the complexity of the problem is an advantage.

I remember having similar concerns years ago about estimating project costs. There were many parts to the project for which we needed to estimate costs, and I knew that many of the numbers in the estimates were suspect. Yet, the total cost estimates were usually close to the truth.

Statistics is our friend. We can be far off on some of the subsystem/component costs, but those errors balance out if the estimates are unaffected by a large overall bias and if our estimates for the largest parts of the project are right.

To predict the future price of oil, we need to

This exercise could evolve into questions similar to those used in the IARPA tournament. The answers may be probability statements, continuously updated with new factual information and multiple opinions (tapping into the wisdom of crowds), and rolled up to generate statements, such as “There is a 70% probability that the price of oil will be between X and Y at the end of 2016.”

Think of how reasonably accurate predictions might improve your career, your company’s prospects, and the condition of the industry. Imagine how much healthier the industry would be today if we had anticipated the recent drop in oil price a year or two before it happened.

There is a lot of work to be done to become confident about the future price of oil, but imagine the benefits of doing it well.

Forecasting Project Performance

Predicting the prices of oil is hard, and there is another area in which accurate forecasting would be easier and hugely beneficial: predicting project performance.

Most of our projects exceed budget and/or schedule and/or fail to deliver the targeted production volumes. We are not very good at forecasting these. The superforecasting techniques may significantly improve our results. OGF

For Further Reading

Berlin, I. 2013. The Hedgehog and the Fox: An Essay on Tolstoy’s View of History, second edition. Princeton University Press.

Leis, J. 2015. What The Recent Oil Price Shock Teaches About Managing Uncertainty. Bain & Company, 27 May 2015.
http://www.bain.com/publications/articles/what-the-recent-oil-price-shock-teaches-about-managing-uncertainty.aspx (accessed 30 October 2015).

Leis, J. and Gottfredson, M. 2014. Beyond Forecasting: Energy Markets in a Time of Unprecedented Uncertainty. J Pet Technol 66 (3): 72–78.

Tetlock, P. 2006. Expert Political Judgment: How Good Is It? How Can We Know?, New Ed edition, Princeton University Press.

Tetlock, P. and Gardner, D. 2015. Superforecasting: The Art and Science of Prediction, first edition. Crown Publishing Group.

Zakaria, F. 2015. Global Public Square transcript. CNN, 25 October 2015. http://transcripts.cnn.com/TRANSCRIPTS/1510/25/fzgps.01.html (accessed 30 October 2015).


Howard Duhon is the systems engineering manager at GATE and the SPE technical director of Projects, Facilities, and Construction. He is a member of the Editorial Board of Oil and Gas Facilities. He may be reached at hduhon@gateinc.com.