Technology

Invited Perspective: Reservoir Modeling Today and Tomorrow

I had the opportunity to discuss the state of the art and the future of reservoir modeling and reservoir performance forecasting with one of the finest members of the reservoir engineering profession, professor Larry W. Lake, of the University of Texas at Austin (UT).

jpt-2015-05-prez-headpuzzlepieces.jpg
Source: Getty Images.

Reservoir performance forecasting is still the “Grand Central Station” of all the exploration and production (E&P) disciplines because this is where everything (the large-scale reservoir plumbing system, reservoir properties, fluid properties, wells, completions, lift curves/hydraulics, facilities, fluid handling capacities, and much more) is integrated and comes together in a life-of-field forecast. The revenue from the field development is the composite oil and gas production rates into the future multiplied by the forward oil and gas prices. The reservoir description, which I call the “subsurface plumbing system,” in a forecasting model can range from a simple homogeneous analytical tank model to an unbelievably complex dual porosity, compositional simulation model with one billion grid cells, pseudos (from endless upscaling), and many other sophisticated features built in to try to mimic a very complex reality. I had the opportunity to discuss the state of the art and the future of reservoir modeling and reservoir performance forecasting with one of the finest members of the reservoir engineering profession, professor Larry W. Lake, of the University of Texas at Austin (UT).

You have taught reservoir modeling to undergraduates and graduate students for more than 30 years. How has the reservoir modeling discipline evolved?

The biggest evolution has been caused by the capacity of computers. The capacity has allowed combinations of traditional simulation with facilities, improved resolution of heterogeneity, progress in automated history matching, and uncertainty estimation to name just a few.

In order for universities such as UT to understand the main reservoir modeling challenges of the E&P industry, close collaboration and a dialogue on real industry needs must happen. What is your view of the state of industry/academia collaboration? Are you collaborating more with service companies than with international oil companies, national oil companies, and independents?

We at UT do well in the collaboration department and on all the major avenues of E&P. The challenge for us is the boundary between technology development and research. We do, or at least we should do, research.

Why are both individual well and total field forecasts still so wrong? Are we not looking enough at uncertainties and risks of all kinds? Should we define and model more possible scenarios? Should a reservoir forecast always be an outcome range rather than one expected number?

The simulation forecast should be reported as an outcome range. Experienced modelers understand this and can estimate the range from experience. Sometimes instead of a range, they use an optimism reduction. It appears that inaccurate field forecasts are (and remain) largely because of unresolved or unrecognized reservoir heterogeneity. There may also be issues with the lack of optionality in traditional forecasting.

Even if individual well forecasts are way off, would the variance even out across the field such that fields with many wells would be close to the expected field forecast? If not, why not?

Results from groups of wells are always easier to match than individual wells in the same way that cumulatives are easier to match than rates. Why this is so seems related to the law of large numbers or a simulation version of the wisdom of the group.

Why are forecasts usually too optimistic compared with what actually happens when fields start producing? Are we deliberately sugar coating—making things look better than they are?

They are not always optimistic but usually so. There is no deliberate sugar coating. Simulators and the folks responsible for simulation input are dedicated, hardworking professionals. I cannot prove this, but I think there is something being lost in scaling up. There is also a lack of benchmarking.

Do you believe that students understand and have an intuitive feel for the physics and transport phenomena at play in the field they are trying to model?

No. This is something we work really hard on at universities, but it seems to disappear once they are on the job. It comes back after some years of experience. For the good ones, anyhow.

What have been the greatest positive reservoir modeling developments in the past 10 years?

Physical process modeling, incorporation of outside the reservoir effects, and better resolution of heterogeneity are the most positive reservoir modeling developments. We have decided to declare victory on truncation error and grid orientation effects.

What is the next big thing in reservoir modeling in your view?

So many possibilities here! I think it will be a technology that will allow us to realize value from the flood of field data that industry is collecting. It will turn into a tsunami soon. Let us make the most of “Big Data.”

Should students learn to run a simulation model while still in university so they are ready to model a real field on their first day at work or would their time be better spent on understanding the transport phenomena and the fundamentals?

You set me up for this one. Large-scale simulations tend to overwhelm students (and practitioners) even to the point of making them weaker engineers than, say, 30 years ago. We should be teaching fundamentals and understanding at universities but not technology, or at least technology as icing on the cake of understanding.

How has your own journey as a reservoir modeler been? Are you now a spokesperson for simplicity or for a billion grid blocks?

I come down on simplicity. Some might say I do this because simple is all I can do. I look at every simulation run (or to be precise, graphical depictions of simulation output) for evidence of gravity segregation (Dietz model), linear displacement (Buckley-Leverett model), large-scale bypassing (Koval and Dykstra-Parsons models), and the distinction between compressibility (Havlena and Odeh) and displacement. The signs are usually there and they are the keys to understanding.

GIGO means “garbage in garbage out” when you are trying to model something. Do we have enough data (core data, relative permeability data, capillary pressure data, residual saturation data, well test data, fluid data, and similar) to meaningfully fill up our reservoir models?

Not even close. The amount of data required to START a field-scale simulation is overwhelming. There are whole technologies devoted to generating the input, never mind the output. The success we have had so far is because of skillful practitioners interpolating or reading between the lines of the data. This is a philosophical question for all of us. When the complexity of the simulation output approaches the complexity of the reservoir, what then? A simulator to model the simulator output?

Do we understand unconventional shale reservoirs such as the Bakken, Eagle Ford, and Marcellus well enough to model them adequately or correctly at this point?

Yes. There is no evidence (yet) that esoteric mechanisms such as Knudsen flow are important. Unconventionals are conventionals with very small permeability. This changes a lot of things operationally and performance-wise but there is no established need for “a whole new physics.” I am unconvinced also that the rush to more geomechanics will improve predictions. This is, after all, an ever more complicated level of modeling with even more parameters to be determined.

Could the long-term forecasts of rates and recoveries in unconventional oil and gas shales still be wrong?

No more wrong than for conventional reservoirs, which is still pretty wrong. I think we do not understand a priori, how much of the volume of a reservoir an unconventional well will influence and hence we cannot determine percent ultimate recoveries.

What do we need to do to model unconventional tight oil and gas reservoirs better? Is understanding more of the physics the way to go or will a forecast based on assigning an initial rate and a slope for the decline curve (for various well types) do?

I wish I could invent a method that has the longevity of the decline curve methods. They have been reservoir engineering stand-bys for 50 years. But they have no physics, or very little. It turns out that any number of models, models that have more physics, can match field data for unconventional production. The problem is not a lack of models; it is a lack of uniqueness.

How do you think reservoir modeling will be taught at universities in 2040?

Not by me—I will be 94 years old. I am certain we will be using computers, although the size and form of them would surprise us today. I am also certain that modeling will be a widely used, since this is key to using the scientific method (SM). If the SM has been abandoned, I did not get the memo.

JPT reaches 143,000 SPE members around the world. What are your top three recommendations to practicing reservoir engineers working in companies around the world whose main responsibility is forecasting reservoir or field performance (i.e., forecasts of oil, gas, and water production rates, oil and gas reserves, fluid handling capacities needed on platforms, and more) in conventional and unconventional reservoirs?

I am amazed at SPE’s growth! It is very impressive. It is the best professional organization that I know of. My recommendations are

  1. Learn to understand what the simulation is telling you.
  2. Acknowledge the timeliness of an answer. More than 50% of the capital expenditures for many large projects are committed before there is enough data for a simulation.
  3. Remember that a simulator, for all its complexity, is only a model. There is a real reservoir with real-life people and real-life decision makers out there.

What gives you the greatest sense of purpose in your job as a professor at UT?

You saved the easy one for last. The greatest sense of purpose is knowing that I have a hand in making fulfilling careers for young professionals. Why else would anyone teach?

Conclusion

So there you have it. Professor Lake offers simple yet profound insight when it comes to reservoir modeling and forecasting. In 1997, Rob Eastaway published an article in the Financial Times called “Jardin’s Principle” (http://www.robeastaway.com/etc). This is how it applies to E&P:

  • Fresh out of university, you are at Jardin’s Level 1 and you do not yet fully understand the E&P business, nor are the reservoirs yet “talking to you.” You are, according to Eastaway, “simple and naïve” and you find it best never to write more than one page.
  • After 5 years in the E&P industry, you have progressed to Jardin’s Level 2 and you now fully appreciate how complex the industry and reservoirs really are. Everything is a function of everything, complexity is overwhelming, and you never manage to write fewer than 100 pages.
  • Finally, after more than 25 years in the E&P business, you have “seen it all,” you have become wise after many predictions that turned out to be both successes and failures and you again have become simple in many ways as you have arrived at Jardin’s Level 3. Again, you never write more than one page, as you have mastered the skill of reducing complexity to something tractable and actionable. You have become “simple and profound”—and when you speak about a reservoir modeling challenge and its solution, everybody listens.

Eastaway concluded, as did Professor Lake, that unless you have a profound understanding of a subject (Malcolm Gladwell, in his book Outliers says that this may take 10,000 hours of practice), you will either over-complicate or over-simplify it. I agree. Learn the fundamentals in university and pick up your industry toolbox running skills later. When you can practically feel gravity, capillary, and viscous forces fighting one another (as described by Malcolm Gladwell in his book Blink), with three phases (oil, gas, and water) present at a certain permeability level, and you understand at your core what will happen directionally in a producing well some distance away from the injector given a certain vertical permeability profile, you know you have arrived at Jardin’s Level 3. Then, when you speak, people will listen and proceed to order the USD 7 billion production platform that will receive and process your forecasted oil, water and gas production rates for the next 30 years!

Professor T.D. van Golf-Racht, the author of Fundamentals of Fractured Reservoir Engineering (Developments in Petroleum Science), passed away on 27 February. Professor van Golf-Racht was my MSc advisor in 1978 and a long time mentor and friend. He had a unique feel for and understanding of multiphase flow in porous media—and especially so in dual-porosity systems. He generously shared his wisdom and his trademark was to always find the best in others, lifting us up in insight and engineering confidence. He loved paradoxes and insisted that reservoir engineers if wrong, always should be wrong “in the right direction,” and to this day I still ponder what that really means. Think about it! Peace over his memory.—Helge Hove Haldorsen