A radical digital revolution is happening all around us (or so we are told). This digital domain is moving so rapidly that new acronyms are being added to our lexicon almost weekly (BD, DA, DNN, GAN, and ML, to name but a few). So how will this affect reservoir simulation? Speaking for myself, I really do not know. However, I came across an interesting article from the Harvard Business Review that examined the effect of innovative information technology (IIT) on some large US-based consumer businesses, and, while specific benefits of IIT were clearly stated, the authors opined, “In times of radical technological change, there’s a lot of figuring out to do. [We] have to understand what new technologies can do.” Applying this to reservoir simulation, we apparently need to understand better when and, more importantly, when not to use such technology—to appreciate its bounds, its limitations, its range of validity, and so on.
Deep neural networks (DNNs), for example, are excellent at labeling images (cat, dog, axolotl, etc.). However, some futurists have an impression that IIT can solve almost any kind of problem, sometimes leading to extravagant claims about its potential utility.
Futurists are optimists in nature (the ones I have seen certainly are). However, such buoyant proclamations require some counterbalance through honest questions, healthy discussion, and a reluctance to accept such bold assertions as dogma.
Without wishing to appear too skeptical, I came across an article from The Royal Society that is well worth quoting directly: “No matter their ‘depth’ and the sophistication of data-driven methods, in the end, they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated … they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system.” While this article focuses on biology and medicine, the authors apply their keen insight to any multiscale and complex system, which covers our domain of interest. I feel that a tempered, modestly restrained approach to IIT is indeed wise (as applied to reservoir simulation), at least until we have had time to fathom where this digital revolution is leading.
This Month's Technical Papers
Recommended Additional Reading
SPE 186079 Implicit Modeling for Permeability Enhancement in Carbonate Reservoirs: A Novel Approach To Bridge Data Gaps for Honoring Dynamic Observations by Arthur P.C. Lavenu, Abu Dhabi Marine Operating Company, et al.
SPE 187046 Integrated Reservoir-Network Simulation Improves Modeling and Selection of Subsea Boosting Systems for a Deepwater Development by Gaurav Seth, Chevron, et al.
SPE 187453 Assessing Single EOS Predictability Using PVT Properties of a Wet-Gas Reservoir on a Compositional Simulator by Bander N. Al Ghamdi, Saudi Aramco, et al.
William Bailey, SPE, Principal, Schlumberger
01 July 2018
Enhancing Model Consistency in Ensemble-Based History Matching
The aim of this work is to present the effectiveness of a fully integrated approach for ensemble-based history matching on a complex real-field application.
Ensemble-Based Assisted History Matching With 4D-Seismic Fluid-Front Parameterization
An ensemble-based 4D-seismic history-matching case is presented in the complete paper. Seismic data are reparameterized as distance to a 4D anomaly front and assimilated with production data.
Rapid S-Curve Update Using Ensemble Variance Analysis With Model Validation
In the complete paper, the authors propose a novel method to rapidly update the prediction S-curves given early production data without performing additional simulations or model updates after the data come in.
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
12 June 2018