Well testing and surveillance have always been, and continue to be, the foundations of reservoir management. Fundamental data, such as pressure, rate, and temperature, and fluid samples are collected during a well test and used to investigate the subsurface. With advancements in modern technology such as smart wells, distributed pressure/temperature, real-time measurements, and extended-reach drilling, we are facing conditions with increasing complexity and unprecedented amounts of data.
In 2017, several key partnership announcements with major information-technology companies were noticed. The oil and gas industry is going through a digital transformation reinforced by data science; terms such as data cloud/lake, machine learning, Internet of Things, high-performance computing, automation, and model management are the new buzzwords. When analytics are applied correctly, they will provide valuable insights, especially in cases such as unconventional reservoirs with significant numbers of wells. In order to make the transformation a success, many of the industry’s leading experts agree that quality as well as quantity of data should be important.
Experience teaches us that the subsurface is always more complex than we expect, feedback is not instantaneous, and issues are difficult to mitigate in the reservoir scale. Hence, essential and information-rich data such as exploration/production-well tests, proper fluid samples, and sufficient/periodic surveillance should still be used effectively as indicators to peel off layers of uncertainty in the complex subsurface. Thus, we conclude that reservoir-engineering fundamentals must still be applied and data should still be quality checked, especially when collecting and applying analytics from a massive amount of information. The age-old “garbage in, garbage out” mantra will continue to apply in the upcoming era of data science.
The papers selected for this issue cover advances and opportunities in well testing. They also apply reservoir fundamentals as well as sound engineering judgment, using quantity but also quality data sets from conventional and unconventional assets.
This Month's Technical Papers
Recommended Additional Reading
IPTC 18924 Current State and Future Trends of Wireline-Formation-Testing Downhole Fluid Analysis for Improved Reservoir-Fluid Evaluation by S.R. Ramaswami, Shell International Exploration and Production, et al.
SPE 187348 New Variable Compliance Method for Estimating In-Situ Stress and Leakoff From DFIT Data by HanYi Wang, The University of Texas at Austin, et al.
SPE 185795 Step-Rate Test as a Way To Understand Well Performance in Fractured Carbonates by A. Shchipanov, IRIS, et al.
Heejae Lee, SPE, Senior Engineer, ExxonMobil Production Company
01 February 2018
Analytics Solution Helps Identify Rod-Pump Failure at the Wellhead
This paper presents an analytics solution for identifying rod-pump failure capable of automated dynacard recognition at the wellhead that uses an ensemble of ML models.
Augmented Artificial Intelligence Improves Data Analytics in Heavy-Oil Reservoirs
The authors of this paper propose a novel work flow for the problem of building intelligent data analytics in heavy-oil fields.
As you read the examples in this section, you will see that a change is already under way in that the methods that are being used are increasingly not oil-and-gas-specific but instead follow patterns that are being used in other industries.
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
13 May 2019
07 May 2019
06 May 2019
05 May 2019