Management: Using Big Data Analysis Tools To Understand Bad Hole Sections on the UK Continental Shelf

Topics: Data and information management Drilling operations
Getty Images

Research and analysis show that 90% of all data in the world today was created in the past 2 years alone. Already the growth of data volume outpaces the conventional capacity to analyze and understand, and the trend is only accelerating.

This trend is also occurring in the oil and gas industry with, for example, the continued growth in seismic channel counts, integration of multiphysics information, logging while drilling, and the constant flow of information from “intelligent wells” in “digital oil fields.”

Current data analysis and interpretation approaches follow well-established and rigid workflows, which study the same small set of relationships between entities. It is common, for example, to use core data in well log analysis or wellbore acoustic measurements in surface seismic interpretation. However, core data is never used alongside seismic data. As the data is growing in volume and variety, it is overwhelming traditional methods. What business opportunities are being missed?

Big Data

The term “big data” refers to more than simply a large volume of different types of data, both structured and unstructured, with varying degrees of accuracy. It also includes a suite of applications providing solutions and analysis. But big data is really a movement. The big data approach is said to be a data-­centric method adept at uncovering otherwise invisible patterns and connections by linking disparate data types. It can search and analyze all data and size with great agility and without regard to user group or project area. Examples from other fields include a retailer analysis of buying patterns and automotive manufacturers predicting faults and failures.

Large volumes of data are nothing new to the oil and gas industry. The seismic business in particular has been successfully dealing with rapidly increasing volumes for a long time. For example, work by seismic equipment manufacturer Sercel suggests the channel count available for acquiring a seismic survey has been steadily increasing by one order of magnitude every 10 years.

Until now, much of our understanding of reservoirs has come from the study of physics-based models rather than directly from data itself. The challenge we face today is converting ever-increasing data volumes into models in decreasing time frames, ideally in “real time.” The big data approach will instead allow us to construct new types of data-­driven models to bypass these traditional bottlenecks. It is also expected to lead to different views of standard models, providing new and valuable insights in the process.

...
This article is reserved for SPE members and JPT subscribers.
If you would like to continue reading,
please Sign In, JOIN SPE or Subscribe to JPT

Management: Using Big Data Analysis Tools To Understand Bad Hole Sections on the UK Continental Shelf

Joe Johnston, CGG, and Aurelien Guichard, Teradata

01 October 2015

Volume: 67 | Issue: 10