Workforce Education Key To Understanding Drilling Data
In trying to remove risk and uncertainty from drilling and improve their overall drilling efficiency, operators are developing more reliable analytic capabilities and adopting novel sensor and data-streaming technologies to help them process the massive amounts of data coming from their wells. An industry expert said that workforce education will be critical to helping companies adapt to a changing data landscape and optimize their operations.
At a presentation held by the SPE Drilling and Uncertainty Technical Section, Eric van Oort discussed the issues involved in analyzing data for drilling optimization, and the work being done at the University of Texas at Austin to help ease the process. Van Oort is a professor of petroleum engineering at the UT-Austin and a former onshore gas technology manager at Shell.
Van Oort said engineers entering the industry must expect to work in a datacentric environment. While various elements of the traditional drilling curriculum such as casing design and directional drilling will still be valuable, companies will need to adapt their training methods to better suit the modern oil field. He said the industry will see an uptake in the use of advanced simulators with sophisticated human/machine interfaces, similar to the drilling simulator National Oilwell Varco donated to UT in 2014.
“It makes sense in the drilling curriculum to include data analytics and coding. These are really going to be important requirements for the future—familiarization with machine and statistical learning, data recognition, and artificial intelligence. Notwithstanding that traditional skills are still needed, we must realize that this is the age of big data and that post-well analysis, real-time data analysis, and so on, provide great value and a great opportunity for talent education,” van Oort said.
Data Standards Lacking
While touting the role that analytics will play in workforce education moving forward, van Oort acknowledged that data collection may still be a difficult and expensive task for many companies. He said data standards are often lacking, and the data collection process may be labor-intensive, which could be a problem as the industry focuses on leaner operations in the wake of the oil price downturn. Also, the turnaround time for some data may be too long for meaningful optimization. Van Oort said he had seen cases of drilling data being produced a year after drilling, which in some instances may be a long enough period of time for an operator to complete its drilling campaign.
Sometimes, data analytic operations suffer from the sheer volume of data mined from a wellsite. Separating out the useful data from the useless data may require a massive labor investment, an investment van Oort said was compounded by what he termed a “messy data” problem.
“If you look at our sources, it’s a range, a whole cornucopia of structured vs. unstructured data, static vs. dynamic information, data that we get in real time, and other data that are stored in daily drilling reports, well programs, and so on. Extracting that information effectively is not all that easy. Optimization often requires information from various sources, and this can be a daunting task,” he said.
Sensor data quality is a potential area of concern. Van Oort said sensor technology used in field operations can often be decades old, and the data are of poor quality. Data collected by operators are seldom used, with the prime reason being a lack of qualified manpower to prepare the data and carry out the analysis, along with what he termed “awkward” storage and data security requirements.
In addition, van Oort said data-analysis and machine-learning algorithms are often too immature for oil and gas operations and, if they are mature enough, they’re not being used to their full potential. Even when data and analysis are available, the data are hardly ever used effectively for safety, performance improvement, and unit cost reduction purposes.
“Analysis is one thing, but ultimately you have to transfer it out into the field and really get very comprehensive action,” van Oort said. “Somebody has to do something to capitalize on the information, and it’s often more difficult to accomplish than it is on paper.”
During the presentation, van Oort discussed the work of RAPID, a drilling automation research group at the University of Texas that was created to develop a workflow for data analysis and visualization. Comprising researchers and students from petroleum, mechanical, and aerospace engineering, the group worked with operators to maximize the value derived from surface and downhole data, establish a data analytics toolkit, and familiarize students with drilling processes.
Workforce Education Key To Understanding Drilling Data
Stephen Whitfield, Senior Staff Writer
01 May 2017
Old CaTS Technology Learns a New Trick
Acquiring data from an abandoned subsea well has been done before, but never quite like this.
Finding Meaning, Application for the Much-Discussed "Digital Twin"
An increasingly buzzy term tossed around at industry events, “digital twin” is leveraging data analytics, machine learning, and artificial intelligence to improve efficiencies from design to decommissioning.
Qualification of Biopolymer in an Offshore Single-Well Test
A single-well polymer-injection and back-production test has been performed in an oil and gas field offshore Norway. The objective of the test was to verify at field conditions the properties measured in the laboratory for the biopolymer schizophyllan.
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
12 June 2018
20 June 2018