Astronomy, Fiber-Optics Provide Blueprint for Data
Managing an unconventional asset requires the proper analysis of large amounts of data. As operators seek more efficient and effective ways to gather, sort, and analyze data, they must look at every possible avenue for the next crucial technological development, a pair of experts said.
In a special session held at the 2016 Unconventional Resources Technology Conference, two panelists discussed how unconventional asset operators can utilize data processing technologies commonly used in other industries.
Michael Bittar, senior director of technology at Halliburton, said that in developing new technologies, companies must try to accelerate the process of bringing products to market. Increased collaboration with operators, universities, and research institutes is essential to achieving that end.
“In our organizations, we need to take advantage of the global effort,” he said. “The problems we are trying to solve are difficult. No one single company can work on these problems. Networking is key, but more important is innovating with our customers.”
Bittar used fiber-optic technology as an example of problem-solving through integration. Fiber-optic cables are regularly used to provide high-speed communication services, but they can also serve a role in determining, for instance, potential leaks in horizontal wells. Operators can lay fiber-optic cables down the length of a well and listen for abnormalities in flow through a signal sent from the cables to a telecommunication device.
“We can lay fiber and listen to what’s happening in the land,” Bittar said. “We took some technology from another industry, we adapted it into our industry, and this type of technology will give us a lot of information.”
George Djorgovski, a professor of astronomy at the California Institute of Technology and the director of the Center for Data-Driven Discovery, said data analysis is still a mostly uncharted frontier in science and technology. He said astronomers have had to manage exponential growth of data volumes and complexity, and that the concepts they have explored to help process data may provide a blueprint for other industries, including unconventionals.
Djorgovski described the virtual observatory concept, where astronomers share their analyses of complex data sets in a dynamic, distributed, open cyberspace research environment. The virtual observatory provides scientists and students with an Internet connection and the opportunity to access the tools needed to innovate, regardless of their location. It also expands the talent pool of scientists available to larger universities and research institutions.
“Anyone with a computer screen can start a business, participate in the world economy, and it counts for a lot of economic power,” Djorgovski said. “The same thing happens with data. Nowadays, if you have an Internet connection, you don’t have to be at Princeton or Harvard. You can be in North Dakota, India, wherever you want, and you have the same level playing field. All of the data are behind computer screens, as are all the tools and all of the literature.”
One of the core problems in data science is the visualization of complexity. Djorgovski said that organizations must develop standard algorithms for gathering and interpreting data, particularly as data increases in dimensionality beyond the scope of human comprehension.
To help with that, astronomers are using machine intelligence tools in all stages of data processing. Among other things, machine intelligence can help recognize patterns and automate object and event classifications. Djorgovski said it may be possible within the next decade for computers to develop bug-free software programs that help streamline the data gathering process.
“We are pretty sure that nature is not confined to two-dimensional and three-dimensional phenomena, and that there are many variables that can act together in something. You need to be able to visualize that,” he said.
The Journey to an Autonomous Marine Ecosystem
With only about 3% of data from industrial assets used for decision making or meaningful purpose, vessel, rig, and fleet owners are looking at ways to connect existing networks and assets to achieve optimized operational performance.
Automated Data Management Helps Take the Pain Out of Analytics
Public clouds are one of the emerging technologies that are minimizing the cost of processing the big data of the oil and gas industry. Among the hurdles to the wide-scale adoption of the cloud are security and access cost.
New Chevron Venture Program Selects Pipeline Inspection Technologies
Sensor systems for pipeline inspections from Ingu and Rheidiant are among the initial selections to receive funding under Chevron’s CTV Catalyst Program.
Don't miss out on the latest technology delivered to your email every two weeks. Sign up for the OGF newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
01 August 2018
30 July 2018