Digital oilfield

Making Data Pay

There is talk about digital oil fields and big data and some striking examples of their power. But in real oil fields, a lot of operators are still running fields with systems relying on big paper.

jpt-2015-05-datapayhero.jpg
Inside the protective white box is a pump monitoring and control system called the Smart Pumper, installed in the Eagle Ford Shale. It uses fluid level data in a well to maximize production, and limit the risk of pump damage.
Photo courtesy of Direct DriveHead.

There is talk about digital oil fields and big data and some striking examples of their power. But in real oil fields, a lot of operators are still running fields with systems relying on big paper.

Since retiring from Chevron where he worked on introducing digital oilfield technology, Jim Crompton has advised smaller operators and learned that “80% are operating on paper” and many control systems using established computerized control systems such as SCADA are “the next generation.”

To describe the state of the oilfield technology during a presentation at the recent SPE Digital Energy Conference and Exhibition in The Woodlands, Texas, Crompton, an adviser with Noah Consulting, quoted a well-known science fiction writer, William Gibson, who said, “The future is here, it is just not evenly distributed.”

The speakers at the conference pointed out a widely shared problem for those working in this field: while much data is gathered, little of it is used. For example, about 5% of the real-time data from a drillship are sent to shore to help monitor and control drilling, he said.

At the corporate level, the industry has been a leading user of computer power and advanced analysis methods for engineering and managing the enterprise. Exploration and production (E&P) companies own some of the world’s most powerful supercomputers, which are used to do the seismic studies that are the foundation of reservoir models. But frequently, they are unable to update those models with results from producing fields.

“Analytics is used on an enterprise level, but no one is looking at how to use it on an operational basis so we can run data on models of wells or reservoirs as we gather the data,” said Moray Laing, executive lead consultant for oil and gas at the SAS Institute, during a panel discussion at the conference.

On the conference show floor, exhibitors were selling digital tools ranging from a software services system integrator, Entrance, whose offerings include systems used by companies to move from paper-based invoices to digital ones, to a small company selling a pump monitoring and control device called the Smart Pumper. What looks like a boxy laptop uses data from a fluid monitoring system to control pump speeds to maximize production without reducing the fluid level in the hole to a low level that causes the pump to run dry, which can lead to damage. And that box is wired to also serve as a communications hub for up to 18 other well monitoring devices.

The pump control device is not unique. It performs many of the functions already done by devices sold by artificial lift companies and big service companies. The Smart Pumper’s pitch is it offers a rugged, cost-efficient, simple-to-use option designed to work with widely used field monitoring equipment.

It was a good show for the company, which set up meetings with majors operating thousands of wells with unmonitored pumps, said Sid Shetty, a systems engineer for Direct DriveHead, which makes

the Smart Pumper. The operators are looking for ways to reduce their costs at a time when oil prices are depressed, he said.

Based on the feedback since the conference, oil companies are interested in pump monitoring on new wells, but for older fields the cost per well seemed too high unless one device could monitor multiple wells, said Greg Boyles, founder and chief executive officer of Direct DriveHead.

Five Ways of Looking at Change

Turning a new idea into a successful product requires making the leap from a small group of tech-savvy users to a broader audience. These consumer groups, described in the book Crossing the Chasm, are:

  • Innovators: Users willing to volunteer time to help develop a product. Critical supporters during early development but lacking in spending power.
  • Early Adopters: Companies with the leadership will necessary to commit resources to new things capable of offering a competitive advantage.
  • The Chasm: Many new product developers fail to make the leap to the wider market, which has different attitudes toward what is new.
  • Pragmatists: Users willing to try new ideas if they solved a significant problem that cannot be fixed otherwise. Seeking ideas endorsed by others and in regular use.
  • Conservatives: Skeptical of new ideas and willing to stay with what works as long as possible.
  • Laggards: Conservatives most resistant to change.

 

The feedback points to a common question that defines the rate of digital change: Is the potential return for those running an operation worth the time, trouble, and cost of doing things differently?

“It is their business you are trying to impose this solution on,” Crompton said. “How do you get them involved? Their metrics—production, safety, costs—have to get better as a result of that.”

As for the question why field data is not more widely used, Crompton said, “We have not figured out how to create value out of it.”

Digital Divide

Shell’s global operations offer examples of what is digitally possible. It has wired fields with sensors and digital controls allowing it to remotely shut off valves to isolate zones if the monitoring indicates there is a problem, or exception, such as water production exceeding acceptable limits.

“We can visualize what is going on and analyze by exception and respond quickly to what is going on,” said Frans Van den Berg, smart fields collaboration manager at Shell.

But for Shell and others in the industry, the level of digital technology used varies widely from field to field and country to country. The level of investment in old fields is not as high as in new ones.

For Statoil’s shale operations, adding electronic monitoring and controls is seen as a path to improve safety, increase productivity, and meet regulatory mandates, said Russell Rankin, a geology manager for Statoil. But in remote locations, it often faces a lack of infrastructure, such as fiber-optic lines for high-speed data transmission, he said.

Even before the crash in oil ­prices, funds for this work were limited. “In 2013, budget there was a limit—it was a big expense to get fiber installed—it was delayed by spending cuts. You will probably see a bit of that this year,” Rankin said, adding that expanded use of data at the wellhead will “require some quick wins.”

The observation about the need for early wins dovetails with the thinking offered by the keynote speaker for the conference, Geoffrey Moore, a business consultant known for his work on the challenges of popularizing new technologies, which are laid out in his book, Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers.

The popular book’s title refers to the chasm between the relatively small group of tech-savvy pioneers who offer critical support and feedback early in the life of a new product, and the broader market of buyers who range from pragmatists—who are open to new ideas, but can think of a lot of reasons why not to use something new—to conservatives who will change when required.

While the nature of these groups of users and the strategies used by those companies able to cross the chasm are the stuff of a book, one consistent requirement for reaching that larger market is the ability to solve a significant problem that requires action by those reluctant to act otherwise.

Moore said sellers seeking a broader market need to identify “pragmatists in pain.” Specifically, these buyers are willing to deal with the problems that come with change because they are facing “an intractable problem not solvable by conventional means.” The target customer could be a manager in charge of producing wells scattered over a large territory facing a deep budget cut. Or the motivation to act could be in the form of a new environmental regulation.

Five Myths About the Digital Oil Field

A list compiled by Jim Crompton, a consultant for Noah Consulting:

  • It is mostly about technology. Actually the digital oil field is mostly about business process change.
  • It is an information technology matter. While IT is an essential tool, the point is improving how fields are run.
  • It is mostly about automation. While automation can be employed, the goal of digital change is most often to allow people to manage assets better.
  • Field operators trust the models built by asset teams. There is skepticism, in part because those constantly observing performance do not see the field data reflected in models built early in the development.
  • Major capital projects are a great opportunity for digital installations. Project managers often see new ideas as a distraction at best and another risk at worst in a complex project.

Source: SPE 173441.

For example, Rankin of Statoil said pressure to reduce emissions or pipeline leaks could drive the industry to embrace new monitoring and control systems. “In a downturn, compliance applications are more attractive” candidates for finding broader markets, Moore said. “They are not disruptive, they are more sustaining. In the short term, low hanging fruit may be wins around safety.”

Before making a change, though, a pragmatic user will look for reviews from peers who have used the product. That makes it harder for anyone with a new product to establish a beachhead in a new market.

For those selling new ideas, finding and satisfying the initial customers on the far side of the chasm—an early win—is critical. Moore said the ideal premier user is often “a small- to medium-sized company that will see their business transformed by this.”

Management Will

Digital monitoring and control can fundamentally change things, and that makes introducing it complicated because it affects so many people. One of the companies pushing this change is Emerson Process Management, which built its reputation on applying automation and process management to industrial facilities.

“We are talking about a transformation that affects every aspect of operations,” said Peter Zornio, chief strategic officer for Emerson Process Management. It is “a broad-based transformation and very people-oriented.”

Because many of the tools needed for this transformation have been used previously for automating industrial facilities, Moore said, “The digital oil field isn’t an innovation challenge. It is a managerial challenge.”

Shell has installed an always-on video conferencing system linking its crews offshore to those onshore, allowing constant face-to-face communication with experts supporting the operation onshore.

This “virtual environment” is used for regular meetings and consultation, said Van den Berg, adding that the advantages for Shell include improved communication, faster decisions, and improved asset performance. For other companies, the value of such a large investment in communications will depend on the “value of total business improvement.”

Digital change requires a management commitment after the spending decision is made to ensure that it is accepted by the organization. For example, the value of Shell’s face-to-face meeting system depends on the willingness of workers in the field to collaborate with experts elsewhere. The inability to create systems based on this sort of collaboration may be a barrier.

Introducing new software requires corporate leaders to communicate the need for change, and a sense of urgency about when it needs to be completed. If the system is used to generate operating advice or warnings, it needs to be designed to deliver that information effectively and workers need to be convinced that it can be used to improve performance.

Corporatewide data sharing re-quires a certain level of consistency that tests the ability of information technology (IT) to impose standards. At BP, that need led to a “data management strategy” aimed at ensuring well-managed reliable data exchanges by imposing hardware and software standards, said David Reed, a solution specialist in drilling and completions for BP.

That entailed some loss of local autonomy. They are working to avoid “engineers going out with their Amex cards” and creating systems that may not connect with others, he said.

The payoff, Reed said, is ensuring connections to monitoring centers and avoiding investments in limited noncompatible systems “that does not make sense at this time when we are reducing cost as much as we can.”

The drive to create richer field models requires a corporate computer system able to support new ways of looking for and producing hydrocarbons. “Digital oil fields have been around for 15 years. There is new technology all the way through processing. But taking all data in field and models of data analytics and putting it all together is one thing that is frustrating,” said David Rossi, smart fields collaboration manager for Schlumberger.

The problems begin with the data exchange needed to constantly update models, and they go deeper. For example, different sensors can offer different measures of the same property, which must be reconciled.

Technical experts have strong preferences on what needs to be measured based on their analytical points of view. Some want to stick with physics-based models of reservoirs, others with data-driven analysis. Rossi sees value in the blending of the two.

Sustaining new data-driven modeling methods requires corporate data systems set up to supply these programs with the necessary data, said Shahab Mohaghegh, a petroleum engineering professor at West Virginia University known for his work in new data-driven reservoir analysis.

A software company he founded, Intelligent Solutions, recently had a significant win, selling several licenses to Anadarko Petroleum, which offered an endorsement displayed on the software company’s website.

jpt-2015-05-datapayfig2.jpg
Oilfield technologies ranked by a long-term expert in the field based on their maturity and the level of current expectations. Source: SPE 173441.

Mohaghegh was involved in the test, which used the data-driven analytics program to do fracture designs, ensuring it was used effectively and tracking down the data needed. While he said the demonstration of added value is critical when selling new software, ease of use can aid acceptance.

“Now they need to go to IT. The problem is all the data needed was located in different places when I did it,” he said. For teams managing wells “to use the software, they will need to do that (download data) automatically.”

Others at the conference talked about the interpersonal skills needed to sell change, and the foresight to create measurement systems to track the pace and benefits when software is upgraded. Those promoting change must remember that users often have mixed feelings about it.

Those selling new software sometimes use the term “legacy software” as a euphemism for old and outdated. But users are likely to see things differently.

“My definition of legacy technology is stuff that works,” Crompton said. If you are installing something new, “make sure you give them something that works. No matter how much more it can do, they will not use it unless it works for them,” he said.

There are many measures to determine what works. It starts with the technology and its state of development at the time it is tried, how well it fits a company’s needs, and whether the operation has the people and willingness to do what it takes to make it work.

“It is not the company that has the newest technology or the most data that wins,” Crompton said in the paper on technology hype. “It is the company that gains the greatest insight into how to improve their operations that comes out on top.”