BP and Startup Beyond Limits Try To Prove That Cognitive AI Is Ready for Oil and Gas
You have access to this full article to experience the outstanding content available to SPE members and JPT subscribers.
BP has invested more than $100 million into nine different startup companies in the past 2 years—but only one of them wants to turn your brain into a piece of its software.
The international major is working with the ambitiously named firm Beyond Limits on a set of artificial intelligence (AI) programs that will absorb the learnings of geologists and petroleum engineers, and then imitate their decision-making processes as they work on subsurface challenges together.
Before this partnership, Beyond Limits had never been involved in solving the complexities of oil and gas, which is something that might be counted against it by venture capital or prospective upstream clients. But after BP saw what the young company was working with in its office about 10 miles north of downtown Los Angeles, it decided to become both its largest client and its largest investor by injecting it with $20 million a year ago.
The attraction for BP came down to getting its hands on a strain of AI known as cognitive computing, or what Paul Stone refers to as “the pinnacle of the artificial intelligence pyramid.”
“We haven’t seen that elsewhere, so we wanted to be the first to engage and see where it might go in the oil and gas industry,” said Stone, who serves as a technology director in BP’s digital innovation group.
And where it lacks a prior track record in oil and gas, Beyond Limits compensates with a technology team that helped design most of the same intelligent software it is licensing from the Jet Propulsion Laboratory (JPL), an institution run by the influential California Institute of Technology in cooperation with NASA. Similar to oil and gas explorers, at JPL immense uncertainty is the perpetual driver for emerging technologies.
AJ Abdallat has served as chief executive officer of Beyond Limits since its founding in 2014 and previously spearheaded several other startups that have been spun out of work first done at JPL. He believes that this firm’s pedigree gives it a big head start in pushing the envelope on cognitive computing because “the issues and challenges we are tackling today, JPL has been tackling for decades.”
And though there remains a lively debate about where cognitive computing really stands today in terms of its human-like reasoning capabilities, Abdallat said, “People tend to judge AI based on the last 2 or 3 decades—you really need to look at what has happened in the past 5 to 6 years.”
Some of the most notable recent advancements include the AI sector proving that intelligent programs can best human reasoning when it comes to complex games such as Go and Texas Hold’em Poker. Cognitive programs also demonstrate growing promise as a diagnostic tool in the healthcare industry—a vertical that Beyond Limits is positioning itself to break into along with oil and gas.
For Stone, who was instrumental in bringing the Glendale, California-based Beyond Limits into BP’s orbit, integrating petrotechnical experts with cognitive computing products can be summed up as an effort to leverage the oil company’s brain trust at scales never possible before.
“Ideally, we’d like the human to be able to work at the speed of a computer with the data, but they can’t do that,” he said. “So the next best thing is to get the computer to work with the knowledge that the human has.”
The future Abdallat hopes for is one where Beyond Limits’ cognitive computing programs become trusted enough to run through every digital vein of an oil company, canvassing reams of information to run all kinds of optimization simulations.
The experiment that will test this vision is well under way.
The AI Reservoir and Production Assistant
A cognitive system can be loosely described as the combination of multiple advanced computing methods that include basic analytics and deep learning tools, with a few others sitting at various points between each end of the spectrum. Practitioners argue that the sum of these parts is a sort of AI cocktail that can reason through problems much the way a human would.
“It has that knowledge layer for how to grapple with all the different inputs that can come in, and how to anticipate how they might evolve, and what to do in light of that,” explained Zack Nolan, the senior vice president of technology programs for Beyond Limits.
Since July, the first of these ensemble programs developed by Beyond Limits are up and running within a select group of BP’s upstream engineering teams in Houston. Their collective mandate is to raise the ceiling on what the oil industry can get out of AI, which has so far been most prominently centered around predicting equipment failures and automating artificial lift units.
In one of a handful of early-stage projects, BP says it is looking at how it can use these AI systems to mitigate the impact of sand production, among the last things an operator ever wants coming out of its wells. One of the biggest benefits to BP is that as its most tenured remediation experts train this system, their expertise will live on digitally within the oil company long after they retire.
Outside of its work with BP, Beyond Limits is developing another system that will learn from geologists and reservoir engineers as they look for signs of the prize in offshore seismic data. Such technology could be used by oil companies to propose plausible well locations, or the well designs that it thinks will recover the most product.
The expectation is for this reservoir management advisor to be as reliable as a top expert, only be much faster and able to review much more data. However, this system and the others Beyond Limits is building are meant to complement the experts, not replace them. Nolan said one example of how this might work is using AI to catalogue the ideas of an exploration team, especially those that were never acted on.
Before a new offshore well is drilled, more than a dozen professionals may see the same set of seismic data a little differently. Ultimately, they will hone in on a single view and start making hole. But if all the discounted angles are kept for retrospective study, they may prove valuable in helping reduce reservoir uncertainty for the next wells.
“That’s the kind of thing you can afford in an AI system,” Nolan said. “Because it can at least store these things indefinitely, with pretty much perfect recall—something that humans are not great at.”
Another common thread with all of these applications is the concept of rapid scenario generation. To solve problems quickly, the idea goes that the AI will toss up qualified ideas to the engineers who will call the shots from there, putting them “in an underwriting position” as compared with “an initial creation position,” according to Nolan.
Haven’t We Seen This Before?
Because cognitive computing is not monolithic, some aspects of it are further ahead than others. Such is the case for computer vision.
“You will be able to infer the faults, and you will be able to infer horizons in the reservoir using this technology,” noted Chirag Rathi, the director of consulting at Frost & Sullivan, which has researched Beyond Limits along with similar AI vendors for the oil and gas industry.
But when it comes to matching human skill in other areas, he said new cognitive products must overcome an “underwhelming” history of delivering “rudimentary answers to really complex situations.”
This may indicate that there is still a lengthy road to maturity for cognitive systems. “It’s not a comment on the lack of hardware, but more on the software and all the aspects of decision making that need to be programmed,” Rathi said, adding that machine-based extrapolation will also need a higher order of data quality and availability than what the industry has traditionally demonstrated.
The concept of capturing the information a company acquires over the years and then activating it with a recommendation engine is not a new one. What Beyond Limits is working on can be traced back to the computers introduced in the late 1970s as expert systems. They were designed to retain an organization’s knowledge and follow a rules-based approach to generate answers.
Though advanced for its time, expert systems would lose much of their luster during the “AI winter” of the 1980s as they proved to be poor extrapolators—they knew only what they were trained to know. But the technology never really went away. People continued working to improve the logic behind it, and Moore’s Law kept enabling computing advancements at increasingly lower costs.
Among the places that expert systems lived on, and became templates for today’s generation of more capable software, was JPL.
It Came From Outer Space
From his desk, Abdallat can see out his window to the pair of hills that JPL sits behind—his deep ties mean he knows exactly where to point.
Down the hallway, walls are adorned with travel posters of recently discovered planets that lie far beyond our solar system. And in its lobby area, Beyond Limits boasts a 4-foot-tall scale model of the Mars rover Curiosity that stands on a pile of faux rocks with a panorama of the vast and inspiring Mars-scape behind it.
If not for the work that some of its team did on the real Mars rovers, this small company of about 80 employees and interns might never have caught the eye of BP in the first place.
Stone recalled how one of the principals at Beyond Limits authored a unique AI program responsible for the mission-critical task of managing one of the rover’s battery. When that program detected that the solar panels were suffering from dust storms, it did something it was never designed to do: access data from pressure and temperature sensors to build the Red Planet’s first weather model.
“That really impressed us,” Stone said, explaining this meant the rover could prep for dust storms by simply knowing which way to turn its solar panels. “I think it is a big step forward—no data scientists created that model.”
This uniquely adaptable software is just one of dozens from JPL that now form the backbone of Beyond Limits’ technology stack. Several bear the fingerprints of its chief technology officer Mark James, who was previously an advanced software scientist at JPL where he spent 25 years.
Among the programs he authored and have since followed him to the startup is a natural-language processing system called Hunter. Company documents say it is capable of “autonomous summarization” and “translating narrative descriptions of algorithms and processes.”
Originally developed for military purposes, this software is now central to Beyond Limits’ ability to spell out to end users the origins of answers in what it calls an audit trail. Through a machine-translation process, this program also allows those users to interrogate the reasoning behind each conclusion.
Another software named Sherlock IQ arose directly from work on the rover program and uses machine cognition to “autonomously shift through corridors of data to discover plausible facts and scenarios.” Tools like this are how a program designed for watching a rover’s battery can autonomously access sensor data and become a digital meteorologist. Similar systems are being adapted to form Beyond Limit’s AI reservoir management advisor, which aims to take risk analysis processes that usually require months down to just a few hours.
Whether you can successfully convert software built for billion-dollar deep space projects into software for billion-dollar deepwater projects may come down to the simple concept of trust.
“If I can explain to you how I got the answer, if I can provide you with an audit trail, you’re going to be willing to test it and try things,” Abdallat said.
As central as the audit trail is to fostering human confidence, it is underpinned by two other major components: knowledge bases and inference engines known in the AI-world as intelligent agents.
In the case of the knowledge base, one of the biggest questions is whose internal thought processes and actions are to be encoded into a machine-digestible form.
“We are replicating their best,” said Shahram Farhadi, a data scientist and the head of oil and gas technologies at Beyond Limits. Prior to joining the firm last October, Farhadi had been a petrophysicist and reservoir engineer with Occidental Petroleum. He pointed out that the ideal candidates initially include those who draft company procedures and best practices, and, in many cases, are also the same people who are on call to troubleshoot an operator’s biggest problems.
Those procedural documents go into these data bases, along with the same industry heuristics and first principle physics found inside textbooks and technical reports. Stone from BP equated the initial result to “a person who just started university.”
By the time a knowledge base model is actually put into the hands of engineers, it will have metaphorically graduated and should resemble what a green professional would be expected to know in their first year of work. At that point, “It will learn through being used on the job, interacting with different professionals, and it will start to build experience and store knowledge further,” Stone explained.
Some users will train the system simply by asking it questions. Others, typically senior engineers, will be the most impactful teachers and “can add new rules, variables, and new concepts,” Farhadi said.
These expert users are also the ones who will be popping the hood most often to see the program’s decision tree and read over the audit trail to understand the route taken to an answer.
How long it might take to build a knowledge base and other supporting systems to drive oilfield decisions depends on a number of factors, including the digital readiness of the operator and the scope of the problem being tackled. But in general, fleshing out an AI technology that can reliably select offshore well locations should be expected to take more time to deploy compared with one that will predict and advise an operator on asphaltene buildups in a well.
Meanwhile, a number of AI-agents will scour these knowledge bases, talk to other ones, and interact with all the same professionals so they can learn the art of problem solving in the oil and gas business.
And where more brittle AI systems might fall down in the face of missing data, these agents will reason past those gaps, an ability that takes time and training to sharpen.
“If you build an agent today with data and knowledge, you’re not going to put it in charge of making decisions tomorrow,” noted Farhadi. “Initially, it will be used as a design tool, then it will become a recommendation tool, and then once you build trust, it will be used as a control system.”
Keeping Humans in the Equation
Whenever AI is discussed in the context of high-level tasks such as reservoir management, the conversation inevitability turns to the future prospects of the human workforce. While a worthy talking point, many analysts do not see the mass replacement of human engineering talent on the immediate horizon.
In a report published last year, international consultancy Accenture predicted that cognitive computing “will have a profound impact in oil and gas” but said the change for professionals is likely to come in the form of “super-charged teamwork.”
That point of view is largely shared by BP and Beyond Limits, which translate the emerging shift toward these powerful computing tools as one that will “augment” engineering groups.
“This is not about automating jobs and getting the human out of the equation,” Abdallat said. The true aim of AI is to “amplify and magnify the human talent.”
BP and Startup Beyond Limits Try To Prove That Cognitive AI Is Ready for Oil and Gas
Trent Jacobs, JPT Digital Editor
01 October 2018
Shell Picks a Digital Platform to Build Its AI Future Upon
The international major has been playing with intelligent programs for years, but this new deal shows that it is now ready to scale those efforts up to cover hundreds of thousands of pieces of equipment.
Seeq’s Focus on Time-Series Data Draws in Chevron, Shell, and Pioneer
The 5-year-old software startup is getting noticed by the oil and gas industry for its ability to accelerate analytics projects by taking care of all the tedious work involved with data wrangling.
Simulation Algorithm Benefits by Connecting Geostatistics With Unsupervised Learning
A new geostatistics modeling methodology that connects geostatistics and machine-learning methodologies, uses nonlinear topological mapping to reduce the original high-dimensional data space, and uses unsupervised-learning algorithms to bypass problems with supervised-learning algorithms.
Don't miss out on the latest technology delivered to your email weekly. Sign up for the JPT newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
09 October 2018
11 October 2018
08 October 2018