The Keystone pipeline that would move crude oil from Canada through the US to a refinery in Texas has been controversial, but it would only be a fraction of the more than 2 million miles of pipelines moving oil and gas around the country. Many existing and proposed pipelines spark the same concerns from people as Keystone: the potential for leaks, especially those that go undetected for long periods of time.
Existing detection systems mostly spot large problems, often visually by inspectors walking or flying over a pipeline. Internal systems commonly used in the oil and gas industry rely on computational pipeline modeling, which searches for anomalies in flow and pressure. That works well for large leaks but falls short in finding smaller ones, of up to 1% of pipeline flow, says Maria Araujo, a manager in the Intelligent Systems Division of the Southwest Research Institute (SWRI).
Even such a small percentage adds up quickly. She notes that 1% percent of the flow of the Keystone pipeline is in the neighborhood of 8,000 gal/D. To improve the efficiency of detection systems, Araujo leads a team taking the technology to the next level using sensors, artificial intelligence, and deep learning. She came to the problem of leak detection while working with machine learning for autonomously driven vehicles.
“We’re not adapting technology,” she said. “We’re using existing technology as building blocks. The problem is very different. With cars, you’re looking for objects. Here, you look for liquids. Gasoline and diesel are transparent to the human eye. How do you differentiate between substances?”
Actually, the system looks for a variety of liquids. To begin tackling the challenge, the SWRI team tested four optical sensors: thermal, optical, hyperspectral, and short-wave infrared. They eliminated hyperspectral and short-wave infrared, keeping off-the-shelf thermal and optical systems.
There’s nothing unusual about using sensors for detecting leaks, but Araujo wanted to improve accuracy. So the SWRI team set out to adapt machine-learning techniques, ultimately producing a multiplatform dubbed SLED, Smart Leak Detection System, that uses new algorithms to process images and identify, confirm, or reject potential problems. Using feature extraction and classifier training methods, they taught computers to identify unique features across a wide range of environmental conditions.
“These algorithms thrive on lots of data,” Araujo said. The team produced and collected thousands of images of data such as gasoline, diesel fuel, mineral oil, crude oil, and water on various surfaces, including grass, gravel, dirt, and hard surfaces such as concrete. The images were shot from numerous angles and under varying conditions from full sunlight to clouds and darkness. “It’s hard to operate under different environmental conditions,” she said. “We found if you train [the system] under certain conditions, it gets tripped up in others, especially shading. Being able to work under shading and different temperatures was a big challenge in modifying algorithms.”