ADVERTISEMENT


Collaboration Redefines the Human/Robot Relationship

Collaborative robots—robots designed to interact with humans physically in a shared workspace—have been around for decades, but recent advancements in automation technology may transform them from a novelty to a critical element of the automation landscape.

They are expected to be big business pretty soon. A 2018 report from Interact Analysis projects accelerated revenue growth during the next decade. Driven by wider availability from mainstream industrial robot vendors and major adoption by manufacturers, collaborative robot revenues are forecast to reach $7.5 billion (the sector generated less than $400 million in 2017), accounting for 29% of the industrial robot market. ABI Research projected a $13 billion market in 2027 as Chinese manufacturers come on board.

Despite these optimistic predictions, some hurdles to widespread adoption remain. Collaborative robots can be expensive to acquire. Installing work flows that use multiple collaborative systems can be challenging. However, these hurdles are not impossible to clear. Cognitive systems, visual programming interfaces, and industrial Internet of things (IIOT) platforms are making it easier for human operators to interact with and use the robots.

“Over the last 8 or 9 years, we’ve seen tremendous growth in robotics,” said Nicolas de Keijster, manager of sales and marketing at ABB. “There’s really been a step change in how many robots are going into the industry every single day. The avenue collaborative robots have gone over the last 5 years, while still in its infancy, shows you a little bit where robots will go in the next stage.”

Making Collaboration Reality

Speaking at a panel on collaborative robots held during CERAWeek by IHS Markit, de Keijster said the digitization of equipment is a crucial development toward more-productive operations. To this end, collaborative robots provide the best of both worlds, allowing operators to perform dangerous tasks in hostile environments without using human workers while still using the cognitive, common-sense decision-making skills of those human workers. The robot does the heavy lifting, but the person does the heavy thinking. De Keijster said this can be extremely valuable for tasks in hostile environments, such as offshore platforms. 

The robot does the heavy lifting, but the person does the heavy thinking.

“Think about sharing a task, assembling a product, loading a product so it can do its task; all of that is done in a very collaborative, seamless way. For a lot of people, that means ease of use—not necessarily intending to collaborate fully with the robot but having a robot you can approach, touch, and work with. It allows you to program much easier than with traditional methods,” de Keijster said.

Roger Barga, general manager of robotics and autonomous services at Amazon Web Services (AWS), highlighted some of Amazon’s recent developments in this space. The company has an open-source robot operating system (ROS 1) that provides developers with the software and libraries needed to debug, test, and deploy robotics applications. The company is also working on integrating ROS 2, an industrial-grade runtime for robotic applications, with its cloud service. This will allow operators to stream video off of a drone camera and process images in Rekognition (Amazon’s cloud-based image-analysis application), stream telemetry data, and perform simulations of robot designs.

“When you’re building a robot, you want to test it through simulation. So, with this, you can have a scale-out simulation to test your drone in real-world environments before you put it out in the wild and having it crash,” Barga said. “We think this is going to accelerate the field of robotics.”

Automation and Jobs

In discussing the long-term sustainability of collaborative robots, the question of increased automation and its effect on the workforce was brought up. If robots reach the point where they no longer need to collaborate with humans for many tasks, will human workers be necessary?

Another speaker at the CERAWeek panel, PrecisionHawk Chief Executive Officer Michael Chasen, said robotics are heading toward a path of full autonomy in certain environments but, as those environments become more autonomous, companies will be able to use the technology in other environments that require some kind of collaboration. He pointed to his company, which specializes in commercial drone remote-sensing applications and data-processing services, as an example. The company’s drones have been used to inspect pipelines and wellsites, processing sensor data with proprietary algorithms to help identify potential issues on site.

“A year ago, we had 20 full-time drone pilots on our staff, and today we’re at over 100, even though we’ve actually done a lot of work to autonomize how the drones are flying. It’s an example of how the technology has become more autonomous, but, at the same time, … there’s just more potential deployments of this technology, so it still requires more people at the end of the day,” Chasen said.

De Keijster said that he expects only a fraction of the tasks that can be automated will actually progress to full automation and that there is still a long way to go before industry sees large fleets of robots outfitted with the problem-solving skills needed to complete the repetitive manual tasks still performed in manufacturing. Barga agreed that any kind of full autonomy should not be expected any time soon. He also added that AWS has observed an automation dividend when it comes to the workforce.

“If you introduce a robot into production, there are people manufacturing the parts that go into that robot; there are people assembling it into an actual robot; there are developers and engineers who program it; and, once it’s deployed, there are people at the company who have to actually manage it as well as people who repair it and keep it in operation. So these robotics create jobs,” Barga said.  

Even if increased automation does not have a net negative effect on job totals, it will affect the types of jobs available. A May 2018 report from the McKinsey Global Institute predicts that automation will accelerate the shift in required workforce skills, as the demand for technological skills will rise 55% by 2030.

This surge, the report said, will affect the demand for basic digital skills as well as advanced technological skills such as programming. The demand for basic cognitive skills, such as data input and processing, will drop by 15% in the same time frame. The demand for physical and manual skills will drop by 14%, though McKinsey estimates that it will still remain the largest category of workforce skills in many countries.

“We’re flying over oil fields, doing well-pad analysis, we’re using thermal sensors and ground-penetrating radar. We’re getting data that either previously wasn’t available or people were just hiring people to walk the fields and just eyeball things. Now, we can fly drones with high-end sensors and we can write AI that can really identify potential problems. We are replacing what is a more boots-on-the-ground, unscientific analysis of data with high-end, high-resolution, machine-learning capabilities that can not only do the analysis today but they can do the predictive analytics of the future,” Chasen said.

What Lies Ahead

As for where collaborative robots will go, de Keijster said scalability is a major hurdle to clear. He said manufacturing processes still require a good amount of manual labor and scaling existing collaborative applications can lead to “tremendous” growth in its own right.

Chasen said the future of collaborative robots lay not in the robots themselves but in the sensors installed on the robots to collect and process data.

“We’re flying drones that cost only $10,000 to $15,000, and we have $200,000 LiDAR sensors on them. You’ve got high-end methane-detection sensors or ground-penetration radar sensors. It’s a little bit less about the drone robot than it is these incredibly high-end sensors coming to market that give us access to a wider range of data than we’ve ever had before. Having to take that data and analyze it is where we’re coming up with new applications,” Chasen said.

Barga said that “any task that’s dull, dirty, and dangerous” will be ripe for automation applications in the near future. He cited platforms such as Amazon SageMaker, which allows companies to establish machine-learning work flows that help them label and prepare data, choose an appropriate algorithm for processing that data, train the algorithm, and optimize it for deployment in a predictive model.

The platform is already being used to predict 3D volumes of rock density by combining 3D seismic data and well logs to create rock-property data at every point within a seismic frame. Barga also cited an unnamed operator that uses the platform to predict well failure using a limited set of variables. He said the next step in collaborative robots is reaching a point where operators can put a device in the field to capture data for a predictive model that then can be applied on site in quick fashion.

“If your business question changes, you can write the code, test the simulation with no investment of capital, test that the model works, and then be able to deploy that over the air to the robot to be able to carry out that task. I think you’re going to see this develop collectively. You’re going to see us figure out how to better collect data with sensors, build predictive models, and put the intelligence out on the edge where you can make an informed decision. And, as the business challenges change, you’re going to be able to redeploy those rapidly,” Barga said.

Seismic Density Prediction With Amazon Sagemaker


STAY CONNECTED

Don't miss out on the latest technology delivered to your email monthly.  Sign up for the Data Science and Digital Engineering newsletter.  If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.

 

ADVERTISEMENT


POPULAR NOW

ADVERTISEMENT

ADVERTISEMENT