With cyber attacks in the oil and gas industry growing in scope and sophistication, the need for a healthy security culture within organizations is as important as ever, according to a cybersecurity expert.
In a presentation held by the SPE Digital Energy Technical Section, Betsy Woudenberg examined recent incidents and described how the oil and gas industry can develop the culture needed to combat cyber espionage. She is a former US Central Intelligence Agency case officer and the founder and chief executive officer of the security consulting firm IntelligenceArts.
Handling cybersecurity requires an understanding of the landscape of bad actors who pose a threat to an organization. Woudenberg split these actors into six categories: criminals, vindictive insiders, ideological extremists, ethno-nationalist hackers, rogue corporations, and nation states. She said each group has its own motivations and capabilities, and in order to develop a strong security culture, a company must determine where its interests align with those motivations and capabilities. Businesses, particularly those in the energy industry, create value and that value attracts threat.
“The reason why we’re getting attacked cyberwise and every other way is because we are involved in businesses that create value,” Woudenberg said. “This is a pretty simple thought, but you’d be surprised at how many people are shocked that they’re being hacked. Well, you are creating value, you’re moving value, you are moving money around, you’re buying and selling.”
Woudenberg brought up several examples of attacks in which China was involved over the past decade. The first attack took place in 2008, when attackers took exploration data from oil fields around the world and auction-planning data from Marathon, ExxonMobil, and ConocoPhillips. This attack coincided with China’s first attempt to purchase liquefied natural gas on the open market, and the country was involved in deals with each of the three companies during this time.
Operation Shady RAT, a long-term campaign, involved the theft of negotiation plans, supervisory control and data acquisition (SCADA) configurations, designs, and schematics from a natural gas wholesaler based in the United States. The third attack, dubbed Night Dragon, focused on specific energy industry targets in the US, Taiwan, Kazakhstan, and Greece from late 2009 through early 2011.
Woudenberg said she believes the later two attacks occurred after China realized the value of the information they acquired in 2008. Night Dragon coincided with specific ongoing oil and petrochemical deals that China had with each country in question, while the push for SCADA configurations came at a time when it was building several ambitious pipeline projects.
One simple way in which organizations can combat cyber attackers is to monitor the information given publicly about its operations. Woudenberg said employees should look out for spear phishing, a fraud attempt made by hackers seeking to access confidential data through email spoofing. These emails typically contain personal information meant to fool their targets into accepting them, so employees should be aware of the kind of information a hacker could easily access, she said.
“See what’s out there about you,” Woudenberg said. “Look at your company’s websites. See if you’re described with your title. See if you are mentioned in press releases about having worked on key projects. If you are interviewed, if you are mentioned somewhere, if you get an award, … if that information is made public, that is information hackers can use to shape an approach to you to try to get into your system.”
Maintaining boundaries in public and private discourse can also help thwart hacking attempts. Boundaries, Woudenberg said, are a matter of context: what an individual does, where that individual is, and who that individual is with. It is important to keep context in mind when discussing a company’s operations, as privacy is no longer a guarantee. Woudenberg illustrated this point by discussing the hack of Sony Pictures Entertainment in November. She said that employees need to know where their boundaries lie in corporate email and decide what they will and will not discuss in a conscious and thorough manner.
While monitoring information and maintaining boundaries are helpful solutions, they are not foolproof. Woudenberg said organizations must accept that some potentially harmful information will become public, and with hacking attacks becoming more complex, they must remain extra vigilant.
“The issue is not that we are dumb,” she said. “The issue is that they are clever and getting cleverer. It’s very, very clever social engineering. What we must do is recognize that, recognize how much information there is out there, and recognize what it is about us that they can use for this, and what makes it easier for them to shape clever approaches to us.”
A healthy safety culture requires openness within an organization between management, staff, and the security people. Everyone in the organization must accept personal responsibility for their own decisions, and they must be willing to give and accept feedback. Woudenberg said this is especially true when a security team makes a recommendation that may hinder the staff’s ability to do their jobs properly. Employees, she said, should not be afraid to point out the flaws in a security initiative as opposed to ignoring the initiative altogether.
“The fear from security is that people will say, ‘Uh huh, I will’ and then they don’t because they know that they can’t get done what they need to get done, and so they will disregard the advice. It’s much better to actually tell security that to their face. … In a healthy security culture, there is feedback and suggestion and change on both ends. The security’s idea and picture of how everything works gets better, and you get better direction that’s in tune to the way you actually work,” Woudenberg said.