Exclusive Content
11 Aug 2016

European Commission Strives Toward Reasonable Shale-Gas Regulation

Following years of deliberation, the European Union (EU) released a recommendation on unconventional hydrocarbons and a related communication in 2014. Although these documents are not legally binding on member states, they are nevertheless of great significance because they indicate, for the first time, the current and likely future stance of EU institutions on the regulation of unconventional hydrocarbons. This paper traces the origins and development of these documents, which provide vital clues for the road ahead in European shale-gas regulation.

The potential threats of groundwater contamination, irresponsible disposal of flowback, the repercussions of significant land use, and increased emission of greenhouse gases have been named in recent scientific studies as main potential threats of shale-gas extraction. The current European law framework on environmental protection, mainly consisting of directives and regulations, entails some gaps and does not cover these issues comprehensively. Thus, the EU took recent action to develop shale-gas-specific regulation in order to close the identified gaps in the existing general framework.

Because the existing secondary law norms were elaborated at a time when shale gas extraction was virtually unknown in Europe, one would suspect that they entail provisions that do not sufficiently cover the specific potential threats of this technique. Indeed, there are a number of issues. Probably the most important one is that environmental impact assessments (EIAs) are not compulsory for shale-gas projects. Although member states have the right to require an EIA for specific, individual shale-gas projects, this discretion does not appropriately match the level of potential environmental hazards of shale-gas extraction.

The paper does not engage in an analy­sis of the pre-existing EU regulatory framework but focuses on the EU’s efforts to close the gaps in the framework that have been discovered previously. The EU introduced nonbinding, soft-law measures in this regard to create a level playing field among all member states in the form of the 2014 Shale Gas Recommendation and the 2014 Shale Gas Communication.

The paper highlights the main features of the recommendation and the communication and considers whether they are sufficient to close the gaps in the EU secondary law framework. Overall, the ­author concludes that these measures go a long way in addressing the perceived gaps, although they do not succeed in closing all of them.

Despite that rather favorable assessment, the recommendation and the communication have been criticized because of their legal guise as nonbinding soft-law norms. The fear that individual states could simply ignore the recommended measures if they do not fit in with their respective agendas on shale-gas extraction was expressed by some scholars. This peculiarity, they argued, could lead to a “race to the bottom” of environmental standards, as one member state may try to undercut the others  on environmental standards in order to attract investors.

However, this paper concludes that the described race to the bottom would be a rather short one and would not put the environmental standard in the EU into any real danger. The existing environmental directives and regulations of the EU constitute the ultimate bottom line, below which member states are not allowed to operate. Because of the high standard and elaborated nature of this bottom line, there currently is no real danger for environmental standards in Europe to be lowered to any significant extent. Moreover, member states would be ill-advised to take a chance and simply ignore these recommendations. A considerable number of legally binding EU directives started their existence as recommendations in the past. In line with this history, the 2014 Shale Gas Recommendation explicitly threatens member states with the introduction of legally binding norms if the EU is not satisfied with the domestic implementation of the recommendation. Thus, it is not unlikely that the recommendation could turn into an EU directive or regulation.

Although the 2014 Shale Gas Recommendation and the 2014 Shale Gas Communication go to some length in addressing current gaps, they do not cover all of them. Even more importantly, the 2014 measures are not legally binding on member states. However, this paper concludes that the nonbinding legal character could be an advantage because it provides member states with the greatest possible leeway to implement shale-gas regulation that is tailored to their individual needs.

Both documents recommend to member states that wish to engage in shale-gas extraction a set of measures and ­operating standards in order to create a level playing field among those states. The measures indeed go some way in closing some of the pre-existing gaps. The framework urges member states to carry out a strategic environmental assessment before issuing licenses that may lead to shale-gas extraction. The 2014 Shale Gas Communication entails a pledge of the EU to look into the issue of a specific best-available-technique reference document for shale-gas extraction under the Mining Waste Directive. This action is designed to ensure that waste is appropriately handled and treated and that the risk of water, air, and soil pollution is minimized.

Moreover, the framework ­reinforces the monitoring requirements under the Water Framework Directive and the Groundwater Directive. Baseline studies of shale-gas sites with regard to water, soil, and air quality, and other issues, should be conducted; their results should be benchmarked against future results of comprehensive monitoring exercises. Furthermore, the framework calls upon member states to apply the provisions on environmental liability to all activities taking place at a shale-gas-extraction site. This request shall be understood explicitly as including strict liability for greenhouse-gas emissions and overbearing use of land, which currently do not fall under the scope of the Environmental Liability Directive.

However, the European Commission failed to close some other gaps. Most notably, it called upon member states to ensure that an EIA is carried out for each shale-gas project but took no action to insert shale-gas projects into Annex I of the EIA Directive. This move would have made EIAs obligatory for all shale-gas projects, already at EU level. By simply passing the ball back to member states, the EU did not adequately address the main gap in EU EIA legislation.

However, the 2014 framework on shale-gas extraction does not actually implement the described measures but merely recommends to member states to take these measures into account. The framework has been molded into a recommendation and a communication, secondary EU law measures with no direct binding force.

The nonbinding nature of these EU regulations on shale-gas extraction became the main point of criticism. It was argued that nonbinding legislation is an ineffective way to create a level playing field for shale-gas extraction for all member states because individual states are allowed to ignore the measures outlined in the recommendation if the measures do not fit in with their respective agendas. This could lead to a race to the bottom of environmental standards, because one member state could try to undercut another on environmental-compliance costs for foreign investors. However, this race to the bottom could not last indefinitely because the existing ­environmental-law framework of the EU constitutes the bottom line for member states.

A review of the effectiveness of the framework shall be conducted within 18 months of its coming into force. Depending on the outcome of this review, the commission is going to determine if further, more-stringent, regulatory action on shale-gas extraction is required. In fact, this is the way in which a considerable number of directives came into force in the past.

Ultimately, the nonbinding character of the 2014 Shale Gas Recommendation is in some respects an advantage. The principle of subsidiarity, under which member states should take responsibility for matters that can be decided at their level, is honored. Subsidiarity must also be viewed in the context of proportionality. The principle of proportionality requires the use of nonbinding instruments (e.g., recommendations) in EU environmental legislation, wherever possible.

10 Aug 2016

Service Company Explores Pathways To Make Driving Inherently Safer

In risk management, an inherently safer approach implies an attempt to eliminate, or at least reduce the severity and likelihood of, incident occurrence through careful attention to fundamental design and layout. This paper examines whether this approach can be applied and be effective in managing transportation safety concerning which, historically, most of the responsibility for safe driving has been placed on the individual driver and less on the design of the transportation system and features of the equipment.

As is often the case for change management, this undertaking was motivated by a tragic motor-vehicle accident in Saudi Arabia, which resulted in three fatalities, two employees and a third-party driver. Transportation-­management systems were implemented and in place, including a contractor-selection process, journey-management program, ­defensive-driving training, and in-­vehicle monitoring systems, but, as sometimes happens, compliance with planning and executional requirements was inadequate. The ­accident-investigation findings uncovered a number of gaps that existed in the transportation-management system and that eventually led to the catastrophic event. These revelations, coupled with the vision that “all motor-vehicle accidents are preventable,” presented an opportunity to revisit the way transportation safety was managed. The entire life cycle of the journey was reviewed and reorganized, from the planning stage of the journey to journey’s completion. Such an approach posed a challenge to the company definition of “preventability” for motor-vehicle accidents, which states that a preventable accident is a vehicle accident in which the driver could have driven (but failed to do so) in such a manner as to identify an accident-­producing situation soon enough to take reasonable and prudent action to avoid such an accident. This definition places the primary responsibility for preventing a vehicle accident on the driver and his ability to anticipate road hazards, assess the risks, and take actions to avoid the accident through the ability to challenge the process, including questioning the need or timing of the journey itself.

Instead of the instinctive quick-fix reaction of placing responsibility solely on the driver, the new perspective dictated that responsibility for preventing accidents lies with the company management and its ability to create a system that would comprehensively combine the management of all transportation aspects under one umbrella, including:

  • Human factors and driving behaviors
  • Journey management, with all necessary reviews and approvals
  • Vehicle speed and other driving characteristics
  • Vehicle condition and conformance to standards

Inherently Safe Driving-System Framework
What makes a system robust and inherently safe, and what must it look like? In the world of computer science, robustness is defined as “the ability of a computer system to cope with errors during execution” and also as “the ability of an algorithm to continue operating despite abnormalities of input or calculations.” To build a robust operating system, computer companies study many possible inputs and input combinations, program against every point of possible failure, and make the system intelligent enough to handle all possible error states. The goal for a new system is for it to be robust enough to monitor proactively (without direct involvement of humans), prevent any known or assumed compliance failures or violations, and enforce compliance during driving. The pillars of the conceptual inherently safe and robust driving system are intelligence, visibility, compliance, and proactivity, which rest on the foundation of independence and automation.

Intelligence. This is the ability of the fleet and journey-management software to build connections independently and automatically between the movements of vehicles and applicable journey plans, run compliance checks against their approval levels, run compliance checks against driver competencies, and highlight and send any potential breaches to designated personnel for them to audit.

Visibility. This is the ability to provide accurate, real-time information on movements of vehicles, journeys being undertaken, and their associated information such as driver, vehicle, and trip progress. In addition to the operational data, which is required to monitor execution, the system is required to provide visibility of current trends in various driving aspects (e.g., at-risk driving behaviors, journey-management breaches, fleet utilization, and night driving).

Compliance. This is the ability to ensure compliance of all elements of the driving process (i.e., driver, vehicle, journey route, and plan), either before or during the journey, to the standards within the preapproved criteria. The system must be set up to prevent selection of an unfit driver or vehicle for an intended journey, and, if the journey must progress according to a preapproved route, any deviations must be identified and corrected immediately.

Proactivity. This is the ability of the system to prevent potential breaches through predetermined controls or check points. Examples of this would be the inability to select an approved light-vehicle driver for a heavy-­vehicle trip (even if a person possesses a ­commercial-vehicle driving license) or the ability to alert a driver to stop and rest at predetermined intervals.

Independence and Automation
This driving system rests upon two primary features—automation and independence. Automation is intended to minimize the human-to-system interaction, whereas independence implies freedom from operational factors that may influence the safe operation of vehicles.

Independence. Day-to-day priorities influence operational activities. Deadlines must be met, and, often, these priorities conflict with values. It takes integrity and commitment to follow the rules, but, as is commonly recognized, the human factor is often less reliable than ­others. To avoid possible conflicts of interest, provide uncompromising independence from operational factors, and provide sufficient available resources, a new department was created called the Transportation Office. The entire purpose of this department is to oversee all transportation aspects of employees and contractors in support of the company’s work activities. This office directly manages all transport vehicles, drivers, Road Journey Management Center operations, and the Journey Management Plan process, and performs regular audits of the entire system.

The Transportation Office has responsibility and full authority for final approval of any journey to take place. Even if a journey has been approved by the driver’s manager, it will still require approval from the Journey Management Center.

Automation. Fleet-Management Improvements. In-vehicle monitoring systems are used to control compliance with speed limits and to monitor and correct drivers’ behaviors (e.g., harsh braking and harsh acceleration). However, it was determined that the system could provide more proactive control points to improve the fleet-management process for earlier detection of possible noncompliance and intervention.

Improvements were made to make the system less dependent on drivers’ attitude toward their safety. The system automatically alerts and, if required, enforces the expected behavior, and it provides new data for trend analysis and required further improvements.

e-Journey Management. Making improvements in the management of drivers’ behaviors and vehicle movements was an important step toward the “zero motor-vehicle accidents” vision but was not enough to eliminate vehicular incidents. Failures in the journey execution were a common cause and were responsible for a large percentage of motor-­vehicle accidents. To address this issue, a project was created to develop a solution that focused on “management by exception” through the integration of fleet management and automated journey monitoring. This electronic system is designed to be sufficiently intelligent to run a constant monitoring of vehicle movements and verify the compliance of their execution to their preapproved conditions. If any breach is identified, the system will alert Road Journey Management Center personnel for immediate intervention.

Passive Controls. Considering the risks of possible rollovers, a decision was made to reinforce vehicles with rollover protection. All company-owned vehicles must meet international automotive safety and quality standards, including applicable safety and crush tests by manufacturers, and must be currently safe to use.

Driver Training. A competence-­management concept was adopted regarding driver training. The driver training program has been revised to address critical defensive-driving fundamentals, company-specific driving hazards, and safe-driving expectations. The Core Defensive Driving course covers 25 specific defensive-driving skills, and a student must demonstrate not only academic knowledge of the defensive-driving material but also practical mastery of the defensive-driving skills taught.

9 Aug 2016

The Influence of Communication About Safety Measures on Risk-Taking Behavior

Risk-taking behavior is an important contributing human factor to incidents and is notoriously difficult to influence. Anecdotal evidence suggests that people have a hard-wired optimal perceived risk level. People compensate for risk-reducing measures by behaving in a riskier fashion until the desired level of risk is reached again. This study looked at the effect of the number of shields of protection and uncertainty on the risk-taking behavior of the participants.

The main aim of safety research is to identify ways to prevent accidents and to ensure the safety of workers. Human error—or, in other words, unsafe behavior—has been found to be a major cause of accidents, and its elimination, therefore, is a prime goal for improving safety. The human factor is most effectively addressed by tackling the organizational system instead of focusing on incorrect actions by individuals. An effective strategy is to increase the level of protection or the number of safety barriers. The concept of the safety barrier features most prominently in the Swiss-cheese metaphor of accident causation. The Swiss-cheese model describes accidents as being caused by unchecked hazards that are allowed to cause losses. A series of barriers is placed between the hazard and that which may be harmed. The barriers keep the hazard under control and prevent it from causing harm. However, these barriers are always less than 100% adequate and contain weaknesses or holes. The barriers, therefore, often are compared to slices of Swiss cheese. Unlike real Swiss cheese, the holes in the barriers are dynamic and open and close at random. When these holes in the barriers are aligned, a path is created, leading to a potential accident.

Intuitively, one would assume that safety improves proportionally both to the protection measures taken and to the improvements in the design of such measures. The more protective equipment given to the workers, the safer they will be, either because of a reduced risk of accident or because such measures mitigate the effects of accidents. This approach assumes that human error can arise from unintended actions such as memory lapses and attention failures. However, it can also be attributed partly to intended actions such as risk taking. The question addressed in this paper is related to the extent to which people’s risk-taking behavior was influenced by their awareness of the numerous preventive interventions in place. The pivotal issue that arises is whether people adapted their risk-taking behavior as a result of their awareness of the number, and of the effectiveness, of the barriers in place.

Risk Taking
Alteration of behavior is a recurring theme in the safety literature and is described in a variety of ways—“risk compensation,” “risk homeostasis,” or the Peltzman effect. When people feel safer, they tend to take greater risks. People do not want to reduce risk to an absolute minimum but, rather, to optimize it. People are willing to accept a certain level of risk if risky behavior (e.g., breaking a barrier in the Swiss-cheese model) comes with benefits.

There is virtually no behavior without a certain measure of risk attached to it. Therefore, the challenge is to optimize rather than to eliminate risk. This optimum, also known as the target level of risk, is the level that maximizes the overall benefit. Previous studies suggest that people constantly compare the amount of risk they perceive with their target level of risk and that they will adjust their behavior in order to eliminate any discrepancies between the two. This psychological mechanism constitutes a case of circular causality.

The mechanism is similar to a thermostat, where there are fluctuations in the room temperature but where such fluctuations are averaged over time; the temperature will remain stable unless set to a new target level. The risk homeostasis theory (RHT) transfers the homeostatic effect of a thermostat to risk behavior. RHT posits that, similar to a thermostat that has a target temperature, people have a target level of risk. People will change their behavior in order to maintain their target level of risk.

Research Questions
Previous research has given some indications that people compensate for safety measures such as barriers or shields by behaving in a riskier fashion. However, besides such anecdotal evidence, no systematic research has been carried out to consider the effect on behavior of informing people of the number of safety barriers in place for their protection. This paper sought to answer two questions:

  • Do people indeed compensate for greater layers of, or more effective, protection by behaving in a riskier fashion?
  • How do people behave when they are uncertain about the number of shields of protection?

The Experiment

Screenshot of the game.

A side-scrolling videogame was custom made for this experiment. It required the player to navigate a small spaceship through an asteroid field, with asteroids moving from right to left after materializing randomly on the y-axis. The spaceship was controlled with the arrow keys on the keyboard. The up and down keys moved the spaceship up and down, and the right and left keys increased and decreased the speed, respectively. Five different speed levels ranged from 1 (default) to 5 (maximum). The number of shields left was indicated as can be seen in Fig. 1. Whenever the ship crashed against an asteroid, the player lost a shield. This process continued until there were no shields left. A collision at that point would end the game. In circumstances where participants did not know the number of shields the ship had, a question mark was displayed in front of their spaceship. The game measured the time a player spent on each shield and the total amount of points a participant scored. Points were gained by staying alive. The faster a participant flew, the more points he or she gained per second. Such bonus points acted as an incentive for participants to increase speed in order to achieve higher scores.

To test if the layers of protection and the uncertainty about their number had an impact on the risk-taking behavior of participants, an experimental design was chosen. This design allowed for the determination of causal relationships. This experiment focused on the relation between the number of shields (maximum five) and the level of risk taking. There were 104 students participating in the experiment. Participants were randomized to participate in two out of six possible sets of conditions:

  • Condition 1—Zero shields
  • Condition 2—One shield
  • Condition 3—Two shields
  • Condition 4—Three shields
  • Condition 5—Four shields
  • Condition 6—Unknown number of shields (actually four shields)

The data were analyzed, and univariate analyses of variances (ANOVAs) were used to analyze the different trends to evaluate four hypotheses.

Hypothesis 1. The average mean of speed is different for varying conditions: The higher the number of barriers to which participants are exposed, the greater the degree of risk they take in flying.

A univariate ANOVA was calculated to see if the average speed was different for the varying conditions. Condition 6 was excluded. A significant difference was found, with a significant upward linear trend. Hypothesis 1 was confirmed.

Hypothesis 2. The average mean of time spent per shield is different for varying conditions: The fewer shields participants are exposed to, the more time they spend per shield.

A univariate ANOVA was calculated to see if the average speed was different for the varying conditions. Condition 6 was excluded. A significant difference was found, with a significant downward linear trend representing the data best. In Condition 5, players spent less time per shield than in Condition 1. Hypothesis 2 was confirmed.

Hypothesis 3. In conditions of uncertainty, participants fly more slowly than in conditions where they know how many barriers they are exposed to.

A repeated-measures ANOVA with a Greenhouse-Geisser correction determined that the average speed of play throughout Condition 6 differed significantly statistically between shields. A significant upward linear trend represented the data best. Hypothesis 3 was confirmed.

Hypothesis 4. In conditions of uncertainty, participants spend more time per shield than in equivalent conditions where they know how many barriers they are exposed to.

A repeated-measures ANOVA with a Greenhouse-Geisser correction determined that the average time played per shield throughout Condition 6 did not differ significantly statistically between shields. Hypothesis 4 was not confirmed.

When participants entered the game with five shields, they played in a significantly riskier fashion than when they entered with only a single shield. This is a highly relevant finding, considering that costly risk-assessment techniques may be of little value if improvements in safety systems are outweighed by the risks introduced by changes in operator behavior. However, removing safety features might not be a very ethical move. One suggestion could be to hide protection mechanisms from the system operators until needed. This goes toward creating a feeling of uncertainty or ambiguity among workers concerning their safety, which could be accompanied by a communication strategy emphasizing this uncertainty. Putting a greater number of layers of protection in place was not rendered ineffective completely by increased risk taking. Although the limitations of this study should be recognized, organizations might reconsider the practice of giving information to their employees on the number of safety measures that are taken. Preserving ignorance among employees concerning the enhanced protection in place creates a stronger safety buffer because it reduces risk-taking behavior and improves employees’ efforts to make sure that the presumed last layer holds.

8 Aug 2016

Making an E&P/Fisheries Management Plan Work in Ghana With Multiple Stakeholders

Ghana’s fishers and coastal communities have raised concerns over the effects of offshore oil exploration and production relating to the giant Jubilee field. In 2014, the Ghana Environmental Protection Agency initiated an independent study of marine conditions as a key step forward, with the endorsement of the Ministry of Fisheries. The study included extensive participation from fishing groups, oil companies, government officials, nongovernmental organizations, and other interested parties, and produced a number of recommendations. This paper describes the process and results of the multistakeholder approach to solving marine-zone conflicts.

Following the discovery of the Jubilee field in 2007, the rapid development and launch of the offshore commercial oil production have attracted major oil and gas companies, with huge capital injection into the national ­economy as well as related sorely needed social investments and developments in communities of the Western Region and other sectors.

Offshore-resource development has raised expectations and has fueled speculation and confusion about the actual environmental effects of the oil and gas activities. A resulting backlash, including blame of oil and gas activities for myriad problems associated with fishing and the marine environment—such as declining fisheries and poor landings, frequent beaching of whales, algae nuisance, presence of tar balls, and contamination of fish—indicated a clear need for scientific study and research-based action to reveal the actual environmental situation and trends.

Independent Study
An independent study was commissioned in 2014 by Ghana’s Environmental Protection Agency and Kosmos Energy Ghana to explore stakeholders’ environmental concerns and evaluate the extent to which marine environmental challenges could be linked to or mitigated by oil and gas industry action. The study focused on six major themes:

  • Fishery and fishing decline
  • Whale mortality
  • Algal blooms
  • Tar balls
  • General marine environmental conditions
  • General coastal socioeconomic conditions

After a review of nearly 200 studies, extensive stakeholder consultation, and independent analysis of existing research and data, the joint international/Ghanaian study team found that some of the concerns raised were not directly attributable to the offshore oil and gas activities, while others appeared to be. The study made 13 general recommendations for addressing these major issues and concerns, whether attributable to oil and gas activities or not. The recommendations were prioritized further to facilitate specific immediate actions and included

  • Management plans to mitigate the effect of an exclusion zone
  • Strengthened capacity for governance of fishing activities
  • Additional measures to minimize harm to whales
  • Marine-noise study and management practices
  • Algae source study and management plan
  • Tar-ball fingerprinting analysis and management plan
  • Continuous improvement in waste management
  • Continuous improvement in oil-spill prevention and response
  • Integrated, participatory, and transparent baseline data compilation
  • Additional measures to minimize effects on fishing activities

Marine Fisheries Advisory Committee and Action Plans
To ensure the effective implementation of adopted recommendations, the Ministry of Fisheries and Aquaculture Development (MoFAD) established a Marine Fisheries Advisory Committee (MFAC) with the mandate of formulating sectorwide action plans.

The action plans that were developed are the result of focused efforts, research, and consultation by the members of the MFAC. Each of the action plans represents a brief and focused response to the priority recommendations raised by the independent study. The plans are intended to be strategic and high level, providing description of key activities and objectives. Appropriate institutions/organizations were identified to perform the next step of detailing the stated actions for implementation where feasible. Otherwise, contracting arrangements and options have been indicated for selecting contractors or consultants to elaborate on the actions for execution.

MFAC assumes responsibility for coordinating the implementation of the action plans, per its charter, under the ­auspices of an Inter-Ministerial Oversight Committee (IMOC) through MoFAD. MFAC’s overarching mission (under the direction of IMOC) includes supporting MoFAD and other ministries in

  • Ensuring strategic coexistence of oil and gas and fisheries sectors
  • Harmonious use of marine space and seabed
  • Promoting intersectoral management of marine resources
  • Coordinating information sharing and decisions on fishing and other industries that use marine resources

Membership of MFAC reflected the view that successful marine-sector management would require an inclusive strategy in order to ensure that a broad array of interests were represented.

The action plans are intended to inform a 2-year initial design, launch, and implementation period, which will be followed by review, realignment, and follow-on activities coordinated by MFAC.

The seven broad areas covered by the action plans are

  • Governance of the marine sector
  • Stakeholder awareness on oil-development and marine-resource issues
  • Marine environment research in support of resource protection
  • Marine spatial planning for integrated management
  • Marine- and coastal-data generation and use
  • Land-based sources of contaminants and degradation of coastal waters
  • Livelihood diversification in coastal fishing communities

The action plans include objectives and activities for each of the seven focus areas, as well as scope, timeline, implementation strategy, and potential lead and supporting organizations. A summary version of the integrated action plans follows.

Action Governance.
Primary Objectives.

  • To ensure coastal- and marine-policy coordination is placed at the highest ministerial level
  • To facilitate sector integrated management
  • To promote streamlined regulatory functions
  • To facilitate effective institutional working relationships and collaboration
  • To promote efficient access to sector data

Major Milestones and Activities.

  • Formation of new, intergovernmental coordinating bodies
  • Establishment of a marine-database center

Stakeholder Awareness.
Primary Objectives.

  • To develop evidence-based information targeted at specific stakeholders
  • To disseminate the information
  • To facilitate prompt reporting of environmental challenges, such as beached whales or sighting of tar balls

Major Milestones and Activities.

  • Carrying out awareness activities and dissemination through public forums, focus-group discussions, seminars, public announcements, video documentaries, radio or television programs, websites, and other means
  • Developing material and delivering training

Marine-Environment Research.
Primary Objectives.

  • To introduce conservation practices in support of improved fisheries and sustainable fishing efforts
  • To ensure continuous monitoring of cetaceans and investigation of causes of their mortality in Ghanaian waters
  • To intensify investigation on the beneficial uses of green algae

Major Milestones and Activities.

  • Designation of potential marine parks, conservation areas, and spawning and feeding grounds
  • Fish stock assessment and study and introduction of closed seasons
  • Cetacean migratory monitoring and documentation
  • Tar-ball sampling, documentation, and reporting

Marine Spatial Planning.
Primary Objectives.

  • To determine and designate areas for streamlined multiple use
  • To promote integrated management of marine resources and space
  • To ensure peaceful coexistence between fishers and the industry
  • To publicize the national Marine Spatial Plans (MSPs)
  • To enact appropriate legislation to back the MSPs

Major Milestones and Activities.

  • Review and study of existing designated areas
  • Specialized site studies
  • Development of a safe-sea-access strategy
  • Stakeholder review of prospective MSPs and sites

Marine and Coastal Data.
Primary Objectives.

  • To create a centralized coastal and marine database
  • To facilitate consistent time-series monitoring
  • To support the use of data generated (both baseline and monitoring) for national use
  • To conduct trend analysis to interpret events in the marine and coastal sector

Major Milestones and Activities.

  • Baseline survey on fisheries value chain
  • Coastal socioeconomic survey
  • Marine-environment baseline data collection (e.g., water quality and chemical indicators, fish tissue analysis, and chemical constituents)

Land-Based Sources of Contaminants/Coastal Degradation.
Primary Objectives.

  • To establish extent of land-based pollution loading on coastal waters
  • To support preparation of district land-use schemes and local plans

Major Milestones and Activities.

  • Sampling and monitoring of coastal waters for chemical pollution
  • Developing district land-use planning schemes focused on agricultural use and waste management
  • Waste management, including waste segregation and recycling awareness

Coastal Livelihood Diversification.
Primary Objectives.

  • To enhance access to education
  • To promote higher education for children, particularly in fishing communities
  • To promote complementary livelihood opportunities and skills development

Major Milestones and Activities.

  • Comprehensive scholarship scheme
  • Implementation of access-to-education projects
  • Development of livelihood-skills projects

It remains to be seen how successful Ghana’s multistakeholder approach to marine and fisheries management will be; the research and stakeholder-informed action plans have just been launched. But the effort has been noteworthy in its explicit recognition of the need for government leadership, industry support, and civil-society cooperation and collaboration in order to tackle a critical environmental and socio­economic challenge. Ghana’s marine environment and fisheries are of paramount importance to its coastal population, and a successful collaborative model could represent an approach that has global applicability.

7 Aug 2016

New Optical Gas-Imaging Technology for Quantifying Fugitive-Emission Rates

Optical gas-imaging (OGI) technology has been developed and can be used to detect leaks of volatile organic compounds (VOCs) from process equipment. Using OGI to detect leaks is more effective than using the US Environmental Protection Agency’s Method 21 because OGI is visual, making detection faster, and can survey an area instead of one component at a time. Although OGI can be very effective in detecting leaks, it does not provide a quantitative measure of leak rate (LR), hindering its adoption as a true alternative to Method 21. This paper describes development of quantitative OGI (QOGI) technology.

Method and Preliminary Results
Approaches have been proposed to establish a quantitative relationship between the pixel intensity difference with and without a plume (ΔI) and the product of concentration in ppm and path length in meters (ppm·m) for a gas column represented by a pixel in the infrared (IR) image for a given temperature differential (ΔT) between ambient air and the background. This quantitative relationship has been confirmed with a study showing that there is a monotonically increasing relationship between ΔI and concentration for uniform black background that was temperature controlled. That study’s data also showed that ΔI increases as the temperature of the background increases for a specific gas concentration.

The working principle of the QOGI can be described as briefly follows:

  • IR images of a leak are analyzed for intensity on a pixel-by-pixel basis.
  • Each pixel represents a column of hydrocarbon vapor between the IR camera and the background.
  • Pixel contrast intensity (ΔI) is defined as the intensity difference at the pixel level between the background with and without the absorption because of hydrocarbon molecules.
  • ΔI is a function of the temperature difference between the background and the plume (ΔT).
  • At a given ΔT, the intensity is proportional to the number of hydrocarbon molecules in the vapor column.
  • The LR is reflected in both pixel intensity and the number of pixels that have a ΔI higher than a certain threshold. Inversely, the combination of intensity and that number determines LR.

On the basis of this methodology, a computer program has been developed that captures raw IR data from an IR camera and analyzes it for LR. The IR camera must be radiometrically calibrated to make it capable of measuring temperature at the pixel level. To analyze the IR images, the user must also enter an estimate of ambient temperature and distance from the component being tested into the IR camera. All other variables required for determining LR are preprogrammed. With the captured IR images and the two ­user-provided input parameters, the program will calculate the mass leak rate in pounds per hour (lbm/hr).

Accuracy of QOGI on the basis of 80 tests for propane.

Work to date has measured the component leak rate using QOGI technology on accurately controlled releases, with the focus on propane. Flow rate, or LR, was set using a calibrated mass-flow controller. The IR camera was positioned approximately 10 ft away from the release point. All of the tests performed to date (80 total) were conducted in an outdoor, open-air environment. The types of backgrounds tested included a uniform temperature-­controlled metal board, a building wall, and gravel. These tests were conducted in sunlight and in shade, in ambient temperatures from 37 to 95°F, in relative humidity from 50 to 90%, and in various moderate wind conditions. Because the true LRs were known in these tests, the accuracy of this method can be assessed by comparing the true LR and the LR measured by QOGI (Fig. 1). Eight LRs were tested, represented by eight pairs of bars in Fig. 1. The green bars represent the true LRs, with the number indicating the rate. The purple bars represent the average LR measured by the QOGI method. The red error bars represent ±1 standard deviation from the average. Within these 80 tests, the measured LRs were between -17 and 43% from the true values.

A limited number of tests have also been performed for methane and ethylene. Leak rates were determined for these materials using IR response factors (RFs) developed on the basis of the IR spectra of methane and ethylene relative to the spectra of propane. Measured LRs used RFs developed from these known spectra (vs. direct RF measurement) and indicate good agreement between the true and measured rates for the set of tests conducted.

The accuracy of the QOGI method as discussed previously pertains to a limited range of conditions, and more-­comprehensive tests are planned to characterize QOGI accuracy under a broad range of environmental conditions. The initial results are encouraging, especially when comparing measurement accuracy with inherent uncertainties in Method 21. The uncertainties associated with the current Method-21-based methodology come from three potential sources:

  • Measured screening values (SVs)
  • Compound and instrument-specific RFs
  • Correlation equations that are applied to the SVs to estimate emissions rate

The SV is a concentration measurement that uses a probe to examine a component to determine the maximum concentration for a small set of components that potentially could leak. Concentration is not proportional to the LR but is presumed to be for estimating LR by use of Method 21. Factors such as the geometry of the leaking component, the pressure inside the equipment, wind speed, and atmospheric turbulence will affect the concentration measurement (or SV reading). For a small leak area (i.e., from a single point), the concentration measured by use of Method 21 will be much higher than that for a more-­diffuse leak. This will produce significantly different SVs with Method 21, even if the leaks are controlled to the same rate.

Another source of error for Method 21 is the RF for each compound in the gas leak, which corrects detector differences for each compound in the emitted gas. The portable detector used for Method 21 surveys is calibrated with one compound, but actual material leaking may be a different compound or a mixture of compounds. To determine the true concentration of the leaked compound, an RF is applied to account for differences between calibration gas and the emitted gases. RF is a predetermined ratio between the reading of the calibration gas and the gas in question.

The EPA has compiled RFs for approximately 200 compounds. The RFs can vary by an order of magnitude for different compounds or from one instrument type to another. Per the EPA’s 1995 Protocol, if RF is less than 3, no adjustment is required. This means that the measured SV could have up to a 200% error if the SV is not applied per the protocol. In addition, if the RF does not reflect the actual mixture of compounds emitted, additional error in the estimate of LR is introduced.

Even if the SV is perfectly accurate, the potential for error exists when correlation equations are applied to the SV to estimate mass emission rate. The correlation equations were developed on the basis of field tests in which SVs were determined by Method 21 and actual mass emission rates were determined with another technique. The correlation between these paired data sets was not very strong, as indicated by low R2 values from 0.32 to 0.54. As a result, the ratio of LR predicted by these correlation equations to the measured LR ranges from approximately 0.2 to greater than 4. As such, errors in the LRs estimated with the EPA protocol for Method 21 could be in the range of -80 to 300% using ­Method 21 when all of the potential sources of error are propagated.

With Method 21, the concentrations are measured and emission rates are estimated. In comparison, QOGI directly measures mass LRs. In the tests reported in this paper, the errors in the QOGI results are substantially smaller than those that would be expected from application of Method 21.

It has been demonstrated, with initial but compelling data, that QOGI is technically feasible. QOGI directly measures emission rates. This is fundamentally different from Method 21, which estimates emission rates using concentration measurements, SVs, RFs, and correlation equations. With a QOGI commercial product, operators have to enter only ambient temperature and the distance from the leak site into the IR camera. The QOGI product can then capture the IR images for approximately 30 seconds and provide the operator with a measurement of the mass emission rate. Consequently, it is expected that QOGI will be able to reduce significantly the time to complete a survey while providing more-accurate measurement of emission rates.

This article, written by Special Publications Editor Adam Wilson, contains highlights of paper IPTC 18471, “New Optical Gas-Imaging Technology for Quantifying Fugitive-Emission Rates,” by Hazem Abdel-Moati, ExxonMobil Research Qatar; Jonathan Morris and Yousheng Zeng, Providence Photonics; Petroula Kangas, ExxonMobil Chemical Europe; and Duane McGregor, ExxonMobil Research and Engineering, prepared for the 2015 International Petroleum Technology Conference, Doha, Qatar, 7–9 December. The paper has not been peer reviewed. Copyright 2015 International Petroleum Technology Conference. Reproduced by permission.

6 Aug 2016

Operational Risk: Stepping Beyond Bow Ties

This paper presents the multiple-physical-barrier (MPB) approach to operational (or process) risk, an extension of the common bow-tie technique for identifying risk. Bow ties identify a variety of different types of barriers and help communicate safety principles that link causal factors and subsequent actions to a specific event. By narrowing the focus to physical barriers and by developing success paths that enable each barrier to perform its safety function, the MPB approach moves further toward a systematic approach to operational-risk management.

Introduction—Operational Risk, Bow Ties, and Physical Barriers

Example bow-tie analysis for a well kick while drilling.

Operational Risk. One of the more elusive issues in the upstream oil and gas industry is the understanding of process safety or process risk—especially how it overlaps with industrial (or personal) safety—and the types of tools needed to assess and manage it. An important part of this hinges on the role that barriers play in the analysis and what constitutes a barrier. Some companies consider training to be a barrier, others consider certain meetings to be barriers, and still others consider safety procedures themselves to be barriers. Indeed, there is scarce practical agreement between companies as to how process risk is assessed, managed, and communicated. As a result, there can be similarities, but, ultimately, no two process-risk assessments from different companies look the same.

Several different barriers are shown in the bow-tie diagram in Fig. 1. Barrier types there include the well-control program, mud checks, fill-ups, and escalation barriers.

Bow-Tie Analysis. Bow-tie analysis has been widely used in the offshore oil and gas industry as a technique for communicating safety issues and safety control measures. Bow-tie analysis is event based; it seeks to tie causal factors and subsequent actions to a specific event, such as a kick. Bow-tie diagrams help teams better understand the sequences that can lead to serious process or operational risks. They also identify mitigating actions that can be taken to reduce the consequences of a major event.

The MPB Approach—A Pathway to Success
The MPB approach was developed with the help of collaborations from the upstream oil and gas industry. It takes a step beyond bow ties toward a more-direct and -systematic understanding of operational risk so that operators can design their operations to be successful. In so doing, risk is systematically identified and evaluated and can be incorporated into the management system to help ensure the safety of offshore operations.

This paper posits that operational risk stems from the breech, removal, or failure to properly install or maintain a required physical barrier. If all required physical barriers are in place and effective, then there will be no operational safety incidents. If all of the cement-plug barriers, fluid-column barriers, and blowout-preventer barriers had been effective, there would not have been any of the major accident events in the Gulf of Mexico, including explosions, loss-of-well-control events, and major environmental spills. Operational risk is fundamentally about establishing and maintaining MPBs.

Physical barriers are designed, constructed, operated, and maintained to ensure that they can perform under adverse conditions. In many cases, multiple physical barriers are required so that, in case one barrier fails, another is in place to achieve the safety function (e.g., contain hydrocarbons). More broadly, the MPB approach reflects the concept that the number of physical barriers should be commensurate with the risk of the ­associated activity.

The focus of the MPB approach lies with two leading questions:

  • What are the physical barriers required for the operation at hand?
  • What is needed to ensure that these barriers succeed in meeting their safety functions?

These questions marry principles from two very different industries (nuclear and maritime). The focus on physical barriers that is foundational to the nuclear safety industry and the ability to diagram and trace how critical systems function (e.g., performance qualification standards) form a key part of training for engineers in the US Navy and the US Coast Guard. Both perspectives were adapted, and templates were developed to diagram this approach as a success path.

It is this understanding of success paths, especially when applied to the physical barriers, that paves the way toward systematically elucidating the risks. It is important to visualize what must be successful in order to understand what can fail. In effect, this approach is designed to increase operational awareness with the aim of managing operational risk more effectively.

This success-path model is straightforward and provides a number of benefits including

  • It is a systematic mechanism for getting at the root cause of operational safety risks that can lead to major accidents. The top-down approach starts at the highest levels first and then enables drill-downs to whatever level of detail is needed to identify the safety problem or match the available data.
  • It provides a risk-informed communications framework for communicating with rig workers, senior executives, regulators, and everyone in between. Rig workers can identify their roles within the success paths and readily understand how their actions are integral to maintaining the success of the barrier. At the other end of the spectrum, for example, executives are sometimes faced with making decisions regarding new technologies, and key details may not be fully understood. This approach is well-suited to bring them up to speed in many of the technical details.
  • A success-path approach enables decision makers to understand the key points required for success and then participate in the discussion about risks and safety. Further, it provides a consistent and rigorous basis for defending the decisions that have been made, whether to senior executives or third parties. The foundations of this approach have been demonstrated to hold up in legal situations.
  • It also serves as an important training tool that enables students to grasp the key operational safety issues. Each physical barrier can be systematically analyzed to provide the foundation needed to manage the operational working environment safely.

The value of the MPB approach is that it steps beyond the bow-tie analysis techniques by placing the focus directly where the risk is—namely, on the physical barriers, their safety functions, and the success paths (both automated and human) that are needed to ensure the success and safety of the operation.

The hierarchy of physical barrier, safety function, and success path is not a coincidence. This chain of cause-and-effect logic forms the basis of operational-risk management for a system, a rig, a well, or a facility. Ultimately, however, it is the role of the operational plan or management system to call out strategies for maintaining the success paths.

The MPB approach is sufficiently intuitive for everyday use yet powerful enough for large-scale integration. When it comes to process (or operational) safety on offshore oil and gas facilities, the devil is in the details, but the MPB approach guides its practitioners to find and identify those details systematically. The benefits are not only for the practitioners but also for guiding the entire operational team on a path toward intuitively understanding the safety implications of their roles and implementing a successful operation.

This approach also positions operational-risk management to be quantified at some point in the future. When reliability quantification is incorporated, the safety significance of any component, system, or set of human actions can be compared and evaluated ­numerically.

5 Aug 2016

How the Petroleum Industry Can Learn From the Ebola Crisis of 2014

The Ebola crisis of 2014 was one of the worst infectious disease outbreaks in recent history. It also occurred in a region with endemic medical risks and poor medical infrastructure. These two factors make it an important learning exercise for the global petroleum community. This paper reviews the Ebola outbreak from the viewpoint of an onshore and offshore petroleum operator, providing insight into the real threats the outbreak presented by looking past the media hype and diving into the real organizational effects of the outbreak.

West Africa experienced the most severe Ebola virus disease (EVD) outbreak ever recorded. The most-affected countries are Liberia (10,672 cases as of 16 ­August 2015), Sierra Leone (13,494 cases), and Guinea (3,786 cases). Other affected countries include Mali, Nigeria, and Senegal in Africa; and Italy, Spain, the United Kingdom, and the United States, although to a much lower level. Pictorial representation of the number of cases and deaths in affected countries as of 5 July 2015 is shown in Fig. 1.

Number of deaths and people affected by Ebola.

The World Health Organization on 6 August 2014 declared this Ebola outbreak in West Africa to be a public health emergency of international concern.

Lessons Learned
The West Africa Ebola outbreak has been the largest, longest, and most complex since the virus was discovered in 1976. It has had the highest number of cases and deaths ever reported for Ebola.

A functional health system is a prerequisite for any coordinated preparedness for and response to any possible outbreak. The 2005 revision to the International Health Regulations (IHR) is a legally binding agreement whose purpose is “to prevent, protect against, control, and provide a public health response to the international spread of disease in ways that are commensurate with and restricted to public health risks, and which avoid unnecessary interference with international traffic and trade.” None of the most-affected countries were compliant with the IHR regulations, and this surely delayed the timely identification of the disease, the setting up of contact tracing and adequate surveillance measures, and the early implementation of infectious control measures in healthcare settings.

This epidemic served as a reminder of the possible negative consequences of globalization (i.e., rapid spread of infection across continents and oceans, putting the entire world at risk). The West Africa Ebola outbreak also reconfirmed that infectious diseases cannot be easily cordoned off to one country or continent. Rather, an outbreak will have an immediate and critical global effect if immediate prevention and control measures are not put in place.

In addition, countries with weak health systems and poor health infrastructure cannot withstand the effect of such rapidly spreading epidemics. In such situations, the country’s health systems will collapse, leading to more and more deaths because many patients with other diseases (e.g., malaria and HIV/AIDS) would not approach the clinics for fear of being exposed to the epidemic. This could lead to economic shutdown of the affected countries, leading to humanitarian crises. Thus, there is a significant need to strengthen and restructure basic public health systems in these countries, including primary healthcare facilities, laboratories, surveillance systems, and critical care facilities.

Another important lesson learned is the importance of collaboration with media in order to provide correct information. This is essential to avoid or control the spread of panic among the community. The additional threat to the population’s faith in government and international agencies can be fatal for both sides.

Lessons Learned for Companies Operating in Tropical Hot Spots at Risk of Zoonotic Infections
Companies operating in tropical hot spots—typically at high risk of zoonotic infections—that are interested in business continuity need vigilantly to understand the health and political context in which they operate in order to keep their employees safe and protected.

From a preparedness perspective, companies should undertake a thorough review of the capability of the country’s national health and veterinary system at the beginning of the project itself, supplemented by periodic reviews. A detailed health-impact assessment of the project should be conducted before the start of the project and for each major project expansion. This assessment should include a review of zoonotic infections and not be exclusively based on the epidemiology of the diseases already present in the country. Companies should develop flexible response plans informed by the characteristics of the disease or an outbreak and not merely based on fixed triggering factors. The response to the Ebola epidemic, in fact, could not rely on trigger matrices developed for other possible outbreaks because one single case would require a substantial and immediate response plan.

When coming up with a strategy to protect the premises and its employees, the company should keep in mind that the regular health, safety, and environment (HSE) emergency-response-plan (ERP) triggers will not always be appropriate in such infectious-disease outbreaks. This was one of the biggest stumbling blocks encountered in this Ebola outbreak (i.e., to get health, safety, and environment managers to understand that the response to this infectious disease epidemic is very different from that to any other threat that uses the normal HSE ERP and traditional triggers).

Risks Involved for Oil and Gas Companies: Offshore Suspected Cases
First Scenario. The biggest risk involved for oil and gas companies operating in affected countries is to have a symptomatic Ebola case offshore. It is very likely for a person who is completely asymptomatic, and who travels from an affected country after unprotected exposure, to become symptomatic while offshore. This would cause fear and panic among the rest of the team and expose the medical personnel available offshore until the diagnosis is confirmed either way. In such a scenario, the best prevention and control measure would be to isolate the suspected case. The medical staff members should protect themselves with the correct personal protective equipment but should still minimize physical contact with bodily fluids of the suspected case until help arrives from a specialist team.

The company should ensure that there are dedicated and trained teams available at the project sites that will be able to respond to such offshore emergencies almost immediately. This specialist team should be ready to move the suspected case, most likely by boat (helicopter providers will be reluctant to respond in such cases), and immediate directions should be given to clean and disinfect the offshore facilities as soon as possible.

Second Scenario. The company is operating in a nonaffected country, and people from affected countries are working offshore and there is a suspected case on the rig. The chances are that there will be no systems in place to deal with such a patient. There may be no laboratory facilities available, so even excluding EVD will be complex.

First, preventing this situation from happening should be the main objective. This can be achieved through training of all the workers who travel to and from affected countries. They should understand how to mitigate the risk and why it is so important to inform the employer about any potential contact with an ­Ebola-risk case. They should then be allowed to stay at home for 21 days and monitor their health, before going offshore.

A second important consideration is that no symptomatic individual from an affected country should be allowed to go offshore. Consideration should be given whether the company will allow people from affected countries to go offshore unless they have gone through a 21-day window period (in the case of EVD) outside of the affected country. The situation may turn out to be complex and controversial, but it depends on the rank of the individual or workforce and the level of understanding about the disease.

Although, at the time this paper was written, the current outbreak was still not completely over, systems are now in place in affected countries to ensure that the same uncontrolled spread of EVD among humans seen in 2014 will most likely never happen again. The lessons learned from this outbreak have definitely sensitized the world to the fact that the spread of EVD among humans becomes almost immediately a global threat and cannot remain confined to a country or a continent. The key is to respond rapidly and effectively at the early stages of the spread. This early-response system was already tested in Nigeria and Mali, where medical infrastructure is more or less the same as in the three heavily affected countries. But, because of the rapid response of the Nigerian and Malian governments, supported by the international community and nongovernmental organizations, the ongoing spread among humans was stopped very effectively, with only a few cases reported.

This was the biggest Ebola outbreak ever, with many lives lost, including healthcare workers in the line of duty. How­ever, the long-term benefit from this outbreak is the development of vaccines that will save many more lives in the future. Such an unprecedented and uncontrolled outbreak is highly unlikely to occur again. These vaccines are still going through various test trials but thus far have demonstrated very positive ­results.

4 Aug 2016

HSE Conference Highlights Past Progress, Future Challenges

In April of this year, SPE held its 25th-­anniversary Health, Safety, Security, Environment, and Social Responsibility (HSSE-SR) conference in Norway. It is incredible to think about the progress this industry has made since the very first event was held in 1991. The theme this year was fitting: Sustaining Our Future Through Innovation and Collaboration. The technological solutions that have been introduced have helped us be not only more efficient but also safer and more environmentally friendly. The challenges we face now, and into the future, include more oversight from all of our stakeholders—from the governments to the public—all of whom grant us the license to operate.

Tom Knode

We currently face the acute challenge of sustaining our performance in a low-price environment. The industry is responding by introducing proven lean techniques, to be more capital efficient. At the same time, the demands on the HSSE-SR functional professionals have never been greater, with regulations being created or updated at a rapid pace. The public has an increased focus on our performance. This means the HSSE-SR representatives in our industry must become increasingly savvy, both technically and operationally.  Functional leaders must have a firm grounding in the HSSE-SR risks inherent to the operations as well as a good understanding of the business and financials. This knowledge will enable companies to prioritize and direct their focus and resources to the right places.

The journey to an injury- and incident-free operation has many paths, and progress is being made. Process safety is becoming a routine part of our structure programs and management systems. The ongoing discussions around unconventional development are driving understanding of everything from the potential for injection-induced seismicity to the role of government in regulating drilling and completions. Companies are better at integrating social issues into their overall risk picture and are working more proactively with communities to ensure their license to operate. We are seeing more focus on the emissions from our operations. Because of climate-change concerns, governments are requiring that more be done, which is driving companies to develop new technologies to help identify and fix point sources.

And, finally, there is always the human element. It once was very common to see an incident investigation in which the root cause was listed as someone not following a process. Our interactions with high-reliability organizations are shifting the understanding of human factors. At the aforementioned SPE conference, technical experts on human factors, including researchers and pilots for major airlines, discussed how organizations can take the human element into account as they build their equipment, processes, and procedures. This helps us to better look at failures (and successes) through a brain-based focus and understand how to reduce the chance, and outcome, of errors.

3 Aug 2016

Ohio Study Tries To Pin a Number to Earthquake Risk

Research and development firm Battelle is working on a new induced-seismicity study that aims to help wastewater disposal well operators in Ohio stay on the good side of state regulators.

Expected to be completed later this year, the company says the study will be the first to quantify a disposal well’s potential to cause an earthquake.

Concern over disposal wells in Ohio was brought to the fore when a series of earthquakes jolted residents of Youngstown, Ohio, in late 2011. A disposal well near downtown was assigned responsibility for the tremors and regulators responded by shuttering that well along with several located nearby—the scenario Battelle is hoping its study can prevent from happening again.

“It’s really a risk-based mapping study,” said Srikanta Mishra, a senior research leader and energy fellow with Battelle. “The idea is to show that, within the state of Ohio, there are some areas that should be avoided and there are some areas that could have potential for wastewater disposal without representing any adverse risk to the communities.”

Several independent operators in Ohio are forming a joint industry project to help fund the study and, though the study is incomplete, Battelle has already identified a number of locations in the state where those companies may want to think twice about drilling a new disposal well.

Mishra said the high-risk areas have low storage capacity and tight rocks that limit the rate of injection and are close to fault zones. Researchers are also using other risk factors, which include the depth of the target formation and the history of natural seismic activity in the area.

A map of eastern Ohio shows the estimated wastewater fluid storage capacity of the region’s subsurface. Researchers are hoping to use these data to help disposal well operators know which parts of the state are most susceptible to injection-induced seismicity. Image courtesy of Battelle.

These data points are used to create a fluid flow analysis model that predicts the pressure buildup in a formation as a result of a specific volume of injected waste water. “Then we couple that with a geomechanics model that will say ‘If this is the pressure, then that will translate to this level of stress change, which can produce an earthquake or seismic activity of this magnitude,’” Mishra said.

This means that operators will also be able to use the data to determine how much waste water they can inject into a particular well or particular areas before they are likely to trigger a seismic event. This would allow those companies to self-impose a limit on injections in order to avoid regulatory actions.

Mishra said some of the work that remains to be done includes filling in data gaps from areas of the state where there is a low density of disposal wells and thus less geologic information.

The results of the study could have a big effect on the future of Ohio’s disposal well business because the state not only handles its own waste water but also imports much of the waste water generated in neighboring Pennsylvania. Because of Pennsylvania’s strict regulations, there are only 11 disposal wells in the state compared with more than 200 in Ohio.

Mishra said similar studies could be carried out for other parts of the country but Ohio was selected to be the first because Battelle had already built up a large database of geologic information from an earlier government-funded study on the state’s CO2– and wastewater-storage potential.

2 Aug 2016

A New Reality for Training and Safety Technology

If you were at the Offshore Technology Conference in Houston this year, you may have noticed the increasingly popular trend of exhibitors using virtual reality (VR) headsets to engage with attendees.

The unique visual and immersive qualities that make these devices such great marketing tools are also what some in the oil and gas industry say make them such powerful training tools.

VR technology has come a long way since its clunky ancestors were first introduced in the 1990s. Thanks to high-resolution screen technology and powerful gaming engines, the latest generation has been deemed by the techno-experts as ready for prime time. Equally important, VR has become affordable; some of the high-end devices now sell for only a few hundred dollars.

An illustration depicts what an oilfield worker sees in his field of view while using augmented reality glasses. Technology developers from various sectors say the technology enhances productivity and can improve safety. Image courtesy of Optech4D.

This confluence of capability and cost is why Vincent Higgins left his job as a senior-level industry consultant to become the founder and chief executive officer of Optech4D. The 4-year-old Houston-based startup develops custom training programs that recreate oilfield and facility environments inside the VR devices.

“I had an idea around simulation that I thought could really be of value,” Higgins said, adding that with VR “you have this visceral experience of being there and it’s as if it is actually happening. That adds a level of learning that you could never get in any other situation.”

The company also creates programs for a similar technology called augmented reality (AR). Instead of presenting users a computer-generated world, AR gives people an enhanced version of the real world complete with contextual information about where they are, where they are going, or what is in front of them.

Analysts are predicting that 2016 will be a watershed year for both VR and AR. Global market research firm Gartner said in a recent report that more than 1.4 million devices will be sold this year—a tenfold increase over last year—and that by 2017 the number of shipped units will jump to more than 6 million.

According to its website, Optech4D has so far done work for oil companies Eni and Shell along with the industry construction giant Bechtel and rotating equipment supplier Dresser-Rand. For one of its operator clients, Optech4D designed a VR-based helideck officer training program that performed so well the company has begun certifying trainees onshore instead of spending millions each year on flying them offshore to become certified.

Higgins noted that VR training allows supervisors to digitally track every move a trainee makes and they can throw them into scenarios too dangerous to recreate in the real world, such as crashing a helicopter onto the deck of an offshore rig. Workers can also plug into the VR devices wherever they are for training-on-demand.

One of the company’s next projects is to develop a well control VR simulation that would serve as an alternative to the more expensive well control schools. Down the road, Higgins wants to simulate well cementing and routine operations such as the inspection and maintenance of blowout preventers.

Many of those in this space like to point out that, not only are younger workers in need of the most training, they are also the most likely group to embrace this new approach. Because they grew up playing the very video games that helped drive the technology to commercial readiness, Higgins said young people intuitively adapt to VR. “They get in there, and they learn almost immediately using virtual reality,” he added.

In terms of adoption and development, VR currently has the edge, but AR is not far behind. AR technology can be used on computer tablets or with the emerging group of smart glasses that tend to look like safety goggles on steroids. With the glasses, AR gives users a hands-free ability to follow instructions or seek guidance while performing a task.

For example, a field technician equipped with AR glasses loaded with gigabytes of information could walk up to a broken compressor unit, and, with a glance to his/her left, be shown an easily readable and scrollable repair manual.

A look to the right, and now the worker sees a 3D model of the compressor that can be virtually disassembled to show all the internal bits with a simple hand gesture. They can even play an instructional video specific to that compressor model before turning their focus back to the repair job.

If the machinery has sensors or an Internet-of-things device, then the AR glasses could allow the worker to see temperature, pressure, and other safety-critical data before even touching the compressor. And if there is a camera built into the glasses, the worker can have an impromptu video chat with a supervisor whenever he/she runs into trouble. Higgins envisions a future in which AR technology enables a single equipment expert to collaborate remotely with 50 or even 100 field technicians a day.

The big idea is that all of this will cut down on mistakes and downtime. Last year, the world’s largest aircraft manufacturer, Boeing, which coined the term AR, presented a study it did with Iowa State University that backs those claims. The study showed that nontrained university students using AR to assemble a mock airplane wing got it right on the first try 90% of the time and 30% faster than the groups who did not use AR technology.

With the hardware becoming widely available, the focus will soon turn to the software developers who will ultimately be charged with making VR and AR devices indispensable tools of our modern age. But to achieve ubiquity, those developers need companies from all sectors to start using the technologies and help uncover their most valuable applications.

This chicken-and-egg situation reminds Higgins of the time when another smart gadget was first rolled out into the market before eventually taking the world by storm. “I waited in line in 2007 to get the first iPhone, and I used it mainly for email and texting,” he said. “Now, I use it for 50 or 60 different things because the apps became available. It’s become part and parcel of everything we do.”


27 Jul 2016

OESI Forum Sounds the Alarm for Offshore Safety

Registration is now open for the Ocean Energy Safety Institute’s (OESI’s) Alarm Management Forum, which will be held 24 August at Maersk Training in Houston. Current confirmed speakers represent Maersk Training, Transocean, the International Association of Drilling Contractors, Shell, American Airlines, Texas A&M, Schlumberger, the National Aeronautics and Space Administration, The National Academy of Science, PAS, and ProSys.

The forum will provide the opportunity to share and discuss best practices in the management of alarms for safer offshore operations and tackle questions such as

  • What process is used to manage the prioritization of alarms?
  • How does the ocean energy industry deal with alarm flooding, alarm rationalization, and other alarm issues?
  • What are the best practices of other industries that have had to deal with alarm management issues?
  • What research and future efforts are ongoing to help alarming to help increase situational awareness and proper decision-making offshore?

Small group sessions will discuss alarm management metrics and discuss topics that could be included in a future development of an alarm management specification for the ocean energy industry. The product of this forum will be proceedings that will include presentations, discussions, current and best practices, ideas and tasks to help continuously improve safer management of alarms and a framework of ideas to help develop an alarm management specification.

Register for the forum here.

27 Jul 2016

Post-Deepwater-Horizon Research Consortium Announces First Project Awards

The Texas OneGulf Center of Excellence has announced more than USD 2 million in research projects to address priority problems affecting the health and well-being of the Gulf of Mexico and those who depend on it. Texas OneGulf is led by the Harte Research Institute (HRI) for Gulf of Mexico Studies at Texas A&M University-Corpus Christi.

The Texas OneGulf Center of Excellence has announced more than USD 2 million in research projects to address priority problems affecting the health and well-being of the Gulf of Mexico. Photo credit: Getty Images.

These projects, funded by the Office of the Governor, represent the first major allocation of research dollars from the Texas OneGulf consortium, which was created after the Deepwater Horizon oil spill to direct funding in support of programs, projects, and activities that restore and protect the environment and economy of the Gulf Coast region. The projects tackle a variety of issues that directly affect the Gulf of Mexico and its residents, including studying the effect of red tide blooms on human health and the health care infrastructure and using underwater gliders to search the coast for hypoxic dead zones.

“We are very appreciative of the governor’s support of Texas OneGulf as it has allowed us to fund these diverse and innovative projects,” said Larry McKinney, director at HRI. “What happens in the Gulf of Mexico affects the health and economic wellbeing of Texas citizens on a daily basis.”

A consortium of nine Texas institutions, Texas OneGulf is a unique multidisciplinary team of marine science, socioeconomic, and human health researchers united to promote collaborative research and problem-solving actions.

The projects, helmed by a variety of institutions across Texas, are

Gulf of Mexico Report Card Prototype for Texas
Texas A&M University-Corpus Christi, USD 550,000
Collaborating institutions: Harwell Gentile and Associates and the University of Delaware
This project will develop a prototype Gulf of Mexico report card by evaluating the overall ecosystem health of the Texas Gulf Coast. Workshops of scientists, stakeholders, and Texas environmental managers will convene to identify the pressures and stressors that impinge on coastal Texas ecosystems and define long-term sustainability goals.

Restoring and Enhancing Structurally Complex Nursery Habitat To Enhance Reef Fish Populations
Texas A&M University at Galveston, USD 223,752
Collaborating institutions: Texas A&M University-Corpus Christi and The University of Texas Rio Grande Valley
This project will develop a structurally complex nursery habitat using both natural and man-made materials to improve the early life survival and recruitment success of reef-dependent fish and gather baseline biological information on the fishery benefits of creating and enhancing these habitats in the northwest Gulf of Mexico.

Isotope Geochemistry of Texas Coastal Waters
Texas A&M University-Corpus Christi, USD 220,365
Collaborating institution: Texas A&M University
Texas has 400 miles of coastline, and growing evidence shows extensive areas of hypoxia (critically low oxygen) as well as a buildup of nutrients within this complex coastal ocean of bays, estuaries, and barrier islands. This project uses an underwater glider to conduct sampling, providing an early and late summer overview of coastal Texas water column carbon and nitrogen source variations, and examine how they contribute to water column hypoxia. This project will complement sampling scheduled for summer 2016 under a separate grant.

Developing a Predictive Ecosystem Model for the Lower Laguna Madre
The University of Texas Rio Grande Valley, USD 213,956
Collaborating institution: Texas State University
This project will develop an ecological modeling system for sustainable management of the Lower Laguna Madre, a data-poor yet ecologically important region of the Gulf of Mexico.

The Marine Microbiome as a Sentinel for Ecological Health and Resiliency
The University of Texas Medical Branch, USD 186,224
Collaborating institution: Texas A&M University at Galveston
This project will establish a baseline of diversity and species composition in microbial communities, microscopic populations of bacteria, fungi, algae, and other microorganisms, in near-shore Gulf of Mexico environments, and monitor changes associated with oil pollutants.

Texas OneGulf Center of Excellence Pilot Project Program
Texas A&M University Health Science Center, USD 150,000
Collaborating institution: The University of Texas Medical Branch
The Texas OneGulf Disaster Research Response Program will create, for the first time, an infrastructure to support disaster research response encompassing environmental, human health, and economic assessment capabilities. The project will provide seed money for pilot projects that can be deployed rapidly to assess the effect of disasters along the Texas Gulf Coast in real time.

Socioeconomic Indicators for Coastal Community Disaster Response and Resilience
Texas A&M University-Corpus Christi, USD 125,060
This project will identify socioeconomic indicators that can be used in disaster response assessments by bringing together leading expertise in this area to populate a searchable database of indicators for community and human well-being, working with the Gulf of Mexico National Estuarine Research Reserves to apply these in a local context, and publishing online and in print a guide to socioeconomic indicators for disaster response and community resilience.

Red Tide Data Integration Project
Texas A&M University-Corpus Christi, USD 103,650
When harmful algal blooms such as red tide algae grow and disintegrate along the Texas coast, their neurotoxins may become an aerosol, causing adverse effects that can significantly increase emergency room traffic and visits to doctors. The Texas Harmful Algal Blooms (HAB) Data Integration Project will team up Texas researchers with expertise in HABs and medical researchers familiar with data about the effects of HABs on humans to work together to better prepare first responders, emergency rooms, and the medical system in responding to red tide events, minimizing human health risks.

Texas OneGulf Network of Experts Communications
Texas A&M University-Corpus Christi, USD 81,390
Collaborating institution: Amazee Labs
This grant will develop and implement a communication strategy that includes the existing Gulf of Mexico web portal GulfBase.org to enhance the ability of the Texas OneGulf Network of Experts (TONE) to inform all Texas stakeholders about its capabilities and expertise. TONE is a network of more than 150 Texas experts in human health, science, marine policy, and related fields convened to work to tackle Gulf problems. The goal of this tool is to facilitate communications among researchers, policy makers, and the general public.

Impact of Environmental Criminal Enforcement on Disaster Response
University of Houston Law Center, USD 80,251
Collaborating institution: Texas A&M University-Corpus Christi
This study will aid future responses to environmental incidents and releases in the Texas Gulf region by shedding light on the true risks of environmental enforcement after disasters and offer suggestions on how best to promote effective and speedy disclosure and cleanup in light of those risks. This research will assemble a database of all major industrial disasters in the United States since 2000, focusing on incidents that have occurred in the Texas Gulf region.

Species Identification Training for Effective Monitoring and Management of HABs
Texas A&M University at Galveston, USD 60,000
Collaborating institutions: Gulf of Mexico Coastal Ocean Observing System
Effective monitoring and management of HABs, over growths of algae that can affect ecosystems and human and animal health, relies on accurate and timely identification of the species involved. However, many trained in this specialty are either retired or retiring. This program will provide critical comprehensive training in identification and taxonomy for scientists, technicians, and managers.