Exclusive Content
5 Nov 2015

Beyond the Headlines: What Is All This Talk About Emissions?

Emissions are in the air and in the headlines every day. Whether the discussion is on human health effects of pollution, greenhouse gas (GHG) emissions and their effect on climate change, or the misleading performance of a diesel engine, the bottom line is there is a global focus on emissions.

Emission discussions are also in the forefront of the oil and gas industry. In 2012, the US Environmental Protection Agency (EPA) mandated new emission rules for our industry dealing with volatile organic hydrocarbons (VOCs), hazardous air pollutants (HAPs), and, most recently, methane. On 18 August, the agency proposed additional measures (EPA 2015) that supplement the earlier regulations and that “together will help combat climate change, reduce air pollution that harms public health, and provide greater certainty about Clean Air Act permitting requirements for the oil and natural gas industry.”

The rules have set limits on emissions. If they cannot be captured, the emissions must be combusted at a 95% destruction efficiency at a minimum. Destruction efficiency is not as extensive a measure of performance as combustion efficiency; however, the move to performance-based legislation is a positive one and will most certainly lead to improved air quality and a healthier relationship with communities in close proximity to oil and gas development. Within the US, various states have or are in the process of enacting legislation specific to their local situation, on the proviso that their requirements are not weaker than the federal rules.

In Colorado, for example, much of the industry activity occurs in the vicinity of communities. The state regulator is moving toward mandating de­vices that not only combust efficiently, but also fully enclose the combustion to eliminate the visibility of the flare. It is also spot inspecting facilities with special cameras that detect uncombusted hydrocarbons, which are a clear indicator of poor performance. Companies that are emitting uncombusted hydrocarbons will be subject to considerable fines.

North Dakota has put a rule in place that the industry must find an acceptable method of using a significant, measurable portion of a well’s associated gas or oil production from that well will be capped.

There are other situations in which the new EPA requirements, combined with state rules, will result in improved operations.

The existing 2012 EPA legislation covered natural gas wellsites, production gathering and boosting stations, natural gas processing plants, and natural gas compressor stations. The proposed EPA legislation requires the reductions of methane and VOC emissions from hydraulically fractured oil wells and extends the emission reduction targets downstream covering equipment in the natural gas transmission segment.

In view of this regulation, two questions are central to the emissions issue.

  • Is there a technology that can deliver results cost-effectively and address the community concerns about emissions?
  • Is it possible to comply with the new rules in the current low oil and natural gas price environment?

Venting and Flaring
In the early days of petroleum exploration, associated gas was not considered a useful product because of the difficulties in transporting it to markets and the low price it received. As a result, gas was simply burned off at the well or vented into the atmosphere. Unsophisticated means for combustion continued from the early years and well after the first patent for the flare stack was submitted by Exxon Research in 1951.

Even today, flaring and venting continue in locations where local markets and gas transportation infrastructure are lacking, or where the gas itself is of low volume or contaminated with other incombustible gases and therefore uneconomic and impractical to conserve. We are unfortunately again in a period of low oil and gas prices, hence any technology solutions have to make economic sense.

Venting of natural gas, especially methane, the key constituent of natural gas, has a significant effect on air quality and climate change. Natural gas contains VOCs and HAPs, which affect air quality and human health. According to the EPA, methane is the second most prevalent GHG emitted from human activities in the US, and nearly 30% of these emissions come from oil production and the production, transmission, and distribution of natural gas.

The global warming potential (GWP) of methane is 25 times greater than that of carbon dioxide, and the venting of methane releases more than nine times as much carbon dioxide equivalent GHG on a tonnage basis, as shown in Table 1.

Table 1 also shows the effect of combustion efficiency on the amount of GHG emitted from methane combustion, especially poorly combusted gas. Quantification of the GHG emissions during gas flaring is difficult because the measurement of the efficiency of a flare is problematic. The ultimate efficiency of a flare is governed by many factors: composition and heat content of the gas; size of the entrained liquid droplets in the gas stream; direction of the wind, especially if the gas is blown away from the ignition source; velocity of the gas at the flare tip, etc. All these factors are continuously changing, hence there is no single efficiency number that can be applied universally.

For this reason, the EPA has taken the approach of encouraging improved and measurable combustion technology that can be tested to show a destruction efficiency of 95%. The ability to measure performance and address the problem with facts creates a comfort level for communities that express concerns about the health effects of oil and gas emissions. Public concerns have delayed industry activity or caused requests for moratoriums on oil and gas activity. Measurable performance standards create clarity and help develop the social license to operate.

Certain industry participants have also taken the initiative to find the best way of conducting their projects that exceed regulatory requirements by using the best available technology or best practices. In all cases, new technologies are used to effect performance changes, gradually replacing practices that have a negative effect on the environment and/or the communities that live in close proximity to oil and gas developments.

Increasing Combustion Efficiency
Technology not only exists but is readily available that combusts at 100% combustion efficiency. This efficiency measures the device’s ability to reduce hydrocarbons to carbon dioxide and water vapor. Waste gases and associated gas with oil production that is impractical for conservation are combusted to the extent that the impact on environment is significantly reduced.

From the public perspective, this type of combustion results in no odors or visible smoke because of its high efficiency. Regulating agencies have in place a measurable technology that can be audited for performance. Although this seems a place where all jurisdictions should strive to be, only the US has incorporated such performance measures into its requirements. For example, if the 2011 uncontrolled methane emissions of 6.2 Bcf from hydraulically fractured oil well completions (EPA 2014) were cleanly burned at 100% combustion efficiency, GHG emissions from this source would be reduced by 89% at a cost of less than USD 0.40/ton.

Similarly, a business case can be made in dealing with VOC emissions and pollutants known as air toxics—in particular, benzene, toluene, ethylbenzene, and xylene (BTEX) resulting from natural gas dehydration. These pollutants are reduced to benign carbon dioxide and water vapor through clean, efficient combustion technology using 60% to 80% less fuel gas than a traditional flare, thereby significantly reducing GHG emissions and operating costs. The reduction in operating costs typically delivers a payout in fewer than 6 months on the capital investment.

Our analysis has indicated that based on the estimated 38,000 dehydrators in operation in the US, GHG emissions can be reduced by approximately 340 million tons at a cost of less than USD 1.65/ton over a 10-year period while providing an effective solution to VOC and BTEX emissions. Additionally, some ­clients are using the waste heat from the combustion process to keep the water in vapor form, thus eliminating condensing equipment, water storage, and trucking and disposal costs.

Taking a holistic approach highlights an opportunity to use the waste heat generated from clean combustion for process, building heat, or water vaporization/treatment. If the natural gas is of sufficient quality and volume, then it could also be used to generate power with reciprocating engines for the site or the grid. If the volume or quality of the gas is poor, then the waste heat from clean combustion can be used to generate site power with an organic Rankine cycle engine.

Additionally, there is an opportunity to  treat the wastewater stream thermally with the waste heat generated from the combustion of the associated gas. Typical wells that are not served by the pipeline infrastructure may produce in the order of 50 B/D of oil, 250 B/D of water, and 150–200 Mscf/D of associated gas. Even at elevated oil prices, the economic operation of such a well will be challenging. As oil prices drop to below USD 50/bbl, the water handling expense alone will exceed the oil revenue. This is the case when relying on trucking and deep well injection of the produced water.

Although there are elaborate methods to desalinate the entire stream, the economics are not yet suitable for this type of well. Using the excess thermal energy from the associated gas combustion under careful, controlled conditions, up to 85% of the produced water volume is vaporized. The remaining 15% of the original volume is carefully diverted to storage and then disposed of by deep well injection. By reducing the volume, the economics of flowing this well just became positive. This method requires combustion with a device whereby the heat is contained and able to be transferred effectively. Note that the 85% water volume in this example is returned to the ecosystem instead of being lost to deep well injection.

To rephrase, this combustion technology is able to use waste heat to drive water, a valuable resource, into the environment where it will become part of the water cycle and be available for future use.

Of course, the combined heat and power (CHP) is also readily accessible to technology providers that generate and are able to control heat. In short, a better way of doing things leads to a number of positive options.

For the sake of discussion, let us say that your oil well is burning associated gas and the landowners nearby are complaining about black smoke and odors. You have just decided to purchase a new combustion device. After 3 months of better air quality and no complaints, you decided that you would like to manage your produced water differently. You currently spend USD 2,100/day for water transportation and disposal and you can rent a vaporizer at USD 1,100/day. This well generates USD 900,000 yearly in oil revenue, and you have just added USD 300,000 yearly to this well’s bottom line without even looking into replacing the diesel generator with a CHP option.

Emission regulation will continue as the public becomes increasingly concerned with the potential health impact. The rules continue to get more stringent. The challenge for our industry is to find ways to effectively comply with the measures, especially in this low oil price environment. We believe the rules ­create clarity for both the industry and the community and will enable meaningful discussions toward creating the social license to operate.

There is a strong business case for change. Doing things more efficiently will cost less, and moving projects forward without public delays will also see oil production flowing sooner. It is a matter of knowing which technologies are available to resolve these new challenges. Technology is already here to provide for regulatory compliance, improved air quality, social acceptance, and true economic value.

EPA. 2014. Oil and Natural Gas Sector: Hydraulically Fractured Oil Well Completions and Associated Gas during Ongoing Production. US Environmental Protection Agency, Office for Air Quality Planning and Standards, Washington, D.C. (April 2014).

EPA. 2015. EPA’s Air Rules for the Oil & Gas Industry. Proposed Climate, Air Quality and Permitting Rules for the Oil and Natural Gas Industry: Fact Sheet. US Environmental Protection Agency, Washington, D.C. (18 August 2015).

Audrey M. Mascarenhas is president and chief executive officer of Questor Technology. She has worked in the energy and environment industry for more than 33 years, starting her career with Gulf Canada Resources. At Questor, she has focused on technology solutions to eliminate flaring and venting and the opportunity to utilize the energy for power generation and water treatment. Mascarenhas served as an SPE Distinguished Lecturer during 2010–2011 and currently serves on SPE’s Distinguished Lecturer Committee and Communication & Energy Education Committee. She holds a BS degree in chemical engineering from the University of Toronto and an MS degree in petroleum engineering from the University of Calgary.

John Sutherland is chief operating officer (COO) of Questor Technology. He joined Questor in 2008 and was instrumental in developing the company’s engineering and technical solutions team. Sutherland became COO in 2014 and, prior to that, held technical and managerial positions during his 26-year career with various exploration and production companies and the Alberta provincial energy regulator, AER. He is a graduate of the British Columbia Institute of Technology and the University of Calgary with degrees in mechanical engineering.

5 Nov 2015

SPE Effluent Discharge Management Workshop Addresses Standards and Regulations

The SPE Trinidad and Tobago Section recently hosted an Applied Technology Workshop (ATW) on oil and gas effluent discharge management in Port of Spain, Trinidad and Tobago.

The event brought together petroleum and petrochemical industry professionals involved in generating effluent discharges to the receiving environment, regulators, and those involved in the design, construction, operation, and maintenance of treatment systems for effluent discharge.

The goals of the workshop were to

  • Explore regional and international legislation as applied to effluent discharge in the oil and gas industry and compare with the local regulatory framework in Trinidad and Tobago.
  • Define the compliance issues of water pollution rules in Trinidad and Tobago specific to the oil and gas industry.
  • Obtain the industry best practice for effluent treatment and disposal with emphasis on large oil and gas drilling and production operations both onshore and offshore.
  • Explore the available technology and its applicability to achieve compliance.

There was a consensus that urgent collaborative action is needed among all stakeholders. The path forward received endorsement from the country’s regulators (the Ministry of Energy and Energy Affairs [MEEA] and the Environmental Management Authority [EMA]), operators, and service providers. The areas of focus in the workshop included legislative reform, use of applicable technology, and availability of resources.

Legislative Reform
Participants discussed the Trinidad and Tobago legislation pertaining to effluent discharges. There were also presentations on experiences setting effluent discharge standards in the North Sea using the Convention for the Protection of the Marine Environment of the Northeast Atlantic or OSPAR Convention, for possible use as a framework.

The topics of discussion about legislative reform centered on the following:

Are our current water pollution rules relevant and appropriately framed in the current environment?
The consensus was that there is an opportunity for the review of the current Water Pollution Rules created in 2001 and amended in 2006. For example, there is the need for ambient water-quality standards coupled with discharge standards. Current discharge standards were deemed to be stringent by some presenters compared with other jurisdictions such as OSPAR. OSPAR is using a holistic or risk-based approach in regulating effluent by the use of mixing zones and impact-based ambient water-quality standards vs. technology-based discharge parameters. To develop new ambient water-quality standards, a baseline study must be conducted on the receiving environment of Trinidad and Tobago.

Are our standards applicable to all sectors?
The discussions highlighted the need for industry-specific effluent discharge standards. There are different challenges and available technologies within the various industrial sectors that require specific regulations. The consensus was that one set of regulations is ineffective for all industrial sectors. The Trinidad and Tobago Bureau of Standards’ “Specification for the Effluent from Industrial Processes Discharged into the Environment” can be used in the development of oil industry-specific regulations.

Is compliance monitoring and reporting efficient and effective?
Participants suggested an opportunity for improvement with regard to compliance monitoring and reporting. Current environmental impact assessments focus on specific project sites (for example, a single well) rather than a cumulative assessment of an affected area. The use of strategic environmental assessments vs. specific environmental impact assessments under existing legislation should be considered.

How do we repair the disconnect between the certificate of environmental clearance (CEC) rules and water pollution rules?
Operators cited inconsistencies in recent CEC rules regarding which effluent discharge parameters are to be regulated. In the past, the CEC rules specified the effluent parameters to be regulated. In recent times, the rules require all parameters in Schedule II to be met, which may pose a challenge to attaining the goals with the existing technology by using the “end-of-pipe” discharge criteria.

Is there room for fiscal incentive for onshore and offshore operators?
Operators shared technical and financial issues associated with effluent treatment. With the retrofitting of offshore operations, for example, there are space restrictions, high cost of new technology, and logistical challenges. Bringing existing waste onshore for treatment also poses logistical and operational challenges. It was suggested that financial incentives be made available to operators for treating effluent closer to the source.

How can we speed the implementation of legislation?
Changing legislation can be a lengthy process in Trinidad and Tobago. However, representatives of the Bureau of Standards suggested that regulations may be used as a viable option instead of changing the legislation.

Should drill fluids and cuttings be regulated under the water pollution or waste management rules?
The discussions suggested that this is an area for further work because there were two aspects for consideration: offshore discharge and onshore treatment of drill cuttings.

Use of Available Technology
Participants gave presentations on existing and emerging technologies for effluent treatment by service providers. In summary, available technologies may be selected on the basis of the quantification of toxic components in the effluent and treatment goals. There were presentations on the use of ceramic membrane technologies and macro porous polymer extraction (MPPE) technology as examples with international case studies in different regulatory regimes.

The MPPE case study was presented as an option for treating both dispersed and dissolved hydrocarbons in produced water streams with limited pre- and post-treatment requirements. A case study was also presented on the treatment and reuse of black and gray water for onshore drilling.

The following summarizes the discussions on technology:

  • Best available technology should be used in the development of standards.
  • Operators should indicate what technologies are currently in place and what can work through shared lessons learned.
  • There is potential to reuse and recycle liquid effluent (produced water and onshore drilling waste water).

Availability of Resources
There were discussions about infrastructure challenges such as laboratory services and availability of human resources.

The following summarizes the discussions about resourcing:

  • Laboratory accreditation. There is a requirement for the infrastructure to be available to support the legislation with regard to effluent sampling and testing. Attendees shared concerns with the repeatability of effluent test results provided by various laboratories in Trinidad and Tobago. There is an opportunity to improve laboratory testing and reliability of results through a coordinated quality assurance program by an appropriate body such as the Bureau of Standards.
  • Human resources competency. There is a need for qualified and competent effluent treatment professionals at the operator and regulatory levels. Discussions revealed a gap in the competencies and skills of professionals. There is a need for specialists in industrial ecology and water treatment. Most of the local graduates are environmental management professionals. It was suggested that the local universities play a role by offering programs based on the needs of the industry.
  • Staffing levels. A concern of regulatory bodies, the level of staffing needs attention to support legislation related to permitting and enforcement.
  • Use of data resources. The EMA is the repository for all environmental data. It was suggested that these data be made available in a database to be more effectively used and incorporated into geographic information systems and mapping.

A committee or technical working group should be established to set the framework for managing future processes by incorporating the workshop’s recommendations. The group should be formally set up at a governmental level by a cabinet-appointed committee. Representatives of the committee and working group should comprise regulators, industry representatives, the Bureau of Standards, the Institute of Marine Affairs, environmental consultants, and service providers. A technical working group that was formed in 2013 should be expanded to include new focus areas explored in the workshop.

The terms of reference for the committee should be established to include targets and milestones. OSPAR, for example, has a template for which a covenant was signed by company leaders and regulators in environmental management.

The terms of reference should contain the following at a high level:

  • Review of legislation and recommendations for changes and improvements
  • Risk-based rather than end-of-pipe discharge standards
  • Use of strategic impact assessments vs. environmental impact assessments
  • Development of industry-specific water-quality standards (impact-based rather than parameter-based)
  • Identification of best available technology and lessons learned for effluent management
  • Laboratory accreditation standards
  • Resourcing (increasing competency standards)

It is recommended that the MEEA be the lead facilitator of the working group because it regulates the country’s energy industry and is well poised to bring all stakeholders together.

5 Nov 2015

Paper Chronicles Advancements in Derivation of Occupation Exposure Limits

A paper published recently in the Journal of Occupational and Environmental Hygiene examined how exposure-response estimation has been used to derive occupational exposure limits. What follows are two reviews from ExxonMobil scientists of the paper “Historical Context and Recent Advances in Exposure-Response Estimation for Deriving Occupation Exposure Limits” by M.W. Wheeler, A.J. Bailer, and C. Whittaker.

The first review is by Min Chen, an associate biostatistician, and the second is by Silvia I. Maberti, an industrial hygienist and exposure scientist, both of whom are with ExxonMobil.

Comments and review are welcome. Please send your comments to the HSE Now editor.


Quantitative risk assessment is required to characterize and disclose risks so that the resulting occupational exposure limit (OEL) better reflects the hazards involved and achieves an explicit low level of residual risk. This paper reviews several exposure/response modeling methods available for quantitative risk assessment (QRA). The recommendations are appropriate. I agree with the authors that the benchmark-dose (BMD) approach may get more information from the data when the BMD analysis is suitable for the data. However, it is likely that there are some endpoints and data sets that are not amenable to modeling with the BMD approach. In such instances, a no-adverse-effect-level/lowest-adverse-effect-level (NOAEL/LOAEL) approach must be used.

In animal toxicology studies, the exposure/response relationship is generally well-characterized. The sources of significant uncertainty include differences in species, routes and duration of exposure, and the relative potency of similar exposures in humans.

In epidemiology studies, no species extrapolation is needed when conducting risk estimation. However, the exposure concentrations may need to be reconstructed historically and estimated. Confounders and effect modifiers may need to be included in exposure/response models. The model uncertainty and exposure uncertainty would need to be considered.

Various exposure/response assessment techniques are used to select the point of departure (POD), the exposure associated with observed risks within or just below the range of observed data. Once the POD and target risk estimate are determined, the approach used for establishing the OEL will depend on organizational policies and other considerations.

The NOAEL is the highest experimental exposure at which there is no statistically or biologically significant change in the outcome of interest relative to responses in unexposed individuals. The LOAEL is the lowest dose or concentration that has been shown biologically or statistically to change the outcome of interest relative to responses in unexposed individuals.

NOAEL/LOAEL ignores the shape of the exposure/response curve, is constrained to be one of the levels of exposure selected in the experiment, and depends on the number of replications at each level.

Generally, NOAEL/LOAELs should only be used for OELs if the data are not adequate for exposure/response analyses.

PODs From Exposure/Response Models and the BMD Approach
Exposure/response models use all of the information in the exposure/response relationship to predict risks. Models that do not fit the data adequately should not be used. Dichotomous data requires at least one dose group whose response is different from the background rate and 100%. If the BMD, the dose associated with a specified change in the response, far exceeds the maximum experimental dose, the NOAEL may be the only viable option.

POD From Model Averaging of the BMD
The average-model method accounts for uncertainty in model selection. The average-model approach creates a weighted average of the exposure/response curve from the candidate models in which weights are based on how well each model fits the data. The average-dose approach takes the weights formed from the Akaike information criterion and the Bayesian information criterion, the model selection criteria. The average-model approach has better statistical properties than the average-dose model. To set the OEL when using the average-model approach, the exposure concentration should be chosen directly at the level of risk specified for the OEL.

POD From Semiparametric/Nonparametric Models and the BMD
The Bayesian semiparametric method in the paper uses a flexible spline construction for BMD analyses. One can include prior information on the incidence of the response in historical controls. The informed choices should be addressed. It also requires the choice of spline basis functions located at specific knot locations. To set the OEL when using the semiparametric approach, the exposure concentration should be chosen directly at the level of risk specified for the OEL.

Average-model and semiparametric/nonparametric methods are recommended for estimating risks at low levels if the data are adequate for exposure/response analyses. However, I would like to point out the importance of model checking for the average-model method. If all the models considered for averaging were poor, combining them would not do much good. The average-model method addresses model uncertainty within the selected class, but it does not address the uncertainty of model class selection. The average-model method can never completely avoid the selection dilemma even if model averaging is a wonderful approach to the problem of accounting for model uncertainty.



The reviewed document is part of a series of manuscripts discussing the various aspects of the process to develop OELs, including description and selection of the dose/response curve, selection of uncertainty factors, risk assessment, OEL setting, and their use in risk management. The manuscript is centered on the estimation of the dose/response relationship. It briefly describes and compares traditional and new methodologies used in dose/response modeling of data from animal and human data and advocates for the application of QRA and the BMD approach.

The first step in the OEL-setting process is to develop a dose/response curve on the basis of the available data for the health effect of interest. This curve is used to estimate a threshold dose or POD below which toxicity is not expected. Various dose/response assessment techniques are used to select a POD, the exposure associated with observed risks within or just below the range of observed data. Accounting for human sensitivity and responses on the basis of extrapolation of such data has proved to be difficult and is the source of inconsistencies in the approach to setting OELs.

Advances in analytical methods applied in toxicological and epidemiological studies have allowed for increased understanding of the basis for human variability in sensitivity. On the other hand, more powerful statistical analysis tools allow for better data selection and description of the selection process for the POD, uncertainty factors, and risk levels in a quantitative manner. The quantitative approaches described by the authors can enable a more-transparent and -systematic way to set OELs.

Approaches to POD Estimation
The authors describe the estimation of the POD on the basis of the NOAEL/LOAEL approach, where the lowest exposure without significant effect is selected as the NOAEL. Because of this, this approach is highly dependent on the design of the study (e.g., number of test subjects used, dose spacing, endpoint) to incorporate biological information in the curve. If the doses are properly chosen, one of them will represent the NOAEL and another will represent the LOAEL. If the number of individuals is limited, the adverse effect might not be observed at all, thus yielding an artificially high POD.

As opposed to the traditional NOAEL approach to select the POD, the authors propose the BMD approach, which uses data from the entire dose/response curve for the critical effect. The selection of a dose/response curve is limited by the type of data in addition to the need for more-realistic dose/response models, such as biologically based models and mode-of-action or semiparametric models. In the BMD method, a mathematical model is fitted to all the dose/response data, allowing for incorporation of biological information in the estimation of the POD that is associated with a predefined level of response. Because POD is not based on one experimental dose, the BMD approach is not as dependent on the study design and can be calculated for doses outside of the observed range. On the other hand, the BMD analysis is considered inappropriate for data sets with small dosing groups.

The authors describe the application of sophisticated statistical tools to estimate BMD. When multiple models are used, their ability to describe the data may be compared and the best model can be selected; an average dose can be estimated from all the models, or an average model of can be estimated with goodness-of-fit weights. One of the advantages of using average-dose models is the ability to take into consideration model uncertainty; however, the selection of the models to be included is crucial to obtain an estimate representative of the data. A detailed and transparent approach must be presented when applying highly sophisticated computational tools that require input on priors and selection criteria, for example.

Strengths and Limitations
The paper discusses the different approaches used to estimate the POD and provides examples of application of such models in setting OELs. The background section briefly describes some of the methods used and the challenges of applying these models to toxicological or epidemiological studies with continuous or dichotomous exposures. It provides information on common biases present in epidemiological studies, their consequences, and how to address them. However, the authors limited the description of the models used for BMD estimation and the key considerations for their application to toxicological or epidemiological data with continuous or dichotomous endpoints. The authors did not address other approaches such as the threshold of toxicological concern, the T25, the TD50, or the margin of exposure approaches. It must be noted that the BMD approach is the most widely used and accepted method, after the NOAEL.

The paper provides two hypothetical data sets and uses four different modeling approaches (i.e., BMD, Bayesian model average, and semiparametric modeling) to estimate the OEL for a particular endpoint. The POD based on the NOAEL approach is used as a baseline comparison for the quantitative models contrasted in this study. When compared with the NOAELs, the estimated BMDs were similar, in general, but higher, while the BMDL10s were closer, if not lower than the NOAEL. These results are within the expected outcomes discussed in similar studies. The authors reiterate the importance of the decisions along each step of the modeling process but provide little discussion or guidance on the application of the various approaches for different types of data or any of the data requirements to apply the models used.

Regardless of the approach followed, model selection must be consistent with the mechanistic considerations about carcinogenicity or the endpoint of interest. The authors describe some of the challenges presented when choosing models that are not biologically relevant, which might lead to the selection of an unrealistically low (or high) BMDs. This underscores the importance of proper selection of model and statistical tools to be used, as well as following a predefined selection process.

Discussion and Conclusions
For noncarcinogen substances, the BMD method is proposed as a preferred alternative to the NOAEL/LOAEL approach because it allows for calculation of a specific and measurable response rate. Similarly, BMD is preferred for threshold dose carcinogens where the concentration or dose that induced tumors derived from chronic studies are typically used. On the other hand, for nonthreshold carcinogens, where the endpoint is not dichotomous, modeling data from animal or epidemiological studies may be preferable.

Several software packages are available for BMD calculations. Although using the models may seem to be an easy task, the interpretation of the results is not trivial, requiring engagement of subject-matter experts in toxicology and statistics. Most importantly, expert judgment is still required to address the hazard-characterization issues in risk assessment, such as selection of the applicable uncertainty factors for the calculation of an OEL.

QRA allows for better characterization and discloses risks so that the resulting OEL reflects the hazards better because it can provide information on uncertainties associated with the data and identify factors contributing to uncertainties in risk estimates. Furthermore, consistent application of the models allows for comparison across experiments and effects. A standardized approach allows for effective processing of toxicological and epidemiological data for multiple data sets as well as comparisons across exposures and outcomes. On the other hand, these approaches can be cost prohibitive because they require significant amounts of data, computational ability, and information to correctly interpret results.

There is a need for the development of improved guidance for risk communication on the basis of probabilistic assessment techniques applying the wide variety of models and approaches available. This should include communication of the types of uncertainty and the relation to statistical variability, imprecision, and the use of confidence intervals. Additional guidance will enable an improved and proper utilization of benchmark response by the various science disciplines using OEL for risk management.


5 Nov 2015

PetroTalks Collection Continues To Grow

In a further advancement of SPE’s core mission to disseminate information, three new videos have been added to the PetroTalks collection. Modeled after the popular TED Talks format, PetroTalks are videos of speeches and talks presented by industry leaders on topics at the forefront of current discussion.

The continuing growth comes as the SPE Foundation (SPEF), which is tasked with financially supporting the mission of SPE International (SPEI), has committed to support PetroTalks as part of its Distinguished Lecturer webinar efforts.

“Among the offerings the foundation supports are the traditional Distinguished Lecturer section visits and the Distinguished Lecturer Web events. Recently, SPEI approached the foundation to ask whether funds set aside for the latter purpose could also be used for PetroTalks,” said Kate H. Baker, SPE Foundation president and 2004 SPE president. “The foundation trustees were pleased to say yes.”

The PetroTalks currently available address the issues of sustainability, social responsibility, and risk assessment. “In the moment, and perhaps because of SPEI’s desire to emphasize the HSSE-SR agenda, PetroTalks are weighted toward health, safety, social responsibility, and subjects in the water/energy nexus relative to topics covered by the Distinguished Lecturer programs,” Baker said. “But there is no reason why that content distinction should persist. PetroTalks are expected to offer SPE members the opportunity to hear from subject-matter experts on topics of current concern to the industry, whatever those might be.”

DeAnn Craig, SPEF past president, was president when the foundation made the decision to fund webinars. “The SPEF has long been a supporter of the Distinguished Lecturer program, and webinars, including PetroTalks, are a more-efficient and -cost-effective way of delivering the latest technology to our members in their offices,” she said. “With the archived material, the latest technology is delivered on the engineer’s schedule. And, there is no cost to the member.”

The three new PetroTalks now available are by Michael Oxman, partner and consultant for Acorn International; Kevin Preister of the Center for Social Ecology and Public Policy; and Doug Bannerman, head of corporate responsibility at Statoil North America.

Watch Oxman speak about the management of above-ground risks here.

Watch Preister speak about the importance of community engagement here.

Watch Bannerman speak about corporate social responsibility here.

23 Oct 2015

The Growth of the HSE Discipline and Its Role in SPE

22 Oct 2015

SPE Partners With IOGP for Outstanding Young Professional Award

To inspire the next generation to get involved and solve problems through innovation, SPE and the International Association of Oil and Gas Producers (IOGP) have collaborated to launch the IOGP Outstanding Young Professional Award. The award will recognize achievements of E&P professionals who have fewer than 10 years of experience and have demonstrated outstanding talent, dedication, and leadership in at least one aspect of health, safety, security, the environment, and social responsibility. The winner will be announced at the SPE Health, Safety, Security, Environment, and Social Responsibility (HSSE-SR) Conference and Exhibition, which will be held in Stavanger 11-13 April 2016.

How to Apply
To nominate someone, you must be an SPE member and have more than 10 years of experience. Nominations are open to all young professionals, both SPE members and nonmembers, who have fewer than 10 years of experience. Nominees should:

  • Be well-respected and in good standing within the community
  • Serve as a role model for other young professionals
  • Demonstrate noteworthy professional and personal achievement
  • Demonstrate commitment to excellence and proven leadership
  • Exhibit expertise, passion, and the ability to inspire others

Nominators should include the nominee’s CV and complete the application form here. The deadline for nominations is 4 November.

The award committee, which consists of SPE members and IOGP officials, will select five finalists by 1 December. Each finalist will then be asked to submit a short video presentation in the style of a TED talk (no longer than 5 minutes and 1GB) that addresses the issue, “How innovation in HSSE-SR can make the oil and gas industry more sustainable and more acceptable to the wider world.” Creativity is encouraged in the making of the videos. All videos will be due by midnight on 8 January, and the award finalist will be notified 1 February.

At the Conference
The winner will be announced at the SPE HSSE-SR Conference in Stavanger and receive:

  • IOGP Outstanding Young Professional Award certificate and trophy
  • Complimentary registration to the conference
  • A one-year membership to SPE
  • An invitation to join the award and young professional committees for the 2018 SPE HSSE-SR Conference
  • Recognition on the IOGP website and newsletter and on the SPE conference website

The 25th anniversary of the SPE HSSE-SR Conference and Exhibition will bring together experts from all over the world to share new ideas, process improvements, technological advancements, and innovative applications to enhance HSE performance. The theme for this year’s conference is “Sustaining Our Future Through Innovation and Collaboration.”

Read more about the conference here.

19 Oct 2015

Executing Sustainable Development at a Sensitive Amazon Basin Area—Decommission and Abandonment

In Ecuador (circa 2006–07), PetroOriental (operator of Blocks 14 and 17) concluded a large exploration project with the construction of the Batata 2 well pad and the drilling of two exploratory wells, pursuing the confirmation and development of oil reserves identified during the seismic work started in 2003 by EnCanEcuador (subsidiary of EnCana Corporation) in the northern region of Block 14. All this development occurred within the Yasuní National Park, one of the natural areas in South America with the highest biodiversity, and also a traditional land of peoples living in voluntary isolation (the Tagaeri and the Taromenani).

The early planning and execution of this project included the development of a long-term strategy and a set of high-standard social and environmental practices to minimize the company’s exposure to social and environmental liabilities and to guarantee the success of this challenging project. Unfortunately, by the first half of 2007, after drilling the previously mentioned exploratory wells, the prospects were declared noncommercial, triggering the process for decommissioning and restoration of the Batata 2 well pad. The loop was closed, and the success of the project was dimmed by the lack of results.

This paper summarizes some of the most important elements of the project planning and execution of the Batata 2 well pad by means of a retrospective analysis with the intent to identify some of those practices, highlight them, and present them to our colleagues in the industry. We now have many examples of life-cycle applications, which is our contribution to the discussion, because we believe that life-cycle assessment and life-cycle management are closely linked to sustainable development, a journey that the oil and gas industry is just starting to make.

Read the full peer-reviewed paper here (PDF).

7 Oct 2015

Symposium Examines Complexity of Arctic Exploration

On 24 September, University of Houston Energy and ExxonMobil presented the first event of the 2015–16 Energy Symposium Series: Critical Issues in Energy, “Arctic Drilling: Untapped Opportunity or Risky Business?”

The symposium, held at the University of Houston, was moderated by Richard Haut, director of energy production at the Houston Advanced Research Center. The speakers were Kevin Harun, Arctic program director at Pacific Environment; Jed Hamilton, senior Arctic consultant at ExxonMobil Upstream Research Company; Bob Reiss, an American author and consultant on Arctic issues; and Peter Van Tuyn, managing partner at Bessenyey & Van Tuyn based in Anchorage, Alaska.

Watch the symposium here.

6 Oct 2015

OPITO Releases List of Candidates for Annual Safety Awards

OPITO, the Offshore Petroleum Industry Training Organization, has announced the short list of finalists for its Annual Global Oil and Gas Workforce Safety Awards.

INSTEP/Petronas in Malaysia, Oil Spill Response Limited (OSRL) in the UK, and McDermott in Dubai are vying for the Employer of the Year title, while Megamas Brunei, Petrofac Training Services, and PT Samson Tiara in Indonesia have made the final three in the Training Providers category of the awards.

The awards recognize companies that best demonstrate their commitment to building a safe and competent workforce through OPITO standards.

The winners will be announced at the OPITO Safety and Competency Conference (OSCC) on 3 November 2015 at the Dusit Thani Hotel, Abu Dhabi. Now in its 6th year, this global annual event is supported by headline sponsor Shell.

Expected to bring together approximately 500 senior figures from industry, government, regulators, and training providers from around the globe, the event will explore how the industry maintains competency and continues to keep its people safe in a lower-oil-price environment.

OPITO Group Chief Executive Officer David Doig

OPITO Group Chief Executive Officer David Doig

OPITO group chief executive officer David Doig said, “We have been thoroughly impressed with this year’s entries, the most we have ever received. It’s great to witness an increasing number of organizations looking to continually improve their safety training and develop competency.

“The OSCC Awards is our way of recognizing and rewarding their ongoing efforts to ensure a safe and competent workforce. It’s vital that we do what we can to maintain this momentum and not make any compromises when it comes to safety, even in sub-USD-50 oil.”

Entries were judged on how they have effectively adopted OPITO standards, the number of staff trained, geographical location, and examples of how the standards have shown a tangible improvement in safety and competence in the workplace.

In March this year, INSTEP/Petronas was named the world’s first approved technical qualification centre to receive OPITO accreditation for maintenance training in mechanical, electrical, instrument, and control disciplines. Established in 1981, INSTEP/Petronas was primarily set up to train skilled technicians and operators in meeting the rapid growth of the global petroleum industry. The organization has since trained more than 10,000 technical staff in Malaysia.

OSRL first gained OPITO approval in 2007 for its competency management system. The organization has worked closely with OPITO to adapt its training programs to meet the needs of a growing workforce across an extended geographical reach. OSRL has seen a number of improvements in safety, competence and risk reduction since adopting OPITO’s global standards.

As an OPITO approved training centre, McDermott actively participates in OPITO development forums and training provider advisory groups. The organization recently adopted OPITO’s IMIST standard to ensure that every offshore worker is equipped with the necessary safety awareness and training to reduce risk and ultimately reduce the number of incidents.

In 2000, Megamas Brunei became the first training provider in the world to deliver the OPITO Tropical BOSIET and has since delivered the course in more than 20 countries. The organization recently celebrated 25 years without a lost-time incident free and received an international recognition from NEBOSH for its outstanding contribution to health and safety.

Petrofac Training Services established a 5-year health, safety, environment, and quality strategy to drive health, safety and environmental improvement in line with OPITO standards. The organization recently developed a Competent Person Profile program to ensure staff skills and behaviors meet the required competencies needed to deliver OPITO courses. The program is currently being delivered in Aberdeen and will be rolled out across the globe for fire and marine training.

In 2004, Samson Tiara became the first OPITO-approved safety training provider in Indonesia. The organization has played a pivotal role in the growth and adoption of OPITO standards across Indonesia in an effort to educate government institutions, oil and gas regulators, and industry employers on the benefits of OPITO approved training.

OSCC is the only global event focused on safety and competency in the oil and gas industry. The event was introduced to bring operators, contractors and the supply chain together with training organizations to provide a forum for improving standards of safety and competency that protect the workforce and the industry’s reputation.

Read more about the OSCC here.

25 Sep 2015

Understanding Communities: A Key to Project Success

Many factors can influence public perception of the oil and gas industry and the projects it develops. Increasingly, public acceptability can make or break the license to operate. Engineers and other technical leaders frequently view the problem as one that can be overcome with public education.

But education can only be successful if the industry first achieves a level of trust within the community. Building trust requires developing an understanding of the community. One of the more important approaches is for operators to understand the needs and expectations of the communities where they operate and for project teams to educate themselves about the expectations of the local community that will be affected by a project.

Each community is unique; the challenges are multifaceted. SPE HSSE-SR Technical Director Trey Shaffer and SPE PFC Technical Director Howard Duhon will present Understanding Communities: A Key to Project Success during a topical luncheon at SPE’s Annual Technical Conference and Exhibition (ATCE) in Houston. Shaffer and Duhon will explore some notable industry efforts in community engagement and education as well as critical success factors for effective community engagement.

The PFC/HSE topical luncheon will be from 1215 to 1345 on 29 September at the George R. Brown Convention Center, in Bush Grand Ballroom A.

Learn more about ATCE and register here.

11 Sep 2015

Simplification: A Moral Imperative

As early as 500,000 years ago, man was using fire to light his cave. This was a very inefficient source of light, yielding about 0.6 lm-h per 1,000 Btu of energy.

A step change improvement occurred about 40,000 years ago with the burning of animal fats and oils. Candles became common about 4,000 years ago, but burning wax to get light was also inefficient, yielding only 4 lm-h per 1,000 Btu.

This type of resource was also expensive. It has been estimated that a common man would have had to work an entire day to afford a few minutes of light. Unless you were wealthy, night was a dark and dangerous place.

It was thousands of years before the next significant improvement occurred when sperm whale oil came on the scene in about 1700, yielding 10 times as much light per Btu of energy at a much lower cost. A day’s work would buy 4 hours of light. A downside was that many men died while harvesting whale oil, and, after 150 years of its use as a fuel for lighting, the sperm whale was nearing extinction.

The oil industry saved the sperm whale. The discovery of significant quantities of oil in Pennsylvania and elsewhere in the 1850s and beyond and the development of drilling and refining methods created a much lower-cost and more abundant source of energy. One day of labor yielded 75 hours of light.

The next and most dramatic improvement was the development of electric light. One day of work earned 4,000 lm-h per Btu or 10,000 hours of light.

Light was available to the common man in nearly unlimited quantities.

People who are fortunate enough to live in developed countries enjoy unlimited light, which is not the case everywhere in the world. Availability of affordable energy is perhaps the largest divider between the haves and have-nots today.

The Complexity of Light
For the end user, switching on a light bulb is much simpler than lighting a fire. But the systems behind the bulb are complex. To get light from an electric bulb the following are needed:

  • Mining for fuel (gas, coal, oil, and uranium)
  • Power plants to generate the electricity
  • Mining industries to obtain raw materials for light bulbs, wiring, and other components
  • Transmission and distribution systems to deliver the generated electricity to homes and businesses
  • Light bulb manufacturing, distribution, and retail sales
  • Electrical wiring systems in buildings
  • An advanced political/social system that enables all of the above

The benefits of light are derived from complex systems that are mostly hidden from view.

Strength is Also Weakness
Oil has had a much greater impact on the world than simply providing light. With the age of oil has come cars, trains, planes, modern medicine, and plastics. Life expectancy in 1850 was fewer than 40 years. Since then, the discovery and use of oil has enabled the innovations and advancements that have added 40 years to our life expectancy.

Systems theory guarantees that there will be some downside to this kind of success story; every great strength is also a great weakness. The success of the oil industry also poses its greatest challenges, one of which is to keep it going.

The world has changed because oil/energy has been cheap for most of the past 160 years. And the world has become dependent on oil.

But keeping energy affordable is going to get harder.

Although there is a great deal of oil and gas in the world, much of it is expensive to produce, as is painfully apparent in the past few months.

The world is not lacking in oil, but it may be lacking in oil that can be affordably mined—unless we change the way we mine it.

How Much Oil Do We Need?
The current glut of oil supply is unlikely to last long. We need to find a lot of oil to keep this world humming.

The world consumes about 94 million BOPD. Historically, consumption has increased by about 1 million BOPD annually. The production decline rate of existing wells is about 5 million BOPD annually. Therefore, we need to bring on 6 million BOPD of new production every year just to stay even. That is the production equivalent of Saudi Arabia every 2 years, 60 major deepwater developments annually, or the production equivalent of six North Dakotas annually.

And this has to be done in an environment in which the oil industry faces a great deal of opposition and where many areas are off-limits to oil production. If it is true that the easy oil has been found, then extracting the vast quantities of needed new oil would be increasingly more difficult and expensive.

Moral Case for Simplicity
In the February and April issues of Oil and Gas Facilities, I wrote articles about complexity and the SPE Complexity Work Study Group. At the beginning of the study, I viewed complexity as an economic issue that affects project viability and profitability. Now, I see it as a moral issue as well.

World political stability and the economic progress needed to pull impoverished people into the middle class depend on affordable energy.

Eventually, the world is likely to transition to renewable energy, but it will not happen soon. For at least the next couple of decades, the affordable energy source must be largely oil and gas.

We seem increasingly incapable of delivering projects successfully. Obviously, part of the problem is that our projects are inherently more complex than they used to be. For example, a complex project is necessary to develop a complex reservoir in a new deepwater basin with no infrastructure.

But we also add unnecessary complexity. To be successful going forward, we must do a better job of managing inherent complexity and we must shed the baggage of unnecessary complexity.

Sources of Complexity—Systems Theory
I have discussed the sources of complexity in previous columns, which include

  • Inherent technical complexity
  • Inherent social/political complexity
  • Standards, specifications, and regulations
  • Decision making
  • Design team preferences
  • New technology
  • Safety culture

Another source of complexity, which is largely hidden from view, is the effect of increases in the size of project teams. Although it is recognized that large teams and multiple teams may create interface issues, the challenges are greater than most people realize.

Early in my career, when I started working on small developments in the Louisiana swamps and shallow waters offshore, projects were less complex: one facilities engineer generally understood the whole project. Surprises were few and usually inconsequential. Everyone with whom I interacted was located in the same building, across town, or at most, a short flight away. All construction was taking place within driving distance.

Designs were simpler, too. For example, control systems were defined on simple loop sketches.

I recently listened to an interview of an Apple executive. He was asked why the iPhone was being built in China: Why not spend a little more to build them in the United States?

Paraphrasing his response, he said that it would be impossible to build them in the US because it would be impossible to organize the requisite skills in one place as the company can do on the massive manufacturing campuses in China.

I immediately related to that comment. I know little about the building of an iPhone, but I know that we are hampered in our industry by the need to coordinate with design and construction groups spread all over the world. I do not think that we fully understand the impact of having to coordinate the work of hundreds of people in multiple teams on a global scale.

More People, Less Productivity
When we have more work to do, we often add more people to the team or teams to the project. But adding people does not necessarily result in increased productivity.

Baron and Kerr, in their 2003 book Group Process, Group Decision, Group Action, described the factors involved as

Team Production = Team Potential – Coordination Losses – Motivation Losses

Although team potential may increase linearly with the number of people on the team, the increases in coordination losses and motivation losses are nonlinear. A point is reached at which adding more people results in more losses than gains.

What do these losses look like in practice? Some examples that I have experienced are

  • A process engineering team designs parallel compression systems to be run in parallel, but the control systems engineer does not include load sharing in the design. As a result, the compression systems will not run in parallel.
  • A vendor uses an outdated version of the piping and instrumentation diagrams (P&IDs), and the wrong grade of pipe is installed.
  • The subsea team misinterprets the topsides chemical design pressure as the operating pressure and undersizes an umbilical tube.
  • The commissioning engineer adds a valve to the P&IDs (for isolation during commissioning), but the process engineer deletes it from the next version of the diagrams because he cannot see a reason for it.
  • A section of pipe rack reserved for future expansion is used for a minor field-run piping.

We light the world. In the future, we must learn to light it more simply and efficiently.

PFC Program at ATCE
The discussion of complexity will be continued in greater depth during the Projects, Facilities, and Construction (PFC) dinner on 28 September at the SPE Annual Technical Conference and Exhibition (ATCE) in Houston.

The PFC-related offerings at the ATCE have increased over the past few years. This year’s program features the best lineup. They are

  • Technical paper sessions (Flow Assurance, New Technology, and Field Experience)
  • Special sessions (Gas Scrubber Design and Validation for Robust Separation Duty, Error Analysis and Uncertainty in Flow Assurance and Facility Design, and Managing the Future Impact of Current Cost Cutting)
  • Training courses (Water Treating for Hydraulic Fracturing, Separator Design, and Understanding Communities)
  • Topical luncheon (Understanding Communities)

Speakers at the training course and topical luncheon will discuss community engagement, an important topic.

Many of you may still believe that educating the public will melt away its opposition to the oil industry. This is utterly incorrect. To defuse community opposition, we need to understand the communities in which we operate.

Attend ATCE, meet up with old friends, and make a few new ones.

Read more about ATCE and get tickets to the PFC dinner here.

Howard Duhon is the systems engineering manager at GATE and the SPE technical director of Projects, Facilities, and Construction. He is a member of the Editorial Board of Oil and Gas Facilities.

11 Sep 2015

BSEE Presents Program To Determine Best Available, Safest Technologies

In cooperation with industry, the Bureau of Safety and Environmental Enforcement (BSEE) has developed over the past months a data-driven and transparent program for determining best available and safest technologies (BAST). On 12 November, BSEE will present the BAST Determination Process at an event in Houston.

“One of the ways to ensure safety and reduce risk in outer continental shelf areas is through the use of available critical technologies that have been determined to be the best available and safest. The requirement encourages innovation and continuous improvement and guarantees development in the safest, most responsible manner,” wrote Doug Morris, BSEE chief of offshore regulatory programs, on the bureau’s Website.

BSEE and industry stakeholders have worked closely to develop the process that informs and enables the evaluation and determination of BAST in the offshore environment. The Houston event will be an opportunity to hear from the regulator on the path forward for BAST. The process will be presented by BSEE representatives Doug Morris, Joe Levine, and others.

The event will be from 0800 to 1200 at the Hilton Houston North. Registration is free, but seating is limited to 300.

Register for the presentation here.