Formation evaluation

Core-Analysis Elephant in the Formation-Evaluation Room

The variable data quality of core analysis, the sensitivity of results to different test methods, poor reporting standards, and the reluctance of some vendors to share experience and expertise have contributed to basic mistakes and poor data quality.

The variable data quality of core analysis, the sensitivity of results to different test methods, poor reporting standards, and the reluctance of some vendors to share experience and expertise have contributed to basic mistakes and poor data quality. In many cases, an inconsistent or inappropriate approach to the design, management, and interpretation of the core-analysis program has been adopted and exacerbated by the conflicting requests from the end users. In combination, approximately 70% of legacy special-core-analysis-laboratory (SCAL) data are not fit for purpose. A core-analysis-management road map was designed to increase the value from core-analysis investments by enabling a more-proactive, more-coherent, and more-consistent approach to program design and data acquisition.

Introduction

In hydrocarbons-in-place and core analysis, the volume of stock-tank oil initially in place in a reservoir can be calculated. The gross rock volume and gross factor in the net-/gross-pay ratio are the primary responsibilities of geophysicists and geologists. The reservoir engineer is responsible for oil-formation-volume factor from pressure/volume/temperature experiments. The petrophysicist is responsible for net reservoir thickness, porosity \(\phi\), and water saturation. Data input relies principally on logs, but log interpretation must be calibrated or verified by measurements on cores. For example, net reservoir thickness normally is defined by a permeability cutoff, and high-resolution permeability data are available only from core analysis. Porosity interpretation (e.g., from density logs) should be verified by, or calibrated to, stressed-core porosity data. Resistivity logs require Archie’s cementation and saturation exponents m and n, respectively, to determine water saturation quantitatively in clean formations. These exponents are measured on cores. Water saturation can be determined directly by extracting water from cores with Dean-Stark methods, or indirectly from core-derived capillary pressure measurements.

Importance of Core Data and Data Issues. Core analysis is the only direct and quantitative measurement of the intact-reservoir properties, and it should provide the foundation for formation evaluation. However, the author found an unfortunate negativity about the value of core data. In a multiple-laboratory comparative SCAL study, it was found that some core-service contractors provided very-poor-quality data on some of the tests, and it was concluded that some of the laboratories do not have in-house quality-control protocols and report data just as acquired. Other studies have found inexplicable relative-permeability-data discrepancies between commercial laboratories testing exactly the same core material and fluids. Too often, core-analysis programs are ill considered, badly designed, and poorly supervised and the results are only crudely integrated with other well and reservoir data. The results, in terms of data acquired, often are unrepresentative or even contradictory. A conservative estimate from review and audit of more than 30,000 SCAL measurements of different vintages indicates that approximately 70% of the data are unfit for purpose because of their unreliability, inapplicability, or inappropriateness.

Elephant in the Room. Idiomatically, “elephant in the room” is an expression that applies to a problem or uncertainty that few people want to discuss. The following examples illustrate that small and generally unreported laboratory artifacts and measurement uncertainties have a significant influence on two key petrophysical-data inputs: the Archie water-saturation equation and capillary pressure measurements.

Core-Data Uncertainties

Archie defined a fundamental set of equations that establish the quantitative relationships between porosity, formation resistivity Rt, formation-water resistivity, and water saturation of reservoir rocks. Both formation resistivity and porosity are obtained from log data, but porosity logs should be verified from core measurements made at representative stress. The m and n values are obtained from formation-resistivity-factor (F) and resistivity-index (I) tests on cores. Many petrophysicists must rely on legacy SCAL data of varying vintage, frequently measured at ambient conditions.

Porosity Input. In the formation-factor test, porosity often is measured at stress in conjunction with core resistivity Ro and is used to estimate porosity-compaction factors for log calibration. In a common test protocol, the sample is saturated in brine under unconfined conditions and, after resistivity stabilization, is loaded into the test core holder. Because air is resistive and compressible it must be removed from the annulus between the plug and the sleeve, the end stems, and measurement system so that the system is filled with brine before loading. The sleeve-conformance pressure (SCP) and sleeve-conformance volume (SCV) of brine in the plug/sleeve annulus should be satisfactorily established for each test plug so that appropriate corrections can be made to determine the correct pore-volume reduction at stress. The confining pressure is increased in small increments, and the pore-volume expulsion and core resistivity are recorded. The SCP and SCV normally are pinpointed by an inflection of the slopes of resistance or expelled-­volume vs. confining-pressure curves.

The volume/stress data rarely are reported, and many laboratories assume the same SCP (and hence SCV) for every sample irrespective of the plug shape and surface topology. Even when data are available, the interpretation of SCP and SCV can be subjective. Unfortunately, the effect on the porosity measurements is significant. The example in Fig. 1 shows stress-normalized porosity (the ratio of porosity at stress to unconfined or ambient-condition porosity) as a function of confining stress for samples from the same formation. The only difference was the test laboratory. Laboratory A determined SCP and SCV for each sample. Laboratory B assumed an SCP (and corresponding SCV) that were too low, which resulted in an apparently lower porosity at stress. When the Laboratory-B results were used to calibrate the density-log interpretation in this gas reservoir (before the Laboratory-A results), porosity was underestimated by 7% and gas initially in place by 4%.

jpt-2013-8-elephantfig1.jpg
Fig. 1—Effects of SCV uncertainty on stressed porosities.

Archie m and n Input. This excess brine also can have a significant effect on Archie’s m and n values. Although this is not an issue for tests carried out at stressed conditions (above SCP), in ­ambient-condition F measurements brine clings to the surface of the plug after saturation. This surface brine must be removed; otherwise, it will provide a conduit for current flow, making the measured resistivity too low. A film of surface brine of only 2.5-µm thickness can result in an underestimation of F of nearly 30% in a low-porosity sample. If F (and Ro) are too low, then the ambient-condition resistivity index (Rt/Ro) and n will be too high. Although these effects are eliminated if both F and I tests are measured at stress, petrophysicists often cannot work with such data. At ambient conditions, surface brine produces an average negative error of 30% in m and a positive error of 15% in n.

Water Saturation. Grain loss from plug handling during testing can result in considerable uncertainties in the calculation of saturations from gravimetric measurements. A loss in weight during a drainage experiment might be interpreted as a loss of water such that the calculated water saturation is much less than the true value. For a 16%-porosity 140-g plug, a grain loss of only 2% dry weight translates to a 20-saturation-unit error in water saturation for a gas/water test. The grain-loss error is magnified for an oil/water system because the fluid-density difference is smaller. The goal of grain-loss correction is to predict the fluid-filled pore volume at each stage of handling and test procedures and to validate these estimates, where possible, by use of measured data. Unfortunately, such corrections can be subjective.

Mercury-Injection Capillary Pressure. Historically, mercury-injection capillary pressure data were the principal source of core-derived saturation/height functions. Before the mid–1990s, nearly all mercury-injection capillary pressure tests were made on core-plug samples (1- or 1.5-in. diameter×2- to 3-in. length) by use of manual equipment at injection pressures up to 2,000 psi. The nonwetting- and wetting-phase saturations were determined from the injected-mercury volumes, and the core-plug helium pore volume. Today, virtually all measurements are made with automated high-pressure equipment that is capable of injection pressures of up to 55,000 psi. These instruments were designed specifically for pore-size-distribution tests on papers, catalysts, porous material, and ceramics, not for capillary pressure curves on core plugs. The sample chambers (penetrometers) are size limited to approximately 10 mL (Fig. 2), but typical core-injection samples are approximately 5-mL bulk volume. Therefore, for a 20%-porosity sample, the chip-sample pore volume is less than 1 mL, compared with a pore volume of 14 mL for a 1½-×2½-in. plug. Because the effects of volume errors on saturation estimated from immersion bulk volume and helium grain volume on small samples are large, most laboratories inject mercury to define the total ­mercury-filled pore volume. This process requires pressures in excess of 25,000 psi.

jpt-2013-8-elephantfig2.jpg
Fig. 2—High-pressure-mercury-injection-equipment sample penetrometer.

In certain formations, high-pressure mercury injection appears to cause distortion of the capillary pressure vs. water-saturation curves. In the example detailed in the complete paper, mercury-injection tests were run with conventional manual equipment on 1.5‑in. plugs, whereas tests on the chip samples were run on 4- to 6‑mL specimens. In both cases, the test plugs were unconfined (3D injection). The distortion phenomena involved are not yet understood clearly. They may be related to sample-size percolation dependencies, but they appear worse on samples containing clay-filled micropores, which normally are not accessed by mercury at 2,000 psi. Continued higher-pressure injection appears to damage the pore system progressively and permanently, which has a significant effect when the end user generates saturation/height curves from high-pressure-mercury-injection data. The high-pressure-mercury-injection data on plug chips form a separate population, and if applied unquestioningly in isolation, they result in misleadingly optimistic hydrocarbon saturations.

Core-Analysis-Management Road Map

The chance to acquire new core data provides an ideal opportunity to minimize uncertainties in key core-derived model inputs. The key questions that should be asked before embarking on this process are:

  • Are there areas of concern or are there anomalies or suspicious data in the database that need to be resolved?
  • How closely do the core, log, and test data agree for the well in question and the reservoir in general?
  • What core-analysis tests are actually needed?
  • Is the contractor’s interpretation correct?
  • Can operators improve on the laboratory interpretation?

Planning and Program Design. Coring and core analysis often are poorly planned. An ill-considered test-program design can result in underused, poorly appreciated, and misapplied core-data results. Proper planning and supervision of a core analysis can reduce data redundancy. Petrophysicists; geologists; and reservoir, drilling, and completions engineers all have a role in the planning, which must include the laboratory that will perform the work.

Core-Analysis Contacts. Recurring themes in many core-laboratory audits are the need to improve communication between the laboratory and the client as well as client education in core-­analysis performance and interpretation. In particular, vendors feel that they are too often faced with conflicting and contradictory instructions from within the operator’s different discipline functions, and they would prefer to deal with a single, knowledgeable, core-analysis contact who understands the applications and limitations of core-analysis tests. This single contact should ensure that the data-quality requirements are maintained and, more importantly, that the data are fit for purpose. The client contact is the liaison between the client’s different subsurface disciplines and the laboratory and is accountable for laboratory supervision and real-time quality control.

Real-Time Quality Control. Regular monitoring of vendor performance and the provision and checking of experimental data can ensure that any problems or unusual, anomalous, or inconsistent results can be identified as soon as possible so that they can be rectified before the test program is completed. Thereafter, it is too late and costs often are incurred in retesting, which can lead to largely undeserved but lingering resentment about laboratory performance. Contractor supervision and quality control will ensure that complete records of the test methods and procedures together with laboratory data are available to provide a complete audit trail. SCAL reports must include a detailed description of the work performed and the equipment and procedures used, along with details of the methods used by the laboratory in analyzing the data. Understanding the core-plug history is essential in quality control of the SCAL data—particularly in formations sensitive to stress cycling and rock/fluid incompatibilities—but deciphering plug records from standard SCAL reports can be challenging.

Laboratory/Client Relationships. In the author’s experience, laboratories have enthusiastic, committed, and highly experienced senior-management teams, but they tend to be reactive, rather than proactive. This tendency is not helped by the traditional master (client)/servant (laboratory) relationship. Engaging the laboratories through regular meetings and laboratory visits during ongoing projects makes them more aligned with, and more involved in, the client/stakeholder objectives, and they understand better the importance of the data in field-development planning and decisions.

Conclusions

Laboratory artifacts can have a significant effect on petrophysical interpretation. Nevertheless, with experience, learning, and appropriate diagnostic tools, these uncertainties are recognizable and manageable. A proactive synergistic core-analysis-management strategy can deliver high-quality data by developing a more-effective relationship between the end user and the data-acquisition laboratory. The result includes

  • Improved communication and learning.
  • A better understanding of core-analysis procedures and methods.
  • A more-coherent and -consistent approach to data acquisition.
  • A reduction in uncertainties and data redundancy.
  • A full data-audit trail, which yields better equity and unitization positions, and easier and more-efficient presentation of core-analysis plans and results to partners.
  • Added value from core-analysis investments.

This article, written by Senior Technology Editor Dennis Denney, contains highlights of paper SPE 158087, “The Core-Analysis Elephant in the Formation-Evaluation Room,” by Colin McPhee, SPE, Senergy Limited, prepared for the 2012 SPE Annual Technical Conference and Exhibition, San Antonio, Texas, 8–10 October. The paper has not been peer reviewed.