Tuesday, June 19
Our ability to integrate models of reservoirs, wells, and facilities has improved markedly in recent years, and is now standard practice across many of our common workflows. As a new generation of simulation capabilities emerges, driven by evolution of hardware and software approaches, the long-held aspiration of comprehensive modelling of business decisions seems now possible. Compared with only a few years ago, the variety, quality, and frequency of data that can be used to improve reservoir and production flow models have dramatically increased — and with such improvements have come many "data-driven" modelling techniques that in some cases are now displacing the historical approach of using physical models. This session will explore how the industry is now focusing on the application of system-wide modelling for operations, for optimization and for forecasting — where we can now incorporate a variety of techniques like analytics, uncertainty integration, event recognition, and machine learning to dramatically improve efficiency and business results.
Initial reservoir model building activities have focused on distributing sub-seismic scale properties across reservoirs based on log data from limited sampled locations, resulting in large initial uncertainties. With modern instrumentation and better technologies, large amounts of information are gathered locally over refined time scales. Though spatially limited, these data are collected over large time spans from a variety of sources, potentially including permanent pressure gauges, SCADA units, DTS, and DAS. This session will discuss various methodologies and algorithms used to integrate such high frequency data, build models and refine uncertainties.
Compositional details/fidelity/granularity of the fluid description used in production system modelling is often a compromise between adequately addressing physics and providing sufficient computation efficiency as projects often require integration with other parts of the production system. However, there is clearly no one-size-fits all approach. Sometimes, even within a particular integrated model, as required compositional detail in one area preclude practical run times in another. This session will explore consistent and computationally efficient approaches for handling fluid property predictions across workflows and in various model domains.
Detailed well design has typically been the domain of the production engineer, while well placement and reservoir/production system performance have been the domain of the reservoir engineer. Each uses discipline specific tools and workflows. As a result, one discipline may be more aware of the costs and risks of complex wells, while the other may better understand the reservoir uncertainties and how advanced completions can affect the value proposition. Beyond the implementation of the field development plan, establishing the right data acquisition and surveillance plan, reservoir management strategy, and operating guidelines are critical to the long term success of the asset. This session will explore how disciplines can efficiently work together to design and operate wells, to ensure value delivery throughout the life of the field.
Wednesday, June 20
Industry experience shows that value from assets is maximized when design of facilities is performed using integrated models that consider physical responses of the reservoir and wells during the lifetime of the asset. Elimination of artificial boundary conditions in the physical and mathematical models, along with appropriate solution formulation are two key elements that facilitate optimal designs. With the bigger picture in mind, appropriate model building techniques and fluid consistency enable robust optimization algorithms to be deployed. This session will explore current state of the art in vendor neutral integration of models and their offline and online use in designing facilities and running operations in a proactive manner. In addition, the discussion will look at areas of improvement and adoption of standard practices in the industry.
History matching can be defined as the calibration process of our models, where we address our lack of understanding or inadequate description of the reservoirs. Proper calibration requires feedback from the static model (“big loop”) and can be time consuming. Even with the feedback during calibration, the remaining uncertainty can result in multiple acceptable solutions, or may not be able address some of the unknowns that can be critical to accuracy of the forecasts. Topics that will be addressed in this session include shortening of the “big loop”, calibration while integration with geomechanics and wellbore hydraulics, quantification of multiple solutions obtained through assisted history matching applications, forecasting with uncertainty, and inclusion of new data sources.
Advanced Integrated Models can be described as strongly coupled physics (reservoir, wellbore, facilities) and economics models. In addition to the complexities in developing efficient integrated models, they result in complex optimization problems, especially those related to capital allocation with long-term objectives. Latest developments in optimization and machine-learning algorithms may enable us to efficiently solve these very complex optimization problems associated with integrated models, while accounting for various uncertainties. Topics that can be addressed in this session include efficient development of integrated models and employing them to perform robust optimization (e.g. optimize development planning in unconventional resources, optimize facility design for the life-of-field, near real-time optimization).
Advancements in the data analytics technologies such as Artificial Intelligence and Machine Learning is making its way in the oil and gas industry and promise to fundamentally improve the way we operate. Thus, a robust data foundation is required to enable and maximize the utilization of the emerging technologies. The session will discuss the use of Artificial Intelligence and Machine Learning for smart data integration including digitization, streaming and big data and novel approaches, such as pattern recognition and deep learning for integrated design and operations.