Content Management To Support Application and Business Workflow for Efficiency Wins
There has been a realization within the oil industry since 2000 that applications (which are end-user programs such as spreadsheets, word processing programs, and databases) and systems are related. Well, transaction, and product data, and best practice with success data should be integrated into a seamless and streamlined process for process improvement and greater return on investment.
When oil companies merge, they consolidate not only the cultures of the companies, but also the systems of records for geological assets, financial records, well drilling, completion and workover procedures, as well as vendor contracts. Similarly, the same consolidation occurs when service companies merge and rationalize their locations, product lines, training programs, and information technology standards. It became apparent that effective data-sharing could occur only if hurdles of incompatible and duplicative processes between divisions, departments, and product lines were rationalized and eliminated. All these elements have raised awareness that the business process workflow and content management must be integrated for efficient use of people and applications in a world where global document visibility and around-the-clock processes deliver the quickest results with the most value to satisfy a demanding business market.
Business Process Workflow
The business process workflow elements are the components required to transact business between the supplier, the manufacturer, and the buyer. It consists of images, documents, forms, and internal or cross-company-based processes. These could relate to a single event, reoccurring events, or stop-and-start events. The business process workflow can also include ad hoc events that might occur once or might evolve into a new work process. The business process may need to be independent of the resource and might apply to the job role rather than a specific individual.
This article addresses the business process, management, and flow of content for products and services. The business process elements can consist of applications and forms that fulfill a business transaction, such as a datasheet, request for quote, quote, purchase order, order acceptance and delivery forms, field ticket, invoice, and payment. These documents are part of the everyday business process, and for the oil field they are part of recommended practice API RP 3901 created in 2001. This API-PIDX [Petroleum Industry Data Exchange] standard applies to a common framework and definition of elements in business documents (Murphy and Barling 2001). Without the availability and use of standards, at any point in the process, data transactions could react with business transactions, and some compatible and incompatible business actions could require dual data entry for applications.
In 1997, a process was designed to automate the workflow to manage operational data, improve the job performance, and collection of job execution data. This workflow automation delivered a process for data collection and allowed access to information, which provided a greater level of job execution and quality. The goal was to store all job information by well in a single shareable database (Schmitzer and Randall 1997). The post-job data could then be analyzed to identify opportunities for improvement in job procedures and to develop and distribute best practices. An index of incidents to consider when preparing for a job, either common or unique, would also be available (Feechan 1997).
In 1999, there was an industrywide increase in purchases of servers, routers, and applications in preparation for year 2000. Systems were prepared for Y2K updating and the vision of seamless integration was not at the forefront. Prevalent thinking at the time was that system integration may never occur, but corporations should be ready and have the infrastructure for a time when application functionality could catch up with the vision.
From 1997 to 2000, the most value was not from integration but in assisting people to be more efficient. Since there is never enough time in a day, employees were tasked to do more and be more efficient. The contradiction was that the industry was forced to be more efficient through applications that enabled companies to perform with fewer resources. This apparent transition occurred when the focus changed from systems and components to business value that generated the greatest return on investment (Edison 2001).
In the Y2K frenzy, the support for efficient business workflow may have been lost before year 2000, but between 1998 and 2003 applications and web services improved dramatically, changing the need for seamless two-way integration to simple linking of applications through the web. (Web services are applications that share business logic, data, and processes through a programmatic interface across a network—a standardized way to integrate web-based applications using various standards to tag data and describe and list the services available for sharing data among systems.)
The system of record for financial and engineering data was still the system of record, but the data were now available for display in different presentations and applications based on the need of the user, almost like a personalized subscription of content (Fig.1).
Today, virgin data, if normalized and standardized, can be extracted, mapped, and possibly reformatted to meet the needs of the vendor, engineer, shop employee, or customer. Applications today offer features and benefits that are superior to legacy applications, and a dynamic business requires a systematic process for envisioning, developing, migrating, and decommissioning applications both old and new.
Many things prevent change, stemming from either the cost to change or the cost of not changing. Change is constant, and the cost of not adapting is greater in the long run (Edison 2001). Change might mean supplanting an old system, adding another system, or expending the cost required to normalize virgin data. Cost may inhibit change, but culture could be the greatest driver for lack of change.
Choosing a focus area should include a well thought-out evaluation to produce the greatest value with the least amount of recreating a system that would make obsolete or sunset a current vertical application. Companies that adopt the Microsoft philosophy of application development must be careful not to decommission the most stable application for new applications that offer features and benefits but may be deficient in data after the transition. The decision process must focus on customer experience based on speed, accuracy, and relevance. The goal is to provide a solution to solve a well-defined customer problem.
An alternative utilization of a current application could help create data extracts that could be transformed into content viewable via a web browser. The web includes the intranet, the Internet, and the extranet, as well as browser viewing on a mobile device. This data presentation requires security to control access. Security access level is defined by the role of a user: internal, external, or by project based upon need. This allows data to be structured, extracted, and manipulated to best serve the audience without a major application overhaul. (Structured data are defined as information classified as reports, such as inventory figures, costs, and other statistics presented as spreadsheets and stored in databases. Unstructured data are information not traditionally classified as reports, such as graphics, audio and video clips, and text from files such as newsfeeds, email messages, and word processing documents.)
There are many places to begin these integrations (Fig. 2). Geosciences data can be connected to the earth model, which is connected to the well plan for drilling. The well plan is connected to the previous preferred method of drilling best practices and events to consider. There is a connection to a contact list for contract information and any previous vendor successes. The financial systems are connected to the procurement systems and include vendor contract and pricing data. There are recommended practices for the delivery of technical information for the drilling and completion of the well. Well ownership group transitions from the development or exploration group to the production group where hydrocarbon production data must justify the investment in asset development. This information flows in multiple directions, from the reservoir management group into the production numbers for the hydrocarbon sales. Eventually, for field development or the possibility of a workover of the well, a previous completion is connected with the future completion. The cycle continues through the sale of the asset or through plug and abandonment.
Throughout this life cycle, the quantity of documents created is astounding, but the ability to find a particular document in a moment in time to support an argument is critical. The collection of information should not overburden operations and should be visible and shared. All data, from concept to completion, can be considered digital assets. They can be defined as findable electronic files that are property and can include everything from spreadsheets, word processed documents, artwork, logos, photos, presentations, text documents, voicemail, and email. Like all assets, data must be managed to be useful. This is digital asset management.
Culture paradigms such as “we have always done it that way” may be a hindrance to evolution. Visually mapping the entire process by data components, flow, and owners can be useful. This allows a team to look at the current process and the parts of the process for dissection and to identify areas for improvement. The end result may be an entire culture change, and this approach may help the team see the immediate benefits and help transition such a culture change.
Content is any information that is used to capture, develop, and support the business information process. Content management is any method used to store, preserve, and deliver the information in support of the business processes.
The content process for products and services begins with attributes and documents that are connected to a unique material number. A content management system is needed when there is too much information to collect, manage, and publish manually. When addressing materials, all attributes require standard or common names and naming conventions that include abbreviations and values to the other materials that are part of a well completion system. These materials may have attributes that describe a specific size, and those attributes have many values. The attribute values can be extracted and input into a database to be reused in multiple applications.
The application could apply to software tools that could evaluate dimensions for compatibility within a system to help ensure all components meet the requirements of the system. There could also be a secondary application of the data values and their attributes to produce a sales datasheet, which could be an independent document or a component document in a system. There are numerous feeds to documents such as case histories, technical bulletins, process workflows, and correction and prevention programs throughout the life cycle of the product to help ensure that data captured for sales and performance are readily available for business development opportunities. This information could be pushed to the business development team or to the valued customer as needed.
Virgin data are extracted from the system of record “as is” and can have different final destinations. Virgin data can be presented in different ways for different audiences. Principles of content define the content as data and the data must be normalized. Content requires differentiation and the process must be automated for approvals and it must be reusable. If content is not standardized and normalized, then the process for publishing content to a variety of formats must be streamlined and that process is expensive.
A template, a form, or style sheet for a common business process should be used for documents such as quotes, proposals, product brochures, case histories, best practices, assembly and disassembly instructions, and pulling and running procedures.
Forms allow for more rapid collection of information that is required for input data. Input data that have already been entered can then prepopulate the next form. Data entered into the forms can also be extracted as a report of the attributes and the product and services supplied. Content management requires addressing the organization and classification of the documents based on the business workflow that will best be served by the content organization.
Business process workflow defines the path and life cycle of data and forms to execute business operations. Content management or digital asset management is the creation, recording, storing, and currency of data and information used to support the business process. The business process and workflow must be supported by content management. Content management allows critical information to be stored, shared, and accessed globally, providing individual presentation to produce the greatest return on the investment.
Edison, L. 2001. Complex Products and Services Task Group Business Case, version 1.0. Petroleum Industry Data Exchange and American Petroleum Industry.
Feechan, M. 1997. Managing Operational Data Throughout the Life Cycle of an Asset. SPE Petroleum Computer Conference, Dallas, Texas, 8–11 June. SPE-38115-MS. http://dx.doi.org/10.2118/38115-MS.
Murphy, J. and Barling, B. 2001. Sell-Side Product Content: The Key to Supplier Empowerment. The Report on Customer Management. Boston, Massachusetts: AMR Research.
Schmitzer, J. and Randall, J. 1997. Workflow Automation Enhances Job Performance and Improves Job Execution Data. SPE Comp Appl 9(6): 167–170. SPE-38116-PA. http://dx.doi.org/10.2118/38116-PA.
Michael Tunstall is the US contracts manager for Lonestar West based in Dallas at its corporate US headquarters. Tunstall ended his 34-year career with Halliburton as a senior sales manager for completion tools. He began his career in operations and also served in business development, account leadership, knowledge management, and procurement. In addition to his work in North America land operations, he has worked in the Gulf of Mexico, South America, and China. Tunstall is the SPE regional director for the Mid-Continent North America region. His years of service to SPE has included being Dallas Section education chair, Denver Section chair, and serving on the SPE International Section Activities Committee. He received the Mid-Continent Region Service Award in 2013. In 2012, Tunstall was peer recognized by the Texas Independent Producers and Royalty Owners Association as one of the 14 best engineers in Texas. He holds an associate in arts degree from the Houston Community College.
Don't miss our latest content, delivered to your inbox monthly. Sign up for the TWA newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
29 October 2018