twitter linkedinfacebookacp contact us

Optimising pipeline operations

By using analytics to augment and empower the workforce, pipeline operators can optimise operations and improve asset availability. (Image source: Adobe Stock)


Stuart Parker, Oil and Gas Industry principal, AVEVA, explains how an intelligent pipeline can elevate operational excellence, efficiency and safety

The global pipeline market picture is more nuanced and unpredictable than at any time in history. The industry is presented with complex new challenges, as well as vast opportunities.

A heightened demand for energy security is driving demand for expanded oil and gas pipeline infrastructure. Now, industry executives must balance stakeholder expectations, including affordable and flexible energy, profitability, and environmental sustainability – particularly mandated decarbonisation targets. What’s more, companies must strive to reduce operational costs and their carbon footprints across complex value chains, including a growing and ecosystem of business partners.

As a result, more companies are looking to digital transformation to drive effective capacity, not only through CAPEX, but also OPEX investments. By using analytics to augment and empower the workforce, pipeline operators can optimise operations and improve asset availability, creating scalable business models with shorter lead time, all while swiftly responding to market changes.

Today’s forward-looking oil and gas companies are installing intelligent pipeline frameworks to quickly turn massive amounts of data into wisdom that generates business value. By using existing operational data as well as new data sources, companies can take a model-focused approach that puts them on the path to operational excellence.

When organisations successfully execute these digital transformation strategies, teams can uncover opportunities to significantly improve equipment reliability, operational efficiency, safety, and overall business performance.

Not all data is equal

Oil and gas pipeline companies were collecting huge amounts of operational data long before the term ‘Industrial Internet of Things’ (IIoT) was coined. However, turning vast amounts of raw data from SCADA, pipeline applications, ERP systems, and more into contextualised information around equipment and processes is often challenging. Contextualising this data ultimately enables operational improvement.

Not only does a wealth of raw data, devoid of context, structure, or quality, rarely pay dividends, those tasked with utilising that data often find it difficult and cumbersome to extract insights. If users are too slow to develop and implement sustainable solutions, the company will accrue significant lost opportunity costs. When unstructured operational data builds up in data lakes, traditional IT technologies can create more problems than they solve, as businesses must spend more time wrangling data than using it to deliver business value.

Unfortunately, many companies are rushing to layer in new technologies and solutions such as cloud, machine learning, edge, IIoT, and predictive analytics before building the right data and analytics foundation. Adopting these new solutions can potentially deliver new and valuable insights, but pipeline companies must first enact solid data management and analytics strategies. Deploying an enterprise-level, real-time data management platform lays the foundation for future technology success.

Know your data

To produce actionable intelligence, data must be structured and accessible to those who can best use it, particularly subject matter experts who have the knowledge and experience to put data insights into action.

Digital transformation success hinges on having a single source of truth. Operations data must first be standardised and contextualised before it can be analysed and visualised. Comprehensive data management systems can lay the foundation for operations data integration, data validation, and analytics. AVEVA’s PI System, for an example, is an agnostic data management platform that combines, abstracts, and normalises disparate data sources from multiple control systems and information silos into one centralised location.

A centralised operations data management platform then uses standardised and templatised tag-naming conventions and assets are cataloged in a flexible hierarchy. This platform becomes an operational system of record, creating the foundation to democratise insights across any pipeline business model.

Using the data model, companies can accelerate digital transformation by combining operational data into a digital replica of physical assets. This can be enabled by developing a ‘digital twin’ of the entire system using information such as drawings, 3D models, materials, engineering analysis, dimensional analysis, real-time pipeline data, and operational history.

During the operational life cycle, the digital twin is updated automatically, in real time, with current data, work records, and engineering information, to optimise maintenance and operational activities. Engineers and operators can easily search the asset tags to access critical up-to-date engineering and work information in order to diagnose the health of a particular asset.

Previously, such tasks would take considerable time and effort, and issues were often missed, leading to failures or pipeline outages. With the digital twin, operational and asset issues are flagged and addressed early-on and the workflow becomes proactive instead of reactive. Pipeline companies can easily benchmark operational performance, such as pipeline throughput and energy consumption, to uncover gaps and improve pipeline efficiencies.

For more from AVEVA’s Stuart Parker on digitising the pipeline, see the latest issue of Oil Review Middle East, at