Automation Condition Monitoring Maintenance Predictive Maintenance Reliability

Not All Digital Twins Created Equal

EP Editorial Staff | September 1, 2022

A digital twin is a computer simulation of a physical system or piece of equipment that can be used throughout the life of the facility.

Much more than a simple 3D CAD rendering, digital twins can significantly advance operational performance.

By Mark Childs and Walt Ravelo, GE Digital

A digital twin that provides value to the operating capability of a power plant must include physics and empirical-based modeling, product design information, operational data, and failure-based mechanisms. More than just a visual representation of an asset, a digital twin can provide the actionable data necessary to prevent failures, along with analysis and design information needed for prognostic decision making.

Today a digital twin is commonly defined as a software representation of a physical asset, system, or process. It’s designed to detect, prevent, or predict, the overall state of a system or piece of equipment within its lifecycle. While this broad definition may suffice at a high level, it doesn’t define the specific characteristics that make one digital twin more valuable than another. 

In reality, a digital twin is a computer simulation of a physical system or piece of equipment that can be used throughout the life of the facility. It should contain four key elements:   

• mathematical model describing a system or piece(s) of equipment
• model created with industry-specific data that describes how the system or equipment was built
measured values from plant instrumentation to simulate system or asset operation and validate the calculated results
computational and visual method to analyze those results and gain operational insights. 

There are also two key elements that are necessary to properly support a digital twin. First, data accuracy and completeness need to be of high quality. Bad data in equals bad data out. Second, the “as designed” and “as running” process of systems and equipment must be considered during development. When applying artificial intelligence (AI)/machine learning (ML) concepts to a digital twin, even minor deviations can affect output accuracy. While talented operating teams can build workarounds to these conditions, in an ML paradigm this could mean adding an implied bias to real and lasting optimization.           

By this definition, a 3D CAD rendering or a static picture of systems or equipment is not a digital twin. It is merely a visual representation that lacks the power to act on your data. 

To truly realize business value, digital twins should not be narrowly focused on siloed solutions as they are typically inefficient, error prone, hard to maintain, and not scalable enough to keep up with growth.

ML, Maturity, and Scope

As in all ML concepts, the more current, accurate, and complete the data provided to a digital twin, the better fidelity and accuracy you get as a result. That means that a new digital twin will improve as it collects data and matures into an accurate representation of the systems or equipment. 

For example, in the case of a single analytic, on day one of a digital-twin activation, assumptions have to be made regarding expected performance. At this point, historical data will be used to shape digital-twin output behavior. On day two, the digital twin begins to incorporate real-time operating data that improves accuracy. The digital twin also logs the data, benchmarks asset performance, and identifies anomalies. 

As time goes on, the real-time data feed continues to improve and maintain the accuracy and fidelity of the digital-twin results. As model maturity increases, AI can also be deployed to turn ML outputs into forward-thinking decision making across the plant. Just as ML needs more data and time to mature, the same can be said for AI. 

Digital twins do not have to be limited to a single analytic or a single form of data. To truly realize business value, digital twins should not be narrowly focused on siloed solutions that do not integrate and scale. These siloed solutions ultimately require manual manipulation and aggregation of the outputs from other siloed solutions to create a comprehensive view and turn data into information that drives decisions. Systems that require this type of manual effort are typically inefficient, error prone, hard to maintain, and not scalable enough to keep up with growing organizations and technology evolution. 

Complex digital twin

Imagine a digital twin, or system of digital twins, that uses data from multiple systems such as operating rounds, instrument calibration, anomaly dictions systems, tribology, vibration programs, equipment criticality, and business systems (EAM/CMMS, financial systems, and historians). This twin can scale across multiple plants and technology types (solar, wind, battery, hydro, grid, fossil, nuclear).  

If this complex digital twin is modular to enable deployment using a phased approach, then possibilities are endless. Considering an investment in assets that are expected to run for 10, 20, and sometimes 30 years, fleets that have mixed technologies, or fleets that have mixed life spans, then a complex digital twin that integrates and scales with a plant’s needs makes sense. 

Next Steps

Once you have identified where to invest in digital twins, determine how you should deploy them. 

Most of today’s digital-twin-software solutions require manual updating. While closed-looped AI/ML solutions allow digital twins to be updated in real time, open-looped solutions still remain the industry preference. 

Open-looped solutions with advanced digital twins provide analytics that combine critical data into actionable insights. This allows operating personnel to control the systems while empowering teams to make data-informed decisions. On-premise solutions without AI/ML capabilities allow companies to own the software on their own network, making any updates to digital twins the customer’s responsibility. If the solutions are in the cloud, however, the digital twin model(s) are typically updated by either a service provided by the company solution or in-house teams.  

Whether you’re part of the Energy Transition or Industry 4.0 movement, the development and enhancement of machine interconnectivity, automation, ML, and real-time data are your future. Today, there are a large number of equipment manufacturers that are offering smart equipment packages, integrating low-cost sensor capability, and modifying controllers to meet the needs of digital technology.   

Case study

A digital-twin mathematical model is based on the energy balance in a pump, analyzing the electrical energy supplied to that pump, and converting the energy to the expected fluid energy leaving the pump through its discharge. This is then compared to the manufacturer’s pump curve. Combining this information with data collected from the running pump instrumentation provides the digital twin with the as-operating data to properly simulate pump performance. This performance can then properly access expected inputs, such as minimum and maximum flow, high temperatures, lubrication requirements, pressure deviations, and variable speed drive (VSD) inputs.  

A digital twin that is fully integrated into the EAM/CMMS and alarm systems can quickly assist operating teams in correcting performance and operational conditions. Expand that across all critical systems and it’s easy to see that digital twins can process hundreds, even thousands, of operating conditions, and analyze numerous deviations, all while providing operating teams with actionable information. 

Often, organizations purchase perceived “easier” or “cheaper” solutions that allow increased savings for further investment. With digital twins, taking the perceived easy way out can create tremendous downside for the operation. 

As previously mentioned, siloes are detrimental to the ability to train digital-twin models to solve business problems. With some down-level solutions, the need for inefficient manual manipulation of models increases and there is a higher likelihood models are not receiving all of the data required to deliver maximum potential from the digital twins. 

If you start to create a culture of data across business lines and open data sources for model consumption, the entire picture becomes clearer. You can quantify data and how it affects performance. By aligning clear and quality data to your major investments, you are creating the potential to derive tremendous value from digital twins. EP

Mark Childs is an Energy Industry Subject Matter Expert for GE Digital, San Ramon, CA (ge.com/digital). Childs has more than 45 years of energy-industry experience, including various leadership roles in GE’s Gas Power O&M organization and was a key principle in GE’s Total Plant Optimization organization. 

Walter Ravelo has 32 years of power-industry experience and is currently a Sr. Director leading the GE Digital Solutions Team for the Americas. Prior to this, Ravelo served with GE Engineering Services as an Industry Manager for Power Generation and was the Global Project Engineering Manager for power, oil & gas, aviation, and mining industries for the GE SmartSignal analytics product line.

FEATURED VIDEO

Sign up for insights, trends, & developments in
  • Machinery Solutions
  • Maintenance & Reliability Solutions
  • Energy Efficiency
Return to top