Use the Data You Generate
EP Editorial Staff | August 15, 2021
To benefit from AI/ML technology, break down the silos that isolate unused but valuable data.
By Rajesh Ramachandran, ABB Process Automation
Today, the average industrial plant uses less than 27% of the data it generates, according to industry experts at the ARC Advisory Group, Boston (arcweb.com). Typically, the remaining 73% of data—much of it produced by plant process-control system as high-frequency operational control (OT) data—is put in a historian and seldom used.
In addition, there are large volumes of other valuable functional data residing in the company’s general business or IT systems, and still more in the engineering systems (ET), covering specific design information for various assets. In addition to being rarely used, all of this data is normally scattered about in separate silos and networks that support little or no cross referencing.
That’s where the golden opportunity lies, which we can now unlock with new software platforms that simplify better convergence and analysis of OT/IT/ET data. The benefits can be impressive, such as higher production rates from existing assets, less downtime as a result of predictive-maintenance practices, safer operation, reduced energy and other raw material inputs, and lower environmental impact.
By better convergence of OT/IT/ET data, we mean bringing together previously separate elements, which have now been streamlined and integrated so that they proceed seamlessly. To achieve this, we accumulate all OT, IT, and ET data in a data lake. Next, we contextualize and store related data in an industry-specific data model, such as paper making or plastic extruding, for example. Then we apply advanced analytics and industrial AI algorithms to identify correlations that were not previously visible.
Industrial artificial intelligence (AI) can play a major role in identifying these patterns and making process predictions. The terms AI (artificial intelligence) and ML (machine learning) are often used interchangeably, which can be confusing at times. AI is the overarching science of making machines and physical systems smarter by embedding “artificial intelligence” in them. ML is a subset of AI that involves systems gaining knowledge over time through “self-learning” to become smarter and more predictable, without human intervention.
As an example, consider a motor, which is an essential and omnipresent manufacturing asset. In terms of data we have:
• OT data: Motor speed, vibration level, and bearing temperature are typical parameters monitored in real time by OT systems to tell us how the motor is performing. This normally comes from automation system components such as PLCs or a DCS.
• IT data: If we want to see things such as the motor’s maintenance history, when it was last serviced, how much has been spent on repairs, or if the right bearings are in stock, we must find it in various IT systems, usually somewhere in the ERP solution.
• ET data: Information about factors such as whether the motor is within its design speed limits, how much vibration it can take, the safe operating temperature for its bearings or what the useful bearing life should be, is all residing within the ET (or engineering design) systems.
To acquire a holistic overview of the motor, we integrate information from all of these systems and store the relevant pieces in a contextualized data model. This allows us to visualize and activate optimum equipment operation for the best overall process results.
A motor is just one example. In a large manufacturing plant, there can be hundreds of such assets performing several functions and running under different operating conditions with varied design parameters; all with data stored in various systems. Widespread OT/IT/ET integration and contextualization is, therefore, critical to obtain a complete view of the plant and carry out valuable analytical tasks that improve operations, asset integrity and performance management, safety, sustainability, and supply chain functions. What emerges are patterns that accurately predict future behavior, allowing improved process performance.
We have been using AI/ML to deliver a higher degree of prediction accuracy and optimization to operations, processes, and assets. Combining AI with deep industrial domain expertise empowers operators to run their industrial processes safer, more effectively, and more sustainably.
There are several barriers—perceived and otherwise—that hinder implementation of advanced analytics. The most common reason for hesitation is the perceived complexity. People mistakenly think it is much more difficult to achieve than it is. Another explanation we get is the incorrect belief that, to use big data, you must make massive capital expenditures, because it is an “all or nothing” undertaking. But it isn’t. You can start with small steps. Other reasons might be lack of cooperation between OT, IT, and ET people, and just generally slow adoption of new digital tools in many industrial sectors.
The fact is that it is easy to join this digital-mining journey, no matter where you are, using data and signals that are already available in your process control, business, and engineering systems.
As the CDO at ABB Process Automation, Rajesh Ramachandran focuses on implementing new strategies to obtain greater value from vast data silos in manufacturing companies. Based on his extensive experience with big data analytics at eBay, Oracle, and Rolta, combined with ABB’s process control expertise and deep-domain knowledge in dozens of industry sectors, he is driving the creation of new digital platforms that simplify the task of extracting targeted data to successfully predict outcomes and optimize manufacturing processes.