Data Modeling Helps Identify Upstream Root Causes
Grant Gerke | September 19, 2018
In my upcoming October Industrial Internet of Thing’s column, I discussed the mounting pressure on operations and maintenance (O&M) teams to manage data with so many connected devices and equipment on the plant floor. The column explores how much data should be presented to technicians and how a continuous data thread or machine modeling can help identify root causes on a production line.
From the upcoming IIoT column in Efficient Plant’s October Issue:
“Maintenance technicians are starting to push back when plant engineering looks to flood systems with data utilizing the more is the better mentality,” explains David Wilmer, vice president of manufacturing systems at The Aquila Group, Sun Prairie, WI (www.the-aquila-group.com). “Technicians are requesting sampling intervals be increased from one second to one to five minutes as the vast majority of data only slows effective analysis.”
However, many companies are more data mature organizations, farther along with their data strategies than the scenario above. Some manufacturers are employing data modeling software to connect process and quality data — key performance indicators (KPIs) — to provide better production line context.
Sight Machines’ Kurt DeMaagd (www.sightmachine.com) describes this scenario with a recent beverage customer in a recent blog post, titled, “The 4 Best Ways to Use Data to Optimize Continuous Flow Processes.”
Here’s an example of what I mean. One of our food processing customers was looking to drive efficiency and improve quality in the separation processes. Separation, where they divided milk into various products (whey, cheese curds, etc), was often a bottleneck. The length and efficiency of separation was a function of many batch variables including:
• Blend of protein and fat in the raw materials
• Temperature of the batches coming out of an upstream process (in this case, a cooker)
• pH of the output from an upstream fermentation process
We were able to create a data model or digital twin of the production line that included equipment/process sensor data with batch and quality data. This model was then used to analyze upstream cooker and fermentation data for specific batches.
From the outside, data modeling can be challenging since most manufacturers don’t have data scientists on staff. But creating a digital twin model — Change Requires Decisive IIoT Strategies — and finding the right consulting team for this process modeling can be the right fit and avoid an automation capital expense.
Even in 2018 necessity is the mother of invention.