Part 2 | Case Study Shows Data Management Challenges
Grant Gerke | August 31, 2018
>> Read part 1 of this column on how operators and maintenance personnel are pushing back on too much production data.
Driving efficiency with automation technology is putting pressure on many plant executives to manage much more data within the enterprise, as more sensors are added to processes each year. While many plants have been producing big data for years, the idea of acting on this data in real-time — usually languishes in historians — is a daunting task for large and small manufacturers.
According to Jim Wetzel, interim CEO of the Clean Energy Smart Manufacturing Innovation Institute, Los Angeles, CA (cesmii.org), most companies need to hold onto all data. Wetzel worked at General Mills for more than 32 years and spent a majority of his time at the company as the Director of Reliability.
“General Mills collected over 700 billion data points a day and probably 98% of those we never used every day,” says Wetzel in a recent phone interview. “But, I still would collect all this data for troubleshooting purposes and want to have that information there when troubleshooting a line or piece of equipment.”
However, many operations and maintenance (O&M) teams are pushing back on data policies. “Some companies learn the hard way on the true cost of data bloat,” explains David Wilmer, vice president of manufacturing systems at The Aquila Group, Sun Prairie, WI (www.the-aquila-group.com). “These lessons come in the form of maintenance tablets not having enough processing speed for reports to misconceptions that historical data can be viewed and updated in real-time on conventional hardware.”
For maintenance teams, there is a real struggle due to legacy devices and, more importantly, the large number of SMS messages related to predetermined threshold changes in voltage, amperage and pressure.
“Without question, modernization projects show responsible data capture of machinery provides a reduction in maintenance cost,” says Wilmer. “However, data saturation can quickly negate the very advantages organizations are looking to achieve.”
The Aquila Group provides training, system integration and consulting services for plant modernization projects. The company’s Green Light Monitoring System gathers machine-level data for OEE measurement and feeds this data to the company’s Dynamic Machine Management (DMM) manufacturing execution system (MES).
In a recent case application by The Aquila Group, Wilmer’s pointed to an example of pushback by an O&M team when it came to a robotic process and the amount of data needed. For the evaluation, the consulting company produced “3 days’ worth of data, which generated approximately 276,000 records per sensor output, with the manufacturer insisting on sampling x, z vibration, and temperature once a second.”
To counter this, Wilmer suggested a linear 5-min. write heartbeat, coupled with ISO 10816 triggering when various zones are reached, such as exit “Good” at .03, exit “Satisfactory” at .07, and Exit “Unsatisfactory” at .018.
“From there, we did an impact study and found the manufacturer could reduce its monitoring to less than 903 records per output; recording only when values changed,” says Wilmer. Even with 903 records, this would still be outside actionable levels but ofar better than 276,000,” explains Wilmer.
Efficient Plant’s View: The big takeaway is outside monitoring companies can provide experience and help to avoid excessive operating costs and the chance to keep up with your competitors.