metamorworks/iStock/Getty IMages
woman-plant-data-iiot-metamorworks-istock-getty-web.jpg

Five Ways to Rethink the Production Data Discussion

July 26, 2019
Understanding the data generated closest to potential quality issues can better position your plant to make the right Industry 4.0 investments.

When I meet with discrete manufacturers to discuss plant floor quality and efficiency, the conversation always turns to Industry 4.0, and what exactly it is and how to deal with all the equipment involved. Many of these mid to large-size companies already have a significant investment in ERP, MES, SCADA, and IOT devices, but standardization of the data collection strategy can vary from plant to plant even within the same company.

These systems are excellent at providing production reporting and analytics, but what happens when you isolate quality issues to a particular work cell or machining area? What diagnostic capabilities are present to determine root cause within manufacturing operations. And what upstream stations are contributing to quality issues?

This is where the collection of high speed “part production” data, generated by each cycle of process or test station on the line, can truly add value by tying together with production reporting data. This ultimately enhances the ROI of the entire top-to-bottom manufacturing software stack.

Part production data ranges from the scalars providing binary pass/fail results using whatever limited data collection and analysis capability is embedded in the station, machine vision inspection images and related data, and the more granular data collection and analysis (such as digital process signature analysis) that can only be achieved by upgrading the station with a third-party process monitoring and data analytics system.

The word “Upgrade” can raise a red flag with manufacturing managers and engineers. Automotive OEMs and their big suppliers are of course accustomed to integrating a host of equipment and software from third-party suppliers into their lines, but the tone of the conversation may take an abrupt turn if they feel that new technology investments undermines existing ones.

Here are five ways to reframe this discussion.

1) Collapse the cycle

Making the most of part production data is about collapsing the cycle (and the cost) of having to reengineer a product if a line is suffering from high scrap and rework rates.  This fits in with the Industry 4.0 value message from the PLM vendors by allowing the comparison of as-designed details to as-built data.

2) Dig for the Right Data

Everybody has a different handle on how to make the most of the data at hand to deal with production issues, including SCADA, MES, ERP, PLM, and so forth.

Each of these layers has its strength and role to play. Each can raise a red flag when a certain issue arises at a certain point, but none provides the diagnostic means to quickly trace and profile the root cause when that quality issue is cropping up on the line.

Only the production parts data, the data generated closest to the source of the quality issue, can. These data provide near real-time insights into what is happening through each cycle of a production process or test.

3) MES, ERP or the Rest are Not at Risk

This emphasis on production parts data should not be seen as a threat to existing/legacy enterprise software investments. The data management and analytics systems that enable this level of Industry 4.0 insight are typically additive. They should be considered complementary, not grounds to replace or switch off other platforms that work just fine doing what they were designed to.

4) Focusing on Cost savings

The net effect should be cost savings. I talk with automotive OEMs that five or eight years ago were all about big and expensive software-based infrastructure projects. Now their focus has become more granular with the need to save money and reduce costs. They are asking what new technologies will allow them, for a relatively modest cost, to get more value out of their existing enterprise platforms.

5) Without large-scale deployments out of the gate

Sure, modern MES and ERP systems are still on the radar for many manufacturers. These are by their nature big systems with complex implementations. By comparison, deploying the means to make more effective use of part production data is a more modular and modest affair.

A common approach is to tackle a specific bottleneck on the production line, such as with a leak test. With leak, we typically run into two types of issue: Repeatability and reliability problems with the test itself, or high failure rates because of some un-traced issue with an upstream station.

Regardless of which scenario you face, in-process testing and digital process signature analysis can quickly trace root cause and provide the data baseline with which to prevent the issue from reoccurring. The point is that once you have a successful pilot/proof of concept using your own data, to solve a real problem that was costing the plant money, it’s much easier to secure broader buy-in from the team.


Tim Williams is Vice-President of Global Sales at Cincinnati Test Systems.

About the Author

Tim Williams | Vice-President of Global Sales

Tim Williams is Vice-President of Global Sales at Cincinnati Test Systems. He has worked extensively with major industrial companies and manufacturers on the challenges of data utilization from the plant floor and deriving real business value from Industry 4.0.