Manufacturers are generating more data, faster, and from more aspects of their operations than ever before. The key to harnessing all that data to produce higher quality products more quickly and efficiently, and to speed up the decision-making process, is having an intelligent platform and good data governance strategies, agreed Sid Verma, General Manager of the Manufacturing/IIoT Division at Hitachi Vantara, and Mike Lashbrook, Vice President of the Esys Division and Digital Solutions at robotics automation company, JR Automation, during an Executive Dialogue at the MLC’s Rethink 2021 Summit this week.
An M4.0-ready intelligent platform should have several key components, they said. It must enable operators to understand the data being generated and present it in a form that enables users to create analytics or decision models. It must also be able to provide intelligent output based on those models.
To make the system work enterprise-wide, it must also be able to talk to both the OT and enterprise sides, said Verma. And it must be scalable, fault-tolerant so it can self-correct, and the user interface must be consistent with the way people naturally work and think. It should also be flexible and easy to maintain.
This all sounds good, but how does it work for manufacturers who are still in the process of automating their legacy systems? The increasing need to improve data quality across all the key performance indicators is driving the push toward intelligent platforms, said Lashbrook. “We have an enormous amount of data and intelligence being collected from connected equipment and smart sensors…the system needs to be able to ensure that, if there are small changes in the system, you can adapt on the fly and still get good quality.”
The future, he added, will be a completely integrated digital twin system that can quickly enable the operator to come up with the production model they need to move forward without disrupting the system.
There are challenges to scaling these platforms to where they need to be to achieve a full M4.0 operation, they acknowledged. “It’s a journey for us as a platform company to learn and adjust to be able to provide that value that the OT world needs,” said Verma. IT companies like Google make it look easy, because everything was IT-enabled and the protocols are clean. When you get to the OT side, however, it gets more complicated, depending on the age of the assets and the volume of data involved. While internet companies can just collect all that data and scale it, it gets too costly and difficult to follow their example on the plant floor.
The main challenge is the explosion of data being generated, which session moderator, MLC Co-Founder, Vice President, and Executive Director David R. Brousell called “the 400-pound gorilla in the room.” According to MLC survey data, manufacturers expect up to a 500% increase in data volumes over the next two years as they become more connected.
“Just collecting data on the OT side does not work for us” in the same way it works for a Google, said Verma. “We have seen horror stories where people spent their entire IT budget just collecting data because they didn’t know where to start.”
“Step one has to be stepping back and working with the operational focus. What are those KPIs whose operational efficiencies you want to improve? Then we have to make sure that we collect data around those, not just collect everything,” Lashbrook said. The approach is to focus on the value you’re hoping to create, then collect data associated with that value and figure out which aspects of the legacy systems need adjusting. “In the future, we can look at creating connected systems from day one. But for now, we have to work through these challenges.”
Verma agreed that companies should work upfront to develop a business priority, and a business use case that has an associated ROI, then bring in the engineering expertise. “If you are looking for predictive maintenance, let’s target the most critical failure that can happen. Then let’s try to collect the data to address just that particular failure mode. That way we limit the cost of the solution and the value goes back to the business.”
Many companies don’t have data scientists on staff to help analyze the data, but that shouldn’t stop them, said Verma and Lashbrook. “The first phase of deployment for industry 4.0 is to get that expert knowledge from people in quality and maintenance using older systems and automate that information,” said Verma.
For example, if a technician hears a noise when a motor fails, put in an acoustic sensor that can be alerted when the tech hears that noise. Once your models start showing more accuracy, then companies can begin to layer in data science on the areas that are most business-critical. “That has been our recipe for giving incremental value and industry 4.0 at a much lower cost profile,” he said.
Lashbrook added that, if you don’t have the expertise in house, bring in partners that can fill the gaps. For longer term solutions, look to hire people who have the new skillsets you will need.