All News & Insights

ML Journal

Dialogue: Developing Pella’s Data Driven Culture

Creating a culture where all employees understand the importance of data to driving operational improvements and value is crucial to the future of manufacturing, believes Pella’s Head of Advanced Analytics, Jacey Heuer. 

“What drives me is being part of that blank canvas and having the opportunity to create something with data analytics that is so game changing not only to Pella, but to the whole industry.”

Jacey Heuer, Head of Data Science and Advanced Analytics, Pella Corporation

Pella Corporation was founded in 1925 when lumber business owners Peter Kuyper and his wife Lucille acquired a small manufacturer of roll-up insect screens called the Rolscreen Company and moved the business to Pella, Iowa. As Pella approaches its centenary in 2025, the company has now grown into an award-winning $2 billion enterprise, with 10,000 employees, 17 manufacturing plants across the U.S., and one of the largest window and doors manufacturers in the U.S.

In our latest interview with an industry thought-leader, Pella Corporation’s Head of Data Science and Advanced Analytics, Jacey Heuer, talks to the MLC’s Executive Editor Paul Tate about transforming a mature corporation in the digital age, the need to build a solid data foundation to help reimagine traditional processes, harnessing data analytics and AI technologies to create new insights, and creating a culture of ownership where every employee understands the true value of data to improving the future of the organization.

Q: What excites you most about your role at Pella?
A: It’s the nascent nature of data and analytics in the manufacturing industry, especially in advanced data science and machine learning AI environments. Historically, there have been data and statistics and optimization initiatives in many different forms, like Six Sigma, Lean, and traditional business intelligence systems, which have been around for decades. That’s the precursor to what we’re trying to create now with the next generation of data driven solutions.

My focus is on how we accelerate to that next level of leveraging our data at Pella to drive greater economic value and greater impact across the enterprise – from customer experience and marketing, to improving the P&L and optimizing throughput capacity in manufacturing operations.

“The challenge is how do we tear the traditional processes down to introduce a new data solution that drives improvement? That’s the greatest part in all this. It’s re-imagining the way things are done.”

 

A lot of this stems from how we collect data. In the past, there’s been almost a taboo around adding individual transactions to data resources. Traditionally we’ve tried to eliminate that transactional data to create more efficiencies and productivity. Yet, all those transactions are data rich sources of what’s really happening as materials or products go through a facility.

This is one area where my team is now helping to change that perspective because, in a data-driven world, those transactions are valuable to us to help us continue to develop greater optimization, more insights from AI and machine learning, and better outcomes. What drives me is being part of that blank canvas and having the opportunity to create something with data analytics that is so game changing not only to Pella, but to the whole industry. There’s a lot of excitement and opportunity in that.

Q: How are Pella’s data analytics and data governance activities structured within the organization?
A: The corporate Data Analytics and Insights team, which I help to lead, is around 14 people right now and it spans core roles in data science, data analytics, data engineering, data governance, and data quality.

Our vision is a hybrid model where each plant still owns some of their data and analytics, but we at least have a governance role or advisory role in how those plants and different facilities are leveraging and using that data. I fact, we prefer to use the term data enablement, as opposed to governance, because it has a more positive connotation.

We’ve also established a corporate Data Council, and a layer of data stewards who are the front-line owners of data analytics in different facilities and are our main conduit to make sure that we’re developing and rolling out solutions in an effective manner.

The Data Council meets on a quarterly basis and includes directors and senior level leaders who provide executive sponsorship and buy-in, oversight, guidance around prioritization and what’s most important, and help steer how we drive forward. These leaders act as a megaphone for us and go down into their functions to help drive the change in mindset around data generation and use.

Further down, we also have the layer of data stewards. These are people on the frontline who may have developed a legacy system, or are in our continuous improvement environment, or in our engineering space, and are the true data owners who interact with data day-to-day. We have a roster of these data stewards, and we meet with them on a monthly basis to continue to develop a culture of data ownership and an understanding around the importance that data needs to play as move to the next generation for Pella as a data driven organization.

Q: What are the main data analytics initiatives now underway at Pella?
A: There are two key initiatives, both targeted ultimately on improving the customer experience. The first is a smart factory initiative. We have two plants that are developing different elements of a smart factory. One plant is focused more on automation, so autonomous vehicles, robotics, and those kinds of technologies. The other plant is focused more on process optimization. There’s a lot of cross-over between those, but that’s they’re starting focus.

The other key initiative is reliable delivery. We have a goal of “industry best” reliable delivery at Pella. So, we are working on how we ensure on time delivery by fully leveraging the data and applying predictive modelling using AI and machine learning technologies.

“If you don’t have a focused diligent investment into the type and frequency of the data you’re collecting, your AI and machine learning outputs are likely to struggle and perhaps fail.”

 

Within both of those initiatives is a focus on people, understanding individual capabilities and skills so we can optimize the role they are working on in the factory and make sure they are placed in the best position to help drive optimization.

Q: How are you integrating the data generated by legacy systems into the new analytics framework?
A: Legacy systems historically tend to be production databases where the data only exists for a short length of time before it’s purged out. We’re now applying more event-driven monitoring technologies where we can monitor each system in more detail and anytime a new event occurs, we’re taking that data out and moving it into our data lake environments to create a history of all the data. We’ve introduced technologies using application interface layers (API) that essentially provide one entry point for all those legacy systems. This helps us gain greater access to the legacy systems and move the data more easily to locations where data scientists can then work with it, or machine learning can happen, which is mostly in our cloud environment.

On top of that, from a front-end consumption standpoint, we’re developing our own internal applications so we can take the output of our AI data science models and feed it into those applications where we have more control over the user interface and experience for the people who need the information. That helps bridge that gap from the legacy to the modern, and helps develop the people, the resources, and the skills we need to make the transition occur.

Q: Once you’ve gathered all this data and analyzed it to give you new insights, how do you then operationalize those insights to drive better decisions?
A: That’s a great question. This is where the network of data stewards and the data council becomes crucial. Ultimately, they’re the people who are helping us connect with the users on the shop floor, the line managers, and the production managers, who are going to be consuming these outputs. And it’s also about how we deliver that information to the users. Think about your phone, or Facebook, or Google. There’s a whole team of individuals focused on your user experience. Historically, when you are getting a model or AI solution into production, a lot of its downfall can be because of the user experience. It can make a cool prediction, but as a user, it’s not always intuitive to use. Or there’s other reasons that might limit a person’s willingness to adopt it. We’re now leaning into that aspect of how we actively engage with our users to make sure we’re creating the right user experience so that our outputs are meeting that need. That’s helping to drive a lot of consumption and adoption.

Q: How are you harnessing AI technologies to help drive this data-driven transformation?
A: AI and machine learning are fueled by data. Like a car, or a tractor, or any big piece of machinery, it’s the fuel that makes it productive. It’s the same with AI. That data foundation is where a lot of our focus has been initially and on how we continue to influence the type of data we’re creating. I mentioned event driven technology solutions, for example. That stream of event data coming in means that we’ve now got much more clarity from when an order enters our system, to every event along its journey, to when it exits. We can see the timing and the context around that journey. We’re now able to build predictive models that are much more accurate on when an order reaches different milestones to ensure it can be delivered on time and in full to the customer. That’s an area that we’ve had a big initial impact in building predictive models that drive scheduling and the decision points along the way. On the smart factory side, for example, we’ve got a big focus on process optimization. There can be somewhere between 1,000 and 5,000 carts moving around our door facility at any time, which is where our automation focus on smart factory is happening. We’ve now applied RFID tags to the carts so we can collect passive signals as they move through the facility. We’re then leveraging that data to improve route optimization within our plants because we are able to establish see a clear linkage between the impact of delays at certain points in that process to lost units created that day. There’s clear value in that use case. The next stage is to understand how we can best scale that across our 17 plants.

“One of the key leadership skills for the future is greater trust in the individuality of all the team members, all the workforce and allowing them the freedom to create, to think, and to make decisions.”

 

It’s that kind of data foundation that’s going to fuel many more AI use cases for us as we go forward. Of course, understanding the intrinsic value of having a strong data foundation is something that’s not always intuitive for everyone to grasp. That’s an area where our storytelling has continued to evolve for our leadership and our plants about why investment in those technologies is going to serve us in the long-term. If you don’t have a focused diligent investment into the type and frequency of the data you’re collecting, your AI and machine learning outputs are likely to struggle and perhaps fail.

Q: What key challenges still keep you awake at night?
A: One of the core challenges is cultural. We’re trying to change the mentality so that everyone in the company has a greater understanding that every point of data, every action that someone on the shop floor or in manufacturing takes, either does, or will eventually, generate some data point that’s going to be crucial input to helping improve their lives, improve the P&L, and improve our impact as an organization as we go forward.

Another challenge, but a good challenge, is how we solve some of the technical aspects of creating a data-driven organization and re-imagine the processes that currently exist. We can develop models, develop solutions, push them to the five-yard line and are ready to hit the end zone, but then we come up against a wall of the way things have always been done. People say, “Well, this is the process”. The challenge is how do we tear that traditional process down to introduce a new data solution that drives improvement? That’s the greatest part in all this. It’s re-imagining the way things are done. In a hundred-year-old organization with a hundred years of legacy and processes in place, that’s what keeps me up at night in many ways.

“Eventually, anybody who interacts in the organization and takes an action will be able to understand how it impacts the bigger network of what’s happening.”

 

Q: Looking ahead, what would you highlight as the greatest challenges and opportunities for manufacturing industry as a whole for the rest of the decade?
A: I think there’s a great opportunity to create a truly connected network of data within a manufacturing organization, so that if a decision is made on one side of the business, people can see the impact that decision has way over on the other side. Eventually, anybody who interacts in the organization and takes an action will be able to understand how it impacts the bigger network of what’s happening. For example, if I make a decision to reorganize some physical part of a plant, I can see the expected impact on the other plants that are outside my immediate location. It’s a connected network of understanding, supported by a focus on the data foundation and driven by investments and front-line engagement in creating that data foundation.

Q: What key leadership skills, attributes, and roles do you feel that senior industry executives now need to lead successfully in an increasingly data-driven world?
A: One of the key leadership skills for the future is greater trust in the individuality of all the team members, all the workforce and allowing them the freedom to create, to think, and to make decisions. As people get more transparency and more understanding of the connectedness of every decision they are making, their knowledge of the true scale of impact is going to increase, so they should be able to make better informed decisions and take better informed actions. In an established legacy organization, it’s a different mindset to allow individuals that freedom and capability. From a leadership standpoint, it’s about being okay with that flexibility and that freedom.

Q: Finally, if you had to focus on one thing as a watchword or catchphrase for the future of manufacturing, what would that be?
A: It’s intelligence augmentation. I like that term because I think it captures, in those two words, the role that artificial intelligence and machine learning and data should play, and will play, in organizations in the future. It also highlights that humanity, people, are not going to go away, and that we need to ensure there’s teamwork between those people and the new technologies that will come our way. Intelligence augmentation captures that well.  M

FACT FILE: Pella Corporation
HQ: Pella, Iowa
Industry Sector: Windows and Doors
Revenues: $2.17 billion
Net Income: (Privately held)
Employees: 10,000+ Employees
Presence: USA
Production Sites: 17+ Manufacturing Sites
Website: www.pella.com

EXECUTIV PROFILE: Jacey Heuer
Title: Head of Data Science and Advanced Analytics, Pella Corporation
Nationality: American
Education: BA degree, Business Administration – Finance & Economics, Wartburg College; Masters’ degree, Business Analytics and Data Analytics, Iowa State University; MBA, Iowa State University
Languages: English
Previous Roles Include:
– Data Scientist, Pella Corporation
– Data Science Author, Pluralsight
– Data Scientist, Brownells, Inc.
– Senior Pricing Strategist, MidAmercian Energy
– Assistant Vice President, Situs RERC
– Financial Analyst, The Members Group
Other Industry Roles/Awards/Board Memberships
– Golden Key International Honour Society, Iowa State University
– Iowa Technology Leadership Institute member

About the authors:

Paul Tate is Co-founding Executive Editor and Senior Content Director of the NAM’s Manufacturing Leadership Council.

 

 

 

ML Journal

Case Study: The Future Starts with Citizen Data Scientists

How IBM is developing the next generation of transformation-drivers by turning supply chain SMEs into Citizen Data Scientists 

Company Fact File

Name: IBM Corporation
Sector: Information Technology and Services
HQ location: Armonk, NY
Revenues: $10 billion plus
Employees: 5,000 plus Employees
Web url: www.ibm.com

When COVID-19 hit, IBM Supply Chain found that its digital transformation journey had enabled the resiliency needed to tackle major disruptions. That awareness led to the decision to further accelerate their digital transformation projects. To keep up with the demand of exploiting emerging technology while preparing for future, IBM launched a first-of-a-kind transformational upskilling initiative designed to democratize data while empowering individual business technologist (SMEs). This transformative initiative is a prime example of the mantra Ron Castro, IBM Vice President and Chief Supply Chain Officer, constantly challenges his organization with “innovate anywhere, use everywhere.”

The initiative, dubbed the Citizen Data Scientist (CDS) Certification program, was designed to help supply chain SMEs develop the data science skills they need to make data-driven decisions in a fluid business environment — and then roll it out to the rest of the company to engage employees in successful process improvement throughout every part of the organization.

In just the first year, the CDS program developed by IBM Supply Chain, in partnership with the IBM Data Scientist Profession Team, the company already was seeing results. The first supply chain SME cohorts to go through the program’s data science, data wrangling, and data visualization upskilling and reskilling structured learning program resulted in more than $1.6M in efficiency and productivity improvements and inventory savings from improved decision-making.

All this in just six months, and with zero financial investment. And the program has only continued to grow ever since in both the breadth of its content and its reach across the company.

What Is a Citizen Data Scientist?

As manufacturing moves into M4.0, the role of big data is only getting bigger — and more ubiquitous throughout operations. Data scientists, who usually have a master’s or PhD, are essential. However, there may not be enough of them to go around, and even if there are, their level of business expertise may not be sufficient in a specific domain.

Hence the rise of the Citizen Data Scientist. According to Gartner, a citizen data scientist is “a person who creates or generates models that leverage predictive or prescriptive analytics, but whose primary job function is outside of the field of statistics and analytics. They bridge the gap between those doing self-service analytics as business users and those doing advanced analytics as data scientists.”

IBM’s Citizen Data Scientist digital credential is designed to create more of these data-savvy employees so they can bring the power of data analytics to aspects of their daily jobs without having to rely on those who call data science their profession. At IBM, the CDS certification also is integrated into the company’s career path, so it serves as a starting point for those who might be interested in becoming a full-fledged data scientist.

Starting with an Innovation Foundation

The CDS certification program was established on a foundation of innovation that would encourage continuous transformation. One bedrock of this foundation was to reimagine and transform people’s analytical skills using IBM and open-source technology to retain and create a high-performing, engaged culture. Another was to leverage the reskilled SMEs to reimagine processes, then apply AI and Cloud technologies to implement new processes to accelerate transformation.

“The CDS certification program was established on a foundation of innovation that would encourage continuous transformation.”

 

“We wanted to certify people who could really work on data to bring our broader population into IBM’s digital transformation,” says Matthias Graefe, IBM’s Director of Supply Chain Transformation. But it isn’t enough to just offer one-off classes in, for example, Python programming, which would likely result in the creation of isolated applications and solutions. The company also wanted to ensure that its latest crop of Citizen Data Scientists has an understanding of cybersecurity, because a talent in programming doesn’t automatically mean the person understands how to ensure the programming, they’re creating is also compliant, Graefe says. “We knew we wanted to achieve both sides of it — enabling people in a supportive way, while also creating a community and governance around what they are doing.”

Taking the CDS Challenge

Brenda Berg, CDS Program Manager, advises that the CDS program requires more than just watching some videos and taking a test to earn the digital credentials. It may not be as difficult as becoming a full-fledged data scientist, but the Citizens Data Scientist certification still is no walk in the park. In fact, only 70% of the first year’s CDS trainees made it through the program. The nature of the learning, uninterrupted time commitment and demanding day jobs are what makes the program challenging.

To help keep candidates on the learning path, the program is highly structured and includes having a technical leader, a program manager, and a project manager agree to facilitate the process. Each candidate also is assigned a mentor to help them develop the data scientist mindset and build the new ecosystem together with professional data scientists. Employees who chose to participate could leverage their allocated IBM global training program learning hours to gain data science knowledge and skills.

CDS candidates initially were chosen by their managers because it takes a true commitment on both the participant and their managers, says Giovanna Benetti, Digital Supply Chain Transformation Leader. The self-paced training is done in parallel with participants’ day-to-day jobs, so both their managers and the participants need to ensure that they can complete the program while still getting their daily work done.

The data science mentor role also is a key one, she adds. They meet regularly with CDS participants to assist with coursework understanding, application, and provide guidance on the required Citizen Data Scientist project plan to complete the certification. Now that the program has an established, and growing, bank of certified Citizen Data Scientists, the idea is that some of those who have been through the program can now turn around and assist new trainees within the CDS community. Or, as Anita Toth, Global Supply Chain Manufacturing Operations and Integration Manager, says, “We now have started to eat our own cooking. The people who have graduated from the CDS program are now applying their skills back to the business, either vertically in their own domain or cross-organizationally. We have some Citizen Data Scientists who just graduated one year ago who now are guiding people who are currently in the program.”

“A key component to the program is that participants also must apply what they are learning to real-world projects.”

 

Another key component to the program is that participants also must apply what they are learning to real-world projects. These projects generally fall into three buckets. One is what IBM calls “big rocks,” which are multidomain, large-scale projects that are important to the organization overall. “It’s a benefit for employees to work on these big rocks projects, because they can both experience different functional areas, and are able to contribute on projects that have a big impact,” says Graefe. Then there are smaller projects, or “pebbles,” which generally fall within a specific business or functional area, and even smaller, more specific “sand” projects. Of course, the big rocks projects result in the most material savings — and the highest visibility. As an example, a 2020 CDS participant’s pebble project resulted in a $50K inventory savings for his business unit. After CDS graduation, he partnered with CDS multidomain colleagues to drive a services and manufacturing synergy project. Over the past two years this advanced deep learning-machine learning model contributed over $10 million in inventory savings and cost reductions.

The key to successfully completing the program is for the candidate to have the support of their manager or supervisor, and their entire team. Erin Thalacker, CDS Project Manager, also advises it’s vital to connect early and often with their data science mentors so everyone understands what business problem they’re trying to solve through data science, and also make sure they have access to data and that they understand that data, as well as for help navigating any roadblocks they may hit along the way.

The program still is tough, but those who make it through feel more prepared to be the technologists IBM will need for a future in which automation and data analysis will continue to be increasingly key for most job roles. Having a mentor also is a big boost for many who go through the program, as is having a community of fellow Citizen Data Scientists who not only understand what they are doing, but also can share solutions to common challenges across functionalities. Another perk? Added visibility. “We have a strong sense of community across Supply Chain, and to be able to make an impact in that community is a strong driver within the program,” says Graefe.

Scaling Up

As important as it was to quickly reinvent the supply chain, the program also was designed to be scalable throughout the company by basing the CDS on data science methodology, tools and data trainings that were domain-agnostic. This means that the program can be replicated for any other part of the organization that wants to jump-start its transformation journey with reskilled, resilient, Citizen Data Scientist SMEs.

This is already starting to happen. The original Citizen Data Scientist certification program has been leveraged to pilot additional Citizen Process Designer, Citizen Automator, Citizen Front End and Backend Developer programs.

There was a strong use case already built-in for expanding the program, says Graefe. For example, as IBM continues its drive toward automation, there is a need to create more bots. But those bots need to be programmed so that they add to the greater good — otherwise, they could just create chaos. Also, he says, there was a need to use a common platform and tool set so people can share successful components as they scale up the automation. “If we want to eliminate, simplify and automate, and to drive automation everywhere into our workflows, we need to expand the Citizens program,” he adds.

“The program was designed to be scalable throughout the company by basing the CDS on data science methodology, tools and data trainings that were domain-agnostic.”

 

“If you need to develop an application, you need people who understand the business requirement, can write the code, and you also need people who can design the UI for the application on the front end. Then you have the automators on the back end. Also, now it’s not just about developing code; it’s about working on platforms, often with low-code applications — you often don’t need deep coding skills now. We wanted to bring all of them under one Citizens umbrella.”

Next up is a pilot program with another business unit that works closely with Supply Chain to see if they can “lift and shift” to make the CDS program work for their specific needs. The team also is close to implementing an overall system to manage the projects the prospective Citizen Data Scientists will be working on so innovations can be shared on platforms with communities who could link those innovations to other areas.

“Our next step is to scale up the CDS to Innovation Technology and get through the pilot, then open it up to the broader IBM community,” says Graefe. “We are creating a community of those who finish their projects and graduate, who then are offered the opportunity to become Citizen Technologists and work across transformation projects to co-create across the supply chain and, hopefully, across IBM in the future.”  M

About the author:
Sue Pelletier, a contributing editor with the Manufacturing Leadership Journal, is a seasoned writer/editor with experience in online, social media, e-newsletter, tablet app, book and e-book, and print publications.

 

ML Journal

The Journey to Analytics Maturity

How can manufacturing organizations progress from descriptive to prescriptive analytics on their journey to data analytics maturity?

TAKEAWAYS:
How the benefits from data analytics increase exponentially as organizations progress along the Analytics Maturity Curve.
The importance of data governance and quality at every stage.
Driving data analytics by combining IIoT with data from IT and OT systems.

In today’s business world, data is the currency that allows stakeholders to successfully exchange information. Operating without it is like flying blind. Understanding how to use the data that’s available is essential.

Data analysis is also a critical aspect in the creation of measurement metrics and performance indicators that align with business objectives. This involves a series of processes, including data collection, cleansing, definition, and processing. Nevertheless, the reliability of the insights generated ultimately depends on the quality of the underlying data. Data analysis is only as valuable as the data it’s based on, and despite advancements in technology, ensuring data quality remains a persistent challenge.

When organizations emphasize the value of data over gut instincts, they can optimize their understanding of business operations by leveraging data analytics to automate processes, reduce defects, and increase velocity. When doing so, they may move from describing the status of operations to predicting outcomes. As the process matures, they can even prescribe conditions to improve operations. Organizations can track their journey to improving analytics using an Analytics Maturity Curve.

This article explores how businesses can leverage data analytics to gain numerous benefits, with the advantages increasing exponentially as they advance along the Analytics Maturity Curve. It also underscores the importance of data governance and management at every stage of the analytics journey. Various use cases are also included to highlight how data analytics has proven invaluable in real-world scenarios, demonstrating the tangible benefits of a mature analytics program.

Definition and Operationalization

The Analytics Maturity Curve is a model that outlines the different stages of growth and development an organization undergoes as it improves its use of data and analytics to drive better business outcomes (Fig 1). While progressing along the curve doesn’t imply that one type of analytics is superior to another, it does indicate that businesses must focus on developing a solid foundation in descriptive and predictive analytics before moving on to prescriptive analytics.

Another essential aspect of operationalizing data analytics is recognizing that true data mastery lies at the point of data collection. Organizations should aim to make data analytics easily accessible from the top floor to the shop floor, ensuring that relevant insights are available to the appropriate personnel throughout the organization, from plant managers to CEOs.

“Data analysis is only as valuable as the data it’s based on, and despite advancements in technology, ensuring data quality remains a persistent challenge.”

 

The Analytics Maturity Curve involves several steps, starting with data collection from various sources like ERP, business systems, and Internet of Things (IoT). The collected data is then cleansed by removing errors, duplicates, and irrelevant entries. This clean data can be used for descriptive analytics, using visual reports with drill-down capabilities to understand past or present events, patterns, and trends.

The next step involves using the data to create predictive algorithms and applying statistical methods to determine the significance of relationships between parameters. AI, ML, and statistical programs can help in this process, and the accuracy of the predictions is tested by measuring the variance between forecast and actual values.

The prescriptive stage of the Analytics Maturity Curve is the most important and the most difficult. In this step, organizations not only predict what can happen, but also prescribe optimal actions to achieve desired outcomes. By predicting end states and defining recipes based on these outcomes, organizations can automate processes and achieve their goals. However, the accuracy of prescriptive analytics must be validated using simulation optimization, which requires data models, operational research, and AI to refine this process.

Fig 1. The Analytics Maturity Curve

Source: Softtek

Advanced Analytics in Industrial Organizations

Advanced analytics has become increasingly important in industrial organizations, with technology advancements driven by the collection and analysis of data from the industrial internet of things (IIoT). This technology has a range of applications, from monitoring assets and server uptime to improving product quality through the monitoring of measured variables. By leveraging IIoT platforms, manufacturers can collect time-series data and use rich drag-and-drop dashboards, along with cognitive automation and robotic process automation, to enable advanced analytics.

One example of advanced analytics is the development of AI modules in cognitive automation that combine hyper-automation with real-time data ingestion, analysis, and conversion into structured data. This approach reduces noise, makes correlations, finds root causes, detects anomalies, and provides intelligent analysis and automated actions.

Sophisticated algorithms can detect seasonality and identify anomalies, triggering a self-healing process to address issues as soon as they are detected.

“The effective management and governance of data is crucial for organizations to ensure data quality, compliance, and security along their journey to analytics maturity.”

 

Advanced analytics also benefits from a human-in-the-loop process to enable feedback through reinforcement learning. This approach reduces false positives and ensures that hyper parameter configurations are tuned to respond to changes and learn new patterns.

Data scientists and analysts can also create self-service solutions to monitor assets, automate processes, and contextualize their predictive analytics. These solutions can have various use cases, leveraging predictive analytics to identify potential outcomes and prescriptive analytics to determine the right course of action based on those outcomes. In this way, industrial organizations can keep operations running 24/7 while predicting downtimes or failures to improve business outcomes.

Data Management and Governance

The effective management and governance of data is crucial for organizations to ensure data quality, compliance, and security along its journey to analytics maturity. This is achieved by developing a robust governance framework that guides and directs the objectives of data analysis. To achieve this, organizations must first establish key metrics that evaluate the characteristics of the data, for example:

Data Quality

●  Percentage of data test coverage
●  Number of data incidents
● 
Percentage of data models with checks

Productivity and engagement

●  Number of data test failures and resolutions
●  Users of data dashboards.

Uptime

●  Percentage of data models updated
●  Data model run time

To design an effective governance framework for data, businesses can refer to guidelines from organizations such as the Data Management Association (DAMA). DAMA provides guidance on several areas of data management, including data quality and security, storage and warehousing, data modeling and architecture, and verification of data integration and interoperability.

However, an effective framework for data management isn’t complete until organizations consider the infrastructure and cybersecurity requirements for data management. Finally, as organizations progress in their data analytics journey, ways of working will change, and change management cannot be ignored.

The chart below (Fig 2). illustrates a foundational governance framework for data management, from data source identification to data ingestion, warehousing technologies, and the use of analyzed data. This framework serves as a guide for organizations to incorporate governance at every stage of data management.

Fig 2. The DAMA Framework

Source: Data Management Association

Use Cases in Manufacturing

Product Life Cycle Management: Data analytics can be used to optimize the product mix of manufacturers by leveraging historical organizational data and market data. Optimization algorithms can predict market needs and identify new product offerings that meet those needs. Delisting underperforming products, strategic listing of underrepresented SKUs, simplifying the supply chain, and leveraging volumes in procurement can all lead to significant improvements in financial performance.

Inventory Optimization and Demand Management: Data models can predict inventory levels required to meet the desired product mix. The algorithms can be fine-tuned to optimize inventory by monitoring inventory levels and production, reducing changes in purchase order quantities, and providing a run-rate for production managers to consider for steady state operations. Demand can also be forecasted and optimized to improve cost and revenue influencers.

Client Sentiment and Resource Management: Advanced analytics tools such as natural language processing and sentiment analysis can be used to monitor social media messages, chatbot conversations, smart device interactions, and survey responses to understand client buying practices. These insights help manufacturers better understand and service customer requirements, as well as improve human relations and work conditions in an industrial environment.

Fig 3. Data Analytics Use Cases in Manufacturing

Source: Softtek

Next Level Benefits in Manufacturing

The manufacturing industry relies on metrics like cycle time, asset uptime, and work queues to plan workloads and machine operations. However, data analytics can take these efforts to the next level, helping reduce defects, improve quality, increase efficiency, and save time and costs. Some of the benefits include:

  • Enhanced product quality and efficiency: Real-time data from multiple sources across the supply chain and factory, combined with machine learning and visualization tools, can deliver insights to optimize performance and production based on market demand.
  • Cost reduction: Data analytics can help manufacturers reduce the amount of tests required to enhance product quality, resulting in cost savings. By designing targeted inspection plans based on the quality of lots shipped to customers, manufacturing costs can be decreased while improving quality.
  • Improved supply chain management: As supply chains become more complex and generate a larger volume of data, effective data management frameworks like those defined by DAMA can help organizations manage and govern data effectively.
  • Improved demand projection: Big data analytics can help determine and project the demand for a particular product, allowing companies to manufacture products based on forecasts provided by analytics tools and eliminate or reduce production downtime and losses.
  • Machine-level traceability and compliance measurement: Manufacturing analytics software can improve asset management, increase asset lifetime, improve asset availability, and prevent unplanned downtimes. Data analytics can also improve assembly line efficiency through, for example, a pinpoint defect scan to eliminate defects and increase quality and productivity.
  • Customer satisfaction: By tracking products after sale, manufacturers can understand customer responses and reduce condition-based monitoring. Big data analytics can help avoid recall issues, ultimately leading to increased customer satisfaction and product and brand reputation.

 Measuring Progress

To maximize the benefits of data analytics, organizations must carefully design their data management and governance framework and align functional metrics and KPIs with their objectives. After all, “What doesn’t get measured, doesn’t improve.” Measuring with intent is essential. Ultimately, data mastery and analytics require a clear understanding of business objectives and how to leverage data insights to achieve them.

About the authors:

 

Krishnan Venkat is Director of Supply Chain Consulting at Softtek.

 

 

 

Gonzalo Martín Vargas is Advanced Analytics Global Offer Manager at Softtek.

 

ML Journal

M2030 Perspective: More Robots, Better Jobs. But How?

Advanced robotics could create better manufacturing jobs, increase productivity, and help solve workforce shortages by 2030. But there’s no guarantee. 

TAKEAWAYS:
How can U.S. factories increase robot adoption from 10% to 50% by 2030?
Automation will inevitably change workforce roles and create different challenges.
Three key guidelines for positive-sum automation.  

At a family-owned factory in northeast Ohio, business is good. The floor is bustling. Rows of lathes and mills are spitting out a variety of precision aerospace components. Their specialty, like many medium-sized manufacturers in the region, is to produce high-mix, low-volume machined parts. But there’s a problem.

The factory could be growing much faster. It has the demand to add more machines. It could even add another shift. But it can’t find the people. In surveys and interviews with manufacturers, this has become a common refrain: we’re ready to grow, but we can’t recruit and retain talent fast enough.

The workforce challenge started long before COVID. Between 2010 and 2019, job openings in manufacturing more than tripled. And projections suggest the tight labor market for production work is here to stay. Given the aging manufacturing workforce, the Manufacturing Institute and Deloitte anticipate that the manufacturing labor shortage will only worsen.

Looking ahead to 2030, with policymakers and multinationals eager to invest in the future of American manufacturing, how can manufacturers of all sizes overcome the workforce bottleneck and grow?

Is Automation the Answer?

Some factories are betting on automation as the answer. The idea is simple: if you can’t find people to do the job, train machines to take over routine tasks. The vision might be that by 2030, manufacturers can substantially increase their output without growing their workforce – all by adding new technologies.

In our research with MIT’s Work of the Future Initiative studying how dozens of factories deploy new technologies and adjust their workforce, we’ve seen that this vision just doesn’t compute. To achieve what we call “positive-sum automation” – technology adoption that improves productivity as well as flexibility – firms need to ask more of their workers. Adopting new technologies doesn’t solve a manufacturer’s workforce challenges. It just changes them.

Imagine a factory can’t find a machinist, so they buy a robot. In principle, the robot could load and unload parts from a machine, freeing a machinist up to operate more machines at once. The robot certainly promises productivity gains, but they aren’t guaranteed. In the short term, the robot will require more labor and more skills to be deployed on the factory floor.

“Adoption of automation technologies has been comparatively low at U.S. factories. Approximately 10% of U.S. plants have any robots at all – and only half of those report using robots at any scale.”

 

Before the robot can operate efficiently, the production team will need to re-engineer the process of tending the machine, and program the robot to perform the task as efficiently, or more efficiently, than an operator. It’s a process of trial and error that could take months of tweaking and pulling the firm’s leaders away from their day jobs. Even if the factory calls on external help from an integrator to program the robot and fine-tune the process, they will need to build up the internal skills to fix the robot when it goes down – or reprogram it if the process changes.

The Robot Gap in U.S. Factories

Adoption of automation technologies has been comparatively low at U.S. factories. Only one in three factories report having specialized software. Approximately 10% of U.S. plants have any robots at all – and only half of those report using robots at any scale. One explanation for low adoption is the complexity and capabilities it takes for firms to integrate these technologies and make them work. Consider programming a collaborative robot to tend a machine. The task is well within the capabilities of current commercial robot systems. However, one way or another it will often take 3x or even 10x the investment of the robot hardware to make the system functional and robust. One large firm with a team of engineers can implement and scale the system across dozens of machines in three months. Another small firm without an engineer has to plan for the effort on the margins, tasking a machinist without prior programming experience to spend a year teaching themselves to be a robot programmer and integrator. It’s a risk for a firm to make that kind of investment.

Even among manufacturers that adopt robots and use them to automate tasks on the floor, workforce challenges don’t disappear. They just change forms. Studies from a variety of countries show that when firms adopt robots, they end up hiring more people and becoming more productive and competitive.

Jobs Are Changing

At more automated factories, there are more jobs – and the jobs are different. In our work, we’ve seen that when firms adopt new technologies, it changes what it means to be an operator, or an assembly worker. New skills are required. The cadence of the job changes. One study shows that the impact of automation can extend beyond the production environment. At the firm level, it shows that more automation is associated with fewer middle managers.

In short, automation has the potential to create better manufacturing jobs. This is important in a tight labor market, particularly when the wage premium for manufacturing work has vanished over the past four decades. In 1960, an American without a college degree could make 40% more on average than they could outside a factory. But by 2020, that worker without a college degree was making only 2% more in manufacturing. Alternatives in retail and logistics have become far more attractive.

“What would it take to increase robot adoption to 50% of American factories by 2030, and ensure that the factories with robots are recruiting their new workers into high-paying jobs?”

 

Back at the Ohio aerospace supplier, creating better jobs and more flexibility has motivated them to automate. Several years ago, they invested in a collaborative robot, which they call Jeff, to load and unload a lathe in the back corner of the factory. The owner summarized their approach, roughly: “we know we need to offer higher wages to recruit more workers. And we can’t pay higher wages unless we automate.” Although it was a struggle at first to keep the robot working and train their workers to support it, the factory has stuck with it over several years and plans to scale up new technology elsewhere on the floor.

Positive-Sum Automation

We can imagine U.S. manufacturing in 2030 with automation enabling increased productivity as well as higher-wage, higher-quality manufacturing jobs attracting a new generation of manufacturing workers. But it’s no guarantee. What would it take to increase robot adoption to 50% of American factories by 2030, and ensure that the factories with robots are recruiting their new workers into high-paying jobs?

Our research suggests three keys for firms to achieve the positive-sum automation for which they’re striving:

1  Invest in shop floor entrepreneurs to identify and exploit automation opportunities. When firms ask for feedback from their workforce, not everyone is going to participate. Typically, there are only a small share of workers who are prepared to implement new ideas and process improvements. These individuals are typically close enough to the routine processes that they know how the floor operates, but removed enough from the line that they can understand how the manufacturing systems work. We see enormous potential in these shop floor entrepreneurs if their managers give them the resources and the time to experiment with new technologies.

Since deploying automation technologies takes time and trial-and-error, these entrepreneurs need the runway and the risk capital to succeed. We frequently encounter manufacturers without a clear ROI calculation when they purchase a new piece of equipment. This isn’t necessarily a problem. It can be an opportunity for firms to enable their most creative and resourceful personnel to find new applications for a technology that they might not have imagined before purchasing it.

2 Redesign line work to increase flexibility and unleash creativity. Managers have long expected patience and precision from their production workers charged with routine tasks. Operating a machine or working in assembly can be a high-stress environment where every break or distraction is costly. But automation can make this work more flexible and lower stress, which can in turn, attract more people.

The goal of automation is not always to increase the productivity of an individual worker – one worker supervising more and more machines – but also to give individual workers more time to identify ways to make the process better. In our research, we’ve seen the introduction of new technologies associated with more opportunities for bottom-up process improvement when line workers have the flexibility to step back and analyze a process when they’re not stuck repeating it. The next generation of machinists can have more space to innovate if their employers empower them to do so.

3  Use software to squeeze the most out of hardware investments and measure progress. Generating benefits from automation goes beyond setting up a robot to load and unload a machine. That robot and the machine it’s tending can be most productive when it’s connected to software that measures its performance and can allocate work most efficiently between the equipment and capabilities that a factory has. For most people, scheduling is an extremely challenging job. In hospitals and in air traffic control environments, research finds that doing these tasks well often requires decades of experience, and only a small share of people can thrive.

Even at the factories we visit with the most advanced automation in equipment, we still see gaps in their software system. Of course, they might have a sophisticated ERP that helps determine what needs to be made, or even dashboards on whether machines are hitting their targets, but rarely is this software system integrated in a way that can accurately spot inefficiencies and measure productivity improvements in each section of the factory. Factories investing in software and software expertise are building the infrastructure and the capabilities to make advances in automation, flexibility, and job quality possible.

A Spirit of Experimentation

Successful adoption of robotics and automation can promote higher wages and higher quality manufacturing jobs and improve a firm’s competitiveness. However, the research shows that these gains cannot be achieved through capital investment alone. The success stories require firms and workers to bring a spirit of experimentation and openness to redesigning work and training for new software skills.    M

About the Authors

 

Dr. Ben Armstrong is Executive Director and a research scientist at MIT’s Industrial Performance Center, where he co-leads the Work of the Future initiative.

 


Dr. Julie Shah
is the H.N. Slater Professor of Aeronautics and Astronautics and leads the Interactive Robotics Group of the Computer Science and Artificial Intelligence, at MIT. She is also co-lead of MIT’s Work of the Future Initiative and a member of the MLC’s Board of Governors.

 

ML Journal

The Hammer Versus Nails Strategy for Data Analytics

Taking a pragmatic approach to data analytics by focusing on business goals rather than technology allows companies to deliver incremental value.

TAKEAWAYS:
Many developments in AI/ML for manufacturing can be leveraged now.
The challenge is driven by people, technology, and process.
An agile and pragmatic business centric approach can allow for tangible benefits without the high upfront investment.  

Manufactures continue to exist in a complex juxtaposition of needing to significantly invest in technology to improve business operations, while still navigating difficult supply chain issues that require keeping costs down and focusing on retaining talent. Given a choice, the natural tendency is to pick the path of known and least resistance. But if the choice requires a difficult decision, sometimes no choice at all is the most likely outcome.

The use of data to drive better and faster business decisions is a necessary component of any executive strategy. Data is the next elixir in the quest to improve operational margins in the same way that cloud infrastructures allowed businesses to grow and transform over the last decade.

The use of machine learning to examine and augment the decision-making process is a similar and exciting extension of data driven decisioning. For example, on the shop floor AI is used to reduce scrap by identifying defective products as well as providing recommended changes to machine and process settings to prevent future issues. AI is also becoming heavily integrated in asset-intensive facilities to analyze sensor data and prevent downtime by alerting operators on potential issues. Beyond asset performance management, companies can also leverage shift and production information to improve workforce planning and management. With all the challenges in supply chains, having visibility into future issues also allows for improved spare parts management of operations.

“It is important to remember that the explosion of data use requires an ever-increasing focus on governance strategies as well as security.”

 

The benefits of better understanding the business from the prism of data is not constrained to factories alone. Think of how important it is to manage a successful organization by having visibility into cashflow or being able to forecast sales orders and expected product production needs. Beyond the internal business, external customers are also increasingly requesting more transparency into supply chains. The expectation is that product passports will be the next great investment in infrastructure.

The illustration in Fig. 1 highlights the importance of thinking about data analytics across multiple operations as a way to drive the business forward.

Fig. 1: Data Analytics Opportunities in Manufacturing

It is important to remember that the explosion of data use requires an ever-increasing focus on governance strategies as well as security. Data needs to be protected and carefully curated for safe and effective business purposes. Most companies have multiple business solutions in a variety of cloud and non-cloud systems and the investment in managing, consolidating, and reporting for those systems will only increase. At any given time, there are also ongoing IT projects to move, change, update, or replace existing systems in many organizations. The impact on business operations can be significant, especially as the workforce must continue learning and adapting to changing technology and business conditions, combined with talent retention challenges as new employees struggle to adjust to old systems.

It is perhaps no surprise that this can be overwhelming combination of factors for manufacturers to cope with.

The Hammer Approach

The best-case scenario is to be able to plan out your business operations as a digital blueprint and track the materials to goods to customer order (and their use) so that you can leverage all the latest innovations to drive quick and proactive decision making.

This is a significant effort and best done in the initial stages of a digital transformation when there are fewer moving pieces. The company can then focus on automating more of its manufacturing systems. Once a company has a digital twin of its key systems, it is easier to make process and technology decisions in harmony with business priorities. Investing in IoT sensors and vision systems to augment your software and infrastructure setup then helps prepare the way for future innovation capabilities.

It is recommended that while building data models companies also consider the different personas who will need access to the systems, their data access rights, as well as what kind of decision or action data needs to be captured to improve the intelligence of the entire business operation.

Other considerations would be around how to manage a data mesh of interconnected systems so there is reduced data duplication and retention. It is certainly more effective to invest in the right analytics when you start from a cleaner slate.

It is easy to get caught up in technology and forget business objectives.

“Once a company has a digital twin of its key systems, it is easier to make process and technology decisions in harmony with business priorities.”

 

The famous quote by Abraham Maslow: “If the only tool you have is a hammer, you tend to see every problem as a nail,” best describes problems with the hammer approach if you have an over reliance on technology versus solutions.

If the focus on IT infrastructure and software drives strategy and investment, organizations feel the burden of trying to rationalize the investment by looking for very specialized and potentially time consuming and expensive challenges. They end up investing in internet of things (IoT) sensors and storage infrastructure for this information without a clear plan on how to maximize their use. They can also resort to streaming analytic dashboards as way to visualize the information without a clear business driver. There is a great temptation to invest in additional hardware and streaming devices like cameras to leverage the data architecture use. The hammer approach is normally common in organizations that are understaffed or have limited experience with data driving decisions.

The hammer approach is a technology first and only approach to data analytics.

The Nails Decision Strategy

The “nails” are business challenges or goals. All organizations have them and expect them to change as often as needed. Nails are shared focus areas which executives want to move forward as part of a plan to improve their business incrementally.

To be thoughtful and organized is important, and to think of innovation as a “fail forward” approach. Due to the complexity of modern supply chains and plethora of operations systems there is a need to have a top-down executive alignment on an agile business benefit driven transformation plan. Change management is easier if there are quick wins with tangible benefits. Often priorities change faster than the time it takes some IT projects to complete, so it is best to focus on quick time to value.

The six pillars are important guiding principles in employing innovation in manufacturing. Companies can apply them to drive analytic decision making when picking their business goal “nails”.

Fig 2: Six Pillars of Analytics Decision Making

For machine learning driven analytics the availability of data is essential but since it is challenging and almost impossible to get perfect or complete data, the best approach is to reverse engineer the goals to be based on pragmatic business outcomes. A successful approach is to compartmentalize the type of data used in decision making. Evaluate and identify useful data that drives expected business outcomes versus where it does not, and then leverage it where it applies, versus a big bang approach.

While manufacturers have invested in sensors to track equipment performance, there are also pragmatic models that can use asset maintenance history to improve insights into maintenance scheduling versus simple time-based maintenance plans. Applying gradual data enrichment processes allows companies to gently bring in new and additional information where there is sufficient business benefit.

Engaging experienced plant supervisors and operators in leveraging any analytics so there is a way to improve work automation and process is an important tenant in maximizing the use of these tools.

A Pragmatic Approach

With this challenging environment most companies rightfully hesitate to invest significantly in changing their mental approach to leveraging data for business decisioning. The approach each business takes will depend on their culture and priorities, but approaching individual challenges and analytics requirements is an easier and faster way to fully realize priorities. The quick wins can engage all the departments of the business in a more collaborative manner and the investment decisions then tend to be less burdensome.

There is a need to prioritize data management strategies to leverage new analytics innovations in manufacturing and supply chains. How companies go about it does not need to overwhelm their focus.

Focusing on concise business goals (nails) and investing in quick wins is a better approach than building a technology first (hammer) solution.  M

About the author:
Sandeep Anand
is Senior Director of Decision Analytics and Science Platform at Infor. He has over 15 years of experience building and delivering AI/ML solutions. Experience includes solutions around yield/scrap , supply chain improvements and smart asset management strategies. He leads the AI/ML practice at Infor, a leading enterprise cloud solutions provider.

 

ML Journal

Leaning Into the Future

Manufacturing Leadership Journal content and MLC resources are exclusively available to MLC members. Please sign up for an account or log in to view this content.

Name(Required)
ML Journal

The Data Monetization Wave Picks Up Speed

The generative AI tool ChatGPT has raised the competitive stakes, requiring manufacturers to embrace the discipline with greater urgency. 

TAKEAWAYS:
The myths and realities of data monetization.
A framework for identifying and measuring the potential value of direct and indirect data monetization opportunities.
How generative AI is a game changer for data analysis, valuation, and monetization.   

In the Manufacturing Leadership Council’s recent research, Manufacturing in 2030 Survey: A Lens on the Future, 84% of respondents said they expect the pace of digital transformation to accelerate. That means data—more of it and more opportunity to create value from it.

Extracting value from manufacturing data has been rising up the industry agenda in recent years—fueled by success stories from leaders such as Navistar Internal Corporation, which used to rely on miles traveled or time since the last service appointment to develop vehicle maintenance schedules. By introducing new capabilities to analyze sensor data from 375,000 connected vehicles, Navistar has helped vehicle owners reduce maintenance costs by up to 40%.

The industry’s mushrooming volume of data is reason alone to be thinking about data monetization. Then ChatGPT entered the conversation in late 2022—raising the stakes. With generative AI, a user can now formulate a question and feed it into the model, which queries multiple data sources—potentially even integrated data, such as that from a CMMS system. The output provides an explanation of what the problem could be, the tools/parts needed, and a step-by-step explanation of how to fix the problem. This saves significant time and costly trial and error, becoming a source of value. The very human nature of the interaction addresses one of the big challenges with which manufacturing has been grappling: how to equip its workforce with skills to use data in a digital world.

“The time when generative AI will be able to fully ingest and use a company’s data as well as what it scrapes from the web isn’t far off.”

 

Competition among Microsoft, Google, and others will only accelerate generative AI capabilities as well as business interest in using them. We can expect to soon see it embedded in workplace technologies, enterprise resource planning systems, and other business applications. That means it’s not a question of if, but when generative AI will be able to fully ingest and use a company’s own data and what it scrapes from the web. While AI communities are eagerly anticipating this, so should data-rich manufacturers because it significantly increases the potential for turning data into value.

This is, of course, just one facet of data monetization. But it underscores the vast potential and pace at which things are changing. If your organization isn’t yet in the game, it’s time. We’ll cover some basic principles to shape discussions about data monetization and some ideas for jump starting or reenergizing your organization’s efforts.

What Data Monetization Is and Isn’t

There are a lot of myths regarding data monetization. It is important to understand that these are, in fact, misconceptions.

We’ll touch on a few of these, but the first one is key. From our point of view, data monetization is not just about selling data; it’s about using data internally or externally to generate new value streams.

West Monroe defines data monetization as the process of generating new and innovative measurable value streams from available data assets.

There are several key words in that definition. First, data monetization is a process, not a one-time activity. Applying a product mindset—one that focuses on delivering value rather than milestones—is important. A value stream is about generating measurable benefits. If you aren’t connecting the dots from business value to the data used, then you can’t really claim that you’re monetizing the data. Finally, available data assets is not just about the data inside your four walls. It also includes social media data, partner/supplier data, customer data, and open data sources, among others. Data monetization is about harvesting content to enrich and enhance your own data and make it that much more marketable and usable.

Two Types of Data Monetization

We’ve identified about a dozen data monetization patterns. These generally fall into two categories—indirect and direct. Indirect data monetization focuses on internal business processes that generate measurable returns. Direct data monetization involves externalizing data in return for some type of commercial consideration.

Most manufacturers that have pursued data monetization focus on indirect opportunities. Some have made good progress; for example, with AI/ML models that can predict an outcome (will the machine go down?) or aid decision-making (should we replace or repair?). With new capabilities to analyze data, Harley-Davidson was able to predict machine failures with a very high degree of accuracy, thus reducing unplanned downtime and increasing production capacity 8-10%.

The emergence of generative AI creates bigger and better opportunities for indirect monetization and for direct monetization due to the breadth of data now valuable outside the company. As manufacturers consider new use cases, the need for third-party data will increase, making data (both volume and variety) more valuable on data exchanges.

Naturally, this also raises new questions about data ownership, including who owns the data scraped by the model and whether/how they should be compensated? Consider, for example, the data produced and captured by manufacturing equipment used in a factory: Is that the equipment OEM’s data or the manufacturer’s data?

“Data monetization isn’t just about selling data; it’s about using data internally or externally to generate new value streams.”

In any event, keep in mind that it is not the generative AI model that has value. The value is in the data itself and the productivity of using it more effectively to produce insights, content, or other commercial benefit. And that brings us to packaging.

Just like any product, you can sell raw material in some form. In the case of data, it can be shipped in bulk via FTP or provided through an API. But one of the most common misconceptions is that a company can only monetize raw data. That isn’t the case. Because of its unique aspects, data can be packaged in numerous ways. In fact, data products get more valuable when you travel down the list in the figure below, using data to enrich other data or create analytics or insights or custom data products or even integrate data into suppliers’ or partners’ systems. Just as with other raw materials, the more you process data, the more expensive and exclusive—and valuable—it becomes.

To put this in relatable terms, think about wheat. Few consumers (outside of a small number of processors) go to the fields to buy raw wheat. Most go to the store to buy wheat that’s processed into flour. Or, further along the value chain, they buy bread that has been baked from the processed flour or, a step further, a sandwich. At each step, the product is more consumable. The market becomes smaller, the cost becomes greater, but the value increases as you combine a raw material with other raw materials to make a product.

There are literally dozens of types of manufacturing data that may have value in some form of packaging along the spectrum above.

Getting Started or Back on Track

Whether you’re just starting or have explored data monetization but stalled, your organization will need an approach grounded in creating both momentum and value. Again, this is a process, not a one-time activity. It will also require a dedicated leader or team to own and support the process. West Monroe breaks data monetization into about a dozen discrete steps that fall into three basic phases:

1.     Generate and prioritize ideas

2.     Define the use case or data product requirements and features, collaboratively with stakeholders inside and outside of the business

3.     Engineer, introduce, learn from the results and feedback, and improve continuously using rapid iterations to shorten the time to value

If this looks familiar, it is. It comes directly from well-honed R&D and product management playbooks.

Ideation workshops or exercises should start with your organization’s business drivers and identify those where you have the most potential for creating impact with data and analytics. Get a cross-functional group in a room to bring as many perspectives to the table as possible. And aim to develop as many ideas as possible.

One way to frame an ideation exercise is to identify situations where it would be valuable to have more prescriptive, predictive, or diagnostic insight. In manufacturing, considerable effort still goes into reporting on the past—what was sold or how costs fluctuated last quarter. What’s more valuable is understanding why you only sold that much, how much you’re going to sell next quarter, or how you could sell even more.

Another potential starting point is to identify data with the greatest potential value inside and/or outside your organization. Characteristics of highly monetizable data include such things as degree of control or ownership, uniqueness (others do not have data like it), meaningful context, security, accuracy, and availability. If you have data that meets many of these characteristics, compartmentalize it and make sure you begin treating and managing it as an asset, even as you develop your strategy for monetizing it.

Take inspiration from what other organizations are doing, both in and beyond the manufacturing sector. We recommend referencing Data Juice for real-world stories, including several from consumer and industrial products organizations.

Finally, look for other business or IT initiatives already underway that can help in gathering, preparing, or using data in new ways. This is often a way to accelerate new initiatives that may otherwise be challenging to get off the ground.

“Characteristics of highly monetizable data include degree of control or ownership, uniqueness, meaningful context, security, accuracy, and availability. ”

 

As you begin to prioritize ideas, employ a feasibility assessment that considers factors such as complexity, cost, and magnitude of benefits. One way to compare and rank ideas is to plot them according to impact and complexity. Those with low complexity and high impact are candidates for rising to the top of the priority list.

Be prepared for bottlenecks
There will always be challenges—cultural, technical, poor data quality or governance, and privacy and legal considerations (perceived and real) among them. We often see organizational issues—including core business priorities that don’t encourage data monetization, accounting standards that don’t (yet) recognize data as an asset, and lack of experience and skills—as some of the biggest hurdles. At least some of these—such as building the right skills and foundational data integration, master data management, and storage or computing capacity—will require investment. It may pay to bring in expertise in areas such as data literacy, data management, and change management to accelerate key changes and avoid costly missteps.

And be ready to measure
As the old adage goes, you can’t manage what you don’t measure. Organizations tend to manage things like data volume and speed. Few measure data quality characteristics such as potential value, business relevancy, cost, impact on business performance, market value, and impact on the organization. Research shows that only 11% of organizations know the cost of their data, only 12% calculate the value of their data assets, and only 21% measure the business impact of data quality improvements. Further, only 4% have developed ways of measuring data value in monetary terms with an assigned dollar value, only 7% are now beginning to measure data value against data-driven services, and nearly a third don’t have measures in place to value the increasing volumes of data that digital technologies create.

If you’re going to proceed down the path of data monetization, you will need an approach for measuring the value. We recently worked with a client, a leading manufacturer of agricultural machinery, to develop an approach for understanding the return on its more than $100 million invested in data and advanced analytics capabilities and to justify new investments. This effort employed our thought leadership on the economics of information—or “infonomics”—to create a practical framework that the company is using to measure the value potential of digital and data products.

Ride the Wave
With the sudden emergence of generative AI, we believe people will someday remember 2023 in the same way we remember getting on the World Wide Web for the first time. This development has brought new attention to data and its potential value in all sectors, including data-rich manufacturing. We’ve seen interest in this sector pick up considerably in the past several months and expect that will only intensify as new capabilities emerge. If you are just beginning down the path toward data monetization or have started but stalled, now is the time to get focused. Manufacturers that do catch this wave and ride it to begin exploring data monetization will find themselves in a good position to benefit.  M

About the authors:

 

Doug Laney, Innovation Fellow, Data & Analytics Strategy, West Monroe

 

 

 

David McGraw, Senior Manager, Consumer & Industrial Products, West Monroe

 

 

 

Tim Wrzesinski, Director, Technology, West Monroe

ML Journal

Achieving Data Mastery through Literacy and Standardization

Setting clear objectives and ensuring cohesiveness can enable manufacturers to be deliberate and intentional in executing a data strategy.

TAKEAWAYS:
Manufacturers must first determine an objective for data mastery and prioritize potential quick wins as the first step.
Data from customers, suppliers, and the shop floor must be harmonized to standardize inputs, KPIs, and other important metrics.
Manufacturers must also assess their teams’ data literacy and then determine what training is necessary to develop organizational cohesiveness.

Manufacturers understand that data will be foundational in developing increasingly efficient factories of the future, an essential tool to guide better decision-making at every level of the business. As digitization becomes more commonplace on factory floors and data becomes progressively central to operations, manufacturers need to be ever more intentional about how they tap into that data.

That intentionality can be more challenging than it seems on the surface. There are two foundational efforts that can help manufacturers on this front: setting clear objectives for how the business wants to use data to its fullest extent and ensuring teams across the organization have a cohesive level of data literacy. Both efforts enable the organization to be more deliberate in execution.

The first mission, determining an objective, may seem straightforward enough. But with how ubiquitous data has become throughout manufacturing operations and production processes, teams may find it challenging to rank their priorities. While predictive machine learning processes may be appealing, for instance, implementing such processes can be more time-consuming and challenging than, say, identifying manual processes where data might help to increase workers’ efficiency. Identifying potential quick wins should be top of mind for leadership teams assessing how to improve their data strategy.

The company’s data maturity level will play an important role in setting this objective. Manufacturers that find themselves in the earlier stages of weaving data analytics throughout their operations will likely have different goals than those already using more advanced, predictive data capabilities.

Harmonizing Data Sources

Customer data, supplier data, and data generated on the shop floor all converge to create an enormous amount of potential for manufacturers looking to make their operations exponentially smarter and more efficient. But if those data sources aren’t harmonized to speak the same language, essentially, then it will be difficult to harness that information in a meaningful way.

Data-driven decision making is at the heart of the Industry 4.0 journey, and connectivity among machines, products, employees, suppliers, customers and processes across the value chain is the key to unlocking the value of this data.

“Manufacturing leadership teams need to identify which functions of the business could benefit the most from data and then talk to those teams about their biggest pain points.”

 

 

Companies need to standardize various inputs, key performance indicators and other metrics so teams can manage, compare, and report on data cohesively throughout various business functions. This allows for cross-training, improvement across different manufacturing sites and standardization of best practices for broader, companywide benefit. Especially for middle market and smaller manufacturers that don’t currently compare metrics between facilities, standardization is an important step in improving data management and governance.

Here are some specific ways companies can standardize their data across various sources:

  • Establishing data owners: Ensuring data inputs are consistent starts with establishing ownership for various data sources. Manufacturers should determine which individuals and/or teams essentially own which data channels and how they can be good stewards of that data. This is also a key effort in standardizing and harmonizing KPIs.
  • Implementing a data catalogue or dictionary: Having a reference guide is crucial to boosting the accessibility of data and becomes more important as an organization builds out its analytics capabilities. Having such a catalogue can ensure teams use their time more efficiently and have a common understanding of data and where it is coming from.
  • Equipping teams to succeed: If one facility uses different metrics than another and the business wants to switch to using standard metrics across the organization, leadership teams need to understand the implications for employees and provide training as appropriate. We’ll look at workforce implications more in-depth below.

Bridging the Data Literacy Divide

Hand in hand with standardizing data across production facilities and other sources, manufacturers need to ensure their employees’ data capabilities and literacy are consistent across teams, locations and the broader ecosystem, which may involve suppliers and other partners. It’s common for there to be a data literacy divide among teams, and for some employees to feel more comfortable leveraging data than others.

An assessment can be a critical first step in determining team members’ data literacy levels and can allow leadership teams to see what work and training can bring everyone onto a level playing field.

“Manufacturers of all sizes need to hone their workforce’s data mastery skills and streamline data across operations before they can tap into more advanced capabilities.”

 

 

Once such an assessment has taken place, there are several important steps organizations should take to bridge the data literacy divide:

  • Education and buy-in: Some employees may not see the full potential of data within the organization. This is where leadership teams need to educate their workforce on why harnessing data matters for the future success of the company. Getting specific with examples here can be useful; for instance, explaining how adopting consistent downtime codes across facilities can make employees’ jobs easier is one way to communicate the value of data literacy and cohesive data mastery across teams.
  • Training: Once leadership teams have buy-in from employees, organizations need to assess what types of training employees need and how their needs may vary. A tailored approach can be useful here to bridge the gap between perspectives; some team members may see data as the main way to solve every problem, some may be skeptical about its practical uses, and some may need to experience using data analytics tools to solve problems in real time in order to see their full potential.
  • Assessing needs throughout the business: While it’s important to make training available widely, manufacturing leadership teams also need to identify which functions of the business could benefit the most from data and then talk to those teams about their biggest pain points. Finance, supply chain operations, and the shop floor production operations — especially those using Industrial Internet of Things devices and connected infrastructure — are some areas where it may benefit organizations to zero in on.

The Bigger Picture

Teams that are in the thick of training employees on data or implementing new technologies might lose sight of how to track and/or communicate the return on investment of those efforts to senior leadership, but determining that ROI can ensure all stakeholders understand the importance of data literacy, tools and training to making the overall business more efficient.

Manufacturers of all sizes need to hone their workforce’s data mastery skills and streamline data across operations before they can tap into more advanced capabilities. Setting clear intentions and objectives is the critical first step to increasing an organization’s data analytics maturity level and strengthening data governance throughout operations.  M

About the authors:

 

Ravi Bodla is a Data and Analytics Director at RSM US LLP.

 

 

 

Jacob Friess is a Data Analytics Supervisor at RSM US LLP.

 

 

ML Journal

Unlocking Data-Driven Manufacturing’s Power

The challenge of decoding and using data to drive strategic outcomes and unlock value 

TAKEAWAYS:
Data has the power to optimize and scale manufacturing operations, enable the workforce, and boost performance.
Manufacturers have struggled to build a business model that captures data’s value and seamlessly integrates data into day-to-day functions.
Organizations that empower workers to be part of their data transformation strategy can gain a competitive edge. 

Effective use of data is increasingly important for manufacturing organizations. The EY CEO Imperative Survey revealed 70% of manufacturing sector respondents see digital innovation as a transformation driver for their companies.¹ However, few manufacturers have been able to successfully navigate this complicated domain. Many continue to struggle with connecting equipment, systems and factories. Others collect data lakes that require extensive effort to extract meaningful insights. Ongoing growth through acquisition has exacerbated the opportunity as global organizations often acquire companies at different stages of technological development. With disparate machines, systems, siloed architectures and data models, time is spent trying to capture and make sense of data rather than taking actions that drive business needs.

The need to effectively use data has never been greater – or more urgent. Ongoing global disruptions are driving more on-shoring and near-shoring of manufacturing in markets with higher-cost labor, which demand more productivity. At the same time, changing customer expectations and unreliable material sources are intensifying the need for agility. These challenges are compounded by ongoing labor and skills shortages that highlight the need to utilize, attract and retain talent by creating more engaging and impactful worker experiences.

“The need to effectively use data has never been greater – or more urgent”

 

 

 

 

Today’s market requires a data-driven manufacturing approach. The focus is on capturing the right data, modeling it for effective insights, and empowering workers with the information they need to optimize day-to-day operations. Additional priorities include, upskilling personnel, and enhancing work processes in ways that strengthen performance and boost value.

Leading with value

At its core, data-driven manufacturing is about leveraging business outcomes based upon intelligent data. The focus is on connecting and capturing data from the manufacturing floor to the enterprise layer, linking that data to unique business requirements and then identifying actionable insights that can improve overall business competencies.

Making the shift from a reactive to a proactive data-driven operation typically requires investment in connecting legacy equipment, mapping infrastructure, integrating systems, and developing end-to-end data and cloud strategy. Priority use cases will need to be defined and operating systems will likely need to be upgraded. This effort should be underpinned by well thought-out architecture, data management processes, and orchestrated change management processes. It is about managing data as a product, and supporting each role, function, and level on that individual’s personal journey map. All of this must also be connected to tangible business outcomes, not just KPI improvements.

Companies that are unable to link data and technology investments to business results often feel trapped in “pilot purgatory.” They are always searching for answers, but find it difficult to justify the additional investments needed to scale. Traditional operational excellence programs may yield improvements, but the lack of sufficient data and insights likely prevents these improvements from being fully sustained over time. Cost-control programs that have been in place for some time may see their benefits begin to plateau. The missing piece to maximizing these benefits could be data-driven insights.

“Companies that are unable to link data and technology investments to business results often feel trapped in ‘pilot purgatory’”

 

 

 

Being able to effectively identify, monitor and realize the value derived from intelligent data investment requires more than identifying costs and tracking KPIs. It starts with a fundamental shift in thinking that data-driven insights are essential to enabling the strategies that have already been laid out, and not as siloed IT, engineering or operations programs that compete for resources. Organizations must begin to connect the dots to see how these digital enablers create new capabilities that support better business outcomes, and how those business outcomes translate to achieving strategic business objectives.

EY recently worked with a global industrial products manufacturer that was struggling to justify infrastructure and system upgrades. These investments were necessary to capture and display real-time overall equipment effectiveness (OEE) visibility in and across their plants. However, business leaders were unable to see how it connected to a core strategic pillar of protecting market share in increasingly competitive markets. By developing a business case that linked the investments to a new enabler (OEE visibility), to new capabilities (proactive problem solving), to a business outcome (increasing labor productivity to offset increasing raw material prices), and to helping achieve the strategic objective (enabling prices to stay low to prevent competitors from taking market share), they were finally able to align on the importance of the program. As a result, the first pilot site was able to properly identify and categorize issues that had never been captured, which accounted for nearly 75% of all production losses. Now, they are systematically working on actions to eliminate these losses.

Navigating the application and data modeling landscape

Once there is alignment on data’s value potential, the complicated work begins to figure out how to collect the data and effectively use it. Organizations need to develop a strategy to capture data from brown or green field sites, interconnect siloed systems, develop end-to-end data and cloud and data management strategies, and develop new business workflows. It begins with a data readiness assessment for each facility, which is mapped across the standardized IT and OT architectures to gain visibility of the data readiness gaps. With advancements in edge, IoT, and OT cyber security solutions (which runs across all systems), data gaps can be bridged, and meaningful data can be extracted from different machines and systems. The data ingestion happens via IoT protocols (MQTT, API, OPC-UA, etc.) and traditional proprietary protocols, and is integrated with the organization’s IT enterprise layer.

With business use cases at the forefront, a modern technology data stack (data capture, ingestion, compute, and storage) and powerful ML/AI capabilities, intelligence can be extracted to demonstrate actionable insights. The data analytics and reporting can be visualized using modern business intelligence or the organization’s preferred tools. Cyber-security plays a huge role and is embedded across all levels of the architecture, ensuring proper authentication, and mitigating the risk of unnecessary data replication.

“Data-driven manufacturing provides those unique capabilities where data insights can happen at different layers of edge, cloud, or on-premises systems”

 

 

 

Data-driven manufacturing provides those unique capabilities where data insights can happen at different layers/stages of edge, cloud, or on-premises systems. The data can be processed in real-time, near-real-time or can follow a cold path route followed by ML/AI processing for insights, depending on the use case. Every customer’s data-driven initiatives are different. Comprehensive tools, advanced technologies and use case accelerators can be brought into the digital journey.

In another use case, EY recently worked with a global pharmaceutical manufacturer to assist in standing up their own connected analytics capabilities. The manufacturer built data pipelines to stand up a rapid proof-of-concept at a facility, deploying to end users with training and validation to enhance before scaling to the broader network. A focus on connectivity, data modeling and analytics enabled the manufacturer to provide the right information to the right people at the right time. All investments were supported by processes that required value to be planned and proven on a small scale before being rolled out enterprise-wide.

This led to a nearly 40% increase in product yield and a 60% reduction in unplanned production stops.

Keeping people at the center

Even with a solid business case and state-of-the-art technologies, real value and improvement lies in employees’ hands, hearts, and minds. It is critical to keep people at the center of all data-driven initiatives, and manufacturers must maintain focus on how new capabilities will upskill and enable workers at all levels. New research indicates that giving specific focus to a series of complex human factors can increase the probability of success to more than 70%.² Each member of the organization will play at least one role as a data generator, data analyzer, or data consumer. Depending on the employee’s function, responsibilities and roles, outlining the journey map of how these new data capabilities will impact them and including them in each step of the process will be critical for adoption and sustainment.

“Almost 60% of manufacturers surveyed by the NAM are creating or expanding internal training programs to address skills shortages through adaptive upskilling”

 

 

 

In a study by EY and The Manufacturing Institute, research showed industry leaders recognize that increasing business performance by asking more from their existing workforces is not enough.³ Seventy-four percent of leaders surveyed indicate that the skills needed for manufacturing jobs are changing rapidly and 82% shared they are seeking new and innovative ways to invest in the careers of their workforce. The study also found that almost 60% of manufacturers surveyed by the National Association of Manufacturers (NAM) are creating or expanding internal training programs to address skills shortages through adaptive upskilling. These adaptive skill programs are giving manufacturers new ways to educate and align their manufacturing operations to data-driven transformation journeys. The goal is to keep pace with the speed of innovation, create iterative value, and evolve based on the new insights that data provides to the organization.

Summary

Data is emerging as the strategic currency of the digital age. With rapidly growing demand for modernized manufacturing systems, companies can use data-driven problem solving to gain a competitive advantage that will expedite their digital transformation journey.⁴ Data is a critical component to inform an organization’s path and building an organization that can meet the market’s needs and expectations. It can help companies see their way through the challenges and complexities of doing business in today’s world. It can also provide a plan built around creating value, leveraging digital tools and engaging people at each step of the journey toward fulfilling the organization’s mission.  M

Article references
1  Why industrial companies need to lead business model innovation, www.ey.com/en_us/advanced-manufacturing/why-industrial-companies-need-to-lead-business-model-innovation
2  The CIO Imperative: Is your technology moving fast enough to realize your ambitions? www.ey.com/en_us/consulting/tech-horizon-survey
3. How adaptive skills can play a pivotal role in building the manufacturing sector of the future, www.ey.com/en_us/advanced-manufacturing/the-manufacturing-institute-adaptive-skills-study
4. How do you harness the power of people to double transformation success? www.ey.com/en_gl/consulting/how-transformations-with-humans-at-the-center-can-double-your-success

 

About the authors:

Manan Bawa: Senior Manager, Technology Consulting, Data & Analytics, EY
Manan Bawa is part of the data and analytics team with a strong focus on the advanced manufacturing and mobility sectors. With a 13+ year career in managing key products and technical transformations, Bawa has delivered superior results for billion-dollar projects and high-profile organizations. He is a proven, versatile leader known to spearhead emerging technologies, develop sustainable and smart factories, and produce cost-effective deliverables within the advance manufacturing, power and utilities, energy, CPG, oil and gas, tech, retail, and mobility sectors.

Robert Calloway: Americas Manufacturing and Mobility Technology Leader, EY
Robert Calloway brings 33 years of experience to help organizations develop and execute digital transformation journeys that leverage data and technology to both optimize business operations and create value and growth.

 

Zakir Hussain: Partner, Americas Data Leader, EY
Zakir Hussain leads a best-in-industry team to deliver technology implementations for his clients across sectors. For more than two decades, Zakir’s clients have benefited from his team’s experience and expertise in technology and data modernizations – uniquely positioning each to advance and accelerate their growth initiatives.

 

Greg Wagner: Data-Driven Manufacturing Solution Leader, EY
Greg Wagner is a transformation leader at EY who specializes in implementing data-driven manufacturing capabilities with clients. He possesses a background in Lean Six Sigma and quality, asset reliability and integrity, operational risk management and digital manufacturing. Wagner has more than twenty years of consulting and industry experience working in leadership, operations and technical functions within the chemicals, oil and gas and specialty manufacturing industries.

ML Journal

How Best Practices in Data Analytics Drive Maintenance Maturity

Insights that convert into actions can improve equipment reliability, span the enterprise, and boost the bottom line.  

TAKE AWAYS:
Using data and analytics for condition monitoring can eliminate unplanned downtime and allow for improved equipment reliability.
Centralizing maintenance roles and utilizing remote support can help manufacturers alleviate labor scarcity and maximize their technical teams.
Machine health monitoring can trigger corrective actions to be scaled across multiple production lines or multiple sites. 

Manufacturers are acutely aware of how machine health affects production throughput, particularly plants operating in a throughput-constrained mode. Without sustainable equipment uptime, schedules are missed, orders go unfulfilled, revenue is lost, and unplanned labor and repair costs are incurred.

A significant factor impeding the achievement of operational goals is the widespread, protracted shortage of asset reliability and maintenance talent. Fortunately, technology can alleviate this challenge.

The burgeoning depth and breadth of condition monitoring analytics technologies offered by countless solution providers aims to eliminate unplanned downtime. The core value of machine condition monitoring is twofold: (1) drive best practices in reliability and maintenance; and (2) mitigate the skills gap now and into the future. Its primary goal is harvesting analytical insights directly from machines to identify the opportune time to service degrading critical equipment and components — not too early, nor too late.

Gaining maximum value requires putting the data and analytics to good use.

 

Mastering how the condition data is harnessed, analyzed, and operationalized is key. With digitalization boosted by the industrial IoT, such as wireless condition monitoring sensors, plants can collect and centralize for analysis unprecedented quantities of real-time, streaming asset condition and performance data, along with batches of intermittently connected or locally captured data. Gaining maximum value from this approach requires putting the data and analytics to good use.

Operationalizing Data Analytics

Many plants lack the internal resources to leverage machine condition monitoring for troubleshooting and repairs, let alone navigate the complexities of developing scalable analytics solutions. The prior model of staffing qualified, dedicated at every manufacturing facility is no longer sustainable.

Industrial service providers are responding to this reality by adapting and optimizing their own technology and methods, while developing new ways to provide actionable remote support to the factory floor. Over the past 12-15 years, the aging manufacturing talent pool and increasing demand for domestic manufacturing compelled leading service companies to refocus on centralizing core roles, such as predictive maintenance specialists, reliability engineers, and high-level positions.

This development is driving a cultural shift in manufacturing. Internal plant maintenance teams are not inherently inclined to ask for help, but once they begin accepting virtual support, they can transition from their singular plant focus and leverage their skills out to other locations.

Machine health monitoring is also a paradigm shift for the remote support providers, who now actively pull condition data for analytical insights and translate it into action on the factory floor. For instance, when an alert is received at a centralized technology center that an asset condition threshold has been breached at a certain plant, an expert can reach out to the appropriate individual on the factory floor to provide crucial guidance remotely, in layman’s terms.

The core value of machine condition monitoring is twofold: (1) drive best practices in reliability and maintenance; and (2) mitigate the skills gap now and into the future.”

 

Rather than details such as the frequency, hertz, and amplitudes of the waveform, what a technician really needs to understand is, for example, that it looks like a specific coupling is coming loose. In this example, that would mean they need to take that machine out of service, validate that the motor is meeting its alignment specifications and everything is in safe working order, and then torque the coupling to its proper specification. Additional details or step-by-step instructions can be provided by the remote service expert to the asset-facing technician via a mobile device.

Moreover, this new approach to monitoring and analytics is driving significant innovation in maintenance automation processes. A simple example is instead of regularly pumping grease into an asset, data analytics from that piece of equipment can trigger a command to an auto-lubricating device bolted onto the machine to automatically inject the correct amount and type of grease into the equipment. This optimizes the process while also eliminating the need to involve a maintenance technician.

Figure 1 – Accelerating Maintenance Maturity

The Evolution of Maintenance Philosophies

Maintenance philosophies have changed markedly over the years (Figure 1). For a long time, maintenance meant running equipment until it breaks, and then fixing it, though this caused excessive unplanned downtime. When preventive maintenance practices emerged, time- and usage-based plans and routes were developed to keep the equipment in better condition, though the risk of downtime from over- or under-maintenance was still a concern.

Condition-based maintenance practices flourished when skills gaps intensified the need for labor efficiency. Having sensors continuously monitor and measure machine condition parameters and trigger alerts when a given threshold is met allows scarce labor to be applied to situations that actually need attention.

“Internal plant maintenance teams can utilize virtual support to transition from a singular plant focus and leverage their skills out to other locations.”

 

The latest realm of maintenance maturity is predictive maintenance — applying condition analytics and prognostics to predict when an asset is likely to fail, forecast the asset’s remaining useful life, and plan maintenance processes, people, and parts with precision to get the maximum life out of the equipment and components. Ideally, this approach leverages not only real-time condition monitoring data, but also historical trends and contextual information such as environmental conditions, process output, and maintenance histories.

Continuous condition monitoring analytics are central to the highest level of maintenance maturity, known as precision maintenance. It allows the work to be predicted and stretched out for as long as possible, maximizing the utilization of time and labor while maintaining the highest asset availability and throughput opportunity.

Extending Maintenance Optimization

Condition monitoring analytics enable plants to optimize their asset maintenance practices and migrate to an enterprise-wide predictive maintenance philosophy. Even non-critical assets that remain on route-based preventive maintenance schedules can benefit from predictive analytics. This is because manufacturing facilities frequently operate identical or similar types of equipment across multiple lines within a plant, and across any number of sister plant locations.

Specifically, any machine health finding that triggers a corrective action can be leveraged across other assets that have, or are developing, a comparable condition. Additionally, the knowledge gained from condition analytics and root cause identification can help to refine maintenance plans and intervals for all equivalent equipment. Extrapolating the findings to additional equipment in this manner optimizes both preventive maintenance schedules and predictive maintenance activity, further reducing unplanned downtime and extending the mean time between failures.

“The knowledge gained from condition analytics and root cause identification can help to refine maintenance plans and intervals for all equivalent equipment.”

 

A good case in point is when a specific lubrication issue at a plant was identified as the root cause of unplanned downtime. A little digging into their maintenance processes revealed that the preventive maintenance plans did not include the appropriate tasks to prevent the lubrication condition. That deficiency affected not just that single piece of equipment, but all equipment of that type, across all facilities. The results of that initial finding led to adjusting all the maintenance plans for 26 pieces of equipment across 12 facilities, increasing efficiency and preventing similar failures.

Reaping Strategic Benefits

The advantages of mastering this strategy are manifold. Consider a plant that purposely overheats aggregate to make sure it is completely dry. It may consume twice as much natural gas than is really needed to properly manufacture the product. From a financial perspective, passing that extra cost onto the consumer creates some competitive challenges for the manufacturer. From a sustainability perspective, excess emissions and gas transport requirements create a larger carbon footprint. With proper temperature monitoring, such challenges can be avoided.

Here are three of the primary benefits of operationalizing condition monitoring analytics:

  • Planning and scheduling: Having access to machine health data analytics enables data-driven prioritization of work planning and scheduling for consequential labor and cost efficiencies. Online condition monitoring allows issues to be addressed as they arise, based on detectable early warning signs, with well-planned and strategically timely corrective actions. Since time- and cycle-based maintenance plans are conducted regardless of an asset’s actual condition, changes go undetected between routes, and random failures are routinely missed.
  • Equipment reliability: Continuous machine health monitoring allows plants to move quickly on resolving predicted failures, maximizing uptime rather than reactively repairing failed equipment or components. Monitoring can reveal sudden large changes such as those attributed to a process change or machine crash, or gradual changes over time, including micro trends. By implementing AI and machine learning algorithms based in statistical process control, the degree of change can be tracked on an ongoing basis, and insights can be gleaned to determine reliability engineering needs and root causes.
  • ESG and sustainability: Actively monitored and well-maintained equipment tends to operate in a highly efficient and effective manner. Likewise, when environmental conditions such as humidity, temperature, and cleanliness of the air are continuously monitored and proactively maintained, the benefits to human and machine health are great. Conversely, improperly maintained equipment running under production load is more prone to energy waste, raw material waste, scrap, quality defects, minor stops, equipment outages, or catastrophic failure.

Consider Monitoring as a Service

Many plants want to move to a condition-based mindset but lack the engineering and development capabilities in maintenance and IT to pull together a robust package on their own. Choosing remote condition monitoring as a service may be the answer for plants that are short on time and internal technical talent.

Industrial service providers with extensive experience in predictive technologies are adept at designing, implementing, and managing fully integrated condition monitoring analytics solutions. Those with the added capability of providing centralized, remote support for hundreds of manufacturers in multiple industries are uniquely well versed in operationalizing analytics in plants and across plant sites.  M


About the author:
Micah Statler
is the Director of Operations at Advanced Technology Services

 

View More