Infrastructure maturity does not always correlate with organizational intelligence. Manufacturers must distinguish the ability to codify meaning from the ability to understand implications. By Roger Thomas

It’s an awkward time. You’ve made some Manufacturing 4.0 infrastructure investments and seen gains in automation and data collection. You now have Internet of Things-enabled your plant floor equipment and have data streaming into a Hadoop cluster. You contracted with a data scientist who keeps talking about Bayesian methods and your boss is impressed with the M4.0 milestones you’ve reached. But when stepping back, you must acknowledge the returns are more incremental than exponential. You’re not unlocking new insights. You’re catching kiln temperature variances faster but not getting to root cause. You feel you’re keeping up with industry but something’s missing.

At home, I have two teenagers, ages 15 and 17, and a tweener who is 12. While all three contribute to the drama of our household, the middle school tween years have consistently proved to be our greatest challenge. The deceiving thing about tweeners is that while many have hit their growth spurt and started looking the part of young adults, they’re still kids. In conversations and disagreements, I constantly find myself reminding my wife and I that they’re still kids. Cognitive development requires something more than a growth spurt. It requires experience.

Advanced studies in cognitive brain development explains much of this. While the physical brain itself physically develops at a younger age, cognitive development happens over many years. It starts with the amygdala where emotions live but then the prefrontal cortex (responsible for planning and patience) doesn’t fully develop until the post-teen years. I draw the parallel with M4.0 initiatives since much of the benchmarks and maturity models tend to focus on the physical IoT infrastructure, data storage, and processing power. I see three primary reasons for this: these elements represent the key M4.0 enabling technologies; physical deployments are easier to measure, track, and declare success with; and pursuing specific, measurable M4.0 business benefits requires more than just infrastructure.

Let’s review three different maturity models to explore these points.

Benchmarks and Maturity Models 

The three different maturity models I’ve seen replayed throughout my career are: The Technology Checklist, It’s a Journey, and the Hybrid Methodology.

 

 

› Model 1: The Technology Checklist
As it sounds, the Technology Checklist model sizes your maturity against the technology itself. For M4.0, the movement ties not only to the explosion of data, but more importantly to the ability of the masses to harness that data. Predominantly, companies do this through advances in IoT connectivity, compute power, and advanced analytics. These technology elements all feed on the data explosion to deliver results back to the business.

The Technology Checklist would benchmark your organizational acuity in these areas of IOT mobility, cloud enabled compute power, and advanced analytics. That said, how you are leveraging the technology is more difficult to assess. So this keep-it-simple model focuses on benchmarking technology-wise how you are keeping up with the Joneses.

› Model 2: It’s a Journey
A second approach is to focus on the journey similar to how Accenture has done with HfS Research. This model points to a nice end game for what success looks like and plays into the theme of starting small while thinking big. The challenge comes with the wholesale approach of going big without the built-in feedback loops on value creation. The methodology answers the question on what success looks like from a technology and capability matrix, but not from a process and organizational perspective. I’m sure those points may work in consulting engagements but the line-of-sight to value is not inherent in the model.

› Model 3: Hybrid Methodology
Layering the above two models together creates a more complex yet sophisticated model. Cap Gemini published such a model that accounts for corporate structure and functional areas under an evolving scale of technical sophistication. Factoring all of that together determines your level from beginner to digital master.

This model does a good job of incorporating the complexity and change management considerations in these implementations. Especially when faced with a new set of technologies and trends, benchmarks can simplify a path forward into digestible chunks.

Collecting the Dots 

The risk with these models is that more data does not mean more insight. To the contrary, we now have studies showing how confirmation bias creates situations where more information leads to people making worse decisions. Similar to my tweener, who looks the part of a young adult, their IT systems lack cognitive maturity.

To find better answers you must first start asking better questions. What was the question you aimed to solve with your M 4.0 journey? Was there a question? Or was it simply an executive mandate? Perhaps something you agreed to pursue as part of an executive Management Business Objective.

Successful transformations look beyond the benchmarks. They employ benchmarks to understand the industry direction and then they find the contrarian view. Here is what everyone else thinks. Here are the patterns. And here is how we can take advantage of those patterns. Here is how this plays into or against our unique positioning. Benchmarks are universal. Effective transformations are personal and must reflect your culture.

An Alternative Maturity Model 

The chart on the next page (Figure 4) captures an alternative view of M4.0 maturity.

  • Standing still will leave you vulnerable.
  • Classic benchmark methodologies will take you to the right.
  • Process foci and lean thinking will move you north.
  • A blending of technology with purposeful understanding will lead you to advanced levels of M4.0 wisdom.

While this model may look simple, the path forward will leave many executives frustrated. Similar to middle school, cognitive development for your operations can be a bit complicated due to the following:

  • The path of progress does not follow a straight line. As tweeners actively work to piece together their world view, what feels like backwards steps may later prove to be the giant leap forward. Life is not simple for tweeners. To the contrary, what would leave me most concerned about my tweener is the absence of challenges.
  • Introspection becomes more meaningful than external benchmarks. Tweeners struggle with peer pressure and fitting-in. Such external benchmarks more commonly become instructive lessons on what NOT to do rather than lighting a path to discovery.
  • You cannot easily predict new sources of innovation. Despite fears and social anxiety, kids can only grow through new experiences. The most painful moments of middle school are often looked back upon as the most instructive.

The M4.0 struggle to deliver returns is real. Below are two case study examples of companies that did it right.

› Case Study 1: Scientific Inquiry for Process Manufacturing
A century-old process manufacturer with tight margins and a scrap issue asked a question. Were there any upstream conditions and measurements that could be used to predict or improve first pass yield rates?

Their slurry mix required a precise blend of additives to pass a key durability yield test. With too much additives, they were eroding margins with excessive raw material costs and making the product unusable. With too little additives, they were scrapping days of product until the mix issues could be identified and resolved due to producing a weak end-product. Rather than blindly diving into a M4.0 deployment, the operations team tightly focused on the yield question independent of any external frameworks. Here is how they navigated this process (see Figure 5):

    • Step 1: Agree the status quo is not sustainable. For this process manufacturer, there was a combination of scrap waste and lost opportunities that pressed this business challenge to the forefront.
    • Step 2: Hypothesize on what world class looks like. When starting with this question, people tend think in terms of simplicity. The machines just work, the orders arrive as forecasted, and the stress levels drop. This process manufacturer needed a way to predict first pass yield upstream from the point of scrap.
    • Step 3: Thinking along the ‘Understanding Axis’, you need to know the leverage points and time fences within your organization and production lines. This process manufacturer knew a time fence existed before the kiln. Identifying product issues past the kiln meant scrap expenses and lost production output.
    • Step 4: These first three steps all now feed into a focused M4.0 deployment. Rather than thinking about the technology first and foremost, you are targeting technology to solve a specific business need. You can now look to the prior mentioned technology frameworks with the lens of your business problem in mind.
    • IOT Mobility: Equip production line sensors before the kiln.
    • Mass Storage: Cloud storage for historical data sets but some onsite storage required for real-time processing.
    • Advanced Analytics: Ability to find real-time correlations that predict scrap.

Ironically, the key success factor in all of this lies in how you define success. Many organizations go through the same motions and end up with similar M4.0 footprints. However, it is only the disciplined focus of business results that will make your efforts a success. The defining question is whether you are adding value or just adding noise.

By taking a data-driven, results-focused approach, this process manufacturer was able to successfully predict yield issues with much greater certainty. Not only did this drive down scrap, but it was also projected to save about $1 million in annual recurring cost savings in raw materials, energy consumption, and labor expenditures.

“Successful transformations look beyond the benchmarks. Effective transformations are personal and must reflect your culture.”

> Case Study 2: Incubation Results Lab for Aerospace Manufacturing
An aerospace manufacturer had simultaneously invested in ‘experience’ and ‘technology’ initiatives over the years. The challenge was these two investment initiatives existed in different camps. Their IT lead data science group was chartering all sorts of IoT and open source innovations but not connecting effectively with the business needs.

This was a classic case of engaging in a M4.0 journey in search of a business goal. All very impressive technology and timelines, but very squishy in identifying meaningful returns to the business.

One of their manufacturing executives tried a different approach by engaging SAS on a Results Project. The aerospace manufacturer would provide the data and share a challenge. SAS would look for patterns in the data that might provide some root cause insight to a nagging quality issue. Without fully understanding the context of the data, the SAS team was able to generate three types of findings:

  1. Confirm hunches on what the business suspected but did not have the data to justify addressing.
  2. Introduce new findings that the business had never suspected to be an issue.
  3. Disprove accepted truths of the business that had been misguiding their underlying root cause approach.

In the end, the Results Project generated more than 25 recommendations targeted at improving yield. After implementing the recommendations, the yield jumped by more than 70% in only the first six months, resulting in over $4 million in savings.

What was different? For a sliver of the M4.0 investment and timeline, the business found over 10x the returns. The difference came with a ruthless focus on results to the business. Where is the wisdom in this haystack of technology? You’re only hope in finding the wisdom is to target it from the onset.

Based on the successful insights and ROI generated from this first round, the sponsoring executive took the savings to invest in 10 more Results Projects. This effectively became an incubation lab for M4.0 innovation. The key, again, was how he defined success.

Manufacturing Wisdom 

Both case studies show the power of seeking a higher level of understanding from the onset. Rather than focusing on the infrastructure rollouts, the focus was on new business learnings. From this raised level of understanding comes a new vantage point to then influence subsequent initiatives. Envision a turbulent stream whereas, through M4.0 discovery initiatives, you can lower the water levels to reveal stepping stones for a new path to wisdom.

There is a tendency to revere new technologies for their own sake, like a magical cure-all for your manufacturing challenges. My reason for embracing the concept of wisdom is that it is innately human which means seeking wisdom can be a natural tendency for any role – despite being focused on using technology for technology’s sake. When deploying new automation and algorithms, we must distinguish the ability to codify meaning from the ability to understand implications. Whatever findings technology surfaces will be a function of how it was programmed and the data that trained it. This process may reveal new insights and spell things out in a manner you could not have imagined, but only humans can correlate meaning in the connections and rationalize the implications.

Without this focus on the human element in decision making, M4.0 initiatives may simply add to our current dashboard complexity. Just as tweeners struggle to make sense of their changing world, growth spurts and data influx needs context if they are to be of any benefit for cognitive insights and organizational intelligence.

Here is a quote I remind myself of with any sort of technology initiative: “To attain knowledge, add things every day. To attain wisdom, remove things every day.”
— Lao Tse

In your quest for M4.0 progress, are you adding to the complexity of your organization? Or is there an intentional aim for winnowing away the information overload and finding that path to organizational wisdom? M