Everything you need to know about Generative AI in Manufacturing

The next leap in bottom-line efficiency will be data-driven

Bottom Line Efficiency is the gold standard for measuring manufacturing productivity. Simply put — it identifies the percentage of manufacturing time that is truly productive. A score of 100% means you are manufacturing only good units, as fast as possible, with no stop time. so, where will the next major leap come from? data of course.

Bottom-line efficiency is, always was, and always will be, at the heart of manufacturing

As the manufacturing world is shifting towards high-mix high-volume manufacturing, a unit’s journey from material to a viable product grows longer with more and more steps. Traditional methods of optimization focus on each step, making it as good as possible, dedicating ever-growing efforts into increasingly marginal improvement. Low yield drives up product prices and cuts manufacturing gross-margins, leaving the manufacturer exposed to market volatility. While at the same time, stiff competition drives the race for excellence forwards, forcing all manufacturers to actively seek better and faster solutions, doubling down on improving each step.

Traditional approaches turn to improve each step

Manufacturers invest massive amounts of funds buying the best machinery and employing the best engineers. While there is sound logic underlying this approach the results are, by definition, marginal. A multi-step manufacturing line will probably be comprised of components and machines from various vendors, integrated at different times, coming from different areas of expertise. It’s only natural to let domain experts handle their respective areas of expertise, to focus energy at what is perceived to be the point of highest impact.

But how effective is it really to improve each step?

Let’s do some math. Imagine a 5 step process, each with a 95% pass rate (or stage yield).

The overall yield is 0.95⁵ ~ 77%.

Pareto teaches that improving an already good solution is exponentially harder, but even so, let’s make the effort. First, we improve step #1’s pass rate to 98%. The overall yield goes to 80%, which is below the low range of desired overall yield. Improving step #2’s pass yield to 98% takes the overall yield to 82% and so on. Immense efforts lead to small improvements.

Even if all the steps reach 98%, the overall yield would be 90%.

Let that sink in.

A manufacturer can continue to improve each step, but the glass ceiling cannot be broken.

bottom-line efficiency
Quote Source

Focusing on the step eventually tops out, but the bigger opportunity lies with focusing on the unit

There are far more units than there are machines, and big numbers allow for big improvements built on top of data. Each unit, good or otherwise, can tell the manufacturer a lot about what works, and what doesn’t — it provides a valuable, rich source of information directly linked to bottom-line efficiency. By virtue of sheer numbers, the potential carried by manufactured units is the stepping stone for the next leap.

Data generated by the unit can explain a lot about its operation and its chance to be a good unit 

makes sense right? Sometimes it’s hard or even impossible to directly measure the target parameter, and then indirect measurements — like assembly data, electrical measurements, process data, etc. — can do a lot to provide insights. But when possible, direct measurements always outperform indirect ones. So, if the goal is to produce as many good units as possible, using information produced by units is the right direct measurement.

Adding data sources paves the way for leaps in performance

Rich data is a major driver of performance. When several sources of independent data are combined the gain is inevitable, as best said by Aristotle:

“The whole is greater than the sum of its parts”

This is particularly true for data. It is a well known mathematical fact that independent sources of data (unit data & machinery data for example) average out the noise while enhancing the signal improving the overall signal to noise ratio (SNR). This SNR is the measure by which bottom-line efficiency can be improved, any increase in knowledge is translatable into efficiency and dollars.

bottom-line efficiency
Source

Manufacturers are sitting on a massive oil well

Let’s continue with the example above, if a single unit goes through 5 steps, each adding 10 to 1,000 data points per unit, and there are 100K units per year — 500M data points are added per year. A dataset of 500M points (not to mention continuously a growing dataset) is nothing short of an oil well for data-driven models and can provide a unique path to break through the traditional ceiling.

The data is already there — it’s a shame not to use it

If data is the new oil, manufacturers have already struck the motherlode. A critical mass of manufacturing processes has been digitized generating a paperless trail of collected data. On top of that, many manufacturers realized years ago the importance of data and have already put in place data collection capabilities from the production floor all the way to a data lake. Automated processes generate huge amounts of data, not using it is a waste.

Data-driven techniques provide a holistic view of the product

As mentioned, the sheer amount of data is enormous and so the task of analyzing it for insights is daunting and near impossible. Naturally, when faced with a large-scale task, we divide and conquer, we break it down into smaller, more manageable problems and solve them one by one. But this approach will inevitably reach a glass ceiling (or find the local minimum — for DS nerds) and leave a lot of money on the table. Luckily data science is built for big data problems. Data-driven models are specifically designed to ingest and process massive amounts of data in a short amount of time, in fact, their core competency to holistically solve the task at hand. These models take in all the data available and make predictions based on all of it — giving a holistic view of the product and the means by which to improve the bottom-line efficiency.

In conclusion

Yield is king. The end goal is to manufacture only good units. If the end goal is to manufacture only good units, it should stand to reason to focus on what the units bring to the table. The units prove to be a rich source of valuable information that can be leveraged holistically to improve bottom-line efficiency.