The Internet of Things, the Industrial Internet, Smart Manufacturing, Industry 4.0: How many of these have you heard of? How many can you define? And why do they matter to you as a manufacturer?
I’ll argue they are all pretty much the same. They are centered on the digitization of the plant floor – that is to say, more and better usable data. And the more data that is accessible from the shop floor, the bigger the big data story for manufacturers.
You may not know that, decades ago, some industries began recording the details of plant floor activities into process historians – a bit like black-box flight recorders. The problem was that single factories generated tens of thousands of data elements. Analyzing those data was like trying to quench your thirst with a firehose.
The only real use for that old process historian was for after the fact, to work out what happened after something went wrong. But with today’s sophisticated analytics software, we are no longer drowning in that data. We are surfing it.
Big Data for Performance Innovation
When people think of big data, they often think of marketing or polling. Now, historical data is equally powerful in manufacturing. It can give insight into machine performance in a single plant, the same product produced across plants, or even how an equipment manufacturer’s machinery is performing across different customers.
How could big data help you if you were that equipment manufacturer? Imagine you have machinery installed in factories around the globe. All those machines are sending performance information back to you: electricity consumption, motor heat, number of problem-stops. Suddenly, you have an incredible volume of useful data. You can compare performance and common problems across your entire install base.
From new product innovation and machinery fine tuning and even proactively contacting customers as it becomes clear that problems need to be addressed, the potential for big data in manufacturing is limitless.
Near-real-time feedback is transforming manufacturing. And big data makes it happen.
By coupling incoming real-time data with the learning from historical data, it’s possible to predict when things are going wrong so as to anticipate rather than react. For example, if a machine deviates from the normal electrical load pattern during a production cycle, you know there could be an issue. And if you know that, you can address the issue before the machine malfunctions.
This real-time element is what makes those buzz words I referred to above worth their buzz. In manufacturing, big data is as much about what is happening now as it is about what happened yesterday or last week.
One Big Thing and Many Little Things
Progressive property and casual insurance firms started to use predictive analytics in the early 1990s, when someone noticed that there was a considerable correlation between a driver’s credit scores and the likelihood of motor vehicle accidents. The new predictive method was so accurate that more firms quickly followed.
But one critical difference between manufacturing and motor vehicle insurance is that the latter is trying to predict the likelihood of one big event – a claimed accident. In manufacturing, we are trying to figure out how to make tens or hundreds or thousands of processes, events, and materials work together optimally. It took us longer to develop the analytical tools we need.
The good news is that, with the new tools to capture and analyze data, we soon may be able to aggregate, benchmark, and share. A systems vendor, for example, might ask all of its customers to be part of a project that gathers data and reports anonymized, aggregate results on a range of critical measurements and outputs to all concerned. Globally. Each member of the study would gain priceless information that would not be available by any other means.
Getting back to big events, let’s talk about a very big, scary event: an airplane falling out of the sky. The latest development on preventing that horrific scenario is having in-flight planes send a constant stream of information to analytics software on the ground that can detect any kind of anomaly. Signs of wear, flaw or weakness, it is hoped, can be picked up before catastrophe strikes. Rather than maintenance that is driven solely by time-bound schedules, repair crews can attend to electronic or mechanical failures before they occur.
For manufacturers, data is a valuable source of insight. That insight is driving transformational change in the field. Decades ago, Paul Meehl, the revered granddaddy of statistical versus human-expert prediction, argued that every study in every industry revealed the increasing success of non-human models in decision making.
There is no question that we still need both humans and statistical models. But increasingly, manufacturers who use only clipboards and a tangle of in-house Excel spreadsheets to optimize performance will find themselves left by the wayside.
Andrew Waycott is the Chief Operating Officer and Chief Technology Officer at Factora. For more information on Andrew and Factora, visit www.factorasolutions.com.
Subscribe to Data Informed for the latest information and news on big data and analytics for the enterprise.