The Secret Sauce of Industry 4.0

By
3 Minutes Read

I'm often asked how best to implement Industry 4.0, when really it's more a combination and application of technologies, such as cloud computing, the industrial internet of things, networks and cyber-physical systems.

Welcome to episode seven of a fourteen-part series by John Broadbent from Realise Potential

However, there is what I call The Secret Sauce, without which your smart factory aspirations will most definitely not achieve the returns you want. 

Let's assume, you've already prepared your environment to be smart factory ready (you can check out Episode 5, which covers this), and you have automation equipment from which you can extract information in real-time. This will give you one view of information, typically production-related. 

However, when you then wish to contextualise it with perhaps further information from say, an ERP system, you need to combine these two worlds. This is also known as 'top floor to shop floor'. 

As an example, suppose production orders were planned in an ERP system and then exported to a plant floor MES for execution. You could mine the MES for real-time production information to see what's happening in the factory. But what if you wanted to compare and combine some quality information from a separate offline Laboratory Information Management System, also known as a LIMS or additional ERP information or capture power usage against a specific production run? 

Typically, even if you have these systems in place, they don't natively talk to each other and are clearly information silos. And it doesn't matter if they're on-premise, in the cloud or a combination of both. 

What we need here is some way to match up these different data sources by connecting to them, extracting, storing, perhaps calculating and filtering, and displaying the results in a meaningful and contextualised way. 

Before we go any further, I want you to consider this example: you subscribe to what I call a 'point solution' to solve an information gap in your visibility of planned information, let's say power usage, and the system is cloud-based. The vendor installs all the necessary switchboard hardware and connects it to their cloud solution. And Bob's your uncle! 

For a monthly fee, you can now see power trends in a flashy web-based reporting system. 

Ooh-ah! 

Then you decide to monitor vibration, so you subscribe to another cloud-based solution, and the same thing happens. 

Then you choose a cloud-based ERP, an MES and an on-premise SCADA system. 

Now you have multiple on-premise and cloud-based silos. I call the cloud-based silos "Islands in the sky", by the way. 

While you've achieved the result of extracting information and getting access to it, what happens if you wanted to understand the causal relationship between how much power a production order consumed and what it should have been against a standard? 

To do this, you need information from the cloud-based power usage system, the ERP for the standards and the MES for the start and stop times of the specific operations that were performed. In other words, you need to INTEGRATE disparate systems that natively don't talk to each other. 

So, in my experience, INTEGRATION is The Secret Sauce, whether the silos are on-premise, in the cloud or a combination of both. 

Beware, however, if each system has to talk to every other system, you'd have so many interfaces to create, test and maintain that it would be a nightmare. 

And if you then had to upgrade or change one system in the future, you'd have to regression test and recommission all the interfaces again. Clearly not a workable nor maintainable solution. 

There is a much more elegant way. In a project I was involved with in 2011 for a global beverage business, just as the term Industry 4.0 was forming in Germany, the integration platform we built connected in a 'hub 'n' spoke' arrangement to over 20 disparate systems. 

Ep7-Picture6The integration platform sat at the centre, the Hub, with a single connector, 'spoke', to each system, which normalised all the collected data into a common language and used a Microsoft SQL database as the central information repository. 

If a system needed to be upgraded or replaced, we had only one interface to test and commission. It also meant we could bring systems online one at a time during commissioning. 

As you can see, this is not only a workable solution, it's also proven a decade on to be incredibly stable, reliable, and totally scalable, and this style of architecture has been repeated in other successful smart factory installations with great success. 

Value is truly delivered when real-time integrated information is given to those who need it, when they need it, and where they need it, without the ridiculous manipulation of copious amounts of spreadsheets. That's the other MES, by the way; many Excel Spreadsheets! 

If you do choose cloud-based subscription services, ask the vendor before you sign up whether they offer Application Programming Interfaces, also known as APIs, which you can access to mine the data you need. It is after all your data, or is it? 

I hope from the short example, you can see how integration is the glue that brings all this hard work of using the IIoT, cloud and on-premise cyber-physical and other production-based systems together, without which there's little to no manufacturing intelligence. 

In the next video in this series, Episode 8, I'll explain how to leverage what's being discussed here to extract even more value from your smart factory, applying what I call time compression, so keep an eye open for it. 

If you need help or clarification on integrating your disparate data sources, you can reach out to me on LinkedIn or via the Realise Potential website. Until next time may all your production be visible.

John Broadbent

RP

 

Watch John's original LinkedIn video here.

 

Picture of Realise Potential

Realise Potential

Realise Potential works independently and collectively with manufacturing companies and individual clients to “CREATE A BETTER TOMORROW”

Author