Manufacturers understand the importance of leveraging technologies to stay ahead in the market. Whether referred to as Industry 4.0 (I4.0), smart manufacturing, or the 4th industrial revolution, data is the biggest asset, says Francisco Almada Lobo, chief executive officer and co-founder at Critical Manufacturing.
With only a small percentage of companies succeeding in digital transformation, however, how can manufacturers take advantage of the massive amounts of data they are creating?
Like oil, data has a great deal of value, but only if it is refined to release the information it holds. To build a transformational data platform requires a combination of the Industrial Internet of Things (IIoT), a future ready manufacturing execution system (MES) and advanced analytical tools.
How smart is your factory?
The ultimate smart factory has action-oriented data and artificial intelligence (AI) controlled production lines. But how do we get there? What solutions are needed? Some predict that the answer lies solely in the IIoT and that MES is no longer required, but this is simply not true. In fact, the MES needs to evolve to include data platforms.
Building a manufacturing data platform
The main elements for a successful Manufacturing Data Platform are a natural extension of the modern MES.
On the edge
A Manufacturing Data Platform combines solutions for processing, storing, and analysing data from huge numbers of resources. Edge solutions run close to the place where data are generated, with some local processing and analysis before sending to a central system. As such, they reduce latency, enable faster responses to changing process conditions and can reduce the costs of central processing and analysis.
One of the most critical functions of data ingestion is the meta data registry, which enables the platform to understand what data is being sent. The meta data registry refers to a schema. Based on the schema ID, schemas are added to messages before sending them on. Later, an application can read a schema and know which data it contains.
The data platform needs to deliver data from numerous data sources, including equipment, process data, MES and ERP, to applications such as historians, dashboards, alarms, analytics, and data reporting. Data warehouses and subsequently data lakes were used, but, for data platforms, Apache Kafta was a gamechanger.
It decouples data streams from systems and is distributed, fault tolerant, has incredibly high performance, extremely light weight consumers, and easily scales horizontally with the addition of hardware. Kafka acts as a nervous system, managing streams of information from various applications, processing each piece of data, and sending it to where it needs to go and has the dual capability to process data in real-time and ‘replay’ data from any given point in time.
Data processing includes batch and stream processing. Batch processing processes large groups of transactions in a single run, involving multiple operations and handling heavy data loads. This may be used to run a report or aggregate data on a data warehouse. Stream processing deals with transformations that require extremely fast handling, usually involving less data.
Higher stream processing speeds and configurable, automatic rule-based actions (e.g. ‘if this then that’) reduce latency between an event and subsequent action, thereby adding value. For exceptional processing speed with in-memory processing, the Critical Manufacturing platform uses the powerful and feature-rich Apache Spark to handle batch and stream processing.
Data enrichment is invaluable for manufacturing. It merges third-party data from an external authoritative source with the existing database of first-party customer data.
Take the example of the temperature profile of a machine. Alone, there is little analysis that can be done. However, If the system understands the processes being carried out, historical temperature profiles, maintenance activities, etc., more can be understood about the readings.
The MES provides data for enrichment and contains all the necessary contextual information. An event received into the data platform has a name, value, timestamp and MES object. It is written into a raw topic, stored in a data lake and sent into stream processing. An MES data enricher then appends contextual data to the message. This new, event-enriched topic is written back into Kafka, where it can be consumed again by stream or batch processing.
Descriptive, diagnostic, predictive and prescriptive analysis help us understand what has happened, why it happened, what will happen and what actions should be taken.
One of the most common uses of predictive analytics is machine maintenance. Data is collected over time from sensors and machine actions and merged with previous maintenance activities. Correlations between variables and results can then be made to determine causes of machine failures.
Predictive analysis then creates a data-driven model to calculate the probability of machine failure or remaining useful life, thereby anticipating maintenance needs or postponing routine maintenance if not required.
Machine learning (ML)
ML is used to analyse large data sets and learn patterns to help make predictions about new data sets. Using Big Data, it requires a data platform that scales accordingly and is the most promising techniques to gain hidden insights and value from data.
ML comprises several levels of analysis. ‘Detection’ of anomalies to identify faulty products, predict machine maintenance needs, and detect possible safety issues. ‘Classification’ then organises information and looks between categories to identify correlations. ‘Probability’ functions test how changes to specific variables will impact outcomes and ‘Optimisation’ can then be achieved by calculating the probability of various outcomes and adjusting parameters accordingly.
Given enough relevant data, learning algorithms can approximate almost any function. Correlation, however, does not imply causation. Initial hypotheses need to be tested for significance, and the more statistically relevant investigated further.
Serving and output applications layers
The final block of the Manufacturing Data Platform makes outputs available to applications such as third-party solutions, alarms, or visualisation tools, through a serving layer into the applications layer.
It is a combination of MES, IIoT, equipment integration and data platform elements that distinguish a Manufacturing Data Platform from a generic one. The use of a data platform designed specifically for the manufacturing environment is a massive accelerator in providing insights to manufacturing processes, continuous improvement, and competitive advantage.
It is a combination of MES and Manufacturing Data Platform that will enable manufacturers to seize the huge advantages I4.0 has to offer.
The author is Francisco Almada Lobo, chief executive officer and co-founder, Critical Manufacturing.
About the author
Francisco Almada Lobo is recognised as a top strategic thought leader and evangelist on digital transformation, specifically Industry 4.0, manufacturing operations and the factories of the future. He holds an MBA and an Electrical Engineering Degree from the University of Porto. He started his career in a CIM R&D Institute and joined Siemens Semiconductor in 1997.
Throughout his tenures at Siemens, Infineon and Qimonda, he specialised in optimising highly complex, discrete manufacturing operations. In 2004, he led the migration of an MES system in a running high-volume facility. Francisco Almada Lobo holds various positions within the smart manufacturing and venture capital industries, including being a Member of the 200M Fund’s Investment Committee, executive committee member of SEMI Smart Manufacturing Technology, member of the Forbes Technology Council and advisor to many Industry 4.0 startups. Francisco Almada Lobo co-founded Critical Manufacturing in 2009 and has been CEO since 2010.