Why edge computing is so crucial for IIoT

Michael Schuldenfrei of OptimalPlus

The invention of the Printed Circuit Board (PCB) in the 1950s changed the world of automation. Prior to the PCB, electronic circuit boards were assembled exclusively by hand, a laborious process that greatly limited global production.

Today, says Michael Schuldenfrei, corporate technology fellow at OptimalPlusindustry is experiencing yet another revolutionary leap with the introduction of instrumentation in the manufacturing process and the use of edge computing.

Instrumentation of the manufacturing process involves numerous sensors and microcontrollers which can subtly alter manufacturing conditions in response to environmental conditions detected by the sensors. These sensors produce large quantities of data, but the microcontrollers cannot respond directly to the data produced.

Both the sensors and microcontrollers used in manufacturing instrumentation are basically small networked computers. The sensors send their data to a central location where the data is then analysed. These small, autonomous computers are not monitored by humans in real time and are part of the Internet of Things (IoT). More specifically, in a manufacturing context, they are Industrial IoT (IIoT) devices.

IIoT use case for manufacturing instrumentation

IIoT devices are used in any number of contexts to do jobs that would be difficult — if not impossible — for humans to do reliably and/or accurately every time. Consider, for example, weld inspection. Welding is an integral part of many electronics production lines and critical to the functionality and durability of the final product.

Unfortunately, manufacturers are being asked to perform welds on increasingly smaller components, with increasingly tighter constraints. In order to protect components, welds must be performed at the lowest possible heat and with the smallest possible electrical charge.

IIoT devices that might help refine this process include heat, voltage, and pressure sensors to help determine the minimum amperage necessary to perform a weld in the current environmental conditions. IIoT cameras may also feed Machine Learning-based visual weld inspection systems to verify that welds are satisfactory, even when they are far too small for the human eye to see; and this is just for starters.

Manufacturing instrumentation can make any manufacturing — not just electronics manufacturing — more accurate, with fewer production errors and requiring fewer people involved. Unfortunately, this instrumentation isn’t easy, especially given the complexities of the modern manufacturing supply chain.

Making manufacturing instrumentation function

Information Technology (IT) teams have been making use of instrumentation for decades. It doesn’t cost as much to build sensors into software as it does to build them into hardware. As a result, operating systems, applications, and IT equipment of all kinds are absolutely littered with sensors. Because of this, IT teams have been struggling with the amount of data they produce since before the modern microcomputer existed.

So much data, so little time

In the real world, any instrumented infrastructure produces way more information than a single human can possibly process. Even large teams of humans cannot be expected to comb through all the data emitted by even a modest IT infrastructure. Entire disciplines exist within the IT field dedicated to making the data emitted by IT instrumentation understandable. Technologies and techniques range from simple filters to sophisticated Artificial Intelligence (AI) and Machine Learning (ML) techniques.

Until recently, this was good enough for most IT teams. Information would be collected and sent to a central location, numbers would be crunched, and only the important data was forwarded to systems administrators. If this took a few seconds or minutes, that was okay; a brief IT outage was generally acceptable.

But as organisations around the world became more and more dependent upon their IT, the acceptable amount of time it took to act on instrumentation decreased significantly. For many organisations, the acceptable reaction time is today far below what a human could possibly achieve. Modern IT systems in the most advanced organisations thus use powerful AI and ML suites to have their IT infrastructure react to changes reported by the sensor data before human administrators are even aware there’s a problem.

Modern manufacturers, as one might imagine, look for manufacturing instrumentation solutions that are capable of also reacting faster than a human. While reading sensors and telling humans a problem has developed is helpful, it’s nowhere near as helpful as responding to sensor data in real time.

IT instrumentation vs. manufacturing instrumentation

The difference between the two is that IT Instrumentation is comparatively easy: one collects data about IT infrastructure and applications from devices that are already fully digital. Manufacturing instrumentation is more challenging. IIoT devices used in manufacturing instrumentation collect data about the physical world. This means collecting analogue data and converting it into digital—and that’s a whole other ball game. Physical sensors need to be calibrated, and over time they wear out. Physical sensors are also typically deployed in clusters so that quorum sensing is possible.

Quorum sensing uses multiple independent sensors in order to compensate for calibration drift or sensor malfunction. If one sensor in a cluster reports data that is divergent from its partners, it can be ignored and/or flagged for recalibration. This allows manufacturing to continue with known good sensors until the malfunctioning one can be recalibrated or replaced.

The complications of analogue sensing, combined with the pressing requirement for real-time responsiveness to sensor data, present real-world challenges for manufacturing instrumentation.

Can’t cloud computing fix everything?

IT teams have had to deal with many different and difficult computational requirements. One example of a solution developed by IT vendors is cloud computing.

Cloud computing & BDCA

Cloud computing allows organisations to access seemingly limitless IT infrastructure with the push of a button. While the reasons behind cloud computing are numerous and complex, perhaps the most important one is that cloud computing allows IT teams to operate IT workloads without having to manage or maintain the underlying IT infrastructure. The cloud provider handles that part for them.

Cloud computing has proven very useful for Bulk Data Computational Analysis (BDCA) workloads. There are many types of BDCA workloads, including AI, ML, Big Data, and more; anything where large quantities of data are collected and subsequently need to be analysed is a BDCA workload. In the past few years, cloud computing has been the destination for the majority of new BDCA projects.

One of the reasons that cloud computing is used for BDCA workloads is the concept of cloud bursting. Cloud workloads—such as the computation workloads used to analyse large datasets—can be spun up only as needed and to whatever scale required. This suits BDCA workloads well because most BDCA workloads only need to generate analyses on a set schedule. End-of-month reports are a popular use case here.

Unfortunately, economics of scale mean that traditional public clouds are centrally located. This allows public cloud vendors to situate their data centres where costs are lowest and simply build really, really big data centres. While this is useful for batch-job style BDCA workloads that run on schedules, this is less than helpful for workloads that require real-time responsiveness.

In order to solve this, edge computing was developed.

Edge computing

Edge computing can be thought of as cloud computing, but in someone else’s data centre. Edge computing evolved because IT teams had workloads that required low-latency responsivity that traditional public cloud computing couldn’t provide. IT teams were perfectly capable of creating such infrastructures but simply didn’t want the burden and hassle of dealing with it themselves.

Meeting new data demands

After a meeting of minds, it was decided that in order to meet the needs of these customers, public cloud providers would install servers into the data centres of relevant organisations. This allowed the IT teams of those organisations to execute workloads on what, to them, looked identical to a region created just for them by the public cloud provider but which was located on the same Local Area Network (LAN) as the rest of their workloads.

These “edge computing” servers allow IoT sensor data to be processed and acted upon far faster than would be possible if that data had to traverse the internet to a public cloud data centre, be processed, and then have the results travel back across the internet. Edge computing is enabling a number of new technologies, including driverless cars.

Use case: Real-time data for driverless cars

Driverless cars are a great example of a technology where waiting for data just isn’t an option. Cloud computing could help driverless cars by collecting sensor information for all cars in a given area, crunching the data, and sending those cars a map of where everyone and everything is located inside a given radius. This could allow these cars to literally see around corners, making them even safer.

However, even at the speed of light, sending information from a car to the public cloud and back again can take up to a quarter of a second. People can die in a quarter of a second when cars are involved. So moving the processing closer to the cars—say by locating the relevant servers within a few blocks of where cars will be trying to navigate tricky urban environments—can enable technologies that otherwise wouldn’t be possible.

In the same way, manufacturing can make use of edge computing to enable needed instrumentation. As is usually the case, however, manufacturing has its own twists and turns that not only make edge computing more critical to the process but also present various challenges that have to be overcome.

Why use edge computing in manufacturing?

A common pitch for the relevance of edge computing to manufacturing companies revolves around the need for real-time responsiveness. When trying to keep manufacturing defects near zero on a fast-moving production line, it helps to be able to make use of sensor clusters. A sensor cluster can quorum sense if an individual sensor is faulty, and then recalibrate. However, recalibration must be done very quickly to avoid disrupting the production line.

If it takes 100 or 250 milliseconds to send sensor data over the internet, then products on the line could be lost, or equipment could be damaged. But if the data can be processed locally, taking approximately five milliseconds, then manufacturers can recalibrate sensors in real time and/or alter manufacturing equipment settings in response to environmental conditions.

Sensor overload

Another reason behind edge computing’s usefulness that doesn’t get discussed quite so readily is that there can be unmanageably large numbers of sensors involved in manufacturing instrumentation. This can not only overwhelm network capacity but also produce a huge collection of data, which is not required in its entirety. Thus, it is useful to sift through the data before forwarding on only that which needs to be sent.

It is common for data volumes to be overwhelming or require some form of filtering, where sensors are used in a quorum to overcome calibration or aging issues. Here, individual sensors may be rejected if other nearby sensors that participate in a quorum do not agree with the readings. A fully instrumented factory may contain millions of individual sensors that ultimately consist of only a few tens of thousands of sensor quorums—potentially quite a lot more than the local internet connection can reasonably be expected to handle.

In other edge computing configurations for manufacturing, there are some sensors that are only used locally. This could be because they are used in real-time responsiveness, or because they are only relevant locally, for example, as part of a security solution.

Contract manufacturing

Edge computing is also useful in the increasingly common scenario of contract manufacturers (CMs). CMs have IT solutions independent from the Original Equipment Manufacturers (OEMs) that commission work. However, many OEMs see benefits in instrumenting their entire supply chain, even those portions of it that have been contracted out.

In this case, OEMs may extrude part of their network into the network of the CM using edge computing. The OEM’s IT team might place servers into the CM’s network that connect back to the OEM’s private cloud. Combined with IIoT sensors, these edge computing servers would allow the CM to meet the OEM’s instrumentation and supply chain integration goals without impinging upon the CM’s own network or requiring radical changes to the CM’s network design.

Edge computing gives the OEM the ability to view their entire supply chain and manufacturing operation using a consistent interface and integrated set of applications, regardless of whether the individual components are being manufactured in the OEM’s facilities or those of a CM. This consistency makes training and supporting CMs easier, as everyone is using the same toolchain.

Summary

Cloud computing, which has been around for more than a decade now, is often marketed as the solution to all IT ills. It’s not. Cloud computing solves a great many problems, but the speed of light means that giant centralised server farms are only ever going to be so useful.

Edge computing serves two main purposes: extracting signal from noise by locally processing large volumes of data that are not feasible to send across the internet and providing the ability to process specific things locally where and when latency is a concern. Both of these are useful to manufacturing companies that are increasingly dependent on instrumentation.

Manufacturing can’t wait around for light to make it from A to B and back. There’s too much on the line and no time for errors. Edge computing solves problems clouds can’t, so it’s time to evolve or be left behind.

The author of this blog is Michael Schuldenfrei, corporate technology fellow at OptimalPlus

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

RECENT ARTICLES

5th Edition Connected Africa announces Telecom Innovation & Excellence Awards 2024

Posted on: April 19, 2024

The International Center for Strategic Alliances (ICSA) has announced the 5th Edition Connected Africa- Telecom Innovation & Excellence Awards 2024, set to be held on 22 May 2024 in Johannesburg, South Africa. Under the theme “Building a Connected Global Economy,” the summit aims to influence the telecom in Africa. With a focus on fostering forward-thinking

Read more

Facilio launches refrigerant tracking and leak detection software

Posted on: April 19, 2024

Property operations software firm Facilio has announced the launch of its ready-to-deploy refrigerant tracking and leak detection software solution. This is meant for all grocery and convenience store operators who want to implement an automatic leak detection system to identify and mitigate potential refrigerant leaks to achieve 100% compliance.

Read more
FEATURED IoT STORIES

What is IoT? A Beginner’s Guide

Posted on: April 5, 2023

What is IoT? IoT, or the Internet of Things, refers to the connection of everyday objects, or “things,” to the internet, allowing them to collect, transmit, and share data. This interconnected network of devices transforms previously “dumb” objects, such as toasters or security cameras, into smart devices that can interact with each other and their

Read more

The IoT Adoption Boom – Everything You Need to Know

Posted on: September 28, 2022

In an age when we seem to go through technology boom after technology boom, it’s hard to imagine one sticking out. However, IoT adoption, or the Internet of Things adoption, is leading the charge to dominate the next decade’s discussion around business IT. Below, we’ll discuss the current boom, what’s driving it, where it’s going,

Read more

9 IoT applications that will change everything

Posted on: September 1, 2021

Whether you are a future-minded CEO, tech-driven CEO or IT leader, you’ve come across the term IoT before. It’s often used alongside superlatives regarding how it will revolutionize the way you work, play, and live. But is it just another buzzword, or is it the as-promised technological holy grail? The truth is that Internet of

Read more

Which IoT Platform 2021? IoT Now Enterprise Buyers’ Guide

Posted on: August 30, 2021

There are several different parts in a complete IoT solution, all of which must work together to get the result needed, write IoT Now Enterprise Buyers’ Guide – Which IoT Platform 2021? authors Robin Duke-Woolley, the CEO and Bill Ingle, a senior analyst, at Beecham Research. Figure 1 shows these parts and, although not all

Read more

CAT-M1 vs NB-IoT – examining the real differences

Posted on: June 21, 2021

As industry players look to provide the next generation of IoT connectivity, two different standards have emerged under release 13 of 3GPP – CAT-M1 and NB-IoT.

Read more

IoT and home automation: What does the future hold?

Posted on: June 10, 2020

Once a dream, home automation using iot is slowly but steadily becoming a part of daily lives around the world. In fact, it is believed that the global market for smart home automation will reach $40 billion by 2020.

Read more

5 challenges still facing the Internet of Things

Posted on: June 3, 2020

The Internet of Things (IoT) has quickly become a huge part of how people live, communicate and do business. All around the world, web-enabled devices are turning our world into a more switched-on place to live.

Read more