Why IT and OT integration and IoT edge processing are paving the way for AI

Information technology (IT) and operational technology (OT) have been coming together in IoT for several years now. Robin Duke-Woolley, the chief executive of Beecham Research interviewed Roberto Siagri, the chief executive of Eurotech, to explore the advantages of this and assess what progress has been made in integrating the two technologies.

Robin Duke-Woolley: What is the main concept behind IT/OT integration, as you see it?

Roberto Siagri: The point of this integration is that before this new software paradigm, factories and products were not connected with the IT side of the organisation. IT was mainly related to the administration and commercial activities, whereas data from factories and products were isolated from IT and not available in real-time. By the way, factory and product data were collected by people, not by machines. Nowadays, with this new methodology we call the Internet of Things, we can digitally connect IT and OT. This is the result of the evolution from embedded computers to edge computers. Embedded computers were not necessarily connected to other computers. They were doing a job in isolation and this job was the automation. Machine automation was mainly related to taking care of the real-time needs of the machine and not related to integration with the IT infrastructure.

Now, if you think about it, before we had smartphones, phones were not data processing machines, just a voice processing machine. With the smartphone, people have figured out how to transfer applications and how to make use of a wide range of data. So why not do the same with machines? If you do this, you are doing IoT, a new paradigm that emerges as an evolution of embedded computers and machine-to-machine (M2M) communication. M2M was the start of it – connecting machines to servers not necessarily through the internet. This was very specific – certain data collected for a dedicated application. With this new IT/OT integration, data become the minimum common denominator of all the devices, including machines, products, assets and others, that you have in your organisation. It means that, in the end, the main purpose is to create a common data lake that contains all the digital twins of your organisation.

RD-W: What do you then do with that data lake?

RS: First you have to create a data lake without any real final purpose, just to collect the data coming from the different parts / devices of the organisation, then you can create a federation of loosely coupled apps around the data lake and these apps are no more hierarchically interconnected than they were in the past. Now that we have entered into the app economy, you can start thinking with agile methodologies. You design just what you need, when you need it. Following this approach, data producers are decoupled from data consumers. This means that what produces data does not have to know what or who will be using them and how they will be used: you just need collect as much data as you can.

RD-W: Does that not mean you end up with collecting a lot more information than you’re ever going to use?

RS: In the beginning you don’t need to know. That’s the beauty of the thing. In the past, you just collected the data that you needed, thus limiting your future capabilities, as what you know now is not what you will know in the future. The idea here is that because you have the things that are producing data, and because the cost of collecting and storing data is so low, why not store everything? The reality is that the more data you collect, the more value you will have in the future. How can you think about artificial intelligence (AI) if you don’t have a large amount of data – especially historical data? If you don’t have the examples, you can’t train your AI software.

RD-W: Do you think organisations accept that argument, that they should collect more data than they really need because in the future they’ll need it? Often, some sort of return on investment (ROI) analysis is needed before they do anything.

RS: You must enter this new digital production mindset. If you don’t switch from industrial production to digital production, you will never find how valuable your data are. Sometimes when you do innovation or research, you don’t know if you are going to have an outcome. But you start anyway. So, if you insist on the old model of ROI, then you will just look at the short-term improvements and, in this case, you just need a few data. If you have a long-term strategic view, then data – all data – become very valuable. For example, if you think that your product could become a service, or more connected with services, then without data you cannot work that out.

RD-W: Looking now at edge versus cloud, cloud has been at the centre of IoT, but now we have much more focus on the edge. What do you see as the right balance between processing at the edge versus the cloud?

RS: If it is real-time, such as in a factory, there is no option but to process data at the edge. If you have a very high throughput of data and you don’t want to transfer all that to the cloud, you need to pre-process it at the edge first. Also, there are now increasing security concerns. Security needs more processing power at the edge. In addition, in a factory environment there is often a need to keep the data in the factory, without moving all the data to the cloud. So, some stay local, some go to the cloud. Finally, you must to be resilient against wide area network (WAN) connectivity or data-centre failures: you need more edge computing. What that split should be depends on the needs of the individual company. In the end, these are the reasons to design a distributed data centre with gateways increasingly forming the local infrastructure.

RD-W: Would you say the intelligent gateway at the edge is becoming more important as the edge computer?

RS: Yes, more and more edge computers are taking on all the functions and this will increasingly be the gateway. To me, the difference between edge computers and embedded computers is that the latter do not need to be involved with connectivity and cybersecurity. They are mainly used in isolation, with no connection. On the other hand, edge computers must have all the features of cybersecurity already built in, because they are always connected and must have the capability to manage connectivity. This is the main difference between the two.

https://www.eurotech.com/en

SPONSORED INTERVIEW

RECENT ARTICLES

5th Edition Connected Africa announces Telecom Innovation & Excellence Awards 2024

Posted on: April 19, 2024

The International Center for Strategic Alliances (ICSA) has announced the 5th Edition Connected Africa- Telecom Innovation & Excellence Awards 2024, set to be held on 22 May 2024 in Johannesburg, South Africa. Under the theme “Building a Connected Global Economy,” the summit aims to influence the telecom in Africa. With a focus on fostering forward-thinking

Read more

Facilio launches refrigerant tracking and leak detection software

Posted on: April 19, 2024

Property operations software firm Facilio has announced the launch of its ready-to-deploy refrigerant tracking and leak detection software solution. This is meant for all grocery and convenience store operators who want to implement an automatic leak detection system to identify and mitigate potential refrigerant leaks to achieve 100% compliance.

Read more
FEATURED IoT STORIES

What is IoT? A Beginner’s Guide

Posted on: April 5, 2023

What is IoT? IoT, or the Internet of Things, refers to the connection of everyday objects, or “things,” to the internet, allowing them to collect, transmit, and share data. This interconnected network of devices transforms previously “dumb” objects, such as toasters or security cameras, into smart devices that can interact with each other and their

Read more

The IoT Adoption Boom – Everything You Need to Know

Posted on: September 28, 2022

In an age when we seem to go through technology boom after technology boom, it’s hard to imagine one sticking out. However, IoT adoption, or the Internet of Things adoption, is leading the charge to dominate the next decade’s discussion around business IT. Below, we’ll discuss the current boom, what’s driving it, where it’s going,

Read more

9 IoT applications that will change everything

Posted on: September 1, 2021

Whether you are a future-minded CEO, tech-driven CEO or IT leader, you’ve come across the term IoT before. It’s often used alongside superlatives regarding how it will revolutionize the way you work, play, and live. But is it just another buzzword, or is it the as-promised technological holy grail? The truth is that Internet of

Read more

Which IoT Platform 2021? IoT Now Enterprise Buyers’ Guide

Posted on: August 30, 2021

There are several different parts in a complete IoT solution, all of which must work together to get the result needed, write IoT Now Enterprise Buyers’ Guide – Which IoT Platform 2021? authors Robin Duke-Woolley, the CEO and Bill Ingle, a senior analyst, at Beecham Research. Figure 1 shows these parts and, although not all

Read more

CAT-M1 vs NB-IoT – examining the real differences

Posted on: June 21, 2021

As industry players look to provide the next generation of IoT connectivity, two different standards have emerged under release 13 of 3GPP – CAT-M1 and NB-IoT.

Read more

IoT and home automation: What does the future hold?

Posted on: June 10, 2020

Once a dream, home automation using iot is slowly but steadily becoming a part of daily lives around the world. In fact, it is believed that the global market for smart home automation will reach $40 billion by 2020.

Read more

5 challenges still facing the Internet of Things

Posted on: June 3, 2020

The Internet of Things (IoT) has quickly become a huge part of how people live, communicate and do business. All around the world, web-enabled devices are turning our world into a more switched-on place to live.

Read more