IMDEA Networks researchers create an algorithm that maximises IoT sensor inference accuracy using edge computing

We are in a fascinating era where even low-resource devices, such as Internet of Things (IoT) sensors, can use deep learning algorithms to tackle complex problems such as image classification or natural language processing (the branch of artificial intelligence that deals with giving computers the ability to understand spoken and written language in the same way as humans).

However, deep learning in IoT sensors may not be able to guarantee quality of service (QoS) requirements such as inference accuracy and latency. With the exponential growth of data collected by billions of IoT devices, the need has arisen to shift to a distributed model in which some of the computing occurs at the edge of the network (edge computing), closer to where the data is created, rather than sending it to the cloud for processing and storage.

IMDEA Networks researchers Andrea Fresa (PhD Student) and Jaya Prakash Champati (research assistant professor) have conducted a study in which they have presented the algorithm AMR, which makes use of edge computing infrastructure (processing, analysing, and storing data closer to where it is generated to enable faster, near real-time analysis and responses) to increase IoT sensor inference accuracy while observing latency constraints and have shown that the problem is solved. The paper “An Offloading Algorithm for Maximising Inference Accuracy on Edge Device in an Edge Intelligence System” has been published this week at the MSWiM conference.

To understand what inference is, we must first explain that machine learning works in two main phases. The first refers to training when the developer feeds their model with a set of curated data so that it can “learn” everything it needs to know about the type of data it is going to analyse. The next phase is inference: the model can make predictions based on real data to produce actionable results.

In their publication, the researchers have concluded that the inference accuracy increased by up to 40% when comparing the AMR² algorithm with basic scheduling techniques. They have also found that an efficient scheduling algorithm is essential to support machine learning algorithms at the network edge properly.

“The results of our study could be extremely useful for Machine Learning (ML) applications that need fast and accurate inference on end devices. Think about a service like Google Photos, for instance, that categorises image elements. We can guarantee the execution delay using the AMR algorithm, which can be very fruitful for a developer who can use it in the design to ensure that the delays are not visible to the user,” explains Andrea Fresa.

The main obstacle they have encountered in conducting this study is to demonstrate the theoretical performance of the AMR²algorithm and validate it using an experimental testbed consisting of a Raspberry Pi and a server connected through a LAN. “To demonstrate the performance limits of AMR, we employed fundamental ideas from linear programming and tools from operations research,” highlights Fresa.

However, with this work, IMDEA Networks researchers have laid the foundations for future research that will help make it possible to run machine learning (ML) applications at the edge of the network quickly and accurately.

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

RECENT ARTICLES

Workz debuts unrestricted IoT device management

Posted on: May 3, 2024

Workz, a cloud-based eSIM vendor, has launched its new remote device management solution designed for the Internet of Things (IoT) industry. The platform eliminates the restrictions associated with traditional technologies

Read more

Itron improves Temetra platform for water utilities in Australia and New Zealand

Posted on: May 2, 2024

Itron expands the capabilities of its Temetra platform in Australia and New Zealand to include NB-IoT communications, enabling digital transformation for water utilities. Temetra’s comprehensive offering includes metre data processing,

Read more
FEATURED IoT STORIES

What is IoT? A Beginner’s Guide

Posted on: April 5, 2023

What is IoT? IoT, or the Internet of Things, refers to the connection of everyday objects, or “things,” to the internet, allowing them to collect, transmit, and share data. This

Read more

The IoT Adoption Boom – Everything You Need to Know

Posted on: September 28, 2022

In an age when we seem to go through technology boom after technology boom, it’s hard to imagine one sticking out. However, IoT adoption, or the Internet of Things adoption,

Read more

9 IoT applications that will change everything

Posted on: September 1, 2021

Whether you are a future-minded CEO, tech-driven CEO or IT leader, you’ve come across the term IoT before. It’s often used alongside superlatives regarding how it will revolutionize the way

Read more

Which IoT Platform 2021? IoT Now Enterprise Buyers’ Guide

Posted on: August 30, 2021

There are several different parts in a complete IoT solution, all of which must work together to get the result needed, write IoT Now Enterprise Buyers’ Guide – Which IoT

Read more

CAT-M1 vs NB-IoT – examining the real differences

Posted on: June 21, 2021

As industry players look to provide the next generation of IoT connectivity, two different standards have emerged under release 13 of 3GPP – CAT-M1 and NB-IoT.

Read more

IoT and home automation: What does the future hold?

Posted on: June 10, 2020

Once a dream, home automation using iot is slowly but steadily becoming a part of daily lives around the world. In fact, it is believed that the global market for

Read more

5 challenges still facing the Internet of Things

Posted on: June 3, 2020

The Internet of Things (IoT) has quickly become a huge part of how people live, communicate and do business. All around the world, web-enabled devices are turning our world into

Read more