The Internet of Things (IoT) is all about sensors and the data that sensors produce. It’s less about the physical things that have a functional purpose, and it’s more about what becomes possible with the data that is produced by the sensors in those things. This is especially true when you combine the sensor data from one thing with sensor data from other things, such as combining a truck’s location data with weather condition data 100 miles down the road.
On the outside, it is generally assumed that IoT is about smart machines in manufacturing plants, smart fridges, smart vehicles, etc. But the piece that is actually driving the IoT is the small sensor embedded within the machine. This sensor quietly works to gather and send data that helps make decisions about the machine’s functionality. For instance, the sensor in a smart machine in a plant may measure multiple variables including oil viscosity and hours of machine usage. This data is used to improve operational efficiency through predictive maintenance, which yields savings on repairs, maintenance costs, and fewer machine malfunctions.
Right now, it is estimated that 50 billion sensor devices will be connected via the IoT by 2020. With all of these sensors out in the world continuously working, there are zettabytes of data being generated 24/7. Most of us understand the challenges associated with this, especially the fact that data is “dirty.” By dirty, we mean that raw, unfiltered data is riddled with duplicate, corrupt, and incomplete messages, and only an advanced analytics solution will be able to scrub and clean the data in order to make sense of it.
However, a sensor alone does not do anything. Machines or computers and people do things. Computers evaluate state and can trigger an action. People can look at data and take action. In short, connecting sensors alone does not do anything. Where things really come together is when you connect computers and applications with the sensors to create the intersection of data and action.
Cloud-based applications connected to the same networks as the sensors can consume and leverage the data. Once consumed, these applications piece together the data from the different sensors and apply analytic models to do things like alert a truck driver connected to the network via a smart phone to take another route because the road ahead is iced over.
Let’s break it down with the following example. At Savi, if one of our customers wanted to know if their truck driver was staying on the approved route during a shipment, then they could configure an alert to be sent via email or text message if the driver strayed outside of the corridor. In this example, the sensor on the truck sends its location data to our proprietary Hybrid Lambda Architecture (HLA) every 30 minutes (although it could be sent in real time as well). Our HLA, utilising programmed algorithms, recognises that the sensor—and the truck—is outside of the designated corridor. HLA then sends this information to our interface, Savi Insight, which would trigger an alert to our customer. At the same time, HLA would also send information back to the sensor requesting that the sensor send location data every minute until the truck is back on the proper route. Most importantly, all of this needs to happen in real-time, which means that our HLA is fast because it is able to process a huge amount of data very, very quickly.
Lastly, the biggest benefit to sensor technology is the operational intelligence that it affords. As we mentioned earlier, the magic is when sensor data combines with analytics to create actionable information. This is where our HLA is not only fast, but it is also smart because it recognises patterns and trends through machine learning capabilities and yields better and more accurate results as more data is collected.
The author of this blog is Andy Souders, senior vice president, Products and Strategy, Savi Technology