The first receiver concept is predicated on the ability to leverage — to the hilt — the value of IoT data and it’s being enabled by an event-driven, publish and subscribe deployment architecture that takes what’s technologically possible and makes it practical.
We’re witnessing exponentially increased volumes of IoT data and when they are harnessed by this development the benefits look set to be enormous.
Leveraging starts with the right architecture, one that enables the abstraction of data creation from data consumption. The same principle is used in regular IoT solutions, i.e. data coming from sensors in the field is decoupled from data usage in the enterprise, says Bob Emmerson, freelance writer and telecoms industry observer.
Moreover, the need to better leverage utility data was fundamental to the creation of relational databases. This indicates there is nothing that is intrinsically new about publishing event data and enabling access from authorised third parties. The difference, which is significant, comes from the way it’s deployed.
As illustrated, different entities (publishers) acquire input data from various IoT sub-systems, e.g. freezers and coffee machines in a fast food outlet like Pret-a-Manger and they output operational data. This is the regular stand-alone IoT model.
However, event data coming from different entities is held in a Common Repository, e.g. a public or private cloud and a customisable Publish and Subscribe Broker enables secure access. This indicates how the architecture can scale, taking in all the requisite entities needed to provide a holistic solution, both now and in future.
Let’s keep things simple, but not too simple. The performance of these sub-systems can be seen and controlled in the context of all the other equipment at the local level as well as a network. Are there coffee machines issues only in one Pret-a-Manger outlet, Entity A, and nowhere else or is it systemic?
This indicates the ability to enrich the data, but the concept is enhanced when the regular input data is merged with the point-of-sale data, the inventory system data and so on. However, when merged with meteorology and smart city data it takes off, which in turn leads to the ability to make better decisions on several levels.
In this hypothetical use case local Pret-a-Manger managers get the local IoT data and use with related information. A city center operations facility would get aggregated data at whatever level of detail they require. Corporate headquarters would get a more aggregated version.
Product vendors would typically receive, under contract, the data from the specific products they provided, but nothing else. In addition health and safety authorities might obtain specific aggregated data for oversight and regulatory purposes.
It is worth noting that source data can also be enriched by employing edge computing, an established development that captures and processes data at the edge of the network, near the source of the data. It is also worth underlining the importance of security, specifically authorisation, which is the function of specifying access rights to resources. Secure authentication can be achieved in many ways, but when looking for a standards-based, proven and solid technology approach, solutions that leverage digital certificates provide the highest level of security.
I’m sold on the first receiver concept but for a while I was somewhat hazy about the details of the publish and subscriber architecture. The information I gleaned from the Web was somewhat vague; it tended to be very techie and the initial impression was that this was work in progress.
That said, the architecture’s foundation and the enabling technologies are proven, so my conclusion, for what it’s worth, is that the first receiver concept is an evolutionary development that looks set to succeed.
The author of this blog is Bob Emmerson, freelance writer and telecoms industry observer