Fog computing could take IoT to the next level, but it will dissipate without interoperability

Recognising its potential to transform the use of IoT in the enterprise, we at Machina Research published recently a series of three research notes about fog computing. They explored what it means as a technology concept, where its impact will be felt the most, and how the supplier community is starting to address it. The fog is one of those IoT topics that we are covering on an increasingly ongoing basis for our clients. In this blog post, let’s sum up some of our findings and conclusions so far.

Fog and edge computing – related, but not the same

First off, it is worth underlining that while fog and edge computing are often used as synonyms, they are conceptually different. In essence, the fog is an evolutionary stage of edge computing – which, as such, is not a particularly new phenomenon. In the pre-IoT world of industrial automation – enabled by site-level systems such as SCADAs, PLCs, and data historians – the intelligence has always resided predominantly within the enterprise’s field assets, largely out of necessity.

In the traditional, pre-fog model, data from endpoints is typically aggregated under a single edge device, which means that the scope of thus created site-level data communities is undermined by the reach of networking infrastructure. The fog represents a new paradigm in which the logic of pooled resources, as used in cloud computing, is applied to a metropolitan (e.g. a city), a local (e.g. a campus), or a hyper-local (e.g. a building) level. Instead of being centralized in a dedicated site backend, the storage and processing tasks are distributed between multiple devices installed throughout the location.

In principle, this allows both the deployed network and the available IT resources to scale up in a considerable manner. Consequently, there can be more nodes feeding data into the system, and at the same time the system has more capabilities to analyse and respond to the received data. By doing so, fog computing can add both the extent (the number of nodes) and the depth (the level of available insight) to an IoT application.


The fog is a long game, requiring interoperability

In the medium to long term, fog computing is likely to have a very far-reaching impact on IoT. In the shorter term (say, the next three years), its effect may however be prove more limited than its leading advocates are envisioning, considering the fact that many of the end markets that stand to benefit from it the most are in the Industrial IoT, where long asset lifecycles, brownfield deployments and entrenched legacy systems tend to be the norm.

This means that the fog can be expected to gain traction in an incremental manner, and even almost by stealth. There will be no “Year of the Fog” or inflection points to pin down – unlike in the case of cloud computing, which very noticeably has allowed various digital startups to leapfrog competition by building out their technology stack cloud-first. In comparison to the cloud, the fog will be a pronouncedly more gradual story.

The fog has a huge, and currently unmet, prerequisite in interoperability. Meaningful interoperability will require relevant open standards to facilitate secure and reliable data sharing between devices that have been supplied by different companies. The creation of such standards will also make existing implementations accessible to new suppliers, preventing vendor lock-in and thereby de-risking the fog as a technology concept from the adopter’s standpoint.

A major initiative to drive open fog standards was created in November 2015, when the Open Fog Consortium, a dedicated trade association, was launched to promote suitable architectures, testbeds, and interoperability frameworks for multi-vendor environments. Without such efforts, the fog will almost certainly fail to condensate. (The pun, sadly, intended.)


The fog will not be a substitute for the cloud

While there are various technological drivers that are pushing system intelligence towards the edge, it would be a mistake to presume that enterprises with edge-based IoT architectures can treat the cloud as an afterthought. The need to unlock site-level data on the enterprise level remains as real as ever, and the emergence of machine learning and artificial intelligence as new analytic enablers will no doubt make the case for it even more pressing. More often than not, doing things well at the edge will simply be a way to be able to do them better in the cloud.

Because of this, fog computing should not be understood as fog-only computing – but be seen as an element of application agility. Making an IoT system’s architecture overly reliant on the edge, no matter how intelligent, is something enterprises will need to avoid. The real end goal should be developing a framework for application agility, under which different elements of IoT applications can be deployed in different network locations. On the supplier side, this means that the vendors that are pioneering the more intelligent edge should engage with their cloud-focused counterparts from early on – and especially when it comes to the pursuit of interoperability.

Recent Articles

Cognizant to acquire Bright Wolf, expanding IIoT services, smart products and Industry 4.0 expertise

Posted on: October 26, 2020

Cognizant is to acquire Bright Wolf, a privately-held technology services provider specialising in custom Industrial Internet of Things (IIoT) solutions for Fortune 1000 companies. Bright Wolf is certified by all major cloud providers and acknowledged as an expert in IoT.

Read more

Why the intelligent edge is central to tomorrow’s IoT

Posted on: October 22, 2020

We’ve all seen the projections about how many devices will soon be connected to the internet of things (IoT), says Gowri Chindalore, head of Strategy, Edge Processing, NXP Semiconductors. IDC, for example, predicted the figure will exceed 41 billion by 2025. Much has been written about the opportunities this will unlock to make our homes,

Read more