Edge computing squares off IoT and cloud data needs
There is a move towards edge computing as an operational solution to address the data processing needs of the growing Internet of Things (IoT). But what is edge computing and how does it ease the data analytics demands of companies? Antony Savvas looks closely at the developing opportunities.
In an edge computing model, endpoints transmit data to an edge computing device that processes or analyses that data, instead of sending the data to the cloud or a remote data centre to do the work. Having these edge computing devices at the edge of the enterprise network reduces latency to and from the endpoint, speeds data processing and helps to bring down cloud usage bills – including running fewer servers in the cloud.
And to improve matters further, increasingly, we are seeing the endpoints or sensors themselves do more of the data processing and analytics.
Cloud players support edge
The major cloud service providers are already gravitating towards the potential benefits of edge computing. Microsoft this July unveiled its Azure Stack offering which includes Azure software integrated with hardware from the likes of Dell EMC, HPE, Lenovo, Huawei and Cisco.
Microsoft says Azure Stack is an “extension of Azure”, thereby enabling a “truly consistent hybrid cloud platform”. The technology sees organisations run the hardware at their network edge – the private part of their cloud configuration – and link it with the public part of their cloud in Microsoft Azure to create that “consistent” hybrid cloud experience.
So what has this got to do with IoT and what are the other benefits? Well, Microsoft is illustrating how hardware at the network edge can be run more efficiently by not having to communicate with the public cloud every time it has to complete a task, whether that be processing transactions, conducting data analytics or controlling processes, and that includes hardware that supports IoT endpoints.
Mike Neil, a corporate vice president, at Azure infrastructure and management, says: “With edge you can address latency and connectivity requirements by processing data locally in Azure Stack and then aggregating in Azure [the public cloud] for further analytics, with common application logic across both. We’re seeing lots of interest in this edge scenario across different contexts, including the factory floor, cruise ships, mine shafts and other scenarios.”
Microsoft isn’t the only major cloud player moving in this direction either. This June, Amazon Web Services (AWS) announced the general availability of its Greengrass software. Greengrass allows developers to easily develop and run functions such as compute, messaging and data caching locally in devices at the edge, in a way that is highly optimised for interaction with Amazon Web Services.
Ian Hughes, an analyst for IoT at 451 Research, tells IoT Now: “Edge computing is an architectural necessity to deal with the sheer volume of IoT data and the need to act upon that in a timely fashion. Cloud still offers a lot of computational power, such as for artificial intelligence analysis and the ability to look at IoT data strategically. But different use cases require a change in the balance between computations at the edge and in the cloud.”
He cites examples. “An autonomous vehicle system needs to make a decision to apply the brakes on board at the edge – with no network latency or round trip to a cloud process. But analysing production processes over time across globally distributed manufacturing plants requires a centralised cloud approach.”
Hughes adds: “The shift in understanding that IoT is not only simple sensors pushing data to the cloud, to one where edge processing takes place alongside cloud processing, is on a trajectory towards a fully distributed computing model where relevant computational decisions occur anywhere from endpoint IoT devices to gateways, networks, data centres and the cloud.”
Colin I’Anson, chief technologist for IoT at Hewlett Packard Enterprise, confirms the move towards the edge way of thinking in relation to the cloud and IoT. “The unfiltered transmission of sensor data leads to an overloading of the networks. If we want to capture the opportunities of IoT it is not enough to rely on today’s big central data centres and clouds. The new action is at the edge.”
I’Anson says: “Edge computing solves the latency problem by significantly shortening transmission paths. And it solves the bandwidth problem because no raw data, only the results of preliminary analysis, have to be transferred to remote computers. The public cloud brings great value in areas like correlation analysis and crosscompany coordination, but it poses data risks, connectivity risks and latency issues. A private cloud platform [at the edge] avoids these issues as a firm can locate it nearby to a plant, for instance, and control local connectivity.”
Address data compliance
Those data risks mentioned can be significant, particularly with forthcoming data compliance requirements like the European Union General Data Protection Regulation (GDPR). Alex Gluhak, head of technology and lead technologist for IoT at UK government innovation agency Digital Catapult, says: “A concern is the privacy of end user generated data coming from wearable devices, connected cars, smart homes and other sources. Hoarding such data in cloud-based systems may lead to the unnecessary risk of exposure of personal information to service providers or unwanted third parties.”
Gluhak adds: “Edge computing supports the increasingly distributed nature of future IoT applications. It empowers devices located at the edge of the network to process information and extract relevant insights for decision making. These decisions can be executed much faster locally where sensing and influencing of real world processes occur. As a result, only data deemed relevant will be passed to the cloud, significantly reducing the resource demands on the communication links and cloud infrastructure [including data storage costs], while minimising privacy risks.”
Open source edge
A further boon for edge computing is the prevalence of cost effective and flexible open source software to support it. 451 Research’s Hughes says: “The packaging and management of applications, and now server-less tasks too, has been driven by the open source community with Docker and operating systems such as Linux. These same techniques are applicable to edgebased, and ultimately, distributed IoT systems.”
Mike Bell, the executive vice president of devices and IoT at open source software firm Canonical, says: “IoT shouldn’t so much subtract from cloud infrastructure as converge with it in the same ecosystem – IoT should be the ‘intelligent edge’ supported by the back-end in the cloud. Software and collaboration will be the backbone of this move to the edge, such as Canonical’s opensource operating system for IoT – Ubuntu Core – matched with the collaborative nature of Amazon’s [aforementioned] Greengrass project, in enabling the efficient transmission of data between connected devices and the cloud.”
Canonical has made Greengrass available as a snap – the universal Linux packaging format – so it is available across Linux distributions, including IoT specific operating systems such as Ubuntu Core, which has carved a niche in edge devices.
Another open source initiative is TIBCO Software’s lightweight open source IoT integration engine Project Flogo, which, like Ubuntu Core, allows application and business logic to run on edge devices, avoids user technical lock-ins and helps reduce costs.
Maurizio Canton, the CTO EMEA for TIBCO Software, says: “The benefits of moving integration applications onto IoT edge devices include real-time sense-and-respond functionality for local decision-making, and savings on communications costs since data is selectively transmitted, stored on board or discarded.”
The general move towards edge computing therefore seems natural considering that hybrid cloud computing has won out against straightforward public or private cloud configurations, and the fact that the evolving IoT needs an efficient and cost effective way to handle and process burgeoning data demands.