Now Reading
Instead of sending data to the cloud, why not send the cloud to the edge?

Instead of sending data to the cloud, why not send the cloud to the edge?

Posted by Anasia D'melloOctober 5, 2018

In the IoT era, companies increasingly require critical decisions to be made in fractions of seconds. Yet in the most time-sensitive situations, says Adi Hirschtein, director of product at Iguazio, the latency involved in sending their data to a centralised cloud and back poses significant hurdles to efficiency and economy.

How can today’s enterprises enjoy the benefits of IoT while avoiding the nettlesome latency problem? Enter intelligent edge computing – the solution that can dramatically accelerate the delivery of data by processing it closer to its source.

Pegged at just under $80 million (€69.55 million) in 2017, the edge computing market in the U.S. alone is projected to soar to more than $1 billion (€0.87 billion) by 2025 as more and more companies across sectors seek to leverage its benefits. Business Insider forecasts that by 2020, 5.6 billion enterprise and public sector IoT devices will harness edge computing for the collection and processing of their data – a nearly nine-fold increase over 2015 figures.

That being said, the cloud isn’t going anywhere yet. By working with hybrid platforms, companies can leverage the edge to make faster and smarter decisions as well as reduce bandwidth costs, while enjoying high connectivity to public clouds for historical data and elastic computing. When combining the two, machine learning models are developed in the cloud and automatically deployed at the edge for maximum performance.

A critical need

Consider a few illustrative use cases in which the time-consuming process of sending data back and forth to a centralised cloud could jeopardise companies’ performance and public safety.

Should localised intelligence at an industrial plant predict or detect an equipment failure in industrial IoT (IIoT), factories must cease operations and trigger auto-healing actuators as soon as possible to avoid further destroying machinery, and more importantly with heavy duty industrial equipment in motion – to avoid endangering employees’ lives.

The pressing need for edge computing solutions in industrial settings helps explain why manufacturing is slated to represent the largest slice of the U.S. edge computing market in 2025, at $306 million (€266.04 million).

In telecommunications, edge computing serves as a vital tool for monitoring and predicting network health in real time. This can only be achieved by processing high message throughput from multiple streams, correlated with historical data at the edge. Without edge computing’s ability to process and act upon distinct data streams in real-time, the analytics process would be much more inefficient and time consuming, increasing the likelihood of major cell network outages dramatically. 

These outages, if caused by natural occurrences or cyberattacks have the capability to kneecap the affected region, compounding the importance of a telco’s ability to detect and resolve outages rapidly. With regard to public safety, efficiency and time are key to crime solving. All the technology that law enforcement has at their disposal – intended to make case resolution faster – goes to waste when cloud processing seriously delays data analysis. 

In extreme cases it is exactly those extra three minutes that make or break law enforcement’s ability to keep the public secure. Only an edge solution can quickly correlate data from multiple unstructured and structured data sources to pave the path for police to detect and prevent criminal activity. As with other highly sensitive use cases in which decision makers need rapid access to actionable analytics, hyper-localised data processing is the only answer.

Rethinking data management

Amid the infusion of IoT into virtually every aspect of business, companies have generated unprecedented demand for access to and analysis of vast amounts of data. In this climate, companies have two choices: They can either send all their data for processing to the cloud or they can opt to have their most critical, time-sensitive data processed and analysed locally while utilising the cloud for machine learning model training and elasticity.

The former causes enterprises to be vulnerable to delays of all kinds and imposes significant costs and risks. The latter provides businesses with the edge they need to boost response times. All of this raises an important question: Instead of sending all data to the cloud and back, why not move that cloud to the edge itself? Businesses seem to be asking themselves this and agreeing.

According to a Gartner analysis, by 2022, three quarters of enterprises’ data will be processed and analysed away from centralised data centres, compared to less than 10% in 2017. As more businesses adopt the intelligent edge approach, it will mean greater simplicity, performance, and security – bringing the benefits of IoT to more enterprises and governments, without incurring unnecessary costs in time and money.

Edge analytics are not only allowing companies to be more competitive in their respective industries, but to effectively perform their tasks in real-time, something that is essential in our interconnected reality.

The author of this blog is Adi Hirschtein, director of product at Iguazio

Comment on this article below or via Twitter: @IoTNow_OR @jcIoTnow

About The Author
Anasia D'mello

Leave a Response