Businesses are already exploiting real-time information from the Internet of Things (IoT). As 5G comes on stream, says Patrick Callaghan, enterprise architect, strategic business advisor, DataStax, the volume of data from billions of IoT devices will explode. Those building an architecture fit to integrate the data – flexibly and at speed – will gain a competitive advantage.
An estimated 20 billion ‘things’ will be connected to the internet by 2020, according to Gartner, all spewing out unprecedented volumes of data. At the same time, 5G roll-outs will ramp up from 2020 to 2022, leading to more data being created. The question is: what will organisations do with all that data?
To build effective modern applications, businesses need data not just from their IoT devices but from other sources both within and outside their organisation. The challenge will be to build an architecture that can integrate all these sources of data together in a way that is fit for the 5G data explosion. This architecture will need to be fast and flexible enough to adapt to new use cases as they emerge.
Challenges of growing IoT use cases
Gartner predicts 5G mobile data networks could support up to one million sensors per square kilometre. This level of connectivity will create two types of demand on the data architecture of organisations that want to improve operations, increase efficiency and better serve their customers.
First, some data will require an immediate response at the edge: deployments for robotics and automation fall into this class. Second, real-time analytics will determine any necessary short-term response when a set of conditions are met. A good supply chain example might be automatically and proactively contacting a customer if their order has the potential to be delayed. At the same time, these data sets will be collated and stored for more long-term analysis.
To cope with this deluge of data, computing models have shifted. Few organisations want to build out and manage their own data centres to the scale required; instead, they will rely on public cloud providers and use either hybrid or multi-cloud deployments.
Challenge of integrating with traditional applications
This move to multi-cloud is why companies should not consider their IoT data strategy in isolation. To benefit from IoT data, organisations will need to integrate this with other data sources, from traditional applications, such as enterprise resource management systems or supply chain management software through to new cloud services or SaaS applications.
These applications can be installed and run in multiple different places. Some applications tend to remain on-premise simply because the cost of moving them – and unpicking all the layers of integration and customisation – is too high.
Alongside supporting multiple different applications, it is not only where data is that is important, but the speed at which IoT applications generate data – and require a response – is also critical. Automated factories that require decision-making in near real-time cannot afford to rely on sluggish remote data sources to provide a result. With the speed of 5G, this connection between services should be able to cope with more complex situations and use cases.
Challenge of building hybrid applications that make use of IoT
If an organisation cannot move or replicate data across its architecture quickly and reliably enough, it will struggle to create the hybrid application model necessary to exploit IoT data in combination with other data sources. While it may be necessary to work from multiple copies of application data, because of the distributed nature of applications, keeping all copies up to date instantaneously is the challenge businesses face if they are to get the most from IoT data.
The long-term benefits of creating a hybrid cloud database that replicates data in real-time are that it offers new use cases for data previously buried within ageing applications.
The aim here is to provide interoperability that would otherwise not be possible due to data silos and functions being spread across multiple cloud providers or locations. Adopting a distributed computing model – where all data is replicated to multiple locations independently – can help applications run more effectively, as the data sets can be stored and processed closer to where the workload exists.
Equally, taking a multi-model approach – where the same data sets can be handled and used in different ways depending on the business requirement and how the data will be used – can help here. For example, looking at an operational data set for short-term analytics will meet some goals, but other data models may be needed as well. Search and analytics are simple use cases for those data sets, but other data models can also be used, such as graph analytics. By looking at different approaches to integrating and using this data, more use cases can be met.
Adopting a hybrid cloud database solution will, in the short term, prepare businesses for the massive increase in data capacity required by IoT. It will also help integrate data into applications closer to users and to customers through cloud deployments. In the longer term, its ability to duplicate data in real time will help manage migration to new cloud-based applications over time without affecting customer experience or application performance.
Although the mass roll-out of 5G is still a couple of years off, 2019 presents an opportunity to start planning the design and technologies that will make an enterprise data architecture fit for the future for a significantly more connected world.
Businesses that prepare for 5G and data growth in advance will see a competitive advantage, as they can more easily scale up to meet the demands of the organisation. With more data available, supporting scalability, availability and distributed computing will be essential to making these applications successful.
The author is Patrick Callaghan, Enterprise Architect, strategic business advisor, DataStax