‘The Wonderful World of M2M Communications’
You can download a PDF or read it here:
As an all-round freelance writer for nearly 30 years I’ve been fortunate to obtain a bird’s eye view of different communications technologies and developments, e.g. computer telephony cellular telephony, mobile data, and VoIP. Each time I experienced an advanced view of an exciting communications future, although I
was wrong about computer telephony. Now my focus is on M2M. In early 2013 I started to think about compiling various short takes on the innovative solutions and applications that I’d either written about or read about. Basically I wanted to communicate the big picture — and as illustrated on page 9, it’s a big, multi-faceted picture — in a way that would be clear, concise and easy to take on board. I also wanted to indicate, without making any guru-type predictions, the way that the industry and the market are heading. And finally, my intention was to focus on the tangible benefits and make minimal reference to the technology.
M2M’s marketing message
It’s hard, intrinsically hard, to convey what this technology does to a third party. It’s logical to start with the term, but do that and you have to explain that the first M is a device and the second M is a computer system, so you’re not off to a good start. A better approach is to forget the term and give an example of an application to which our hypothetical party would relate. Everyone knows that populations are aging and health care is expensive so I usually talk about eHealth and use electronic pill dispensers as an example. It normally works and if the third party is really interested in M2M they will want to know more: what else does it do?
Think. There are nine main verticals; which one would resonate? Fleet management? No! The environment? Yes. Everyone knows about air and water pollution so you tell him or her that M2M’s sensors are used to monitor and measure physical parameters like carbon dioxide. This solution usually works, but you’re in trouble if you try to cover other solutions and explain how the same technology can address sectors that are so diverse. At this stage eyes tend to glaze over. So what am I saying? Simply the fact that it’s hard to encapsulate all the things that M2M does: it would seem to be impossible to create a clear, concise message for the wider world.
I’m about to stick my neck out and suggest that is the reason why the industry’s marketing communications message is relatively weak and why many M2M vendors don’t employ the term. In turn, it explains why so much media coverage is either over-the-top or inaccurate or both. So what’s the answer? I’ve thought long and hard about this issue and on a long flight to California I came up with the following:
Think of M2M as a broad set of data applications. The data is all around us: it’s everywhere. M2M can reach out and grab it and turn it into real-time, actionable information. What kind of data? It can be medical, environmental, vehicular, logistical or location data. It doesn’t matter. What does matter is the fact that M2M delivers tangible befits to society, individuals and businesses. That’s why the industry came through the recession in good shape and why it is experiencing double-digit growth.
M2M addresses big social issues
M2M is addressing big social issues such as healthcare, transportation, energy conservation and the environment. The technology has been proven, albeit on a modest scale, because huge investments are needed in order to deploy
nationwide solutions and right now governments’ focus is on balancing the books. But the potential savings are huge, which means that equally huge benefits to society are a bonus. And M2M solutions are realizing tangible returns in the business arena via their ability to cut costs, save time, improve operational efficiency and enhance customer service.
But before we look at those solutions we should follow Voltaire, who was reputed to have said that: “Before we converse we must define our terms.” The first M in M2M is normally a device that measures a parameter or monitors an event and the second is a computer server that processes the granular data and delivers real-time, actionable information. In the business-to-business (B2B) sector there is no human intervention: data is acquired and processed automatically.
The business-to-consumer (B2C) sector does have human intervention and in most cases it involves a device that connects to the Internet, hence the term “The Internet of Things”. SatNavs and e-readers are B2C devices. I’ve spelt out the distinction because there is a degree of confusion in the media.
Explosive growth is taking place in the M2M industry and in a few years as the applications become pervasive we will start to derive significant benefits that we’ll take for granted. And if we do stop and think, we may even wonder how
we did without them. That’s what happened with cell phones, smartphone and tablets. But right now it’s interesting and important to see how the industry is evolving in some of the key sectors (aka vertical markets).
The use of M2M technology in environmental monitoring is expanding: it’s a development that’s facilitated by costeffective wireless solutions. Applications include: air quality measurement irrigation, temperature, pressure and chemical monitoring and the collection of water samples in rivers, lakes and even coral reefs
However, behavioural changes in nature can provide accurate long-term data. Oysters and other mollusks, for example, obtain the nutrients they need from the water. As illustrated, the shells open in order to filter out the food particles,
i.e. small algae, and at other times the bivalve (two-part) shell is closed. Scientists can use this behavior as an indicator of the quality of the water in which mollusks live. The photo was provided by Dr. Massabuau, who heads up a research team in ecotoxicology at the University of Bordeaux.
The term V2X is used to identify the Vehicle-to-Vehicle and Vehicle-to-Infrastructure components of established various road safety applications such as advanced driver assistance systems and dedicated short-range wireless
communications. What V2X does is bring them together: it enables location awareness via GPS and situation awareness via dialogues between vehicles. Autonomous Cruise Control, shown on the left, employs sensors to detect the position of the vehicle in front and in adjacent lanes. V2X creates an awareness zone right around the driver’s vehicle. This is a good example of the way M2M is creating solutions that address intrinsic issues in our society, in this case deaths and serious injuries caused by traffic accidents and time-wasting holdups that increase air pollution.
Solutions that allow people, particularly elderly people, to self-medicate in a safe, secure way at home are being used. Nobody wants to go to hospital and medical resources are being stretched to the limit as populations age. In addition, embedded sensors and health kits are detecting potential problems at an early age. Future developments look set to include application-centric services, e.g. guaranteed delivery of ehealth data coming from pacemakers, glucometers, and blood-pressure monitors.
Idezo employs wireless sensor technology in an e-Health solution designed for Post-Surgery Recovery Optimization. The solution connects wirelessly to a smart phone application running on the user’s preferred device. While walking the patient can see how much force they are applying to their legs. This data can be used to encourage walking during the recovery stage and can also be remotely monitored by a doctor or a medical assistant. The medical staff are therefore better informed about the recovery activities before follow-up hospital visits and the patient can take more control over their recovery process.
Moves is a new way to understand how much physical exercise you get – just by keeping your phone in a pocket or bag. The app automatically recognizes activities (walking, running, cycling, transportation), routes and places. The data is visualized on a map and as a daily storyline it encourages the user to take small steps towards more healthy habits and lifestyle. There is no need to start and stop the app as it runs continuously in the background.
Numerous machines populate the world: many are deployed in production processes that run 24/7 and any downtime can be horrendously expensive. Preventive maintenance is therefore essential but providing it on a global basis is
also expensive, which means that it has to be enabled from one or more remote sites. ABB, for example, has installed more than 1,000 robots in over 35 countries around the world. The company’s remote service solution lets the
company provide support without being at a customer’s site, thereby reducing the number of on-site travels. To see this solution in action point your browser to: http://www.youtube.com/user/eWONtv#p/a/u/2/EN_XAuHViIc
Surveillance and security
M2M is changing the landscape of surveillance and security solutions. The industry is undergoing a significant change as a result of technological advancements in alarm management, vehicle security, surveillance as well as the safety and security of employees. The ability to monitor job sites, assets, employees, and environments makes it easier to stay aware and in control. And real-time video, audio, and data surveillance solutions allow constant information about activities and conditions at remote locations.
In the home, motion sensors can set off an alarm if a window is opened without authorization and send notifications to emergency services. Wireless solutions can provide two-way communications to/from a central control station, thereby
enabling remote surveillance, access to control systems, motion detectors, lighting and access points. And these stations can remotely authorize access to controlled areas with the ability to open and close locks, doors, and gates via
the wireless network within seconds.
QGate has developed an innovative, easy-to-deploy home surveillance solution whereby a smartphone with QGate functions as a personal mobile controller. Consumers make an on-line purchase of what appear to be small black adapter
plugs that are placed between the socket and the relevant electrical device. The solution connects the device to the Internet. After registering, QApps are employed and operated via smartphones, tablets or a Web browser in order to
communicate with the devices and regulate their behaviour.
With QGate apps you can switch on or off the light at your home, monitor your basement’s temperature, manage your children’s TV watching age, or keep track of your refrigerator’s energy consumption via your smartphone. The solution can also forward messages via SMS, e-mail, Facebook or Twitter.
Energy harvesting gets tiny amounts of energy from the environment. It can come from changes in temperature or light. The electrical energy that’s obtained this way is stored in capacitors. This allows low power consumption sensors to run maintenance-free without batteries. In a networked home a window sensor could indicate that a burglary was taking place and a signal would go to the room controller, which in turn could send a signal to the police, a security firm and the owner’s smartphone.
EnOcean is the originator of patented energy-harvesting wireless sending technology and the company’s solutions have been installed in over 250,000 buildings around the world. And there are more than 250 companies in the
EnOcean Alliance, which is a worldwide non-profit initiative based in California that establishes innovative automation solutions for sustainable building projects. A lot of growth is coming from Asia, particularly Japan, where there is
a lot of interest in energy-efficient buildings following the tsunami.
In Europe bus passengers press a button when they want the driver to let them off at the next stop. Regular push buttons communicate to the driver via yards of cable. Energy harvesting technology allows this to be done using a small,
battery-free microchip. When the stop button is pushed a radio signal is sent to the driver’s receiver module, which saves over 100 yards of cabling.
M2M-centric logistics takes in the movement of goods in warehouses as well as fleet management. RFID (radio frequency
identification) tags are employed to track the movement of pallets as they move through the warehouse. In addition,
RFID tracking systems can identify unscheduled movement, so managers and security can be alerted to possible theft.
Solutions can also reduce the time and cost for counting stock as it enters the warehouse by collecting the data automatically and virtually eliminating the need for manual intervention. M2M has also enjoyed considerable success in fleet management systems, which normally show the precise location of vehicles in real time on a PC or a wall display.
M2M continues to reach out and make a positive impact on more and more aspects of our lives. But delivery of the really big benefits to society at large is only at the green roots stage, i.e. it’s early days. The deliverables are clear and the technology is up to the task, but significant investments are needed together with the political will to make decisions that only show tangible results in longer terms than political office.
The really big picture
Beecham Research`s map shows segmentation of the M2M Market, including nine key Service Sectors, key Applications Groups within Sectors, and examples of Connected Devices within each Sector. Copies can be downloaded from the site.
Sensors are the starting point for most applications. They are used to measure physical, quantifiable parameters such as moisture, pressure, speed, movement, etc. Sensors provide the raw, analog data; they’re indispensable, but it’s easy to overlook this part of the value chain. In recent years we’ve witnessed the development of wireless sensors, i.e. freestanding, low-cost, battery-powered products that can transmit at line-of-sight distances of 900 feet (300 meters). In this case data is sent to a mains-powered concentrator that in turn communicates to the applications server. Monnit is market leader in this sector.
This company has introduced a gateway that can accommodate up to 100 sensors. The gateway aggregates the data, which the application analyses. Monnit has also developed a caller ID sensor/data aggregator that detects incoming phone calls on any landline phone and sends the caller ID information to a concerned person via SMS text or email. More and more elderly people are staying at home and they can become targets for cold callers who are pushing unwanted services and products, which causes confusion. The caller ID is monitored along with other information like time and duration and this information is sent to a relative or healthcare worker.
Devices are the second component in the value chain. They are the beating 24/7 hearts of most M2M applications and they determine a solution’s performance and reliability. Devices measure the sensor’s analog signal at regular intervals, digitize the result and transmit it over a communications network to the application server. Signals can be processed in any way that makes business sense. A typical device comprises a wireless module (aka modem), a sensor, an antenna, a SIM card that identifies the device, and in most cases a microcontroller that manages the module using AT commands. Normally everything will be enclosed in a rugged housing. Chips that provide information on the location of the devices can be included and in line with Moore’s law, they are now small enough to be embedded in the module.
3D-SiP (System in Package) is a relatively new development. 3D refers to the fact that components are placed on the top layer of the stack (x and y direction) and passive components are also embedded in other layers (the z direction).
The objective is to minimize the size of the module, which facilitates the creation of new solutions such as GPS-enabled smart watches. Check out the video on Limmex’s site.
As far as I could find out only one vendor, Telit, is marketing a 3D module. It’s a GPS solution whose form factor is a mere 4.7 x 4.7 mm, therefore it could sit on the finger nail of a small lady’s small finger.
Drinks that think
Time for something different. Sensors are available for a wide range of analog parameters: temperature, sound, vibration, pressure, motion, light and air/soil pollutants. That’s pretty much what you would expect. But how about sensors that measure the amount poured from bottles containing alcoholic beverages?
I think it’s safe to assume that at some time you have sat in the bar or lounge of a 4/5-star hotel and ordered a drink (maybe two). These beverages are not cheap, which means that staff might be tempted to pour short measures and pocket the cash difference. And there are other issues that can add up to a point at which 20% of all alcoholic beverages disappears, i.e. apparent consumption and sales receipts cannot be reconciled.
BEVERAGE METRICS has developed the “Drink that Thinks” solution. It enables wireless sensors to monitor all liquor, beer and wine sales from bottles, with or without pour spouts. State-of-the-art RFID sensors use sophisticated accelerometers to generate data on the volume poured as well as the beverage type and time poured. The data variables
that are measured include bottle tilt, spout size, and the temperature of the liquid. Viscosity is also included in calculating the amount poured. This parameter, which is a constant, is linked to an algorithm based on the brand.
The sensors, which are attached to the neck of bottles, match the beverage: each pour is tracked and the inventory is updated in real-time. In addition, point-of-sale reconciliation matches poured ingredients for all charged drinks. This indicates that there is seamless integration between the M2M solution and the back office inventory and charging systems. It is worth noting that processing the parameter data is done locally by the wireless modules.
The value chain
We need to take a short look at the generic value chain since it not only influences a number of developments, but it also explains why a semi-seismic shift is taking place in the industry.
As illustrated in the following figure, it’s very simple. Capture data, transmit the data, and then analyze the data, thereby turning it into real-time information. An M2M application is therefore a data communications app. A typical solution can involve five or more vendors, which means that M2M can appear to be complicated. There is, however, nothing intrinsically complex about the process. Different vendors are needed because different technologies make up the various links.
The chain starts with sensors and ends with applications that run on a computer system. You can think of it as a middleware platform that’s used to convert raw data into real-time, actionable information. In between there is a
service delivery platform that manages the data as it flows over the network. This task is particularly critical in the case of cellular networks because the volume is high and numerous SIM cards need to be managed.
The model has worked well: it underpinned M2M’s remarkable success, but success has also led to a new set of requirements by the market as well as changes within the industry. The business case has been proven in applications that address widely different vertical markets: automotive, buildings, smart energy, homes, healthcare, etc. etc. However, these apps run in proprietary, vertical solutions (rigid silos) that stand apart from standardsbased, horizontal enterprise environments. This is therefore an intrinsic issue for a market that wants to leverage their M2M investment by enabling integration with the mainstream ICT environment. This is a key development and progress in this area is covered in more detail on page 19.
Various changes are going on within the industry, but moving up the value change is, in my opinion, the most significant. Mobile network operators, for example, want a more profitable slice of the action. There are a few notable exceptions, but most MNOs have tended to ignore M2M because the ARPUs (Average Revenue Per User) are much lower than those of voice/data subscribers. In the past their networks were mere bit pipes: now they are looking to turn them into value-added pipes.
As indicated earlier, wireless devices are a critical component and they need to run 24/7 for many years: 10, 15 or more. Think about energy meters in the home. Vendors therefore make a series of one-off sales, which are preceded
by significant investments in new models and certification: a lengthy process. Telit has to retain this business model, but in 2011/12 they moved up the value chain via the development of a module management system and a revenue sharing agreement with Telefónica, which is a large, international MNO. This is another key development and it is covered in more detail on page 25.
Fifty sensor apps for a smarter world
In April 2012 I heard about Libelium, a Spanish vendor, and I downloaded their sensor application document from the site. Here are a couple of paragraphs from the introduction:
“In Libelium we believe that the new Internet of Things requires an open platform capable of dealing with different technologies, communication protocols and sensor databases. For this reason we released 2 years ago the first Wireless Sensor Network Platform to be open source, horizontal, modular and accessible to help developers design and deploy sensor applications on top, easily and within the minimum time to market. This new platform reached the market under the name ‘Waspmote’.
During the past 2 years, more than 2,000 developers worldwide have joined our sensor platform, creating a compact and incredibly reliable framework, which forms the base of the Libelium Community. They have proved Waspmote’s
versatility by creating amazing applications and new business models. In this document we show just 50 of the hundreds that were sent when we started asking our Community members about what they had been doing with Waspmote during the past two years. We want this document to be an inspirational guide that helps you to create imaginative and profitable applications in the new Internet Of Things era. Our mission is supporting you along all this way.”
Open source, horizontal and modular ticked my M2M boxes, particularly module. Wanting to know more I did an interview with company’s CEO, Alicia Asin Perez for m2m apps. Here are a few extracts:
“ … whether you are talking about M2M or the Internet of Things you have to employ sensors: they are the common element. You need them to acquire data and that is a horizontal function that applies across the board. This means that the term wireless sensor networks is also horizontal, it takes in both M2M and IoT applications. That is why we employ wireless sensor networks as an umbrella term.
We see ourselves as an enabler for the community and that is why we have so many certified sensors as well as the mainstream communication protocols: ZigBee, Bluetooth, plus 2.5 and 3G. Developers can pick the relevant board
and employ the relevant sensor or sensors. The main printed circuit board has sockets for the plug-in modules. The result is a customized, plug and play device that works with the relevant wireless network.
As far as we know it is the only platform on the market that is horizontal and modular. Developers can take off-the-shelf generic products, plug them together and have the data acquisition and data transmission components of a solution up and running in a matter of minutes.”
There are, in my opinion, three different perspectives on M2M’s ability to cut costs, save time, improve operational efficiency and enhance customer service
1: Users and businesses. There are numerous consumer applications and the best of breed look set to bring tangible benefits to the personal side of our lives. For example, to dial a taxi service, get an estimated time of arrival message on
your smart phone, and be able to track the progress of the taxi in real time on the screen. Another neat automotive app is the ability to pay for parking via the phone and get a text message before the time is up so that you can make a top-up payment and avoid paying a fine.
These are the kind of apps that deliver tangible benefits and in a few years we’ll take them for granted. Putting money into a meter will, like rotary dials on telephones, be a relic of times past.
I’ve included businesses in the same perspective as users because once benefits such as reduced costs and improved productivity are realized they too will be taken for granted. These and other benefits are being realized in myriad
ways and more, some not foreseen right now, are coming.
We are also starting to see significant benefits being realized via the seamless merger of real-time information coming from mobile employees with that emanating from M2M devices. For example, enabling people, remote devices, and enterprise business systems to conduct real-time communications using a platform that provides a set of messaging and infrastructure services.
2: Solution and service providers. The business proposition for solution providers is obvious: the industry is ealthy and growing. The rate is different for different sectors but it’s between 10 and 20% p.a. Innovation abounds and growth looks set to accelerate via the emergence of standards, open systems and migration to the cloud computing model. Without standards M2M will never realize its full potential and ICT management will not fully embrace the technology until they are able to manage M2M devices in the same way as PCs.
Equally obvious is the fact that while solution providers created the market, most service providers — the mobile network operators — only played a passive role. The revenue streams coming from parameter data were realized by default and of course M2M represents a relatively small sector of a gigantic global telecoms marketplace.
Recently, however, the Tier 1 operators have recognized the consumer opportunity that has opened up as a result of hardware developments like smartphones as well as applications such as smart meters and security systems, where subscriber numbers will run into tens of millions.
3: Driving forces. Solution providers created the market in various verticals, but in the medium- and long-term governments and the public sector will be the principal driving forces. M2M has the proven ability to address big issues such as healthcare, transportation, energy conservation and the environment. It has been proven, albeit on a modest scale, because huge investments are needed in order to deploy nationwide solutions and right now governments’ focus is on balancing the books. But — and it’s a very big but — the potential savings are huge, which means that the equally huge benefits to society can be regarded as a bonus.
Solutions that allow people, particularly elderly people, to self-medicate in a safe, secure way at home are being used. Nobody wants to go to hospital and medical resources are being stretched to the limit as populations age. In addition, embedded sensors and health kits are detecting potential problems at an early age.
The need to improve the efficiency of electricity grids hit the media headlines following the announcements made in 009 by the Obama administration. In Europe member states are required to undertake an economic assessment of the costs and benefits of smart electricity metering, after which there will be a rollout of the smart meter.
Prison reform is a very contentious subject and views vary widely from country to country. Sweden has saved around $160M via a parole program based on wireless-enabled electronic tags and the recommit rate for offenders is a mere
15%, so both money and prison resources are saved. It’s an approach that helps get first-time offenders back into society and changing the parole conditions via the Web can be used to reward good behavior.
Electronic tags are another example (and there are more) of an M2M investment that can be justified financially, which is obviously important. But the bonus, the long-term benefit to society is arguably more significant.
M2M solutions may involve the deployment of thousands, tens of thousands or even millions of devices. The latter figure is even quite modest when one considers nation-wide installations of smart energy meters in homes. As indicated earlier, they transmit small amounts of data at regular intervals to an application server. The application needs to identify the source, i.e. this data packet comes from device “a”, that packet from device “c” and so on.
SIM cards that are embedded in wireless devices provide the ID just as they do in cell phones. This means that all those devices have to be activated so that the network operator can recognize them for billing and other purposes. However in the case of smart meters the cost of employing SIMs in every meter would be prohibitive and an alternative solution has been developed. It’s covered in the section on RF communications: page 24.
Until recently large-scale activation was a tall order, which took time and time is money, particularly when it involved technical resources. However, there are solutions that enable out-of-the-box-connectivity. This means that the network
will configure the wireless devices automatically as soon as they are powered and that data streams will start to flow immediately afterwards. The first such solution I saw came from Monnit.
Monnit’s core competence is wireless sensors but they also wanted to market solutions and as indicated earlier they needed a gateway that would accommodate multiple sensors and aggregates the data. They couldn’t find a product with the right spec and they didn’t want to market m2m hardware so they had one developed. The company they chose had a contractual agreement with iMetrik Global, whose network provides instant access in over 120 countries through 170 GSM operators. This enabled the development of a gateway that could be shipped pre-activated on the network, which allowed solutions to be used right out of the box.
In an earlier section I indicated how the silo model that has worked well for over a decade is changing in order to reflect market requirements, i.e. the need to leverage investments in M2M solutions. However, if you take a deeper dive into the model it becomes clear that there are other intrinsic issues that have to be addressed.
The communications protocol
M2M traffic comprises very small data packets that are transmitted at regular intervals, e.g. every 15 minutes. They are sent using the communications protocol (language) of the Internet (IP) and everything works. But it is not an efficient way to handle low-volume traffic. In addition, network congestion becomes a significant issue when those billions of Internet “things” drive traffic up to unsustainable levels, and the Net is going to support a vast range of new Web services. In Holland, for example, sensors are being used to check the health and fertility status of cows, each one of which will transmit around 200MB of data a year!
The issue is intrinsic because a typical M2M payload may be under 30 bytes but the communications overhead can be 500 to 600 bytes: well over an order of magnitude. This comes from the fact that today’s Web services protocol stack was developed for the kind of traffic generated by computers. What’s needed is communications that’s optimized for resource-constrained environments.
Sensinode has been a driving force behind CoAP (Constrained Application Protocol), which is designed for M2M apps such as smart energy and building automation. According to Sensinode, utilities are one of the driving forces behind this development. In the US they foresee the deployment of 20 devices per home and there are over 100 million homes: the maths could hardly be simpler
The communications network
We’re about to slaughter a sacred cow. Despite considerable success, as evidenced by the plethora of innovative solutions for consumers, companies and society, there is a fundamental issue that the industry, particularly the MNOs, chooses to ignore. Today’s wireless networks are not truly fit for purpose: they do not and cannot enable the optimum delivery of M2M and IoT traffic.
Cellular networks were designed for voice and high-volume data traffic, not intermittent payloads of under 30 bytes only requiring a throughput of 100 bps. However, protocols are only part of the problem. The real issue is intrinsic: in order to take M2M and the IoT to the next level and enable a cost-effective, long-term future we need a new kind of network.
This is a summary of the requirements of a new network, as proposed by SIGFOX:
- Low cost and low energy consumption; the latter is needed to increase battery life
- Ease of use; including device management and integration with IT systems
- Frequency independent; thereby facilitating cost-effective world-wide coverage
- Embedded subscriber identification; no SIM cards, no need for subscriber lifecycle management
SIGFOX realized this objective by employing a patented radio technology that’s based on UNB (Ultra Narrow-Band), which employs the license free ISM bands. UNB enables the cost-effective transmission of data over a very narrow
spectrum to and from connected objects. The devices have outstanding sensitivity, which in turn minimizes the number of antennas (base stations) needed for wide area coverage.
SIGFOX has deployed a nationwide network in France using 1000 antennas, which the company financed for a very modest 3 million Euros. Currently each base station can handle up to a million battery-operated devices. Energy
consumption is 200 to 600 times lower than an equivalent cellular network and the ratio of payload data to protocol data is normally between 20 to 50%. In cellular networks it would be 1% or less.
Furthermore, each of the modems used in the network (to emit and receive data) are significantly less energy intensive than devices using competitive technologies. Network performance for M2M traffic is therefore more efficient and communication costs are much lower, basically because the network is fit for its dedicated purpose. The networks of MNOs are colossally over engineered for almost all M2M applications.
There is a lot to like about this development. Time will tell how fast and how far it will go and in the meantime, in common with other game-changing technologies, we can expect the concept to be trashed by the competition, i.e. the MNOs.
New location paradigm
It used to be that determining your location meant GPS or nothing. While GPS is a highly accurate positioning technology, it does have drawbacks with regard to coverage between buildings, indoors, and underground. Rx Networks saw a market niche for addressing customers who only require “city-block” level accuracy and want location performance in locations where GPS won’t work.
The solution: XYBRID RT™ (Real Time), is a cloud-based Cell-ID location service that is now a standard, no-cost service on all Telit modules. How does it work? When a Telit module hears or uses the cellular network it notes the “ID” of the cellular transmitters within range and sends them over the Internet to Rx Network’s servers. XYBRID RT then looks up those Cell-IDs in its database (containing over 40 million Cell-IDs worldwide) and returns the latitude/longitude to the device. For devices that are connected occasionally, the Cell-IDs can be captured and then post-processed to identify the route of the device.
The smart grid
When I interviewed Baard Eilertsen, CEO of Maingate, he indicated the need to add intelligence to the IoT environment, convert information into knowledge, distill out what is useful, then we would have an Internet of Knowledge (IoK)
that could deliver some really great benefits. This term isn’t likely to catch on but we should be considering the IoT in a more meaningful way, i.e. what are the end user benefits. For example if a utility tries to analyze all the data that is
being generated in the smart grid environment it would take a long time and they wouldn’t learn anything useful. What utilities really want to know is very basic: what is going on at the transformer stations, and they want to know now.
That knowledge is needed because we’re getting close to the time when European-wide blackouts will take place and it will happen because the industry is losing control of the infrastructure. The critical issue is the fact that flows are bi-directional. Energy is being fed into the distribution networks from market-generated energy sources like solar panels. This process cannot be controlled and it’s changing the grid status in terms of frequency and power loads and grids will shut down unless safety procedures are implemented.
It’s a scary scenario but Baard told me that while the solution is not simple, it is available. Firstly we have to get a common framework where regulated and unregulated businesses can co-exist with the sole goal of making the energy grid future proof. In other words, we need a Smart Grid!
The interview was refreshingly different: it was straight talk and Baard was shooting from the hip. Sometime later Maingate published three hype-free white papers: “Closing the gap between the electric utility and its customers”;
“Embracing Technology in the UK power market”; and “Digitalise or Die”. I wish that more vendors took this approach.
M2M in the cloud
This is an extract from an article that I wrote in 2011.
Locating solutions in the cloud enables tangible benefits such as efficient, flexible use of computing resources, but there is a generic benefit that occurs when communication services are developed using Internet approaches and are delivered using Internet business models.
Currently the business logic of enterprises and other large organizations is normally implemented in services that speak the language of the Web. Therefore if M2M services are implemented on top of the same infrastructure then the same developer tools, processes and people can be used to costeffectively integrate M2M communications into new and existing business processes. For example, data obtained from people who self-medicate in the home could be integrated with patient data held in a healthcare database.
The next step is really interesting, because the cloud model enables data to be shared with other authorized parties, e.g. it enables doctors and pharmaceutical companies to evaluate the treatment in granular detail. That won’t happen overnight because of regulatory concerns, but it does illustrate how an M2M solution developed for one process could enhance another and it would be cost-effective because both are based on the language of the Web.
If we drop down to a less ambitious example, the conservation of food in hotels and restaurants, then we can see a real-world example of one process enhancing the value of other process. Refrigerators store the food and the temperature is monitored and managed using an M2M solution. Very simple. It can also monitor stock levels and issue a replenishment order to suppliers. Not rocket science.
The supplier activates the order and the supply chain management system tells the customer when the new products will arrive. At the relevant time the temperature of the fridges will be lowered in order to prevent it going over the limit when the new, relatively warm stock is added. Humans are not involved.
In reality a solution would rarely be implemented for a single process. The concept is broad-based and, as illustrated, would typically include energy management, transportation, security, communication, and so on as well as inventory management. The visual comes from Viewbiquity and this vendor employs a hybrid cloud architecture that facilitates this kind of process-toprocess communications.
M2M as an integral part of the enterprise
A battle is brewing for this high-margin slice of the M2M cake. The objective is to enable easy integration of M2M data into mainstream applications, databases and enterprise service buses in order to optimize critical business processes. Think of it as the “Internet of Corporate Things”:
M2M solutions deliver significant benefits, as outlined earlier, but the great majority of the wireless solutions are realized in vertical, so-called silo architectures. In addition, proprietary technology is employed because there
was and still is a dearth of standards. This is in stark contrast to the enterprise model, which is horizontal and based on standards. M2M applications such as fleet and asset management and mainstream business apps like ERP and CRM therefore function in alien environments. Ideally integration would be based on M2M solutions that are standards-based, open and cloud-centric. But the key objective is the bidirectional seamless transfer of information between
M2M apps and business processes.
Systems integration requires specialist knowhow and experience in both environments and that is hard to acquire. The big SI players tend to see M2M as an industry that operates on the dark side of the moon and regular M2M vendors are way out of their comfort zone when it comes to the enterprise back office systems.
Axeda is one of a handful of vendors that has the requisite set of M2M and enterprise skills, which they position as a B2B subset of the IoT. The offer is predicated on the management of the physical products that a company delivers to its customers and the management of a company’s corporate assets including facilities and plants, equipment, vehicles and goods being delivered.
The offer also includes managing internal corporate assets and operating the infrastructure, which when you think about it requires an understanding of the complex web of interconnectivity between assets, places, people and information. This indicates that making M2M an integral part of the enterprise isn’t a walk in the park, but it’s doable and the benefits are compelling and proven. Companies need to integrate M2M data into their mainstream applications and processes in order to leverage the functionality of their connected products and assets and thereby create new applications and
business models that differentiate their offer.
It is clear that the integration process must not interfere with the legacy solution: i.e. it goes on running as before. This is realized by a rules engine in the cloud that decides what to do with the data. If it’s for the legacy app then it will be forwarded to the regular application server: no rule is applied. Data destined for integration can have different rules, for example, if the device data is indicating an abnormal condition then it will be forwarded to the relevant mainstream app because it’s data that they may need to see immediately. According to Axeda this is a key requirement because if a pure message broker is used then all the M2M will go to the backend systems and they will be overwhelmed.
This was an extract from an article I wrote for No Jitter. The hyperlink will take you to the full article, which contains the alternative approaches of two other leading vendors.
Analyze M2M data in real time
As covered in the previous section, M2M solutions turn event and parameter data into sector-specific information, which transitions into a corporate asset when it is integrated into the enterprise environment and filtered so as to meet
a specific set of end-user requirements. But the really big benefits accrue when visual analytics is employed: when live data becomes real-time, insightful intelligence on which important business decisions can and are being made.
The ability to transition from raw device data into decision-making processes based on customized dashboards that pinpoint operational and financial trends and issues in real time is an exciting, innovative concept. Moreover, it’s a
logical development that addresses a generic issue: organizations lack realtime insight into the critical aspects of their business: aspects that are getting increasingly complex in today’s highly competitive, global marketplace.
So far we’ve only shown how data can be filtered and directed to a specific system, but that leaves IT with the task of processing it in order to deliver realtime intelligence to different parties, e.g. C-level management, business analysts, engineers and scientist. That’s a challenge too far. What’s really needed are integration tools that can be used by business users to create their own data visualizations: queries and dashboards that allow them to conduct analytics in real time. In a nutshell, users at all levels in an organization want a Google-type experience, i.e. effortless ways of finding the
requisite information: that is the expectation bar.
For example, customer service staff wants to make on-the-spot decisions based on the profitability of customers, using data from CRM, transactional and data warehouse systems. Operations executives want to be able to prioritize production orders, taking into account scheduling and forecasting data that’s at their fingertips.
However, conducting analytics in real time is not limited to M2M. It’s a new addition to the Big Data scenario, which comprises historic and transactional data. When all these components are mashed together management and other interested parties get the full 360-degree picture.
This do-it-yourself approach to analytics reminds me of the early days of PCs
when employees downloaded mainframe data into spread sheets like Lotus 1-
2-3 to create financial and other models in a few days versus several months. In today’s data-driven business climate it’s very similar: users cannot rely on IT to unlock data sources,
Now it gets really interesting
Big data is seriously big: it runs into terabytes and petabytes and it keeps on flowing into enterprises 24/7 from numerous sources. This is where in-memory technology enters the equation. Data processing involves the transfer of data between the computers random-access memory (RAM) and disk storage, but this established technique doesn’t cut it for big data. It’s too slow and the disk would be constantly thrashing about and wearing itself out. Therefore the processing is done in RAM, which is about 1,000 times faster than doing an equivalent task in the traditional way. A massive amount of RAM is needed, it can run into terabytes but it’s affordable. Today it costs around $1 per gigabyte and this figure continues to head south. Back in the mainframe-computing era one gigabyte would have cost $512 billion.
In-memory technology deployed on computer servers therefore allows highspeed processing of M2M data that has been integrated into the enterprise environment. Moreover it can be combined with data emanating from mainstream processes such as CRM and ERP.
The ability to process entire datasets in high-speed memory opens the way for more sophisticated market analysis, what-if analysis, data mining, and predictive analytics. And of course the results can be visualized, which is the way we remember information. In addition fast, easy-to-run analytics frees up end users’ imaginations, enabling them to pose questions they wouldn’t even have thought of asking before. This indicates that in-memory analytics is more than a new technology. It’s a disruptive game-changer.
M2M and social networks
A new study from ABI Research indicates that “… there’s an intriguing intersection developing around social networks and M2M services. At this stage the activity is nascent.” However one company, WaveNconnect seems to be ahead of the game: they have created a solution that has been trialled in realworld scenarios and it’s delivering innovative, versatile functionality. The basic concept is to combine the power of social media with a Web-based M2M application that employs wireless and NFC (Near Field Communications) technology. Here’s how it works in an Expo scenario:
The organizer gives attendees an NFC tag card, which allows them to employ their preferred social network, e.g. Facebook. Exhibitors rent and activate stations supplied by WaveNconnect: when visitors tap the card on the station a
wireless link is established and a message is sent to the visitor’s account. The message would identify the product, post a link, add a message or simply tell the visitor’s friends that he/she likes it. The exhibitor will also be informed about the various “likes”, clicks and shares generated.
Tapping on an Email station is a neat service because it removes the need to
take literature home. The exhibitor has your email address and everything you want will be waiting for you on your return. However, there may be times when you want to tell friends and/or colleagues about a particular product or service and you have to configure the card for email. In this case you go to the camera station, which employs an Android tablet. Stand in front of this station, tap the card and your photo will be taken. If you like it will be posted along with a message that is keyboarded on the tablet.
WaveNconnect generates different, detailed real-time reports about the generated interactions, number of shares, clicks, per country and even per city. For example, a time based evaluation about the usage of each Connect Station and this can be used to generate additional sales leads and new marketing strategies. One of the trials was conducted at a relatively small Expo in North Africa and the exhibitor was a major car manufacturer. The combination of Facebook shares and camera station posts resulted in marketing data being generated in distant locations as well as locally. The back-office system displays them on a map. This allowed the car manufacturer to quantify the level of interactions generated by the visitor’s friends and colleagues and assess their interest. For example, the Expo in North Africa generated additional leads in Ashburn, Virginia.
Local area processing
The majority of today’s solutions acquire data locally and process it at a central facility. However, the modules that are embedded in the wireless devices have become smart microprocessors and their “smarts” have grown in line with
Moore’s law, which states that the number of transistors that can be placed on an integrated circuit doubles every two years. Sometimes it’s quoted as being every 18 months.
This allows certain applications to run in a local area “device domain”, as shown here. The network domain provides connectivity and management functionality. This concept allows decisions to be taken at the local level and when there is an exception it can be handled centrally.
Let’s take a simple, hypothetical example. M2M pill dispensers are being used in the homes of elderly patients. If the pill
was taken at the right time it will be logged locally and if required a batch file of dates and times could be sent to the
central facility. If the pill was not taken at the right time the device might display a flashing light, but if it wasn’t taken the
next day an audible alarm could be added and an SMS could be sent to a neighbor or a relative. On the third day the message could go to the doctor, a local clinic or a health care worker.
Telit has pioneered this concept of data processing in the device domain, e.g.
by enabling the application software to be embedded in its xE910 family. The
company’s “AppZone” is an embedded programming system that allows these
modules to perform the key tasks normally associated with an external memory
and microprocessors. We’ll return to local area processing in the three
Radio Frequency (RF) communications is used in order to transmit data over relatively short distances — relative that is to cellular communications. RF technology is therefore used for local area solutions. For example, as indicated earlier, the movement of goods in a warehouse.
RFID chips modulate an external magnetic field to transfer a coded identification number when queried by a reader device. This allows RFID solutions to employ tags that exchange data between a reader and an electronic tag attached to an object and they can be deployed in a wide range of applications that do not need wireless modules.
When smart utility meters are deployed in homes RF sensors can form a local area network that aggregates information and then passes it on to the cellular network. This approach is cost-effective since it eliminates the need to have M2M SIMs in every meter, i.e. they are identified by the RFID tags. The client concentrator shown here enables ommunication to the wide area cellular network, which means that only one SIM card is needed.
Wireless mesh networks
ZigBee is a wireless technology developed as an open global standard to address the needs of low-cost, low-power wireless networks. The technology targets RF apps that require a low data rate, long battery life, and secure networking. A key feature of this technology is that it is self-organizing, i.e. the devices form a robust LAN automatically. Moreover, as illustrated it’s a mesh network. Each node captures and transmits its own data and it also serves as a relay for other
nodes, i.e. nodes collaborate to propagate the data around the network. And if a node goes down, data from the other nodes is rerouted automatically.
ZigBee solutions can run for years on inexpensive batteries for a host of monitoring and control applications: reading smart meters; networked homes, lighting controls, building automation systems; etc.
The maximum reach of active RFID tags is around 100 meters (300 feet) but it can be half that figure. RFID is a very versatile, cost effective technology, so it makes sense to extend the reach in order to enable tags to be deployed in
large areas such as warehouses. This is where Wi-Fi comes into the picture.
When Wi-Fi access points (APs) are added to a local area network then virtually any area can be covered. Various companies have developed Wi-Fi enabled RFID tags that can be read directly by commercial APs. This opens up the technology to a wide range of applications.
For example, the tags can be programmed with content data and assigned locations and then be placed on containers and pallets that are stored in a warehouse. Additional information can be collected and added to the RFID tags as the
pallets move through the warehouse. In addition, the RFID tracking system can identify unscheduled movement, so managers and security can be alerted to possible theft. Solutions can also reduce the time and cost for counting stock as it enters the warehouse by collecting the data automatically and virtually eliminating the need for manual intervention.
Asset tracking plays a pivotal role in today’s global, dynamic economy. Goods that cross countries and continents are transported by road, air, and sea are tracked 24/7.
Asset management is used to monitor both the location and performance of remote machinery. Quake Global states that it is a market leader in this key sector.
The leadership claim is based on global solutions that combine terrestrial and satellite communications and that deliver unique functionality, e. g. network agnostic technology that provides intelligent interfacing with any configurable
available network in order to provide least-cost data routing. Selection is based on the dynamic environmental and business conditions and this results in a unified communications protocol that enables global data coverage from a
A market on the move
M2M is on a roll, but the move I have in mind is the value chain. From an engineering perspective it’s a clumsy way to create data communications applications and most partnerships, however “strategic”, only paper over the interlink issues such as the need to deal with complex logistic issues like scalable subscription lifecycle management.
In 2012 seven Tier One MNOs announced the formation of an M2M alliance that was long on promise. For example, “… members will continuously cooperate with the aim of enhancing the development of the market dynamics …”. Does that mean that they are actually going to do anything? The industry needs to bite the bullet and address an intrinsic issue: different vendors own different links. Steve Priestly, Wyless’ MD, EMEA: “We believe that a big hunk of the value chain needs to be in one place.” The reason: “The marketplace is increasingly demanding a deeper level of involvement and a more complex set of services.”
The ultimate objective is the creation of an offer that is an “Out of the Box” solution, which we highlighted earlier in the section on sensors. However, in this section we are moving on to a development that is seriously innovate and
disruptive, but only for vendors — not the market.
Modules are a critical, pivotal component. They’re the beating heart of most M2M applications, but at first sight the first link would seem to be an unlikely place to create a ‘big hunk’ solution. It’s also unlikely when you take a second look, but that is exactly what Telit has done and — cliché coming up — it represents a game-changing development.
In 2011 the company acquired Global Connect, a relatively small company that provided global connectivity services for M2M service and solution providers. However what Telit really bought was Dan Amir, the CEO and Founder. He
foresaw the benefits that would accrue by fusing the first two links in the chain: the module and the connectivity service. But it would need the development of a brand-new remote module management system as well as tight, functional
integration with a state-of-the-art service delivery platform.
Telit bought into that vision and it has been realized in a multi-faceted offer known as m2mAIR. The functionality is impressive, but let’s keep things simple. Telit employs its own GSM software stack and that enabled the development of
a cloud-centric module management system that delivers brand-new functionality, e.g. the ability to detect the difference between a defective module and poor network coverage.
Telefónica also bought into the vision. Unlike most MNOs, this carrier has implemented a separate core network dedicated to M2M. In addition, Telefónica has integrated Jasper’s service delivery platform. Integration includes key core network components and this allows users to access and manage a subscription deployment down to the single subscription level in realtime.
The module management system represents an additional link in the chain, but functionally m2mAIR effectively fuses the wireless modules and the connectivity service. This enables ‘out-of-the-box’ connectivity, which eliminates the need for users to be concerned with procurement procedures, complex logistics and integration into solutions and applications.
One stop, one shop
Telit is marketing its m2mAIR offer as “one stop, one shop”, which has a nice marketing ring about it. The message is easy to remember. The offer covers cellular, short-range wireless, positioning, services and connectivity. And it
includes help in the hardware design process, device certification assistance, manufacturing process validations testing, prelaunch customization, scale-up assistance and planning for next-generation products.
That was a relatively short take on a huge topic. Individuals and businesses are realizing convenience and other benefits now and M2M continues to reach out and make a positive impact on more and more aspects of our lives. But delivery of the really big benefits to society at large is only at the green roots stage, i.e. it’s early days. The deliverables are clear and the technology is up to the task, but significant investments are needed together with the political will to make decisions that only show tangible results in longer terms than political office.
Compiled by Bob Emmerson, Freelance Writer and Industry Observer