No, I haven’t “lost my marbles” and the title of this blog is actually not a joke. It stems from an interesting article (can’t find it online = sorry) in a recent issue of MacLife, where Rik Myslewski suggested exactly that. While the idea does sound a bit wacky, it may have some merit.
One major hurdle to the success of M2M roll-outs relates to exactly how big will Big Data be and how will the Internet / cloud services handle it. With all the information being generated, the idea of shipping it across the Internet to a big Data center, then processing the data and finally shipping the results back across the Internet is quite daunting to many.
Rik’s idea is simple using methods that are actually quite common – i.e. using the available computing power of devices that are typically found in most homes. Aside from the obvious PC/tablets, most homes have many devices with decent computing/processing horsepower inside of them – from cable receivers to appliances to fax machines to, yes, even higher end toasters. Of course, the level of processor/RAM on board your high-end toaster or coffee maker is not exactly on par with your iPhone…..however, one would be surprised by how many “bits and bytes” it would actually be able to process without having to leave your home.
Now, there are a lot of steps required to get us to this stage…
- What device is going to assign, manage and aggregate all of the data being processed? An obvious choice is a PC, but many homes are moving away from that. Another choice is your tablet, but since those are often tucked into briefcases when you leave, it might not be ideal. The most likely choice would be your router.
- How do you sync up all of these devices, across a number of manufacturers and platforms? Obviously, this would require a common standard and a common language, something the M2M industry has had trouble with. The most obvious choice would be IBM’s MQTT, but there are always politics involved.
- Finally, what about power consumption? While devices in idle mode still use some power, it is much less than when they are fully up and running. Would consumers/businesses be willing to pay a higher utility bill and what would this incremental boost in power usage be due to many utility grids that are already stretched, especially during the warm months?
Like many ideas, it does sound a little far-fetched….but so did most of the inventions that have been done over the past 100 years when they were first conceived. I always like to quote one of the smartest men ever to walk on the planet, Albert Einstein, on this matter….”If at first the idea is not absurd, then there is no hope for it”. The idea of distributed computing is not a new one….just look at the great work that was done on the United Devices / grid.org project, where thousands of computers donated their “idle time” to allow for huge computing projects to be done, including some that helped out greatly with cancer research. If we can pull off this project (10 years ago), then is this so radical?
The author is Larry Bellehumeur, EVP of Sales and Marketing at Novotech Technologies (www.novotech.com). Follow Novotech on Twitter (@NovotechM2M) or follow him personally
(@LBNovotechM2M). They’re also active on LinkedIn so you can follow their company page.