So you’ve generated a tsunami of data from your consumer-grade internet of things devices — now what?
One of the odder aspects of IoT is how the “things” get all the attention when we talk. The reality is that in order for IoT to deliver real value, the things have to coordinate their activities over distances far further than Wi-Fi or ZigBee can provide. This implies IoT will involve millions of devices being utterly dependent on some form of controlling intelligence via the internet. So while the cute gadgetry gets the attention, the real value will reside in the back-end systems that conjure value out of bit streams in real time. This, in turn, brings up the question of latency.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Poor latency at mass scale will doom many IoT projects to failure. A consumer who turns on a lightbulb expects it to turn on when the switch is flipped — not a second later. Anyone who has tried to hold a conference call in which one of the participants is 1500 ms behind everyone else will be viscerally aware of how badly humans cope with latency issues, and the gadgetry in IoT will be no different.
To make matters worse, many IoT applications will involve optimizing the behavior of swarms of devices over very short timescales using real-world wide area networks, which are far more idiosyncratic in their behavior than a tame development environment.
Traditionally in IT we can make accurate decisions on small volumes of data quickly, or very large volumes of data several minutes after the fact. Online transaction processing systems tend to think in terms of thousands-of-transactions per second. Conventional analytic technologies can deliver much higher volumes, but with much longer latencies. For IoT to actually work, we need to make millions of very detailed and accurate decisions over very short (< 5 ms) timescales — requiring real-time data management.
Companies such as Push Technology are focusing on improving connectivity by providing reliable “reality-aware” network links, but mass-scale, accurate decision-making represents a challenge to IoT industry. Over the last decade we’ve seen Hadoop and related technologies revolutionize the economics of slower batch systems, but there hasn’t been a similar, widespread change in our ability to make high-volume, accurate, compromise-free decisions on the scales at which we expect IoT to perform, in real time.
When the first data warehouses were deployed by supermarket chains, there was a period of joy as the grocery industry realized it could measure things like exactly how many cans of baked beans were sold in Scotland the day before, followed by a period of frustration when they realized they didn’t have any complete ideas for turning this information into more money. We are now seeing something similar with IoT, with a new twist being the realization that the commercial usefulness of IoT sensor data arguably has a half-life of about 500 ms. Being able to act in real time — within milliseconds — will be key to making money in many IoT use cases.
A further wrinkle is that the actual logic that will be implemented will be a far cry from the “Dog & Pony Show” demos now being used to promote IoT technology. With a product development lifecycle of 12-18 months and consumer expectations that physical devices like fridges will work for decades, the average IoT play will end up supporting at least 7-10 different generations of the same product — and that’s before first movers acquire their competitors.
So what can we conclude from all this? Much of the noise and hype in IoT focuses on the part of the process the industry is comfortable with — high volumes and high latencies, with any decisions being either obvious (such as switching lights off) or made by humans after the fact. But these systems are generally useful for optimizing existing processes. The problem with such optimizations is that the benefits are small, finite and decrease over time due to competition and technological change.
The more radical kind of change, which involves totally new use cases that involve automated systems making millions of accurate and precise decisions per second, is only becoming possible now. We’re moving away from an era where maybe 50% of all IP addresses were ultimately connected to a human, to one in which people will be outnumbered 20:1 by devices.
The real money is going to be in emergent use cases that combine high volumes, single-millisecond latency and the ability to make millions of commercially useful decisions in the trillions of tiny windows of opportunity each day.
VoltDB is ready for the real-time data management challenges of IoT. Are you?
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.