The internet of things depends on data. It seems like something that needn’t be said any longer, but it bears repeating as it’s one of the biggest barriers to IoT use cases heading to scale deployment. The things sense and act, the cloud stores and computes, and the intelligence applies insight and logic to drive action.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
We’ve had machine-to-machine communications for a long time, and much of the prevailing mindset of IoT in the early going has been a SCADA-type mentality of command and control from the central system, with plenty of data in round-trips related to the server controlling and the client obeying. Very little of the cloud and intelligence end of IoT has been fully leveraged thus far.
It’s time to revise that approach
With more computing storage and processing horsepower at the edge of the network in today’s IoT devices, use cases have started to incorporate processing out there (with many paradigm names, inevitably: mist, fog, edge and more). It’s becoming more common, for sure. However, while that does bring benefits to certain use cases, the central tenet of IoT remains — plenty of data making its way upstream from devices, sensors and actuators will be the foundation of reaping ongoing benefits.
One big barrier to getting more data is paying the price of carrying it
However, there’s still a big barrier in the way: the cellular Opex meters running on the telecommunications core networks which, to date, still carry the vast majority of IoT data.
When the MB meter runs, the toll on data collection remains in place. But is some data in the transmission from each IoT device/sensor/actuator more meaningful than others?
Clearly, we know that IoT generates a lot of data. Just to give you a mental refresher, here’s a quick back-of-the-envelope:
- Data packet size: 1 Kb
- Number of sensors: 1,000,000
- Signal frequency: 1x/minute
- Events per sensor per day: 1,440 (minutes in a day)
- Total events per day: 1.44 billion (a million sensors)
- Events per second: 16,667 (86,400 seconds in a day)
- Total data size per day: 1.44 TB per day … for the 1 kb packet.
What that means is if you’re paying $1/MB, you’re handing $1,440,000 per day to the cell networks. That’s a daunting number when you hit scale! And most IoT product lines are still in their early stages, so they haven’t really had to pencil out the economics of their solution at scale yet.
Big for the IoT developer, tiny for the carrier
Believe it or not, that 1 million SIM deployment for $1.44 million/day is very small for the MNOs, whose users of expensive smartphones with high ARPUs remain their focus. And so they frequently look at IoT as a secondary source of traffic and revenue to their core networks.
Also, as it turns out, transmitting that data over the cellular network, due to the security requirements, necessitates surrounding the “useful” data with plenty of extraneous bits that aren’t later useful for the kinds of things we’d like to do in the cloud with the data. While it’s necessary for the sensors to report at the signal frequency that the use case demands, it isn’t necessary to transmit all of that 1 kb of data in each packet.
Well, if the sensors were secured onto a virtual private connection, for example, where the device is unreachable from the “internet,” it’d be possible to eliminate a fair portion of that 1 kb. Extraneous packet header and security information can be expunged from the transmission, as those duties can be handled by cloud-side adapters that are tasked with managing those attributes. Each bit eliminated allows each dollar spent on data capture to the IoT application’s data store to be valuable for further processing in analytics, machine learning, business intelligence and other upstream applications.
Strip out the repetitive, extraneous bits — but also leverage edge compute
So building on the point that removing overhead in communication is key when collecting lots of minimized data transmissions, it’s also important to consider adding to that some capability around filtering data at the edge/in the fog/in the mist before sending it out to the cloud. For some use cases it can be more important to reduce data itself at the edge, for example, a surveillance camera sending an alert when a person passes in front of it, whereas cats are recognized, filtered and subsequently ignored. Combining those two approaches — reducing events through edge processing and sending only the important data with minimal overhead — is the way to go.
In summary, let the cellular Opex meter collect its (reduced) toll on meaningful data transmits and wring the repetitive data out of the stream. Architecting with an eye on virtual privacy for the network of devices, sensors and actuators that form the foundation of IoT is crucial to achieving precisely that.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.