Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Analytics: Can IoT manage what it can't measure?

A few years ago, disruptive technologies expert and author Geoffrey Moore described the importance of data analytics in fairly dramatic terms: “Without big data analytics,” he wrote, “companies are blind and deaf, wandering out onto the web like deer on a freeway.”

The same concept applies to IoT data analytics. In most cases, organizations need the insights that their connected assets collect in real time, as IoT data has a short shelf life. When data flow is slow, achieving real-time analytics becomes impossible, meaning decisions are made without the critical insights that data is meant to provide. As a result, time-sensitive functions, like security monitoring, predictive repair and process optimization, suffer.

It’s important to understand the challenges that create such issues. Like IoT itself, the factors contributing to data flow delay and the resulting detrimental impact on analytics are complex and have grown over time. They are driven, in large part, by the sheer volume and complexities of the data that’s generated, infrastructure limitations and the latencies associated with cloud processing.

Data deluge

As IoT grows, the data it produces increases at staggering rates. A recent IoT Analytics report estimated that IoT will comprise 19.4 billion devices this year. With 10% growth in devices expected each year, it’s further estimated that there will be more than 34 billion IoT devices by 2025. Furthermore, a report from IDC predicted that IoT devices will create 90 zettabytes of data a year by 2025.

Moreover, the data that’s generated in the field doesn’t necessarily come in nice, easy-to-process packages. It can be intermittent, unstructured and dynamic. Challenges will continue to increase as machine learning gains more widespread application in the field. These complex devices require more memory and CPU, or they slow processing further.

Infrastructure and security issues

Amplifying the challenges of continuously rising volumes of data are limitations in the technology being used to collect, transfer, cleanse, process, store and deliver data. Many of these challenges are rooted in the fact that the technology was not necessarily intended for those purposes. When limitations exist in functionality and scalability, it increases the likelihood that data processing and delivery will be delayed.

As such, it’s increasingly critical for organizations to invest in new data management technologies and platforms. One option is to “try on” potential new IoT technologies before investing in a full-scale launch. Another option is to develop proofs of concept or pilot studies before a full-scale launch. Regardless, the technology used needs to be scalable and capable of handling inevitable increases in data, storage and computing demands.

Security is another important consideration. A streaming-first architecture can be valuable in this regard as it allows organizations to analyze multiple endpoint security system logs. Additionally, infrastructure component logs that are created in real time can catch breaches that wouldn’t necessarily be detected through an individual security technology.

Addressing these issues, however, is only part of a long-term management solution.

The cloud, the edge and standardization

While cloud computing is integral to IoT, sending data to the cloud — and waiting for it to be sent back — can bog down data delivery speeds. This is particularly true when large amounts of data are involved. Moving one terabyte of data over a 10 Mbps broadband network, for example, can take as much as nine days to complete.

This is where the benefits of IoT edge computing are most evident. IoT edge computing allows data processing to take place directly on IoT devices or gateways near the edge of the network, meaning the information doesn’t have to make a round trip before data is delivered. Removing the step of sending all data to a centralized repository minimizes the latency issues that come with cloud computing and resolves device complexity and bandwidth constraints.

However, this is not a one-size fits all solution. Servers and devices don’t necessarily have the computing power to accommodate enterprise data. They may have battery limitations and lack the storage necessary for analytics. This means that when greater storage and computing power is necessary, analytics have to be distributed to devices, edge servers, edge gateways and central processing environments.

Another key factor is the need to standardize the communications between devices and network elements. When multiple approaches and implementations are used, devices struggle to communicate and the flow of data is slowed.

It’s encouraging to note that work already is underway for creating standards in IoT. The ITU already has a global standards initiative in place and oneM2M is working to create architecture and standards that can be applied in many different industries, including healthcare, industrial automation and home or building automation.

Despite the range of challenges to be addressed, the consistent, timely delivery of data for analysis is doable. With a multipronged strategy and a willingness to invest in infrastructure when needed, organizations can realize the full potential of IoT, including its capacity to generate revenue.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

CIO
Security
Networking
Data Center
Data Management
Close