red150770 - Fotolia

Get started Bring yourself up to speed with our introductory content.

Comprehend control loops to unlock analytics for IoT

Applying analytics to IoT quickly becomes a convoluted task. Architects must master the control loop to direct its design and deliver rapid business insights.

Data plus analytics equals decisions. The question IoT development teams must answer is how to apply analytics for IoT technology. IoT analytics requires architects to understand that where and how they design analytics in the control loop can reduce latency for applications.

The control loop, sometimes called the workflow, is core to IoT analytics. The control loop starts with sensors, moves through a local controller, edge computing and, finally, into IT infrastructure, such as an analytics platform or a database.

Part of the control loop's mission is to signal actions to real-world elements, such as gates, and the other part is to feed information into a repository to apply analytics and make business decisions. Architects might find it challenging to design analytics effectively for both purposes, but they can tweak parts of the control loop to reduce latency and make real-time decisions faster.

Separate real-time processes from the control loop or use tools to reduce latency

Some applications require analytics as a direct piece of a process control step to complete the control loop. For example, if goods are arriving or departing, they may require inventory analysis, but controlling an access gate doesn't require analysis. Applications that combine process control and data entry -- in this case, the gate and the inventory, respectively -- should be broken up to prevent additional latency in the real-time portion. Architects must assign the real-time portion to a local controller or to the edge computing facility, where available.

Architects can't separate some control loops from processing, which makes the overall processing latency critical to the application. When IoT analytics are used in the control loop, architects must use tools to handle event flows or streaming and build an application.

Architects must test any application approach with a realistic test data set that represents both the expected volume of events and the timing before committing anything to production.

Ask what the purpose of the analysis is and how that purpose relates to stored data wherever the control loop needs real-time analytics. Analytics for IoT divides into two categories: correlation and projection. Correlation, sometimes called complex event processing (CEP), interprets IoT events in the context of other events, usually from a comparable point in time. Projection, or historical, analytics -- a form of predictive analytics but translated into a time-critical context, predicts outcomes based on historical trends and current events or conditions.

CEP can use either custom tools or streaming event tools. Organizations tend to use streaming event tools for IoT, which vendors such as Tibco, IBM and Red Hat support. Specialized CEP tools can include analytics, such as those provided by Software AG's Apama Streaming Analytics and Evam Streaming Analytics. Architects can use Apache Kafka to develop complex and streaming event handling and analytics. Cloud providers, such as Amazon, support Kafka.

Streaming event tools offer high-speed event handling and limited analytics support through a series of events. Architects must understand the nature of contextual events belonging to a common real-world process because tools differ in their ability to handle events from distributed sources. If event sources are widespread, a common analytics task in all the control loops can create excessive latency for some loops.

Simplify the control loop without specialized tools

Architects must not rely on event-specific analytics tools as a guaranteed fix to event analytics latency issues. If the actual analytics process must be part of the control loop, architects might be able to reduce latency by working from a summarized historical database or creating an algorithm.

A summary database is a view of or extraction from a traditional database, designed for faster access and distribution closer to the edge. Architects can replicate summary databases if there are multiple points of IoT event generation to keep the data close to several edge points at once. They can also design the database access method for faster access, such as NoSQL versus relational database management system, without compromising the business applications that rely on traditional SQL or RDBMS.

An algorithm may offer the best approach to reduce IoT analytics latency. Historical projections are, by nature, predictions. Architects use traditional analytics to establish a trend, then represent it as an algorithm and apply it to events as they occur. This approach is most valuable when the historical analysis is made over a long period of time. If the analysis period is short, then the application should be considered one of event correlation.

Architects must test any application approach with a realistic test data set that represents both the expected volume of events and the timing before committing anything to production. Production IoT is no place to experiment.

Dig Deeper on Internet of Things (IoT) Analytics

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close