Manage Learn to apply best practices and optimize your operations.

Developing a complex event processing architecture for IoT

What are some of the best practices for making IoT data actionable with complex event processing and stream processing? Expert George Lawton explores this further.

Connected devices could be the main sources of data over the Internet, which, in turn, can be woven into various...

applications to detect events, improve safety or drive businesses. In order for organizations to capitalize on this, enterprise architects should consider architecting an infrastructure for moving complex event processing closer to devices. Here, some best practices are shared, so companies can make IoT data actionable with both a complex event processing architecture and stream processing.

California Institute of Technology is an example of an organization that has taken advantage of data being integrated with an application. The institution has an earthquake detection application that uses StreamPy -- an open source stream analytics platform -- to process accelerometer from computer and smart phone sensor data to detect tremors. There are also numerous examples in airlines and railroads.

Setrag Khoshafian, chief evangelist at Pegasystems, said, "The IoT [Internet of Things] digitization trend is about the physical [and] cyber connectivity of things -- assets, devices, objects -- to operations in the enterprise. Things are continuously generating enormous amounts of data streams that reflect their status: temperature, geolocation, pollution, proximity, status and motion, to name a few." 

Basic complex event processing architecture

There are many architectures and topologies, but, typically, a physical edge device -- thing or asset -- connects to the Internet via a gateway. The gateway, in turn, communicates to the cloud, where thing data is aggregated. Then, enterprise applications leverage thing data in end-to-end value streams involving things and people within digitized processes.

Khoshafian said there are four interrelated and distinct complex event processing architecture opportunities:

  • on devices or at the edge processing of event streams;
  • correlating and processing events within the gateway;
  • correlating, analyzing and processing events with the cloud aggregation; and
  • CEP within the context of business process management (BPM) or process involving humans, enterprise applications and things.

Each of these tiers involves richer opportunities for correlation and action. For instance, a single device could perform simple processing and autonomous self-adjustment, assuming it has the CPU capacity to do so. The cloud aggregation, on the other hand, can provide opportunities for predictive analytics or real-time correlation of device status involving multiple devices across the ecosystem. For instance, the status of weather events affecting wind farms could be used for predicting potential hazards on the wind turbines.

Khoshafian said, "The core business value proposition of CEP for IoT is the speed and agility of responsiveness in end-to-end value streams." For example, an incident can be correlated from a variety of sensors, such as elevated hazardous pollutions, an accident on a highway with multiple connected vehicles and connected highways, or the temperature readings in a manufacturing shop floor. The speed by which the events can be correlated and acted on can become critical.

Plan for complex event processing architecture pipelines

One key trend will lie in creating CEP pipelines for feeding information from IoT devices into other event processing engines. Today, enterprises and tool vendors are just starting to develop best practices and infrastructure for doing this efficiently. Bill Platt, senior vice president and chief architect at BMC, said, "Just like content caching had its day when the Web exploded, edge-based CEP capabilities will emerge in the next three to five years."

The balance is still leaning toward the edge servers and services doing most of the processing, while minimizing processing logic deployed on devices. Platt said that even as devices get more CPU, more memory and more potential capability for doing CEP or BPM, the loss of flexibility from moving the hard stuff into them is far more painful than the gain. At the same time, it can be challenging to update devices in the field due to issues with users, last mile connectivity and security. "The more capable devices will enable performance for user experience, and, yet, keeping the complexity hidden from the end user remains paramount in CEP," Platt said.

Adopt gateways as a middle tier

In the short run, many enterprises are moving processing to the gateway tier. While not as flexible as pushing intelligence into devices, this is a much simpler architecture to maintain today. Michele DiSerio, IoT outreach lead for the Red Hat JBoss Middleware team, said, "We are seeing greater interest in CEP near the devices, in the gateway tier, to drive that sophistication and intelligence closer to the edge."

Organizations looking to take advantage of IoT opportunities should deploy decision services from the data center to the edge to improve business agility; make consistent, efficient decisions; quickly build resource optimization solutions; and shorten development cycles for faster time to market."

The intelligent gateway serves as a functionally capable middle or controller tier to bridge the devices to the data center or cloud. "CEP on the gateway enables real-time analysis and decision-making at the edge, transmitting only summary data to the back end," DiSerio said. Empowering the edge with near-field processing can thus address IoT requirements for reduced transmission costs and decreased decision time horizons.

Architect for distributed CEP

There is also a lot of interest in open source Apache projects like Flink, Samza, Spark Streaming and Storm. IoT is driving a plurality of new stream analytics projects, even if it's not a majority yet. Roy Schulte, vice president at Gartner, said, "Most CEP in IoT is not done with a general purpose stream analytics product; it is custom coded into an application. They are purpose-built, application-specific stream analytics rather than tailoring an off-the-shelf, general purpose stream platform."

But Schulte believes that enterprises could significantly improve their ability to leverage CEP capabilities for IoT by building on top of stream analytics products.

It is important for enterprise architects to develop an architecture for bringing analytics as near to the devices as possible so that most CEP is done within a few meters or a few dozen meters of where the data originates. Most stream analytics platforms can run on small devices like Raspberry PI, so it is practical to have large numbers of stream analytics platforms widely distributed.

"This minimizes the number of [complex] events that have to go across the Internet to the cloud. Send one event every five minutes to the cloud rather than 100,000 per second. If you are doing real-time sense-and-respond, you pretty much have to run the analytics locally because of latency and reliability issues. Enterprises can't afford to have a train or a factory machine stop because the network is running slow or goes down.

Next Steps

Determine your CEP architecture bias

The hype around CEP is real

An analytics guide to understanding Internet of Things data

This was last published in March 2016

Dig Deeper on Internet of Things (IoT) Data Management

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What are the biggest pain points your developers have faced when considering CEP architectures?
Cancel

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close