Most modern IoT ecosystems rely on a permanent connection with the internet. Devices constantly send and receive data to a data center somewhere in the cloud. The obvious benefit: Users can monitor and control their system at anytime from anywhere. The obvious disadvantage: In case of connection loss, most systems stop working. Why are we willing to take that risk?
Okay, the internet of things is called internet of things because the internet plays a major role in the whole idea. Devices are not isolated anymore in their local environment. They can interact at any time with the outside world. They use real-time data from weather or traffic, helping us finding the best decision to manage our day. But with all the comfort that comes with being permanently online, we have become too airy with our data. IoT devices usually send their data hundreds of kilometers through a network to interact, even if they sometimes just sit a few meters away from each other. When we use our smartphone to trigger the light in the living room, within milliseconds the command travels through half of the continent. We consider it normal. But would you call your kids to dinner with WhatsApp even though they are just sitting in their room next door? That would be at least a little bit weird, don’t you think?
With cheaper and smaller processors on the market, it’s time to take back at least a little bit of control. Smart devices are not really smart if they rely on a central intelligence that makes the decisions for them.
While the cloud was the solution for everything in the past, this trend slowly seems to be shifting. It has become clear that not everything needs to be sent into the cloud, thus making IoT ecosystems prone to blackouts or even slowing the whole installation down.
Edge computing is becoming more and more popular, slowly shifting processing power from a central intelligence back to the edge of a network. Even giants like Microsoft (Azure IoT Edge) and Amazon (AWS Greengrass) have recently recognized the writing on the wall and are now offering their own edge-based solutions.
The term edge originates from the mobile world where data has traditionally been compressed at a point as close as possible to the end-user device; the aim being to get it transported quicker through the mobile networks. Its goal was to ease the burden on the network and accelerate the performance of the whole system. Edge means to perform as much processing as possible at the end — the edge — of a network, usually on the connected devices themselves. It worked for mobile networks, so it can’t be bad for IoT networks, too.
Cloud or edge: What’s better for my business?
As so often in life, there is no single solution to find the perfect match; it depends on each individual use case. A mix of both could be perfect: In a hybrid system, simple tasks are performed directly between the devices. Here they can do their work as quickly and independently as possible, which is especially useful in the field of building automation or in setups for the smart industry. Just make sure you have the necessary data for analyzing and monitoring sent to the cloud.
With large companies shifting their business from pure cloud to a hybrid offering, we might even see the obstacles of a heterogeneous IoT landscape with quadrillions of different standards being taken in consideration. Since there won’t be any universal IoT standard in the nearby future, device and gateway manufacturers probably see the need of implementing more functionality into their devices and not just leaving it to the cloud.
When it comes to IoT ecosystems, we’ve been living on the edge for too long now. Let’s move closer to it.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.