There’s been an ebb and flow to where intelligence and data resides throughout the history of computing. Sometimes the dominant architecture pushes smarts (processing power and applications) and state (data) to the network edge. Other times they migrate to the core.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Once, everything sat in the glass room except for “dumb” terminals and keypunch machines. Early sensor and control networks were similarly centralized (even if some local processing was delegated to intermediate nodes in a network).
Fast forward to the PC revolution and a lot of smarts and state moved out to the client. Then it seemed as if the cloud would pull the smarts back to the data center. Web browsers on the edge would handle the presentation of applications. The logic, processing power and data would be off in a data center someplace.
But computing has not fully recentralized. Industrial IoT, for example, often uses an architecture that’s a combination of edge devices, intelligent gateways and back-end servers. All these layers can have smarts and state.
There are reasons for this. Consider (among other factors) device management, networks and processing horsepower trends.
There can be good reasons for a centralized architecture. Perhaps you need to aggregate information before you can do something useful with it. Or you worry about the security implications of locking down distributed devices.
But if we consider the driving forces behind approaches like thin clients and software as a service, it’s often been application and device management. It’s a lot easier to just fire up a browser than it is to be a system admin for a personal computer with lots of apps installed. Running applications in a browser often isn’t better in terms of performance and functionality, but it is simpler.
Given modern networks and software technology, cloud-based apps do work well a lot of the time. However, app stores and container technology have also made it much easier to provision and manage applications locally. When it makes sense to distribute smarts and state, there’s less reason to do so merely to simplify the management of applications on remote devices.
As a result, today we’re in a better position to design architectures that optimize for performance, security and cost rather than just ease of management.
There are a variety of definitions for IoT. But pretty much everyone agrees that data is an important part of IoT. And not just some data. Big data.
And big data doesn’t like to move around much. Dave McCrory first coined the term “data gravity” in 2010 to describe how “data if large enough can be virtually impossible to move.” He’s since expanded on the concept in various ways.
The implication for IoT is that if you collect a lot of data on the edge, you may not want to ship it all back home. For one thing, network bandwidth can get expensive. Furthermore, to the degree that sensor data is used to control physical devices like switches, there may simply not be enough time to wait for a remote computer to make a decision. (Network outages can also be an issue in this case.)
Intelligent gateways are one architectural approach to providing autonomous responses to local events. They can also filter and aggregate data so that only that data useful for predictive analytics and trend analysis need be sent all the way back to a centralized repository. (There is a general trend toward saving more and more data, but the details depend in part on how easily and cheaply all the sensor data can be transmitted.)
A third reason why intelligence gets distributed throughout the network boils down to “because we can.” With small, cheap and low-power processors and sensors, there is often no good reason not to move the processing out to where the data is collected and acted upon.
Some tradeoffs remain, especially in situations where sensors can’t be connected to a power source. Furthermore, endpoint devices that are full-fledged computers running an operating system present security challenges that dumber devices may not.
Nonetheless, low-cost ARM and other microprocessors mean that we can treat the centralization of intelligence and data, should we choose to do so, as an architectural decision. Not one that’s imposed on us by economics or physical limitations.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.