In 2010 Dave McCrory proposed the software concept of “data gravity,” where applications, services and business logic in software applications will move physically closer to where data is stored. The theory makes a lot of sense to me, and over the last several years we have seen companies embrace the cloud for application layers to access data they were already storing.
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
Learn more about data gravity from Dave McCrory in this great video.
Understanding data gravity
But the internet of things is changing this gravitational constant in our technical universe. As IoT matures, the black holes of data gravity we have been placing into clouds will be ripped apart by millions of smaller data planets. These smaller planets will be located in our factories, warehouses, buildings, homes and everywhere else IoT runs to make data actionable. The very simple fact is we must keep our applications blazing fast and running 100% of the time means we cannot let our applications sit next to large data pools running in cloud data centers. We cannot leave our applications vulnerable to the latent nature of the internet. Additionally, the proposition of existing IT engaging with these new IoT systems means more and more of the data that resides behind firewalls must be included in the overall design. These existing systems won’t simply be lifted and shifted into a cloud for a variety of reasons.
Instead, our satellite IoT applications will require that we migrate to where data is being generated and stored. For IoT that means our applications must be capable of running physically next to our machines, devices, users and buildings. In these local environments is where gateways, industrial PCs and microcontrollers will reside, capable of running processing at the outer edge of our IoT solutions. What does this mean for how we will be architecting these IoT systems? Will IoT cause us to abandon the cloud? Far from it — instead we will expand how we build our applications, interfaces and services such that they can be dynamically located and run on demand. Our new IoT systems and infrastructure will be capable of allowing our business rules to execute on any cloud, on any PC or on any gateway at the control of our IoT developers.
The difference is IoT edge processing
Today, IoT software vendors are putting together the pieces with a component generally called edge processing. With edge processing, companies will look to store data, implement their applications and leverage the knowledge and analytics that are available locally. Additionally, this application capability will strategically groom, move and compile the data that then moves back and forth to the cloud for a broader understanding. Ultimately the edge becomes the first stage of a user’s and device’s interaction with IoT. The ability to leverage vendors that rapidly build advanced IoT solutions that easily transition between cloud and edge will be critical to infrastructure decisions in the coming months.
Simply put, IoT will require developers to move application logic to exactly where the data is located at any moment in time. If the data is in the factory, the application runs next to the machine making sure safety rules are executing, machine learning is calculated and usage optimization is implemented. If in a home, the IoT system can communicate with the appliances, security systems and mobile interfaces without the need to push the communication across provider networking.
Cloud only a part of the infrastructure
The cloud and private cloud infrastructure today will continue to be where macro and larger IoT activities take place. If you want to know the status of all your factories, the cloud will have the holistic view. The cloud will even be empowered to interact with those smaller IoT planet applications to manage, control and engage those local devices. If you want to disengage your home security when your car pulls into the neighborhood, the communication will happen from the cloud into your home. The value of cloud will continue but be increased as these smaller locally executing pieces of the solutions keep everything up and running 100% of the time.
Watch “The End of Cloud Computing” with Peter Levine.
Ultimately, the theory of data gravity will hold true, but our strategy for data storage is going to change. Data will not all pool in a single location, but instead will spread across our public and private infrastructure. The result is our ability to build apps that can flexibly move and run with that data.
Enterprise transition has begun
The beginning of this shift to the IoT edge has already started. We are seeing huge investment from industrial manufacturing and building automation companies to build IoT solutions that include local and cloud systems that ensure they have 100% uptime. These early IoT architectures will leverage the ability to define IoT solutions in a cloud that then push rules and logic into our buildings. These rules will make our buildings safer when doors are automatically opened or shut during emergencies, our factories more efficient by reducing power consumption and our homes more responsive to our needs. This logic will allow our warehouses to internally track work shipping orders in real time with indoor GPS.
IoT edge processing done elegantly with cloud integration will be the path forward for those wishing to lead their enterprises into a complete, end-to-end IoT solution.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.