Get started Bring yourself up to speed with our introductory content.

Fog computing and the OpenFog Consortium

For the last 15 years, cloud computing has been a game changer. It has created efficiencies and increased scalability, giving rise to the “as a service” phenomenon in the enterprise. Cloud computing has certainly become a standard in many IT environments, but as we move into a significantly more connected world where we want to support many new “things” and applications, we are starting to see a greater need for computing capabilities closer to the users and the things. This need is driving the next wave of innovations — something Cisco coined “fog computing.”

Fog computing is a system-level horizontal architecture that distributes resources and services of compute, storage, control and networking anywhere along the continuum from the cloud to the things.

This new capability will fill the technology gaps to meet the new requirements of the emerging IoT applications, many which may not have been possible with a cloud-only environment. This can then help a broader range of industries and consumer sectors increase their abilities to support IoT and other emerging applications, which can include anything from existing and future performance-critical, mission-critical and life-critical applications.

Fog computing success stories

Across vertical markets including transportation, utilities, smart cities, manufacturing, retail, energy, healthcare, agriculture, government and even the consumer space, fog computing has demonstrated a tremendous business value already.

The manufacturing industry is full of prime examples of the power of fog. For example, Lordan, a global thermal-engineering heating and cooling manufacturer, used fog for its manufacturing automation system. Its implementation gave the company the ability to view overall throughput and track mission-critical manufacturing information in real time directly from the production floor, rather than rely on periodical assessments. This resulted in over 600 labor hours saved a month, with direct cost savings just weeks after deployment.

Mazak Corporation, which builds advanced technology solutions, has collaborated on its SmartBox technology. Mazak was looking to provide customers with an advanced, secure data collection system which would run on the network infrastructure of a customer’s factory floor. To do so, the fog application needed to support advanced security standards and real-time analytics. Enter the SmartBox, which utilizes fog computing to enable real-time manufacturing data and analytics from Mazak machines to significantly improve machine efficiency for Mazak’s manufacturing customers.

So how does fog computing work?

Fog computing will provide a standards-based way to distribute compute, storage and application resources, and services closer to the users along the continuum from the cloud to the things. This will be analogous to how TCP/IP provides a universal standard way to distribute packets across the internet. Additionally, fog will provide standards-based ways to manage the lifecycles of the distributed resources and services, to secure systems and applications, to pool together the resources in different fog systems, and, in the cloud, to support applications, to provide APIs for the developer community to create new fog applications, and for fog operators to deploy the applications.

To do so, fog needs to operate on an open architecture with interoperable standards. Since the same customer often needs services provided by both the cloud and the fog, fog should be, in many scenarios, integrated with the cloud to enable a unified end-to-end service platform to provide seamless services to the customer. Some platforms can be used to manage services in both the cloud and the fog. Applications developed for the cloud should be able to work in the fog without modification, and vice versa.

A fog system also needs to be able to communicate with all sorts of endpoints. The fog system can serve as proxies of these endpoints to help connect them to the cloud and perform local processing of the data from the endpoints. The fog system can also serve as the proxy of the cloud to provide services to the endpoints. The reality is that no one company can offer a full fog solution. Fog must be supported by a large ecosystem of innovative companies.

In November 2015, innovators including Cisco, Dell, Intel, Microsoft, ARM and Princeton University launched the OpenFog Consortium to develop an open reference architecture. Another key goal of this consortium is to help the industry learn about the business value of fog computing, and therefore help accelerate market adoption. Since then, the consortium has grown to over 50 members, including not only industry leaders, but also startup technology innovators and research organizations.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close