While reflecting on this year, whether the IoT predictions we all made last year came true and looking into the future, a few things stand out to me. First and foremost, companies realizing early wins in the IoT market are still the ones that have clear focus on use cases with demonstrable business value. Use cases that drive efficiency are table stakes, and those that aid in maintaining regulatory compliance will be increasingly attractive, but facilitating entirely new business models and customer experiences is the Holy Grail.
Second, technology providers and end users alike (both with OT and IT backgrounds) continue to realize exactly how hard IoT is and that it takes a village. The OT/IT convergence trend will hit full stride in 2018, with OT end users getting comfortable with the risk versus reward equation of connecting their critical processes to broader networks for business gain, and IT organizations looking to transition from a cost to profit center as their traditional role is increasingly commoditized. The top two challenges in IoT will continue to have nothing to do with technology — #1 being business case and #2 alignment of stakeholders across IT, OT and the LOB (aka cat herding). OT technology experts will increasingly partner with strong IT players (and vice versa) rather than trying to build these new domain-specific capabilities themselves. In general, the winners across the board will have strong partner strategies and an open philosophy.
OK, so now that I’ve already snuck in a few predictions, here are six more for what 2018 holds for IoT:
1. We will see accelerated convergence of foundational IoT technology elements
I led with this last year, and the trend is definitely snowballing based on the widespread motivation to partner, rapid adoption of open source tools like EdgeX Foundry for interoperability, and increasing collaboration between key consortia efforts like the Industrial Internet Consortium, Open Fog Consortium and Edge Computing Consortium. The proliferation of IoT platforms will hit its peak in early 2018, followed by a fairly rapid decline. Leaders among the large horizontal technology platforms will start to emerge, but companies of any size with laser focus on vertical-specific use cases and cat-herding business stakeholders will still see the most traction with customers. Investment and M&A activity will accelerate in areas like security, analytics and scalable domain-specific applications and will decline for generic IoT platforms.
2. We will emerge from the ‘AOL stage of IoT’ –- i.e., simply getting things online — to advanced class involving powerful analytics and action
All the free time developers gain from no longer reinventing foundational IoT elements can now be spent making it easier to practically apply machine learning and artificial intelligence in scale, enabling customers to move beyond simple monitoring to realizing the power of prescriptive analytics. Further, true AI leaders will emerge, separating themselves from the masses that use AI as a buzzword when they’re really just doing basic machine learning. In advanced class we will also direct our brainpower to innovation in areas such as co-processing (e.g., via GPUs and FPGAs) to accelerate analytics in lower-power envelopes, network and application virtualization/containerization and time-sensitive networking. All of the above will be increasingly used to close the gap between OT and IT workloads in mission-critical applications and allow these workloads to be readily transportable across the compute continuum. Augmented reality will find more practical applications in IoT use cases such as remote expert and guided field maintenance, and blockchain will begin to see more widespread use in security, logistics and general transactions, getting beyond the hype and theory of 2017, although the real uptick for blockchain will be in 2019. If you’ve ever seen the movie My Big Fat Greek Wedding, you’ll know what I mean when I say that blockchain was the “Windex of 2017.”
3. Customers will feel the burn of public cloud
While consolidation of general IT infrastructure into the cloud has been a boon for organizations to cut costs, it can quickly become cost-prohibitive to pay every time you want to touch your own data for analytics. So, customers realizing the powerful benefits of sensor-driven analytics will increasingly move their workloads from the cloud to the core and edge, not only for the reasons of latency, security, privacy and network bandwidth that industry experts widely agree on, but also to minimize the total cost of the lifecycle of their data. As part of this, more customers will appreciate the value of edge gateways for real-time action, first-pass edge analytics and applying security measures — not just as a necessity for converting data streams to IP traffic. AI and machine learning workloads will continue to shift towards the edge — even into sensors themselves — but the core (e.g., localized micro-modular server clusters to full-blown on-premises IT data centers) will be tasked with the heaviest of real-time streaming analytics due to the responsiveness and reliability benefits from being on the same local area network as things and processes at the edge, compared to relying on a wide area network to the cloud. The bulk of deep learning will continue to be done in the cloud due to infinitely scalable compute, but end users will use private cloud and increasingly the core to perform deep learning in order to keep control over their data.
4. Dynamic orchestration of micro-service workloads will be a foundational area for innovation
This part of advanced class is what I like to call performing “analytics of the analytics” in order to dynamically optimize where and when compute and storage should occur in the edge to cloud continuum for optimal results and lowest overall cost. As part of this, developers will increasingly realize the importance of microservices and decoupling “things” from applications (effectively OT and IT) as close to the edge as possible through the likes of the EdgeX framework. This decoupling enables API integration at any point from edge to cloud rather than only in the cloud — a practice which often conflicts with the desire to own your data throughout its lifecycle. A general benefit of decoupling southbound sensors from northbound applications is minimizing lock-in to any particular provider which is important to end customers in an inherently heterogeneous market. The EdgeX community is seeing end customers quote the framework into projects for this reason alone. Finally, decoupling workloads at the edge aids with multi-tenancy, an example being when the building owner, the tenant, an outsourced facilities/energy management provider, insurance carrier, etc. can each use aspects of the same sensing infrastructure integrated with their own applications from edge to cloud. Having the ability to integrate their respective stacks close to the point of data inception allows each provider to better control their own destiny compared to relying on another party to aggregate and potentially filter and charge for data access in their cloud. Net-net, how workloads are deployed will vary by use case and context, but investing in loosely coupled microservice architectures from edge to cloud will provide maximum flexibility compared to the plethora of monolithic cloud-centric IoT platforms we’ve seen to date.
5. LoRa will win the battle for LPWAN connectivity
LPWAN has seen a bit of a VHS versus Beta war over the past few years between the likes of LoRa, SigFox and numerous other lesser-known players. If you’re old enough to have rented video tapes at a brick-and-mortar store (crazy thought I know), you’ll recall that VHS was an inferior technology to Beta, but the VHS crew got the most studios onboard. I’m not saying LoRa is a completely inferior technology — it’s solid — but there are better ones out there. The LoRa Alliance has simply done a fantastic job of building out an ecosystem. Large carriers are even adopting LoRa to protect against losing connectivity business in general as they struggle with the underwhelming performance of NB-IoT despite the attractiveness of being able to use their existing infrastructure. While some still think of LPWAN as relevant only for service providers to canvas smart cities, many end users are starting to use it to deploy private networks from fields to buildings as an alternative to wireless mesh protocols like Zigbee. Of course, given the unlicensed spectrum and super-low bandwidth of LoRa, it isn’t suited for mission-critical applications, however it’s super attractive to report simple status via battery-powered sensors due to its long range and low energy consumption (energy harvesting will soon kick things up a notch, too). Private LTE/cellular is also very interesting, especially for use cases in remote areas, such as oil and gas and mining, that need the higher bandwidth for streaming data wirelessly across local site operations that otherwise sip through a satellite connection to central command. In any event, as customers migrate workloads from the cloud to core and edge to get control of their own data many will also deploy and manage their own local wireless networks as carriers struggle to figure out pricing models that are attractive for connecting lots of things. All said, despite the emergence of winners like LoRa, there won’t be a single wireless transport standard in the end — after all, we have Wi-Fi, Bluetooth and cell in our phones for a reason.
6. New tools to simplify security in scale will emerge
This year I’ll finish up with the always hot topic of security. While I do believe security concerns are still holding some back, I do not think it continues to be a widespread barrier to adoption. It’s important to recognize that adequate, well-proven tools exist to address foundational security needs today, and the well-publicized breaches are generally the result of these tools being poorly implemented, if at all. In all cases, implementing security measures involves working with people that know what they’re doing and who practice defense in depth rather than promoting some single magical answer. We should be concerned but not paralyzed because the latter limits us from achieving value and risk getting left behind. Given the aforementioned convergence on the platform basics, developers can place more focus on tools to close key security usability gaps, and in 2018 we’ll see more innovations to simplify the secure onboarding of devices and manage security certificates in scale. While gateways are the first line of defense for dumb sensors, another area ripe for innovation is extending root of trust to smart sensors at the very edge where data is first conceived. The resulting soup-to-nuts trust and providence throughout the data lifecycle is paramount to graduating to super-advanced class — selling your sensor data to unknown third parties. Efforts like EdgeX are also important here because trust is built on common ground and open source is a highly effective way to achieve it across both public and private domains. Finally, as we get out of what I call the party-of-one “PoC friend zone” and into deployment scale, end users will quickly learn to appreciate the importance of enterprise/industrial-grade hardware and remote manageability consoles. Raspberry Pi’s and CLI’s are great tools for prototyping, but simply don’t cut it in the real world of scale.
Here’s to a great 2017 and an exciting 2018 for IoT!
Do you agree with these predictions? What other trends do you think will impact the market in 2018?
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.