IoT devices and use cases are exploding. Coupled with advances in artificial intelligence (AI), they are poised to transform our lives. Interfaces are moving from touchscreen to intelligent voice control. In order for IoT devices to become ubiquitous, they must be increasingly intelligent and, at the same time, low cost. That is the conundrum — product attributes that are in conflict with each other. The only solution is to amortize the cost of high-quality AI across many devices with centralized processing in hyperscale data centers. Designing every single light switch to have intelligence beyond Siri or Alexa would be prohibitively expensive. However, passing voice commands to data centers, where the cost of AI can be amortized across thousands of intelligent light switches, makes the cost of a high-quality AI system a thousand times cheaper. That light switch voice interface used is less than one minute out of 1,440 minutes per day. This will provide $1,000 worth of high-quality AI, at a cost of only $1 per light switch — which is a manageable price.
For example, an intelligent light switch can save energy by reducing light output when there is sufficient ambient light or the user is working in a different area. With an echolocator, it can pinpoint the person’s location without the need for a camera, which many people do not like in private environments. If a person suddenly ends up on the floor, an intelligent switch can determine if the person is doing yoga or if it is a grandmother who collapsed and needs 911 to be called. Or, if somebody at 3:00 am entered the room not through the door and it is wise to turn the lights on and/or call 911.
Another aspect is sharing knowledge. If we have a toaster with a small camera sensor to make sure our bread is toasted but not burned, such a toaster may experience a new type of bread, which behaves differently. Having our toasters networked means they can learn from each other’s experience. Again, hyperscale shared data centers provide this shared knowledge. Such a toaster could recognize the voice that is talking and make toast exactly the way that person likes it.
However, there is an additional ingredient needed for success. It is security. It will not be acceptable to use smart devices if a hacker can burn down somebody’s house by attacking a stove or toaster. It is not acceptable that somebody can listen to your private conversations. Thus security is must. Security drives further demand for processing performance at the data center — not only for encryption, but for AI software to determine if something is appropriate or dangerous, if something should or should not be done, and even whether a malicious hacker requested an action in question.
From intelligent refrigerators telling us not to drink the out-of-date milk, to stoves making sure food will not be burned, to microwaves which will not overheat food, to smart garage openers, home security — everything in our lives will benefit from AI-powered IoT devices with collective shared knowledge and wisdom — if they also have security. It can’t be accomplished without amortizing the cost of intelligence by concentrating it in hyperscale data centers, where AI cost is spread across billions of “intelligent” IoT devices.
All of this drives the demand for more processing power in these data centers. With today’s data centers consuming 40% more energy than Great Britain, more than the airline industry and with 15% annual growth, this means we will have two times more data centers, burning two times more power every five years! Humanity can’t afford >10% of the planet’s energy to go into data centers in 10 years, and definitively not 40% of the energy in 20 years.
In the past, Moore’s law gave us power reduction in semiconductor products from lower voltage and aggressive process shrinks. But now, with a performance and power plateau now in processors, we can’t rely on existing technology to offset growing power consumption demands like it did in the past.
However, companies today are working on new solutions to this problem, building chips that not only reduce data center hardware, but also reduces power consumption of public or private cloud data centers. Solving the processing power problem cannot be ignored if we want to make billions and billions of intelligent IoT devices safe, secure and cost-effective.
Those of you in the San Francisco area are encouraged to watch my invited speaker presentation at the Storage Visions Conference. I will address the growing data center energy consumption challenges, on Oct. 16 at 2 p.m. at the Embassy Suites in Milpitas, Calif.
All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.