James Thew - Fotolia
The problem with thinking about how we protect the data generated by the Internet of Things (IoT) and how we determine ownership of IoT data is that we are trying to apply the same old rules and methodologies to a completely different context for which the old rules and methodologies were never conceived or intended.
We are modern-day cavemen and women struggling with the problem of who owns the light, heat, and sounds and smells generated by a bonfire. We need much more than new laws and regulations to address IoT technology; we need a different way of thinking and ultimately managing a society that finds itself in the throes of evolving into a newly advanced civilization, likely even a new species. We need to advance our culture, our thinking, our beliefs and our value systems, in concert with the evolution of our technologies, including the evolution of IoT.
Evolution of IoT and the fallacy of the final frontier
As humans, we resist change. The status quo provides comfort in the known. Structure and predictability afford us a sense of security. As comfortable (or uncomfortable) as the present seems to be, we tend to accept present reality as what we have for now and for the foreseeable future. As a species, we work hard to protect the status quo. In its extreme form, we believe that everything that can be has already been invented, that there are no new ideas spare an occasional incremental improvement.
Our notion of improved communication, for example, is based upon a better smartphone with a bigger (or smaller), brighter screen. What we often fail to recognize is that the "foreseeable future" is being invented -- is evolving -- faster than we can possibly imagine, and that the pace of this evolution of IoT is increasing geometrically.
We quickly dismiss as science fiction the idea of communication as, for example, direct mind-to-mind telepathy, that is, without the need for smartphones at all (think Vulcan Mind-Meld). In our wildest dreams, this form of smarter communication is something our children or grandchildren might be concerned with, but not us, not now. Like it or not, however, our present reality is not fixed, and what we believe is our final frontier is actually changing at an ever-increasing and foreseeably infinite pace.
Technology evolution defined in human terms
This pace of change changes everything.
Historically, when we have looked at the impacts of technology advances, we typically considered the conveniences and efficiencies that those new technologies provided.
Fire gave us light, heat, comfort, security, and enhanced communication as smoke signals could be seen across greater distances than voices or drums could be heard. Cooked food undoubtedly enhanced health and human well-being, and extended lifespan. Throughout the ages, fire has been an important enabler of military weaponry.
The wheel extended the mobility of people and things, significantly increasing the physical distances that we could travel and move goods for trade and economic development. Steam power and fossil fuel-based engine technologies took transportation and logistics to revolutionary new levels of efficiency and economic productivity during the Industrial Age. These newer technologies similarly contributed to the advancement of military weaponry that helped turn towns and villages into colonies, colonies into nation states and nation states into vast empires, ultimately on a global scale.
The computer age brought us information technology that automated repetitive back office functions, then greatly expanded information reach from the data center to the connected cubicle, and from the connected cubicle to the disconnected worker who became geographically independent from the physical location of computation and data. Once physical barriers to computer and data access were overcome (mobile computing), new customer markets were economically more accessible, new sources of raw materials became available in parallel with advances in logistics and transportation, and new sources of labor became available. Applications rapidly moved from back office efficiencies to front office marketing, sales, customer relationships and, more recently, experience management. Digital age technologies -- social, mobile, big data/analytics and cloud -- have enabled more and more focus on people as individuals and their (real and perceived) needs, evolving well beyond relating to groups of people as demographic or psychographic segments.
Yet with all of this technological development, the boundaries between machines and the people who create and use them have remained relatively clear.
IoT evolution and the blurring of machine and human
The Internet of Things has significantly increased our ability to sense, collect, analyze and act upon previously unthinkable amounts of data. IoT data enables use cases for multi-disciplinary technologies that go well beyond conveniences and efficiencies. Evolving IoT technologies have spawned developments that blur the lines between people and machines in ways that Darwin likely never imagined.
Advances in IoT are enabling unprecedented levels of progress within the disciplines of science, technology, engineering and mathematics, both as individual disciplines as well as across all fields of study and invention. For example:
• Robotics, where the number and efficacy of artificial sensory processing combined with the increasingly sophisticated mechanical actuators and networks that connect them have produced machines that aim to completely and/or semi-autonomously control their actions within our physical world. Autonomously driving vehicles are a great example.
• Mechanical engineering, where the sensors and actuators used by weaponry, robotics and remote medical examination, diagnosis and surgical applications that make use of haptic sensor technologies are great examples.
• Healthcare, where embedded or wearable sensors collect information about patient well-being, wirelessly communicate with computational and analytical facilities run by medical researchers and practitioners and receive further instructions that control the release of appropriate dosages of medications based upon patient need in real time.
• Neuroscience, where embedded man-made sensors and actuators are connected directly to the human nervous system and other critical human organs that can be controlled either remotely or by the patient's own thoughts and brain activity: Prosthetic limbs such as arms and legs are being connected to people who can control their use much like we control the limbs that we were born with. Combined with other technologies, it is not difficult to image a real life $6 million man or woman.
• Environmental and ecological science, where extensive networks of sensors and actuators can control smart grids to better manage air quality, water quality; and where weather and climate sensing and prediction are used to improve agricultural production.
• Manufacturing process control, one of the earlier and more mature uses of IoT, has evolved through advances in robotics, computer-aided design (CAD) and computer-aided manufacturing (CAM) applications. Just as many data centers have become "lights out" operations, such is the future of much of what we currently think of as manufacturing and supply chain control.
• Artificial Intelligence and deep analytics, which for its first 40 years has had moderate impact primarily at the edges of these fields, has within the last 10 to 15 years become a driving force that has enabled each of these fields to flourish within their own domains and, most critically, across all of them.
Evolution of IoT: Four questions
When looking at our current state and assuming that current trends will continue at a similar or even quicker pace, a few observations seem undeniable: Machines are becoming more like people, people are becoming more like machines and, at least for now, we can still pull the plug on things that seem uncomfortable or outright unacceptable.
I suggest that the questions we should be asking ourselves are much broader and have a more far-reaching impact than "What policies are needed to define and control who owns the data, and who has the right to access and update personal information?" Sure, these are important issues that we need to address, but I suggest that to ensure our children's and their children's and their children's future success, we instead should now be also asking questions such as:
1. How much are we willing to let machines become more and more like people?
2. How much are we willing to let people become more and more like machines?
3. What makes us human? Is it our heart -- likely not, since we have already accepted artificial (mechanical) hearts? We openly embrace major organ transplants (soon to also be mechanical), and we are thankful for mechanical limbs that repair accidents of nature, accidents of people, or worse. Is an artificial brain next?
4. Will we reach a point where augmented humans will be indistinguishable from sophisticated machines? If so, is that OK? If not, what should we be doing about that now, before it is too late? Or is it already too late?
Perhaps as IT executives, what we should be thinking about and collaborating around is a different sort of policy manual. Paying homage to the amazing work done by the brilliant scientists, biologists, inventors, philosophers, sociologists, anthropologists, psychologists, et. al., we should call our new manual On the Origin of Species 2.0.
Let me know what you think. Post a comment or drop me a note at [email protected]. Discuss, debate or even argue -- let's continue the conversation.
About the author:
Harvey R. Koeppel is the president of Pictographics Inc., a management and technology advisory and consulting services firm. He is also vice chairman of the World BPO/ITO Forum. From May 2004 through June 2007, Koeppel served as the CIO and senior vice president of Citigroup's Global Consumer Group.
Koeppel's latest columns:
Be cool in the heat of a cybersecurity threat
The 3D printing payoff
DR/BC for the data breach age