Manage Learn to apply best practices and optimize your operations.

A big looming risk for IoT is privacy

The security risks of the internet of things are well documented and ever growing. We’ve seen Mirai and other botnet attacks shut down sites around the web using the fire power of millions of co-opted webcams and other devices. There have been remote attacks and controlled hacks that have shut down heating in buildings during the winter, disabled a car and taken control of traffic lights. Those are scary, but they aren’t the only concerns with IoT that we have. In coming years, we’ll see another big threat arise from IoT: privacy issues that will affect all of us. And they will have significant moral and ethical implications.

For example, this past year saw reports of the police being called to a Texas home over a violent domestic altercation caught by an Amazon Alexa, wherein a man was possibly about to harm a woman and her child. The situation, although harrowing and fortunately a tragedy averted, raised some interesting ethical questions. Early in 2018, it was revealed that this wasn’t an isolated case — Amazon hands over a lot of data to the police, but doesn’t identify if any of it comes its voice assistants Echo and Alexa. This domestic and personal use of digital assistant data for law enforcement raises interesting areas of privacy law — after all, these are in your home or personal space wherein you have an established precedent for privacy. I expect lawyers to push back on some of these cases under these guidelines.

But IoT isn’t just in your home, it’s all over our cities, as well. Some cities already have “shot spotter” technologies, where an array of microphones can be used to triangulate the sound of gunshots. Recent smart city deployments plan to extend this sensor network with additional types of sensors. The city of San Diego, California, is working with industry to wire the city with over 3,000 sensors on streetlights. These sensors ostensibly will help conserve power, improve health and safety, and even provide conveniences like identifying open parking spaces.

However, the privacy risks to individuals are often underestimated. These risks include spying and stalking, and burglars identifying when residents leave a house they want to break into. Smart city cybersecurity risks include hijacking and abuse of devices and networks, akin to the Mirai and Reaper botnets, theft of services such as parking or public transit, or open proxy abuse like we see with standard PCs. The internet at large is likely to pay the price from spam and cybercrime, as well as city taxpayers who will have to pay for continual cybersecurity cleanup of these devices, and possible early replacements.

What’s more, this privacy risk enters into a thorny legal area which is ripe for debate. The smart city exists in public spaces, meaning our expectation of privacy is greatly reduced. Already a judge in Florida has ruled that a brake recording device in a car was using public data because brake lights are visible, which set a precedent with regard to privacy expectations in the age of ubiquitous recording and data collection.

For IoT, emerging questions, which I expect will be debated in the next few years, include: Who derives the benefits from this aggregate data, private companies or local governments and their constituents? What harm is done to individuals who have their data mixed with others’ without any concern for remaining identifiable information? What duty does the government have to protect that privacy and minimize harm? Lawyers and privacy advocates have asked these questions and more, but we’re only at the very beginning of this important conversation, which has real impacts on our lives.

American privacy law, heavily influenced by the publication of a Harvard Law Review article in 1890 by Samuel Warren and Louis Brandeis, was further updated by tort expert William L. Prosser in a 1960 California Law Review article. He argued that privacy laws should ensure individuals are protected against intrusion into their solitude, protected from embarrassment or unwanted publicity, and protected against the appropriation of their likeness against their will. These scholars helped inform legislation that we take for granted in American society. However, these privacy laws were written in an era where an individual or even a news organization had to expend great effort to collect information on an individual. Today data on millions of people can be collected in seconds. These laws will certainly be tested in the coming years in the age of smart cities and IoT, where we have a mixture of mass data collection, security control gaps and conflated ownership between governments and private companies.

The law is not created out of thin air; most often it draws off precedent, principle and existing laws. While there is a growing mismatch between current law and technology, I don’t expect a new set of privacy laws to emerge. Instead, the collision between IoT and privacy will coalesce in the courtrooms and result from incremental extensions of existing legal frameworks.

In recent years, especially with the advent of big data and social media, the lack of consideration for ethics in technology has come into focus. With the rise of ever-widening IoT measurements and smart cities, the risk due to ethical gaps grows. These risks include immediate dangers to citizens and property (including critical infrastructure!) due to lax cybersecurity protections, harm to long-term safety of individuals and society from the theft of aggregated data, and more. If you’re a technologist in this space, make friends with ethicists and legal scholars who have studied this rich body of work, and recognize the coming catastrophe. By working across disciplines — including technologists, city planners, ethicists and legal experts, privacy experts, vendors and citizen panels — we can avert this crisis and realize the promise of smart cities. But this work must be done in all phases, including concept formulation, design, development, implementation and operation. Without it, we expose everyone to undue risks and a stripping away of American privacy in our communities. Pay attention to the European project VirtUE, which seeks to evaluate these ethics questions in IoT, and include that as part of your maturation.

All IoT Agenda network contributors are responsible for the content and accuracy of their posts. Opinions are of the writers and do not necessarily convey the thoughts of IoT Agenda.

CIO
Security
Networking
Data Center
Data Management
Close