peshkova - Fotolia
When the University of Central Florida discovered a data breach in January and determined that the names and Social Security numbers of as many as 63,000 current and former students and staff could have been accessed, it notified no one -- until February.
Incensed by the school's response, students and employees filed a class-action lawsuit for failing to safeguard personal information and then failing to promptly notify the people who were affected.
That's unacceptable today, said Mark Foley, an attorney with Milwaukee law firm von Briesen & Roper who was at the recent Fusion 2016 CEO-CIO Symposium to talk about the litigious side of the Internet of Things -- the vaunted network of devices that exchange information on the Web -- IoT legal issues. As business and pleasure increasingly are done online, people are starting to expect the companies they work for, the schools they go to and the stores they shop at to do more to protect their interests. Foley predicts IoT ventures will heighten consumer awareness as the privacy and security implications of how companies are using data become clearer, leading in turn to lawsuits.
"We have an evolving standard of conduct," Foley said to the Madison, Wis., audience of IT and business executives. "And what is the law about if not suing somebody for breaching your duty of conduct? That is what all of tort law is built on, what all on contract law is built on."
Coming clean after a data breach
No one would envy the University of Central Florida in having to tell its constituents that their records were probably accessed -- but every organization should be prepared to quickly respond to a data breach, especially if it is putting devices and data on the Internet of Things. To do that, they need to first build privacy and security into every project from the ground up. "The days of bolting something on at the end," Foley said, are over.
U.S. and European regulators are now talking about making built-in data privacy and security features in devices and services a requirement for fair trade, Foley said. There's also talk of labeling the sale of devices and services without such features an unfair trading practice, subject to fines by the Federal Trade Commission and its European counterparts.
"There may be a whole brave new world coming in terms of enforcement of those kinds of concepts in the development of data-related products," he said.
Devil in the code
But companies can't be content to guard their customers' personal information; they have to also protect their customers from their products -- defective ones, that is. Foley cited an example from the not-so-distant future: If a self-driving car crashes because of a problem in the underlying computing code, its manufacturer should be ready for product liability law.
That's the set of rules assigning responsibility for defective products -- and a digital-age version may soon make it to the courts. It could also apply if any kind of device that connects to the Internet has a vulnerability that lets someone hack into a home network. What does it all mean?
"Tort damages for negligence when you violate that societal standard as to how well it should work, how safe it should be, how secure it should be, what you should do in response to a failure," Foley said, referring to wrongful acts, or torts, that can lead to court action.
"Those are all duties and standards that are emerging and could be redefined in a way that requires much more of people who manufacture and sell products."
Another thing organizations moving forward with Internet of Things initiatives have to think about is what Foley calls employment law, also known as regulatory law. It covers how much surveillance employers can do on their employees -- whether they can have cameras and where, what they can record and whether they can monitor private phone conversations or email exchanges.
This legal area is bound to get thornier as employers learn to do more with the ever-expanding volumes of data they're gathering, Foley said.
"As the Internet of Things extends into the workplace, the data you may be collecting or may be capable of collecting or someone may be capable of collecting about people as employees will be a tempting thing to look at for perhaps the wrong reasons."
Often, Foley said, when police officers wanted to investigate whether someone was growing illegal marijuana at home, they would try to get ahold of the household's utility bills. If water consumption far outstripped the neighbor's, that could say something. But that's also where Fourth Amendment privacy protections kicked in.
That all changes now that, with the use of smart meters, utility data can be knowingly and intentionally sent outside the house. People can no longer have the same expectations of privacy, Foley said.
The debate in coming years will likely center on IoT legal issues such as whether a company discriminated against its employee by analyzing data taken from a Fitbit activity tracker as he or she walks to the copier or the restroom. It extends beyond the workplace, too. If sensors are set up along the street tracking someone's location and detecting, say, the remnants of drug paraphernalia, what happens then? Was the evidence legally obtained?
These and other questions will need to be explored as the Internet of Things reaches more and more into people's lives.
"You need to keep these in mind as you develop your products, roll out products or you begin to analyze the data," Foley said. "Big data requires big judgment."
Attorney Mark Foley discusses more IoT legal issues data ownership and intellectual-property rights in the first part of this two-part tip.
A primer on the Internet of Things
Who's responsible for the security of the Internet of Things?
Making sense of IoT data