BACKGROUND IMAGE: iSTOCK/GETTY IMAGES
Pervasive computing, also called ubiquitous computing, is the growing trend of embedding computational capability (generally in the form of microprocessors) into everyday objects to make them effectively communicate and perform useful tasks in a way that minimizes the end user's need to interact with computers as computers. Pervasive computing devices are network-connected and constantly available.
Unlike desktop computing, pervasive computing can occur with any device, at any time, in any place and in any data format across any network, and can hand tasks from one computer to another as, for example, a user moves from his car to his office. Thus, pervasive computing devices have evolved to include not only laptops, notebooks and smartphones, but also tablets, wearable devices, fleet management and pipeline components, lighting systems, appliances and sensors, and so on.
The goal of pervasive computing is to make devices "smart," thus creating a sensor network capable of collecting, processing and sending data, and, ultimately, communicating as a means to adapt to the data's context and activity; in essence, a network that can understand its surroundings and improve the human experience and quality of life.
Often considered the successor to mobile computing, ubiquitous computing and, subsequently, pervasive computing, generally involve wireless communication and networking technologies, mobile devices, embedded systems, wearable computers, RFID tags, middleware and software agents. Internet capabilities, voice recognition and artificial intelligence are often also included.
Pervasive computing applications can cover energy, military, safety, consumer, healthcare, production and logistics.
An example of pervasive computing is an Apple Watch informing a user of a phone call and allowing him to complete the call through the watch. Or, when a registered user for Amazon's streaming music service asks her Echo device to play a song, and the song is played without any other user intervention.
History of ubiquitous/pervasive computing
Ubiquitous computing was first pioneered at the Olivetti Research Laboratory in Cambridge England, where the Active Badge, a "clip-on computer" the size of an employee ID card, was created, enabling the company to track the location of people in a building, as well as the objects to which they were attached.
Largely considered the father of ubiquitous computing, Mark Weiser and colleagues at Xerox PARC soon thereafter began building early incarnations of ubiquitous computing devices in the form of "tabs," "pads" and "boards."
Weiser described ubiquitous computing:
Inspired by the social scientists, philosophers and anthropologists at PARC, we have been trying to take a radical look at what computing and networking ought to be like. We believe that people live through their practices and tacit knowledge, so that the most powerful things are those that are effectively invisible in use. This is a challenge that affects all of computer science. Our preliminary approach: Activate the world. Provide hundreds of wireless computing devices per person per office of all scales (from 1" displays to wall-sized). This has required new work in operating systems, user interfaces, networks, wireless, displays and many other areas. We call our work "ubiquitous computing." This is different from PDAs [personal digital assistants], Dynabooks or information at your fingertips. It is invisible, everywhere computing that does not live on a personal device of any sort, but is in the woodwork everywhere.
He later wrote:
For 30 years, most interface design, and most computer design, has been headed down the path of the "dramatic" machine. Its highest ideal is to make a computer so exciting, so wonderful, so interesting, that we never want to be without it. A less-traveled path I call the "invisible": its highest ideal is to make a computer so imbedded, so fitting, so natural, that we use it without even thinking about it. (I have also called this notion "ubiquitous computing," and have placed its origins in postmodernism.) I believe that, in the next 20 years, the second path will come to dominate. But this will not be easy; very little of our current system's infrastructure will survive. We have been building versions of the infrastructure-to-come at PARC for the past four years in the form of inch-, foot- and yard-sized computers we call tabs, pads and boards. Our prototypes have sometimes succeeded, but more often failed to be invisible. From what we have learned, we are now exploring some new directions for ubicomp, including the famous "dangling string" display.
The term pervasive computing followed in the late 1990s, largely popularized by the creation of IBM's pervasive computing division. Though synonymous today, Professor Friedemann Mattern of the Swiss Federal Institute of Technology in Zurich noted in a 2004 paper that:
Weiser saw the term "ubiquitous computing" in a more academic and idealistic sense as an unobtrusive, human-centric technology vision that will not be realized for many years, yet [the] industry has coined the term "pervasive computing" with a slightly different slant. Though this also relates to pervasive and omnipresent information processing, its primary goal is to use this information processing in the near future in the fields of electronic commerce and web-based business processes. In this pragmatic variation -- where wireless communication plays an important role alongside various mobile devices such as smartphones and PDAs -- ubiquitous computing is already gaining a foothold in practice.
Pervasive computing and the internet of things
The internet of things (IoT) has largely evolved out of pervasive computing. Though some argue there is little or no difference, IoT is likely more in line with pervasive computing rather than Weiser's original view of ubiquitous computing.
Like pervasive computing, IoT-connected devices communicate and provide notifications about usage. The vision of pervasive computing is computing power widely dispersed throughout daily life in everyday objects. The internet of things is on its way to providing this vision and turning common objects into connected devices, yet, as of now, requires a great deal of configuration and human interaction -- something Weiser's ubiquitous computing does not.