The “most profound technologies are those that disappear,” wrote the computer scientist David Weisser some 20 years ago. They “weave themselves into the fabric of everyday life until they are indistinguishable from it.” However, some technologies require a substantial public discussion before they take their place as part of the fabric of everyday life. The conversation goes back and forth until we have adjusted the technology to solve real problems and redefined our problems in a way that can be solved by the new technology. At the moment, we are in the midst of such a dialogue with the Internet of Things.
The Internet of Things is a fairly simple concept that consists of several well-tested technologies. At base, it is the idea of connecting servers and actuators to the Internet to collect data from a large area (often a large geographic area) and to control a large number of devices (again often over a large geographic area.) We have already seen some applications that seem to embody this idea. Companies that manage large supply chains, for example, have used the basic concepts of the Internet of Things for almost 15 years. However, in many fields, the Internet of Things does not yet match the problems that need to be solved.
The Internet of Things is a fairly new idea, though it is based on older technologies. Kevin Ashton, the British businessman and tech entrepreneur claims to have invented the term in 1999, when he was making a presentation to the executive of Proctor on systems that would use Radio Frequency Identifiers (RFID) to manage supply chains. He acknowledged that he had no special “right to control how others use the phrase” but he argued that it referred to an environment in which computers “knew everything there was to know about things—using data they gathered without any help from us.” Such a definition is not specific enough to serve as a foundation for new technology and so the engineering community started to build the basic concepts for the new technology, a technology that businesses began developing a few years after Ashton defined the term.
The IEEE began publishing articles on the “Internet of Things” in 2003 and very quickly began to discuss a key issue: security. The security weaknesses of the Internet of Things were hidden by the way that most people talked about the technology. In describing the Internet of Things, many individuals described it as a collection of sensors connected to the Internet. In fact, few of these sensors were directly connected to the network. Most were connected through computer processors, processors that could be hacked or spoofed like any other computer connected to the Internet. Because of these security concerns, the National Institute of Standards and Technology (NIST) decided to develop a basic framework for the Internet of Things so that security researchers could begin to collaborate and share their work.
In spite of its name, NIST does not establish all the standards for the US. It is one of many standard organizations in the US. The American National Standards Institute, or ANSI, is probably the most prominent of these organizations. It is a private organization, located in New York, that creates standards for many different fields. The IEEE is also a standards organization, but it produces standards primarily for electrical and computer technologies. NIST is a government agency that does measurement standards, standards for the US Government and standards that seem to be important for the country as a whole. It is also the standards organization that has been dealing with issues of cybersecurity.
Over the past summer, NIST released its preliminary framework for the Internet of Things. This framework defines the basic elements of the Internet of Things and how these elements interact with each other. It goes well beyond the conventional idea that the Internet of Things is a collection of sensors attached to a network and defines it as consisting of five building blocks, or primitives.
The first of these building blocks consists of the elements that some people believe to be the whole of the Internet of Things: the sensors. The second element is the communications channel, the medium that transmits information form the sensors. The third element is the aggregator, the system that collects data from the sensors and assembles it into a database.
The fourth is still a vague concept. NIST defines it as the eUtility, a software system that takes data from the aggregator and other sources to produce a stream of information. NIST feels that this definition needs to be vague “to allow for unforeseen future services and products that will be incorporated in types of [the Internet of Things] yet to be defined.” However, to be prepared for all possible security problems, NIST needed to identify this element as a potential point for attack. It is easier to spoof aggregated data than to spoof all the sensors coming into the aggregator, and it is easier to spoof data analyses than to spoof the aggregated data.
The last element of the NIST framework is called a “decision trigger.” It is the algorithm that takes an action for the Internet of Things. As a security vulnerability, it may be the weakest link. It would be possible to substitute codes that could make decisions without ever looking at the results from eUtilities or data aggregators.
The NIST framework is not perfect, nor is it yet universally accepted. It has drawn a modest amount or criticism for its definitions and terms. However, it seems to be gaining enough support to make it the common model for thinking about the Internet of Things.
At the moment, most commentators think that two large classes of applications will dominate the Internet of Things: medical applications and factory applications. Both applications will progress slowly until the major security problems are addressed. The factory applications, often called the “Manufacturing Internet,” need security to protect corporate secrets and maintain control over manufacturing processes. These applications can be protected by isolating factory components, but as we have already seen with the Stuxnet virus, these components remain vulnerable even when they are disconnected from the commercial Internet.
The health applications of the Internet of Things promise to be the largest use of this technology. The MacKenzie consulting group projects that it will produce almost 50 percent of the revenue from Internet of Things technology. Currently, most of the applications in this field are similar to the Manufacturing Internet. Hospitals, for example, are using the Internet of Things to manage their supply chain as if they were a manufacturer or a retailer. Yet, the health care field is in a discussion over how to use the Internet of Things to identify problems and monitor treatments. This is a discussion that touches deeply on issues of security. It began in about 2011, when a number of firms began selling health monitors.
Health applications have the potential of being a truly invisible technology, something that is woven into the everyday fabric of both patients and doctors. It might allow doctors to monitor the activities of patients, and patients to monitor the responses of doctors. To date, the discussion of these portable monitors has occurred over a narrow topic: the ability of a single doctor to view health data of a single patient. However, the NIST framework indicates that the conversation will have to become much broader. It suggests that we will not have doctors tracking patients on a daily basis but we will have aggregators and utilities that collect and organization data. It also suggests that health care decisions might be initiated by automatic triggers rather than the judgment of a physician.
Conversely, the framework suggests that there will be another side to a healthcare Internet of Things. Such technology will produce data about doctors, including information about the treatment they recommend, the cost of that treatment, and the effectiveness of that treatment. Doctors are already monitored by their peers. The Internet of Things could create eUtilities and decision triggers that radically expand how they are monitored.
Over the past year, the sales of portable health monitors have slowed. Both patients and doctors seem to be evaluating the benefits of this technology and deciding whether it is working in a way that benefits them. Like much of the Internet of Things, it is a technology that has slowly become visible to a large number of people. These technologies will remain visible as we discuss how they need to be adjusted to fit into our lives and how our lives need to be adjusted to accommodate them. Eventually, we will be able to answer both sets of questions to our satisfaction and can let the Internet of Things slip into invisibility.
About David Alan Grier
David Alan Grier is a writer and scholar on computing technologies and was President of the IEEE Computer Society in 2013. He writes for Computer magazine. You can find videos of his writings at video.dagrier.net. He has served as editor in chief of IEEE Annals of the History of Computing, as chair of the Magazine Operations Committee and as an editorial board member of Computer. Grier formerly wrote the monthly column “The Known World.” He is an associate professor of science and technology policy at George Washington University in Washington, DC, with a particular interest in policy regarding digital technology and professional societies. He can be reached at firstname.lastname@example.org.