Arthur C. Clarke once wrote, "Any sufficiently advanced technology is indistinguishable from magic." 1
It is with this sense of wonder that those of us in the pervasive or ubiquitous computing field perform our work. We pursue a vision that seemed the stuff of science fiction just a few short years ago.
So it was with great interest that I watched the IEEE Computer Society's launch of IEEE Pervasive Computing magazine this year. Preparing for the publication of a new magazine usually takes time, but it seems that the society didn't take much time in deciding to publish this new magazine. I take that to mean that it recognizes pervasive computing as an important—and increasingly tangible—branch of computer technology.
In September 1991, Mark Weiser's seminal article, "The Computer for the 21st Century," appeared in Scientific American
Considered the first mainstream article on ubiquitous computing, Weiser's piece has ignited interest in the field ever since. The following statement appears at the beginning of the article: "The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it."
As M. Satyanarayanan points out, this statement has come to exemplify the ubiquitous-computing vision. 3
The vision of the technology that disappears from untrained eyes is a worthy target that we have pursued for a long time, as the " Early ideas on computing everywhere
" sidebar mentions. This vision is one that we will chase for a long time.
To work, ubiquitous computing will require the placement of many computers in people's living environment. The vision requires a paradigm shift in that computers must melt into our environment, rather than be a stand-alone edifice, like the PC. Ubiquitous computing also represents a complex computer architecture, borrowing various ideas from existing technologies. It relies on
• computers that come in small sizes and consume very little power;
• various sensors, including positioning sensors;
• effector/actuator and display devices to emit information and act on outside objects;
• wireless communication and network technology to connect many computers;
• distributed processing and fault tolerance; and
• polished user interfaces that make computers transparent to users.
On top of these ideas, ubiquitous computers must be inexpensive enough to make practical a huge number of units in a given area and have a long life cycle after deployment. These characteristics differ greatly from those of conventional systems.
Today, the computing field that has probably tackled similar requirements is embedded computing. Initially, only industrial machines used computers for their control, but microprocessors have made it possible to embed computers into many things in use today. The idea of ubiquitous computing would have been a pie in the sky if the embedded-systems technologies discussed in IEEE Micro had not been available.
Toward Practical Ubiquitous Computing
The November-December 2001 issue of IEEE Micro featured special theme articles on smart cards and radio frequency identification. These technologies connect the virtual and the real worlds and play an important role as part of a ubiquitous-computing foundation. In this issue, I present more articles on the technologies that add to ubiquitous computing from viewpoints familiar to IEEE Micro readers.
The basic model of ubiquitous computing requires the melding of the virtual world inside computer systems and the real world into a coherent, single-world model. So it is important to let computers recognize the goings-on in the real world: a person's location, storage locations, ambient temperature, wind direction, and so forth.
Many basic technologies serve ubiquitous computing. Operators are putting one, the sensor network, into practical use. 4
Such networks, often wireless, connect many sensors, and operators use them to monitor environmental parameters such as temperature, sound, and images of the surrounding area. Sensor networks are useful for military surveillance of battlefields and unmanned surveillance of disaster areas after fires, mudslides, and so forth. This technology's application will be wide and diverse.
A sensor network is a distributed network without centralized control. It also requires miniaturized sensors, components having low power consumption, and distributed radio network technology. The required low power and minimalist hardware resources are a departure from conventional wireless and network technology.
With this in mind, I invited the group led by David Culler at the University of California, Berkeley, to submit an article on Mica, a platform for deeply embedded sensor networks with very low power consumption. The group in August 2001 created a working demonstration of a dynamically configured sensor network that consists of 800 wireless photosensor nodes, each the size of a coin. This coin-sized area contains computer, communication, and sensor devices built from commercial off-the-shelf components. The low-power consumption requirements are such that the Mica platform can operate for a few years on a single battery. To minimize power usage, Mica's designers came up with the idea of cross-layer optimization (to go across network layers) and customized protocols for each application. Today, the group is working on a single-chip implementation of Mica and is considering a new direction for future sensor networks.
Positioning is a fundamental technology for making computers understand the real world. 5
These systems are based on the Global Positioning System (GPS), which uses radio wave signals, and other methods that use infrared, ultrasonic waves, geomagnetic signals, or inertial-guidance systems. However, these methods often have their merits and demerits; each application should use a positioning method suited to the problem at hand, and designers should consider
• whether the system is applicable globally or only locally;
• whether it is usable indoors, outdoors, or both places;
• the desired accuracy of positioning;
• whether the application requires a position as an absolute coordinate with respect to a global origin or as a position relative to a local origin; and
• whether it is scalable to a wider area or not.
In this issue, I have invited Shigeru Shimada, Masaaki Tanizaki, and Kishiko Maruyama from Hitachi's Central Research Laboratory to describe the various problems the group has encountered in building a service system that uses a positioning system, and how it has solved these problems. The Shimada group has built a very advanced geographical information system using cell phones and has completed extensive field experiments. One of the two field experiments is a walking assistant that shows you where you are and the route to reach a desired target. This experiment uses GPS and a geomagnetic sensor. The other experiment is a walking navigation system for use in underground subway stations or shopping malls where a GPS signal can't reach. IC tags inside raised tiles on the floor will communicate with sensors embedded in a user's shoes. Both experiments show us the importance of accurate positioning and the difficulty of achieving that goal.
Research in ubiquitous computing is also active in Europe. A cursory glance at European research gives the impression that it focuses on the introduction of computers into our living space to perform various services, and on much-needed improvements to the human-machine interface.
The Disappearing Computer Initiative is a large European project in this field ( http://www.disappearing-computer.net). This project aims to incorporate computers in everyday objects and make them disappear. The Future and Emerging Technologies project, part of the Information Society Technologies Programme funded by the European Union, began in 2001. As the name suggests, this project aims to invisibly incorporate computers into social infrastructure to accomplish the following:
• Create information artifacts based on new software and hardware architectures integrated into everyday objects.
• Look at how collections of artifacts can act together to produce new behavior and functionality.
• Investigate new approaches for designing collections of artifacts in everyday settings, and ensure that a person's experience in these new environments is coherent and engaging.
UbiComp, an international conference on ubiquitous computing, took place in Sweden this year, and the Disappearing Computer Initiative took center stage.
In this issue, I invited Peter Tandler and his Ambiente Group at the Fraunhofer's Integrated Publication and Information Systems Institute to write about workspaces of the future, which is part of the initiative's research. Tandler's group embeds computers into desks, chairs, and walls of rooms. The group will combine such computer-augmented objects to create a ubiquitous-computing environment in which the human inhabitants don't notice the presence of computers. For example, computers inside a wall display and chairs interact with each other. It is possible to line up desks with embedded displays to form a large, combined display for group meetings.
The authors seem to be more interested in software architecture than the hardware alone. Their designs are aesthetically pleasing, reflecting a traditional European approach.
Open development platform
Today's computers come in many sizes and have many functions. Handheld information appliances and various home appliances will connect to computer networks in the future. In my view, instead of new remote controls, handheld phones will evolve into devices that will communicate with every intelligent object in a house and office. A short-term development will be the addition of a wireless local-area network connection (based on IEEE 802.11), and noncontact smart card communication (based on ISO/IEC 14443). In these ways, handheld phones will support multimodal communication and become a ubiquitous communicator device.
From the viewpoint of developing such varied computer-controlled devices, efficient development and short time to market will be major concerns for manufacturers. Unlike PCs, these systems must deal with many different CPUs with, for example, different power consumption characteristics. Such systems will require real-time OS and middleware for each such CPU platform. Today's embedded-computing industry faces a long development cycle, increased cost of development, and system bugs arising from the complexity of system requirements. These factors make increasing the efficiency of software development a major goal.
With this background in mind, my colleagues and I have initiated the T-Engine project. T-Engine is an open, real-time system development platform for developing embedded devices with networking capability and other advanced features. The T-Engine approach standardizes hardware, a real-time operating system, object format, and other key components of embedded systems. This standardization makes it possible to build and distribute middleware products for use by developers. The availability of such middleware products should decrease the development time and cost of new embedded products.
Potential embedded-systems applications include controlling home appliances, placing orders based on sensitive, private information; or conveying information that has monetary value, such as an electronic ticket or coupon. To support such applications, a security infrastructure that prohibits eavesdropping, counterfeiting, and impersonation is important. T-Engine takes network security seriously and uses the eTRON security architecture 6
for the core of its security infrastructure.
As my four-year tenure as editor in chief of IEEE Micro reaches its end, I believe that the impact of ubiquitous computing on the research community and the industry in general will be huge. It will have an impact similar to that of the PC on microelectronics and one as large as the Internet had on society in general. I hope the articles in this special theme issue prompt you to start pursuing the details of this research for the future.
I wish everybody a happy New Year.
Ken Sakamura, IEEE Micro'
s editor in chief, is a professor in the Interfaculty Initiative in Information Studies at the University of Tokyo. His primary interests lie in computer architecture as well as real-time processing and computer-augmented environments. He initiated the TRON project in 1984 to help build computers in the 1990s and beyond. Under his leadership, more than 100 manufacturers participate in the TRON project. He now guides the T-Engine project, which attempts to provide the open standard real-time system development environment. He is now also the leader of the YRP Ubiquitous Network Laboratory, where ubiquitous computing environment research is carried out. Since he is also interested in how computer use will change society in the 21st century, his design activities extend to electronic appliances, furniture, houses, buildings, digital museums, and urban planning. Sakamura has a BS, ME, and PhD in electrical engineering from Keio University in Yokohama, Japan. He is an IEEE fellow, and a member of the ACM.