Issue No.02 - March/April (2006 vol.10)
Published by the IEEE Computer Society
Johannes Gehrke , Cornell University
Ling Liu , Georgia Institute of Technology
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MIC.2006.31
Wireless sensor network technologies let us view and access large parts of the physical world from cyberspace via sensor monitoring. The three theme features in this issue address some of the challenges inherent to developing and fielding wireless sensor networks.
Moore's law indicates that the number of transistors in an integrated circuit doubles every 18 months. In a 1997 interview, Gordon Moore illustrated this law with a real-world comparison: E.O. Wilson, an expert on ants, estimated that the global ant population is roughly 10 17 — approximately the same as the number of transistors produced in that year alone (see www.intel.com/pressroom/archive/speeches/gem93097.htm). This analogy demonstrates the growing pervasiveness of small-scale computing devices throughout the physical world, and the wireless sensor network community is a driving force behind this.
Novel instruments often motivate new scientific discoveries; for example, using the Arecibo Telescope — the world's largest aperture telescope (the dish is 1,000 feet in diameter) — Russell Hulse and Joseph Taylor discovered the first pulsar in a binary system, confirming the existence of gravitational radiation in support of Einstein's general theory of relativity. Similarly, wireless sensor networks enable a paradigm shift in the science of monitoring — whether of habitats and ecologies, buildings, soil moisture, equipment vibration, health care, or myriad other applications. They can significantly improve the accuracy and density of scientific measurements of physical phenomena because we can deploy them in large numbers directly where experiments are taking place.
Let's briefly review some of the technology behind wireless sensor nodes, taking as an example the TelosB mote platform developed at the University of California, Berkeley. Like most wireless sensor network nodes, the TelosB requires a processor, storage, sensors, a radio, and an energy source. It has a Texas Instruments MSP430 processor running at 8 MHz and 10 Kbytes of RAM; it uses only 2 milliamperes of power in its active modus and 1 microampere in sleep modus. It also has 1 Mbyte of external flash RAM for data storage, is powered by two AA batteries, and uses the CC2420 IEEE 802.15.4 (ZigBee) wideband radio with a maximum data rate of 250 Kbytes per second. Wideband radios have replaced narrowband radios (such as the RFM TR1000 used in first-generation motes) because they're more resilient to noise; however, fine-grained control is limited because they expose only a packet interface to the processor. Additionally, the TelosB platform has two (optional) integrated sensors — a temperature sensor and a humidity sensor — as well as expansion connectors that can control other sensors and peripherals.
In addition to hardware, wireless sensor nodes rely on specialized software infrastructure. The TelosB mote platform, for example, uses TinyOS, an event-driven operating system, also developed at UC Berkeley. This platform consists of components that communicate through events. Applications are written by assembling and connecting a suitable subset of components. Typical sensor networks consist of tens, if not hundreds, of nodes, including special gateway nodes that connect them to other external networks, such as the Internet. Communication occurs regularly over multiple hops, and due to frequently poor link quality, reliable data collection at the gateway node is a significant problem.
Researchers have done much work on data-dissemination algorithms that address communication-channel volatility in the face of strict resource constraints (see the first two articles in this special issue). For a sensor network to maintain a lifetime of several years, for example, the radio's duty cycle — the ratio of time the radio is on over time that the radio is actively sending or receiving a message — must typically be less than 1 percent.
As a continuously emerging frontier in Internet computing technology, wireless sensor network technologies let us view and access large parts of the physical world from cyberspace via sensor monitoring. Designing applications for sensor networks is challenging due to their large scale, communication volatility, and power consumption constraints at each node. Consequently, existing systems that collect and aggregate data in sensor networks follow very different architectures than traditional distributed data-centric applications.
In this Issue
The three theme features in this issue of IEEE Internet Computing address some of the challenges inherent to developing and fielding wireless sensor networks.
In "Deploying a Wireless Sensor Network on an Active Volcano," Geoffrey Werner-Allen and his colleagues describe their efforts implementing a sensor network on the Reventator volcano in the western Amazon in Ecuador. They use a two-stage approach to deal with the network's resource constraints, including the radios' low data rate and low amount of RAM. First, during seismic events, data was written into flash RAM; then, a laptop collected the data from a base station. Due to resource constraints at the wireless channel, one minute of collected event data can take several minutes to transmit; the authors propose using the local buffering of event data to combat this problem.
"Monitoring Civil Structures with a Wireless Sensor Network," by Krishna Chintalapudi and colleagues, describes an application that monitors the health of large structures such as office buildings. The article lays out both wireless sensor networks' advantages (a deployment took roughly half an hour to set up per experiment as compared to a legacy wired equipment setup, which took several days) and resulting challenges (such as transmission links with a 37.6 percent reception rate). Given the growing interest in data-driven monitoring applications, Chintalapudi and colleagues predict that the rapidly advancing technology of wireless sensor networks will completely replace wired deployments for structural testing in the future.
Finally, in "Kansei: A High-Fidelity Sensing Testbed," Anish Arora and his colleagues describe a sensor-network testbed for the research community that pushes the scalability envelope with 210 nodes. The article highlights the Kansei architecture, the software infrastructure's experiment support, and the testing environment with data generation. The authors also describe their experience deploying a network of more than 1,000 nodes.
Connecting the digital and physical worlds is one of today's most important challenges. Sensor networks are set to play a pivotal role in making this connection, and we're convinced that we will see much innovation with high-commercial impact from this community over the next decade.
Johannes Gehrke is an associate professor in the Department of Computer Science at Cornell University and the technical director of data-intensive computing at the Cornell Theory Center. He has a PhD in computer science from the University of Wisconsin. Gehrke is a member of the ACM SIGKDD Curriculum Committee. Contact him at email@example.com.
Ling Liu is an associate professor at the College of Computing at the Georgia Institute of Technology. She has a PhD in computer science from Tilburg University in the Netherlands. Liu is a co-general chair of ICDE 2007, co-PC chair of ICDE 2006, and vice chair of the ICDCS 2006 Internet Computing Systems track. Contact her at firstname.lastname@example.org.