The 10th annual IEEE Computer Society International Symposium on Wearable Computers took place on the shores of Lake Geneva, Switzerland, in October 2006. ISWC is the premier conference in wearable computing, featuring the latest in technical advances and fashions. Attendees came from both academia and industry, representing a broad spectrum of nationalities and technical interests, with more than 150 participants from 19 countries.
The papers, posters, and demonstrations at the conference focused on several themes. Three stood out:
• User interfaces and wearability are still open problems, even after more than a decade of wearable computing research.
• Evaluation continues to be emphasized—testing wearable computing ideas under realistic conditions and solving problems both theoretically and practically.
• Wearable computing isn't just about the form factor and computational capabilities of the devices we wear but also about sensing where we are and what we're doing when we wear them.
The conference had sessions on activity recognition, location systems, interface evaluation, input devices and sensors, and wearability.
A major part of context awareness is activity recognition—knowing what the user is doing. By recognizing the user's activity, a wearable computing device can adapt to the user's interface and information needs. For example, if the user is jogging, audio might be a better way to provide notification, but if the user is sitting down, text might be more appropriate.
One problem in activity recognition is manually labeling training data so that we can use machine learning algorithms to determine when someone is performing an activity. To implement the algorithm, we must decide which sensor characteristics are important for distinguishing activities. In "Discovering Characteristic Actions from On-Body Sensor Data," one of four nominees for best paper, David Minnen and his colleagues get around this problem by automatically identifying the training-data components that characterize an activity. Their technique searches for motifs, sets of similar but statistically unlikely subsequences in the stream of raw sensor data. They use these motifs to discern the important actions that make up an activity to be recognized. The authors used a glove-mounted accelerometer and gyroscope to discover the characteristic actions for six dumbbell exercises.
Another paper in this session, "Towards Less Supervision in Activity Recognition from Wearable Sensors" by Tam Huynh and Bernt Schiele, reported on an attempt to improve the trade-off between training effort and recognition rate by combining multiple eigenspaces with support vector machines. The resulting hybrid reduces the problem of incorrectly recognizing similar activities (which multiple eigenspaces typically experience), using only a small amount of manual labeling of the training data.
Thomas Stiefmeier and his colleagues presented a system for recognizing activities while a user works on a bicycle. The system consisted of a set of inertial sensors and ultrasonic transmitters worn by the user and a set of ultrasonic detectors placed in the room. The system recognized 21 activities associated with working on a bicycle, including turning pedals and oiling the chain, turning the pedals and changing gears, and pumping tires. The overall recognition rates were low because recognizing similar motions in a realistic setting is a difficult problem. But one significant result is that the method the system used provided nearly the same accuracy for users who weren't in the training set as it did for those who were. This is important if activity recognition is to work off-the-shelf, without requiring individuals to go through training before using the system.
A second aspect of context awareness is the user's location. Outdoors, we can use GPS to determine our location, but it doesn't work indoors or in cluttered environments, so we must pursue other alternatives. One alternative is dead reckoning, keeping track of movement distances and directions. In "User Localization Using Wearable Electromagnetic Tracker and Orientation Sensor," Akihiro Hamaguchi and his colleagues described a dead-reckoning system that uses an inertial sensor and body-worn electromagnetic tracker to find the user's orientation and step length. One problem with dead reckoning is that errors in individual distance or direction measurements accumulate over time to cause large position errors. This paper presented results showing corrections for the height of the user's rear heel as it lifts off the ground while walking. Overall errors were less than 10 percent for experiments in which the user walked a couple of hundred meters and up seven flights of stairs.
Evaluating user interfaces for wearable systems is of course a major area of interest. Typical desktop interfaces such as keyboards and mice aren't suitable. Ideally, wearable input devices will collect data accurately but require little effort while the user is on the go. In "Evaluation of an Eyes-Free Cursorless Numeric Entry System for Wearable Computers," another best-paper nominee, Gabor Blasko and Steven Feiner presented the results of a user study of cursorless user interfaces for wrist-mounted computers. The interfaces were passive touch-sensitive surfaces that provided tactile landmarks to the user's fingers and didn't require the user to look at them. Users could keep their visual attention on a task, such as maintaining eye contact during a conversation or keeping an item in focus while working on it. The user trials evaluated several variants of the touch interface as well as whether visual feedback affected usability. The results showed that users had about the same speed and accuracy with the interface whether or not they had visual feedback.
In "Evaluation of Four Wearable Computer Pointing Devices for Drag and Drop Tasks When Stationary and Walking," Joanne Zucco and her colleagues presented the results of two experiments involving four commercially available input devices: a trackball, a touchpad, a gyroscopic mouse, and a Twiddler2 mouse. The experiments had users perform drag-and-drop tasks while wearing a head-mounted display and evaluated the time to complete a task, the task error rate, and the users' impressions about ease of use. In the first experiment, the users were stationary; in the second, they were walking. Results showed that the effectiveness of the input device depends on whether the user will be stationary or moving; the gyroscopic mouse was the most effective device for stationary tasks but the least effective for walking tasks.
Input Devices and Sensors
This session examined devices and sensors for monitoring the user's environment and body. Alvaro Cassinelli presented "Augmenting Spatial Awareness with Haptic Radar," which describes a head-mounted device that provides vibrotactile stimuli to the user about nearby obstacles. Haptic Radar consists of several infrared proximity sensors mounted in a headband along with mobile-phone vibrator motors (the motors provided stronger vibrations for closer objects). In the user study, a blindfolded subject wore the Haptic Radar either turned on or turned off. The experimenter told each subject to avoid an object that would approach them from behind. The experimenter would then swing a small foam ball on a wand toward the subject, as shown in figure 1
. Subjects moved in response to the object in 26 out of 30 trials, but they were completely able to avoid it in only 18 out of 30 trials.
Figure 1. Subject avoiding an unseen object while wearing the Haptic Radar device.
For wearable computing to be successful, devices must not interfere with the motions required for the user's normal daily routine and should be invisible to those around the user. Wearability denotes how well the human body can support a device. In "Assessing the Wearability of Wearable Computers," the third nominee for best paper, James Knight proposed a methodology for assessing a wearable computer's impact on users in terms of energy expenditure, biomechanical effects on posture and movement, and comfort.
Electronic textiles, an emerging technology that combines fabric with electronics to create "smart" cloth, will enable wearable computers to be truly wearable. Users will perceive them to be clothing rather than computers, thus leading to greater compliance for medical and industrial applications and greater acceptability for consumer applications. The best-paper winner, "A Construction Kit for Electronic Textiles" by Leah Buechley, presents a set of devices and techniques intended to introduce novices to e-textiles; the devices included a micro-controller, a temperature sensor, a vibrator motor, and LEDs. The devices are built using an iron-on technique to construct printed circuit boards out of fabric, which are then stitched together using conductive thread. The paper also presents several e-textile applications, including shirts that communicate via infrared, a temperature-sensing hat, and wearable LED displays (bracelet and shirt, shown in figure 2
). Buechley demonstrated several of these devices during her presentation.
Figure 2. An e-textile display shirt with LED sequins.
ISWC has always been known as a venue for hands-on, experimental computing research. The program this year included a demonstration session that gave participants a chance to experience firsthand many of the artifacts from the papers and posters. As always, the conference closed with the Gadget Show, where attendees with a cool gadget got one minute each in front of the whole audience for show-and-tell.
This overview is only a small portion of the 17 papers and 18 posters presented at the conference. The full set of papers will be available eventually from the IEEE Xplore digital library ( http://ieeexplore.ieee.org). We encourage you to participate in ISWC 2007, which will be in Boston in October. Details will be available at www.iswc.net.
is an associate professor in the Bradley Department of Electrical and Computer Engineering at Virginia Tech. Contact him at email@example.com.
is a member of the Cambridge Advanced Technology Group in Digital Health at Intel. Contact her at firstname.lastname@example.org.