3.2.1 Affective Computing As mentioned above, the idea that human rational thinking depends on emotional processing, was picked up by the artificial intelligence field. Picard wrote a groundbreaking book named Affective Computing that has had a major effect on both the AI and HCI fields. Her idea, in short, was that it should be possible to create machines that relates to, arises from, or deliberately influences emotion or other affective phenomena. The roots of affective computing really came from neurology, medicine, and psychology. It implements a biologistic perspective on emotion processes in the brain, body, and interaction with others and machines.
The most discussed and widespread approach in the design of affective computing applications is to construct an individual cognitive model of affect from first principles and implement it in a system that attempts to recognize users' emotional states through measuring the signs and signals we emit in face, body, voice, skin, or what we say related to the emotional processes going on in inside. Emotions, or affect, are seen as identifiable states. Based on the recognized emotional state of the user, the aim is to achieve an as life-like or human-like interaction as possible, seamlessly adapting to the user's emotional state and influencing it through the use of various affective expressions. This model has its limitations, both in its requirement for simplification of human emotion in order to model it, and its difficult approach into how to infer the end-users emotional states through interpreting our sign and signals. This said, it still provides for a very interesting way of exploring intelligence, both in machines and in people.
Examples of affective computing systems directed at the learning field include for example, Kort et al.'s work on affective learning. It is well known that students' results can be improved with the right encouragement and support [ 29]. Kort et al. propose an emotion model built on Russell's circumplex model of affect relating phases of learning to emotions [ 42]. The idea is to build a learning companion that keeps track of what emotional state the student is in and from that decides what help she needs.
Another application in the learning area from Picard's group is a leap chair with pressure sensors [ 39]. The chair classifies nine postures a student can have. The postures are related to affective states associated with a student's interest level. Similar to the other learning system, this system also proactively decides what the learner needs.
3.2.2 Hedonistic Usability Hassenzahl has picked up on the usability tradition and aims to add what they name "hedonistic usability" criteria to usability criteria, methods and design requirements [ 17]. His position is that apart from pragmatic qualities of interaction, such as being able to make a phone call, write a paper, or set up a Web page, users also look for hedonic qualities:
" hedonic quality refers to the product's perceived ability to support the achievement of "be-goals," such as "being competent," "being related to others," "being special."" ([ 17])
By formulating and making such goals explicit in a design process, the system may address other user needs than only those related to the system's functionality. However, the main bulk of work in this strand is directed at usability evaluation methods of already designed systems, including evaluation of such hedonic qualities. Experiences of interaction are typically broken down into a set of Likert-scale questions where users grade software along dimensions such as competence, autonomy, or relatedness. The ultimate goal is always design to for a positive product experience, not for expressive power in both negative and positive dimensions.
3.2.3 The Interactional Approach An interactional view sees emotions as constructed in interaction, where the system supports people in understanding and experiencing their own emotions [ 5], [ 23]. An interactional perspective on design will not aim to detect a singular account of the "right" or "true" emotion of the user and tell them about it, but rather make emotional experiences available for reflection. That is, to create a representation that incorporates people's everyday experiences that they can later reflect on. A users' own, richer interpretation guarantees that it will be a more "true" account of what they are experiencing.
1. recognizes affect as an embodied social, bodily, and cultural product,
2. relies on and supports interpretive flexibility,
3. is nonreductionist,
4. supports an expanded range of communication acts,
5. focuses on people using systems to experience and understand emotions, and
6. focuses on designing systems that stimulate reflection on and awareness of affect.
Affector, eMoto, and Affective Diary discussed in previous sections are all examples of design for an interactional perspective on emotion. An interactional approach to design tries to avoid reducing human experience to a set of measurements or inferences made by the system to interpret users' emotional states. While the interaction of the system should not be awkward, the actual experiences sought might not only be positive ones. eMoto may allow you to express negative feelings about others. Affector may communicate your negative mood. Affective Diary might make negative patterns in your own behavior painfully visible to you. An interactional approach is interested in the full range of human experience possible in the world [ 36].
4.2.1 Ergonomics In ergonomics (preceding HCI [ 14]), the actual physical body is the core focus. The body has been measured and designed for in spaces such as airplane cockpits, cars, or nuclear plant control rooms. As pointed out by Harper et al. [ 16], the perspective taken is one where humans are seen as part of a machine. The pilots, car drivers, and factory workers are part of a larger machinery. They must be trained to follow certain routines automatically as if they are one part of the machine. The machinery must be fine-tuned so that human error is minimized and this can only be done through designing the machinery to fit with meticulous measurements of our physical capacity. In those situations, we actually do want to see our bodies as machines, able to follow routines, and act in error-free ways in the spur of a moment [ 16]. But just as we could have different perspectives on emotion processes, we can have different perspectives on the purpose and experience of using our bodies. It may be that when we drive a car, we want to be part of the car's machinery, but on another level, beyond the mechanistic routine tasks we can make our bodies perform, driving a car is also, on and off, a corporeal experience—sometimes dull, sometimes pleasurable, or even exhilarating. In those situations, we may want to see ourselves as something other than machines built in wetware.
In ergonomics and when we address usability in HCI, for the most part, we assume the body to be passive—the interface will be sending signals to the human body that the passive body receives. But as Merleau-Ponty [ 37] argued so successfully, the body is actively giving form and sense to its own component parts and to its relations with objects in the world—the body is not passive.
4.2.2 Cyborgs Another position sometime taken in HCI is that of cyborgs. A cyborg consists of both artificial and natural systems, or to phrase it differently, of both human body and designed tools that extends out capacity. In its simplest form the extension can be the stick that a blind man uses to find his way. The stick becomes a part of how he feels the world, an embodied part of his own body. But framing tools as part of our cyborg existence goes beyond this one-way extension of our bodies. The cyborg concept comes with various ethical and moral implications when we regard how the technical tools we extend our bodies with in turn speak back to us. This positive side of being a cyborg is in some sense that we can free ourselves from our bodies—as discussed by the feminist Donna Haraway in her cyborg feminist writings [ 15]. In a sense, the focus in this movement is on extending the mind, freeing us from our corporeal reality.
4.2.3 Reuniting Virtual and Real While this body-less cyborg being on the internet was much discussed in the beginning of the virtual reality-era, the pendulum has now swung back and most regard it as bad behavior to not connect your real identity to your virtual identity. In addition, more and more technologies are tying reality and virtuality more strongly together, entering our physical selves into the virtual spaces. For example, in the computer games area, we have new interaction devices, such as WII, fake guitars in Guitar Hero, or mobiles, connecting more strongly with our physical selves. A new games field is that of pervasive games, games that are played in town, using technologies such as RFID-tags, mobiles, GPS, or Bluetooth, to exploit the real world and bystanders as part of the game world [ 26]. The currently best-known virtual world, second life, is playfully connected to the real world in various ways, mirroring, for example, various institutions in the real world to virtual ones.
But this drive to unite the real and virtual world does not only concern games and virtual worlds, but also, for example, communication tools. There are mobile communication tools that add contextual information on position or who else is around [ 18].
4.2.4 Third Wave As mentioned above, in the "third wave" of HCI, we try to figure out how to design for experiences beyond those of task completion, efficiency, and tool-based perspectives. This includes designing for bodily experiences. So far, when it comes to involving bodies and creating for bodily experiences, the focus has mainly been on sports and games (e.g., from early work [ 25] to current [ 44]). The aim is to design for experiential qualities such as flow, immersion, or "game play." But there is also a growing body of designs aimed at other experiences. One example is Moen's Body Bug—a wire that you wrap around your body where a "bug" registers your actions and climbs up and down a wire [ 38]. The bug is a simple robot, moving along the wire. When you strap the wire around your body and start making movements, the bug will move along the wire, in a sense mirroring your movements, see Fig. 6. The bug makes you want to "dance." The sought experiential quality is that of enjoying your own body movement as we do when we dance.
Using movement and body in interaction can lead to a whole range of experiential qualities of the interaction, such as affective loops [ 22] or supple interaction [ 24]. The system eMoto exemplifies an affective loop experience: By performing motions that resonate with aspects of those involved in an emotional experience, users get affected by the interaction. But, we can also imagine qualities such as mindfulness or the simple joy of movement as in Moen's work. To reach designs in which such qualities arise, designers and researchers have repeatedly reported that as designers, we need to experience our own bodies in the design process [ 19]. This, in turn, requires new methods in the design process.
• The author is with the Mobile Life Centre, Department of Computer and Systems Sciences, Stockholm University, Forum 1000, SE-164 40 Kista, Sweden. E-mail: firstname.lastname@example.org.
Manuscript received 28 Nov. 2008; accepted 29 Dec. 2008; published online 12 Jan. 2009.
For information on obtaining reprints of this article, please send e-mail to: email@example.com, and reference IEEECS Log Number TLT-2008-11-0102.
Digital Object Identifier no. 10.1109/TLT.2009.3.
1. While some might claim that using a keyboard and mouse will also involve muscular movement, all of these systems relate to noninstrumental, nonsymbolic gestures and movements, related to emotional expressions [ 46], which is different from the instrumental, goal-oriented movements we perform when typing on a keyboard.
3. See, e.g., workshops on affect and learning at ITS 2008 and AIED 2007.
Kristina Höök received the PhD degree in 1996. She was chair of human-computer interaction at Stockholm University in 2003. She has been employed at the Swedish Institute of Computer Science since 1990. Currently, she is working as a professor at Stockholm University and as head of the Mobile Life Centre. She has published more than 50 articles in well-renowned journals and conferences. She is known for her work on social navigation, mobile services, and affective interaction.