The Community for Technology Leaders
RSS Icon
Issue No.04 - Fourth Quarter (2012 vol.3)
pp: 410-423
Roddy Cowie , Queen's University Belfast, Belfast
This paper tries to achieve a balanced view of the ethical issues raised by emotion-oriented technology as it is, rather than as it might be imagined. A high proportion of applications seem ethically neutral. Uses in entertainment and allied areas do no great harm or good. Empowering professions may do either, but regulatory systems already exist. Ethically positive aspirations involve mitigating problems that already exist by supporting humans in emotion-related judgments, by replacing technology that treats people in dehumanized and/or demeaning ways, and by improving access for groups who struggle with existing interfaces. Emotion-oriented computing may also contribute to revaluing human faculties other than pure intellect. Many potential negatives apply to technology as a whole. Concerns specifically related to emotion involve creating a lie, by simulate emotions that the systems do not have, or promoting mechanistic conceptions of emotion. Intermediate issues arise where more general problems could be exacerbated-helping systems to sway human choices or encouraging humans to choose virtual worlds rather than reality. "SIIF" systems (semi-intelligent information filters) are particularly problematic. These use simplified rules to make judgments about people that are complex, and have potentially serious consequences. The picture is one of balances to recognize and negotiate, not uniform good or evil.
Human factors, Ethics, Emotion recognition, Behavioral science, Entertainment, affective computing, Ethics, emotion
Roddy Cowie, "The Good Our Field Can Hope to Do, the Harm It Should Avoid", IEEE Transactions on Affective Computing, vol.3, no. 4, pp. 410-423, Fourth Quarter 2012, doi:10.1109/T-AFFC.2012.40
[1] M.B. Arnold, Emotion and Personality: Vol 2. Physiological Aspects. Columbia Univ. Press, 1960.
[2] R. Cowie, N. Sussman, and A. Ben-Ze'ev, "Emotions: Concepts and Definitions," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie eds., pp. 9-30, Springer-Verlag, 2011.
[3] J. Russell and L. Barrett-Feldman, "Core Affect, Prototypical Emotional Episodes, and Other Things Called Emotion: Dissecting the Elephant," J. Personal and Social Psychology, vol. 76, pp. 805-819, 1999.
[4] J. Panksepp, "At the Interface of the Affective, Behavioral, and Cognitive Neurosciences: Decoding the Emotional Feelings of the Brain," Brain Cognition, vol. 52, pp. 4-14, 2003.
[5] K.R. Scherer, "What Are Emotions? How Can They Be Measured?" Social Science Informatics, vol. 44, no. 4, pp. 695-729, 2005.
[6] R. Cowie, "Describing the Forms of Emotional Colouring That Pervade Everyday Life," Oxford Handbook of Philosophy of Emotion, P. Goldie,ed., pp. 63-94, Oxford Univ. Press, 2010.
[7] Affective Computing and Intelligent Interaction, A. Paiva, R. Prada, and R. Picard, eds., Springer-Verlag, 2007.
[8] Proc. IEEE Third Int'l Conf. Affective Computing and Intelligent Interaction, J. Cohn, A. Nijholt, and M. Pantic, eds., vol. 1, 2009.
[9] Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie, eds. Springer-Verlag, 2011.
[10] A. Batliner, B. Schuller, D. Seppi, S. Steidl, L. Devillers, L. Vidrascu, T. Vogt, V. Aharonson, and N. Amir, "The Automatic Recognition of Emotions in Speech," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie, eds., pp. 71-99, Springer-Verlag, 2011.
[11] G. Castellano, S.D. Villalba, and A. Camurri, "Recognising Human Emotions from Body Movement and Gesture Dynamics," Affective Computing and Intelligent Interaction, pp. 71-82, 2007.
[12] Z. Zeng, M. Pantic, G.I. Roisman, and T.S. Huang, "A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions," IEEE Transa. Pattern Analysis and Machine Intelligence, vol. 31, no. 1, pp. 39-58, Jan. 2009.
[13] M. Brendel, R. Zacharelli, B. Schuller, and L. Devillers, "Towards Measuring Similarity between Emotional Corpora," Proc. LREC Workshop Emotion and Affect, 2010.
[14] R.B. Knapp, J. Kim, and E. André, "Physiological Signals and Their Use in Augmenting Emotion Recognition for Human-Machine Interaction," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie, eds., pp. 133-159, Springer-Verlag, 2011.
[15] M. Shaikh, H. Prendinger, and I. Mitsuru, "Assessing Sentiment of Text by Semantic Dependency and Contextual Valence Analysis," Affective Computing and Intelligent Interaction, pp. 191-202, 2007.
[16] R. Cowie and R. Cornelius, "Describing the Emotional States That Are Expressed in Speech," Speech Comm., pp. 5-32, 2003.
[17] M. Schröder, H. Pirker, M. Lamolle, F. Burkhardt, C. Peter, and E. Zovato, "Representing Emotions and Related States in Technological Systems," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie, eds., pp. 369-387, Springer-Verlag, 2011.
[18] B. Schuller, S. Steidl, A. Batliner, F. Burkhardt, L. Devillers, C. Müller, and S. Narayanan, "The INTERSPEECH 2010 Paralinguistic Challenge," Proc. Interspeech 2010, pp. 2794-2797, 2010.
[19] R. El Kaliouby and P. Robinson, "Real-Time Inference of Complex Mental States from Facial Expression and Head Gestures," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition, vol. 3, pp. 181-200, 2004.
[20] M. Wöllmer, F. Eyben, S. Reiter, B. Schuller, C. Cox, E. Douglas-Cowie, and R. Cowie, "Abandoning Emotion Classes—Towards Continuous Emotion Recognition with Modelling of Long-Range Dependencies," Proc. Interspeech 2008, pp. 597- 600, 2009.
[21] J.-C. Martin et al., "Coordinating the Generation of Signs in Multiple Modalities in an Affective Agent," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie, eds., pp. 349-367, Springer-Verlag, 2011.
[22] http:/, 2012.
[23] P. Goldie, The Emotions: A Philosophical Exploration. Clarendon Press, 2000.
[24] F. de Rosis, C. Castelfranchi, P. Goldie, and V. Carofiglio, "Cognitive Evaluations and Intuitive Appraisals: Can Emotion Models Handle Them Both?" Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie eds., pp. 459-481, Springer-Verlag, 2011.
[25] L.D. Riek, P.C. Paul, and P. Robinson, "When My Robot Smiles at Me: Enabling Human-Robot Rapport via Real-Time Head Gesture Mimicry," J. Multimodal User Interfaces, vol. 3, pp. 99-108, 2010. doi: 10.1007/s12193-009-0028-2.
[26] I. Sneddon, P. Goldie, and P. Petta, "Ethics in Emotion-Oriented Systems: The Challenges for an Ethics Committee," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie eds., pp. 756-767, Springer-Verlag, 2011.
[27] S. Döring, P. Goldie, and S. McGuinness, "Principalism: A Method for the Ethics of Emotion-Oriented Machines," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie eds., pp. 713-724, Springer-Verlag, 2011.
[28] S. Slater, R. Moreton, and K. Buckley, "Emotional Agents as Software Interfaces," Proc. Workshop Emotion in HCI—Designing for People, pp. 38-43, 2010.
[29] sep/193d-games-xbox-playstation, 2012.
[30] , 2012.
[31] P. Sundstrom, A. Stahl, and K. Hook, "In Situ Informants Exploring an Emotional Mobile Messaging System in Their Everyday Practice," Int'l J. Human Computer Studies, vol. 65, pp. 388-403, 2007.
[32] K. Höök, "Affective Loop Experiences: Designing for Interactional Embodiment," Philosphical Trans. Royal Soc. B, vol. 364, pp. 3585-3595, 2009.
[33] R.B. Knapp, J. Jaimovich, and N. Coghlan, "Measurement of Motion and Emotion during Musical Performance," Proc. Third IEEE Int'l Conf. Affective Computing and Intelligent Interaction, pp. 1735-739, 2009.
[34] M.D. van der Zwaag and J.H.D.M. Westerink, "Deploying Music Characteristics for an Affective Music Player," Proc. Third IEEE Int'l Conf. Affective Computing and Intelligent Interaction, vol. 1, pp. 459-465, 2009.
[35] http://www.cantoche.comen~AvatarGallery.html , 2012.
[36] S. Kopp and I. Wachsmuth, "Synthesizing Multimodal Utterances for Conversational Agents," J. Computer Animation of Virtual Worlds, vol. 15, pp. 39-52, 2004.
[37] G. Papagiannakis, S. Schertenleib, B. O'Kennedy, M. Arevalo-Poizat, N. Magnenat-Thalmann, A. Stoddart, and D. Thalmann, "Mixing Virtual and Real Scenes in the Site of Ancient Pompeii," J. Computer Animation of Virtual Worlds, vol. 16, pp. 11-24, 2005.
[38] D. Heylen, A. Nijholt, and R. op den Akker, "Affect in Tutoring Dialogues," Applied Artificial Intelligence, vol. 19, pp. 287-311, 2005.
[39] Augustine,City of God, H. Bettenson, trans. Penguin Classics, 1984.
[40] Descartes: Philosophical Writings. A selection translated and edited by E. Anscombe and P.T. Geach, with introduction by A.K. Nelson, 1954.
[41] David Hume: A Treatise of Human Nature: Volume 1, D.F. Norton and F.J. Norton, eds. Oxford Univ. Press, 2007.
[42] H. Damasio, T. Grabowski, R. Frank, A. Galaburda, and A.R. Damasio, "The Return of Phineas Gage: Clues about the Brain for the Skull of a Famous Patient," Science, vol. 264, pp. 1102-1105, 1994.
[43] V. Dulewicz and M. Higgs, "Emotional Intelligence—A Review and evaluation Study," J. Managerial Psychology, vol. 15, pp. 341-372, 2000.
[44] A.M. Isen, "Positive Affect," Handbook of Cognition and Emotion, T. Dalgleish and M. Power, eds., pp. 521-539, 1999.
[45] J. Decety and W.J. Ickes, The Social Neuroscience of Empathy. MIT Press, 2009.
[46] L. Devillers, L. Vidrascu, and L. Lamel, "Challenges in Real-Life Emotion Annotation and Machine Learning Based Detection," J. Neural Networks, vol. 18, pp. 407-422, 2005.
[47] J. Cassell, "Modelling Rapport in Embodied Conversational Agents," Proc. Interspeech 2008, pp. 18-19, 2008.
[48] K. Murata, "Laughter for Defusing Tension: Examples from Business Meetings in Japanese and in English," New Frontiers in Artificial Intelligence, pp. 294-305, Springer, 2009.
[49] L. Devillers, I. Vasilescu, and L. Lamel, "Emotion Detection in Task-Oriented Dialog Corpus," Proc. IEEE Int'l Conf. Multimedia and Expo, vol. III, pp. 549-552, 2003.
[50] C. Nass, L. Takayama, and S. Brave, "Socializing Consistency: From Technical Homogeneity to Human Epitome," Advances in Management Information Systems, vol. 6, pp. 373-391, 2006.
[51] O. Turk and M. Schroeder, "Evaluation of Expressive Speech Synthesis with Voice Conversion and Copy Resynthesis Techniques," IEEE Trans. Audio, Speech, and Language Processing, vol. 18, pp. 965-973, 2010.
[52] Keating and Mirus, deaf phones, 2003.
[53] R. Cowie and M. Schroeder, "Piecing Together the Emotion Jigsaw," Machine Learning for Multimodal Interaction, S. Bengio and H. Bourlard, eds., pp. 305-317, Springer Verlag, 2005.
[54] M. Bulut, S.S. Narayanan, and A.K. Syrdal, "Expressive Speech Synthesis Using a Concatenative Synthesizer," Proc. Int'l Conf. Spoken Language Processing, pp. 1265-1268, 2002.
[55] L.L. Carstensen and J.A. Mikels, "At the Intersection of Emotion and Cognition: Aging and the Positivity Effect," Current Directions in Psychological Science, vol. 14, pp. 117-121, 2005.
[56] A.C.B. Medeiros, N. Crilly, and P.J. Clarkson, "Affective Response to ICT Products in Old Age," Proc. 2008 Int'l Workshop Emotion in Human Computer Interaction—Designing for People, pp. 32-37, 2010.
[57] A.R. Korukonda, "Differences That Do Matter: A Dialectic Analysis of Individual Characteristics and Personality Dimensions Contributing to Computer Anxiety," Computers in Human Behavior, vol. 23, pp. 1921-1942, 2007.
[58] G. Conti-Ramsden, K. Durkin, and A.J. Walker, "Computer Anxiety: A Comparison of Adolescents with and without a History of Specific Language Impairment," Computers and Education, vol. 54, pp. 136-145, 2010.
[59] S. McGilloway, R. Cowie, and E. Douglas-Cowie, "Prosodic Signs of Emotion in Speech: Preliminary Results from a New Technique for Automatic Statistical Analysis," Proc. 13th Int'l Conf. Phonetic Sciences, vol. 1, pp. 250-253, 1995.
[60] A. Trevino, T. Quatieri, and N. Malyska, "Phonologically-Based Biomarkers for Major Depressive Disorder," EURASIP J. Advances in Signal Processing, to appear.
[61] J.F. Cohn, "Advances in Behavioral Science Using Automated Facial Image Analysis and Synthesis," IEEE Signal Processing Magazine, vol. 128, pp. 128-133, 2010.
[62] A.L. Webb, P.N. Carding, I.J. Deary, K. MacKenzie, N. Steen, and J.A. Wilson, "The Reliability of Three Perceptual Evaluation Scales for Dysphonia," European Archives of Oto-Rhino-Laryngology, vol. 261, pp. 429-434, 2004.
[63] http:/, 2012.
[64] P. Lucey, J.F. Cohn, S. Lucey, S. Sridharan, and K. Prkachin, "Automatically Detecting Pain Using Facial Actions," Proc. Third IEEE Int'l Conf. Affective Computing and Intelligent Interaction, vol. 1, pp. 12-18, 2009.
[65] A.B. Ashraf, S. Lucey, J.F. Cohn, T. Chen, K.M. Prkachin, and P. Solomon, "The Painful Face: Pain Expression Recognition Using Active Appearance Models," Image Visual Computing, vol. 27, no. 12, pp. 1788-1796, 2009.
[66] S.-H. Kang and J. Gratch, "Virtual Humans Elicit Socially Anxious Interactants' Verbal Self-Disclosure," Computer Animation of Virtual Worlds, 2010.
[67] M. Mori, "The Uncanny Valley," Energy, vol. 7, pp. 33-35, 1970.
[68] D. Hanson, A. Olney et al., "Upending the Uncanny Valley," Proc. 20th Nat'l Conf. Artificial Intelligence, pp. 1728-1729, 2005.
[69] W. Gaver, "Designing for Emotion (among Other Things)," Philosophical Trans. Royal Soc. B, vol. 364, pp. 3597-3604, 2009.
[70] I. Asimov, The Naked Sun. Collins, 1993.
[71] L. Hall, M. Vala, M. Hall, M. Webster, S. Woods, A. Gordon, R. Aylett,, "FearNot's Appearance: Reflecting Children's Expectations and Perspectives," Proc. Int'l Conf. Intelligent Virtual Agents, pp. 407-419, 2006.
[72] J. Weizenbaum, "ELIZA—A Computer Program for the Study of Natural Language Communication between Man and Machine," Comm. ACM, vol. 9, no. 1, pp. 36-45, 1966.
[73] J. Weizenbaum, Computer Power and Human Reason: From Judgment to Calculation. Freeman & Co., 1976.
[74] C. Breazel, "Role of Expressive Behaviour for Robots That Learn from People," Philosohhical Trans. Royal Soc. B, vol. 364, pp. 3527-3538, 2009.
[75] A. Hiolle, L. Canamero, and A.J. Blanchard, "Learning to Interact with the Caretaker: A Developmental Approach," Affective Computing and Intelligent Interaction, pp. 422-433, Springer-Verlag, 2007.
[76] P. Goldie, S. Döring, and R. Cowie, "The Ethical Distinctiveness of Emotion-Oriented Technology: Four Long-Term Issues," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie, eds., pp. 725-733, Springer-Verlag, 2011.
[77] Hadfield and Marks, "This Is Your Captain Dozing," New Scientist, vol. 1682, p. 267, 2000.
[78] mediatechnologyandtelecoms/digital-media/ 6539402Google-to-fund-ASAs-regulation-of-web-advertising.html , 2012.
[79] H. Bauman and S. Doring, "Emotion-Oriented Systems and the Autonomy of Persons," Emotion-Oriented Systems: The Humaine Handbook, P. Petta, C. Pelachaud, and R. Cowie, eds., pp. 735-752, Springer-Verlag, 2011.
[80] D. Levy, Love and Sex with Robots: The Evolution of Human-Robot Relationships. Harper Collins, 2007.
[81], 2012.
[82] T.L. Beauchamp and J.F. Childress, Principles of Biomedical Ethics, fifth ed. Oxford Univ. Press, 2001.
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool