The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - Fourth Quarter (2012 vol.3)
pp: 388-393
Mark Coeckelbergh , University of Twente, Enschede
ABSTRACT
A common objection to the use and development of “emotional” robots is that they are deceptive. This intuitive response assumes 1) that these robots intend to deceive, 2) that their emotions are not real, and 3) that they pretend to be a kind of entity they are not. We use these criteria to judge if an entity is deceptive in emotional communication (good intention, emotional authenticity, and ontological authenticity). They can also be regarded as “ideal emotional communication” conditions that saliently operate as presuppositions in our communications with other entities. While the good intention presupposition might be a bias or illusion we really need for sustaining the social life, in the future we may want to dispense with the other conditions in order to facilitate cross-entity communication. What we need instead are not “authentic” but appropriate emotional responses—appropriate to relevant social contexts. Criteria for this cannot be given a priori but must be learned—by humans and by robots. In the future, we may learn to live with “emotional” robots, especially if our values would change. However, contemporary robot designers who want their robots to receive trust from humans had better take into account current concerns about deception and create robots that do not evoke the three-fold deception response.
INDEX TERMS
Robots, Speech recognition, Ethics, Human factors, Senior citizens, Emotion recognition, Context awareness, authenticity, Ethics of robotics, emotions, deception, ideal speech conditions
CITATION
Mark Coeckelbergh, "Are Emotional Robots Deceptive?", IEEE Transactions on Affective Computing, vol.3, no. 4, pp. 388-393, Fourth Quarter 2012, doi:10.1109/T-AFFC.2011.29
REFERENCES
[1] M. Coeckelbergh, "Moral Appearances: Emotions, Robots, and Human Morality," Ethics and Information Technology, vol. 12, no. 3, pp. 235-241, http://www.springerlink.com/content103461 /, 2010.
[2] H. Jürgen, Moral Consciousness and Communicative Action, T.C. Lenhardt and S.W. Nicholsen, eds. MIT Press, 1983.
[3] P. Rosalind, "Affective Computing," MIT Technical Report No. 321, 1995.
[4] P. Rosalind, Affective Computing. MIT Press, 1997.
[5] R. Sparrow and L. Sparrow, "In the Hands of Machines? The Future of Aged Care," Minds & Machines, vol. 16, pp. 141-161, 2006.
[6] F.J. Varela, Ethical Know-How: Action, Wisdom, and Cognition. Stanford Univ. Press, 1999.
[7] A.R. Wagner and R.C. Arkin, "Acting Deceptively: Providing Robots with the Capacity for Deception," Int'l J. Social Robotics, vol. 3, no. 1, pp. 5-26, 2011.
[8] W. Wallach and C. Allen, Moral Machines: Teaching Robots Right from Wrong. Oxford Univ. Press, 2008.
7 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool