The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - April-June (2012 vol.3)
pp: 165-183
M. Wollmer , Tech. Univ. Munchen, Munchen, Germany
ABSTRACT
This paper describes a substantial effort to build a real-time interactive multimodal dialogue system with a focus on emotional and nonverbal interaction capabilities. The work is motivated by the aim to provide technology with competences in perceiving and producing the emotional and nonverbal behaviors required to sustain a conversational dialogue. We present the Sensitive Artificial Listener (SAL) scenario as a setting which seems particularly suited for the study of emotional and nonverbal behavior since it requires only very limited verbal understanding on the part of the machine. This scenario allows us to concentrate on nonverbal capabilities without having to address at the same time the challenges of spoken language understanding, task modeling, etc. We first report on three prototype versions of the SAL scenario in which the behavior of the Sensitive Artificial Listener characters was determined by a human operator. These prototypes served the purpose of verifying the effectiveness of the SAL scenario and allowed us to collect data required for building system components for analyzing and synthesizing the respective behaviors. We then describe the fully autonomous integrated real-time system we created, which combines incremental analysis of user behavior, dialogue management, and synthesis of speaker and listener behavior of a SAL character displayed as a virtual agent. We discuss principles that should underlie the evaluation of SAL-type systems. Since the system is designed for modularity and reuse and since it is publicly available, the SAL system has potential as a joint research tool in the affective computing research community.
INDEX TERMS
interactive systems, behavioural sciences computing, emotion recognition, emotion recognition, autonomous sensitive artificial listeners, real-time interactive multimodal dialogue system, nonverbal interaction capabilities, emotional capabilities, spoken language understanding, task modeling, autonomous integrated real-time system, user behavior, dialogue management, listener behavior, speaker behavior, SAL character, Humans, Computers, Speech, Prototypes, Speech recognition, Real-time systems, Emotion recognition, turn-taking., Embodied conversational agents, Rapport agents, emotion recognition, emotion synthesis, real-time dialogue, listener behavior
CITATION
M. Wollmer, "Building Autonomous Sensitive Artificial Listeners", IEEE Transactions on Affective Computing, vol.3, no. 2, pp. 165-183, April-June 2012, doi:10.1109/T-AFFC.2011.34
18 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool