Search For:

Displaying 1-3 out of 3 total
Building Autonomous Sensitive Artificial Listeners
Found in: IEEE Transactions on Affective Computing
By M. Schroder,E. Bevacqua,R. Cowie,F. Eyben,H. Gunes,D. Heylen,M. ter Maat,G. McKeown,S. Pammi,M. Pantic,C. Pelachaud,B. Schuller,E. de Sevin,M. Valstar,M. Wollmer
Issue Date:April 2012
pp. 165-183
This paper describes a substantial effort to build a real-time interactive multimodal dialogue system with a focus on emotional and nonverbal interaction capabilities. The work is motivated by the aim to provide technology with competences in perceiving an...
 
The SEMAINE Database: Annotated Multimodal Records of Emotionally Colored Conversations between a Person and a Limited Agent
Found in: IEEE Transactions on Affective Computing
By G. McKeown,M. Valstar,R. Cowie,M. Pantic,M. Schroder
Issue Date:January 2012
pp. 5-17
SEMAINE has created a large audiovisual database as a part of an iterative approach to building Sensitive Artificial Listener (SAL) agents that can engage a person in a sustained, emotionally colored conversation. Data used to build the agents came from in...
 
Web-based database for facial expression analysis
Found in: Multimedia and Expo, IEEE International Conference on
By M. Pantic, M. Valstar, R. Rademaker, L. Maat
Issue Date:July 2005
pp. 5 pp.
In the last decade, the research topic of automatic analysis of facial expressions has become a central topic in machine vision research. Nonetheless, there is a glaring lack of a comprehensive, readily accessible reference set of face images that could be...
 
 1