The Community for Technology Leaders
Software Architecture, Working IEEE/IFIP Conference on (2011)
Boulder, Colorado USA
June 20, 2011 to June 24, 2011
ISBN: 978-0-7695-4351-2
pp: 187-193
ABSTRACT
The computer's ability to recognize human emotional states given physiological signals is gaining in popularity to create empathetic systems such as learning environments, health care systems and videogames. Despite that, there are few frameworks, libraries, architectures, or software tools, which allow systems developers to easily integrate emotion recognition into their software projects. The work reported here offers a first step to fill this gap in the lack of frameworks and models, addressing: (a) the modeling of an agent-driven component-based architecture for multimodal emotion recognition, called ABE, and (b) the use of ABE to implement a multimodal emotion recognition framework to support third-party systems becoming empathetic systems.
INDEX TERMS
affective computing, architecture, framework, agent-based, multimodal, emotion recognition, empathetic systems
CITATION

M. E. Chavez-Echeagaray, R. Atkinson, W. Burleson and J. Gonzalez-Sanchez, "ABE: An Agent-Based Software Architecture for a Multimodal Emotion Recognition Framework," Software Architecture, Working IEEE/IFIP Conference on(WICSA), Boulder, Colorado USA, 2011, pp. 187-193.
doi:10.1109/WICSA.2011.32
94 ms
(Ver 3.1 (10032016))