The Community for Technology Leaders
2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (2013)
Geneva Switzerland
Sept. 2, 2013 to Sept. 5, 2013
ISSN: 2156-8103
pp: 135-140
Eduardo Velloso , Lancaster Univ., Lancaster, UK
Andreas Bulling , Max Planck Inst. for Inf., Saarbrucken, Germany
Hans Gellersen , Lancaster Univ., Lancaster, UK
ABSTRACT
Manual annotation of human body movement is an integral part of research on non-verbal communication and computational behaviour analysis but also a very time-consuming and tedious task. In this paper we present AutoBAP, a system that automates the coding of bodily expressions according to the body action and posture (BAP) coding scheme. Our system takes continuous body motion and gaze behaviour data as its input. The data is recorded using a full body motion tracking suit and a wearable eye tracker. From the data our system automatically generates a labelled XML file that can be visualised and edited with off-the-shelf video annotation tools. We evaluate our system in a laboratory-based user study with six participants performing scripted sequences of 184 actions. Results from the user study show that our prototype system is able to annotate 172 out of the 274 labels of the full BAP coding scheme with good agreement with a manual annotator (Cohen's kappa > 0.6).
INDEX TERMS
Encoding, Magnetic heads, Tracking, Sensors, Three-dimensional displays, Training, Decision trees
CITATION

E. Velloso, A. Bulling and H. Gellersen, "AutoBAP: Automatic Coding of Body Action and Posture Units from Wearable Sensors," 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction(ACII), Geneva Switzerland, 2014, pp. 135-140.
doi:10.1109/ACII.2013.29
348 ms
(Ver 3.3 (11022016))