Issue No. 02 - April-June (2014 vol. 5)
ISSN: 1949-3045
pp: 0
Andrea Stevenson Won , Department of Communication, 450 Serra Mall, Stanford University, Stanford
Jeremy N. Bailenson , Department of Communication, 450 Serra Mall, Stanford University, Stanford
Joris H. Janssen , , Sense Observation Systems, Lloydstraat 5, 3024EA Rotterdam, The Netherlands
ABSTRACT
Nonverbal behavior can reveal the psychological states of those engaged in interpersonal interaction. Previous research has highlighted the relationship between gesture and learning during instruction. In the current study we applied readily available computer vision hardware and machine learning algorithms to the gestures of teacher/student dyads (${\it N} = 106$) during a learning session to automatically distinguish between high and low success learning interactions, operationalized by recall for information presented during that learning session. Models predicted learning performance of the dyad with accuracies as high as 85.7 percent when tested on dyads not included in the training set. In addition, correlations were found between summed measures of body movement and learning score. We discuss theoretical and applied implications for learning.
INDEX TERMS
Education, Tracking, Observers, Joints, Materials, Feature extraction, Computer vision
CITATION

A. S. Won, J. N. Bailenson and J. H. Janssen, "Automatic Detection of Nonverbal Behavior Predicts Learning in Dyadic Interactions," in IEEE Transactions on Affective Computing, vol. 5, no. 2, pp. , 2014.
doi:10.1109/TAFFC.2014.2329304