The Community for Technology Leaders
Green Image
Issue No. 01 - January-March (2011 vol. 2)
ISSN: 1949-3045
pp: 50-63
Christopher T. Lovelace , University of Missouri-Kansas City, Kansas City
Reza R. Derakhshani , University of Missouri-Kansas City, Kansas City
Psychophysiological measurements of startle eyeblink can provide information about the state of an individual regarding sensory, attentional, cognitive, and affective processing, and thus reveal valences of interest for affective computing. However, eyeblink is usually measured using intrusive contact electromyographic (EMG) electrodes, accompanied by a laborious manual process of feature extraction. We introduce a new noninvasive automatic system using high-speed video recording of startle blinks in conjunction with data-driven feature selection and support vector machine (SVM) ensembles to classify startle eyeblinks. Using a prestimulus (prepulse) to produce robust modulation of acoustically elicited startle eyeblinks, we tracked the blinks using 250 frames per second video, and extracted different features from eyelid displacement and velocity signals. The SVMs were able to determine whether a trial had contained startle or prepulse+startle stimuli with an accuracy of up to 73 percent (five-fold cross validation). By fusing the decisions made on different feature sets, an ensemble of seven SVMs increased this rate to almost 79 percent. Since startle eyeblinks are robustly modulated by not only sensory events (such as the prepulse used in this study) but also affective and cognitive states, eyelid tracking using high-speed video, in conjunction with the introduced classification method, is an effective and user-friendly alternative to EMG for classification of startle blinks to infer users' affective-cognitive states.
Affective computing, image processing, pattern recognition, user interfaces.
Christopher T. Lovelace, Reza R. Derakhshani, "An Ensemble Method for Classifying Startle Eyeblink Modulation from High-Speed Video Records", IEEE Transactions on Affective Computing, vol. 2, no. , pp. 50-63, January-March 2011, doi:10.1109/T-AFFC.2010.15
86 ms
(Ver 3.1 (10032016))