Search For:

Displaying 1-16 out of 16 total
Crowdsourcing Facial Responses to Online Videos
Found in: IEEE Transactions on Affective Computing
By Daniel McDuff,Rana El Kaliouby,Rosalind W. Picard
Issue Date:September 2012
pp. 456-468
We present results validating a novel framework for collecting and analyzing facial responses to media content over the Internet. This system allowed 3,268 trackable face videos to be collected and analyzed in under two months. We characterize the data and...
 
Measuring Voter's Candidate Preference Based on Affective Responses to Election Debates
Found in: 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII)
By Daniel McDuff,Rana El Kaliouby,Evan Kodra,Rosalind Picard
Issue Date:September 2013
pp. 369-374
In this paper we present the first analysis of facial responses to electoral debates measured automatically over the Internet. We show that significantly different responses can be detected from viewers with different political preferences and that similar...
 
Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures
Found in: Computer Vision and Pattern Recognition Workshop
By Rana El Kaliouby, Peter Robinson
Issue Date:July 2004
pp. 154
This paper presents a system for inferring complex mental states from video of facial expressions and head gestures in real-time. The system is based on a multi-level dynamic Bayesian network classifier which models complex mental states as a number of int...
 
Self-Cam: feedback from what would be your social partner
Found in: ACM SIGGRAPH 2006 Research posters (SIGGRAPH '06)
By Alea Teeters, Rana El Kaliouby, Rosalind Picard
Issue Date:July 2006
pp. 138-es
In this work, we present a freeform deformation guided by physically based processes to model bending of objects when burning. Specifically, we can simulate the bending of burning matches, and the folding of burning paper interactively.
     
Self-Cam: feedback from what would be your social partner
Found in: Material presented at the ACM SIGGRAPH 2006 conference (SIGGRAPH '06)
By Alea Teeters, Rana El Kaliouby, Rosalind Picard
Issue Date:July 2006
pp. 138-es
In this work, we present a freeform deformation guided by physically based processes to model bending of objects when burning. Specifically, we can simulate the bending of burning matches, and the folding of burning paper interactively.
     
Invited Talk: An Exploratory Social-Emotional Prosthetic for Autism Spectrum Disorders
Found in: Wearable and Implantable Body Sensor Networks, International Workshop on
By Rana el Kaliouby, Alea Teeters, Rosalind W. Picard
Issue Date:April 2006
pp. 3-4
We describe a novel wearable device that perceives and reports on social-emotional information in realtime human interaction. Using a wearable camera and other sensors, combined with machine perception algorithms, the system records and analyzes the facial...
 
Predicting online media effectiveness based on smile responses gathered over the Internet
Found in: 2013 10th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2013)
By Daniel McDuff,Rana el Kaliouby,David Demirdjian,Rosalind Picard
Issue Date:April 2013
pp. 1-7
We present an automated method for classifying “liking” and “desire to view again” based on over 1,500 facial responses to media collected over the Internet. This is a very challenging pattern recognition problem that involves robust detection of smile int...
   
From dials to facial coding: Automated detection of spontaneous facial expressions for media research
Found in: 2013 10th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2013)
By Evan Kodra,Thibaud Senechal,Daniel McDuff,Rana el Kaliouby
Issue Date:April 2013
pp. 1-6
Typical consumer media research requires the recruitment and coordination of hundreds of panelists and the use of relatively expensive equipment. In this work, we compare results from a legacy hardware dial mechanism for measuring media preference to those...
   
Smile or smirk? Automatic detection of spontaneous asymmetric smiles to understand viewer experience
Found in: 2013 10th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2013)
By Thibaud Senechal,Jay Turcot,Rana el Kaliouby
Issue Date:April 2013
pp. 1-8
Asymmetric facial expressions, such as a smirk, are strong emotional signals indicating valence as well as discrete emotion states such as contempt, doubt and defiance. Yet, the automated detection of asymmetric facial action units has been largely ignored...
   
Crowdsourced data collection of facial responses
Found in: Proceedings of the 13th international conference on multimodal interfaces (ICMI '11)
By Daniel McDuff, Rana el Kaliouby, Rosalind Picard
Issue Date:November 2011
pp. 11-18
In the past, collecting data to train facial expression and affect recognition systems has been time consuming and often led to data that do not include spontaneous expressions. We present the first crowdsourced data collection of dynamic, natural and spon...
     
Lessons from participatory design with adolescents on the autism spectrum
Found in: Proceedings of the 27th international conference extended abstracts on Human factors in computing systems (CHI EA '09)
By Matthew S. Goodwin, Micah Eckhardt, Miriam Madsen, Mohammed E. Hoque, Rana el Kaliouby, Rosalind Picard
Issue Date:April 2009
pp. 1-4
Participatory user interface design with adolescent users on the autism spectrum presents a number of unique challenges and opportunities. Through our work developing a system to help autistic adolescents learn to recognize facial expressions, we have lear...
     
Automated sip detection in naturally-evoked video
Found in: Proceedings of the 10th international conference on Multimodal interfaces (IMCI '08)
By Mina Mikhail, Rana el Kaliouby
Issue Date:October 2008
pp. 203-204
Quantifying consumer experiences is an emerging application area for event detection in video. This paper presents a hierarchical model for robust sip detection that combines bottom-up processing of face videos, namely real-time head action unit analysis a...
     
Technology for just-in-time in-situ learning of facial affect for persons diagnosed with an autism spectrum disorder
Found in: Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility (Assets '08)
By Matthew Goodwin, Miriam Madsen, Rana el Kaliouby, Rosalind Picard
Issue Date:October 2008
pp. N/A
Many first-hand accounts from individuals diagnosed with autism spectrum disorders (ASD) highlight the challenges inherent in processing high-speed, complex, and unpredictable social information such as facial expressions in real-time. In this paper, we de...
     
iSET: interactive social-emotional toolkit for autism spectrum disorder
Found in: Proceedings of the 7th international conference on Interaction design and children (IDC '08)
By Matthew S. Goodwin, Rana el Kaliouby
Issue Date:June 2008
pp. 703-718
Many first-hand accounts from individuals on the autism spectrum highlight the challenges inherent in processing high-speed, complex, and unpredictable social information such as nonverbal cues in real-time. In this paper, we describe iSET (Interactive Soc...
     
Facial expression affective state recognition for air traffic control automation concept exploration
Found in: ACM SIGGRAPH 2007 posters on XX (SIGGRAPH '07)
By Rana el Kaliouby, Ronald Reisman
Issue Date:August 2007
pp. 167-es
Current methods for evaluating workload and usability of air traffic control automation concepts are often heavily reliant on subjective data. Typically subjects (often off-duty air traffic controllers) assess new tools or technologies in controlled experi...
     
Interactive technologies for autism
Found in: CHI '07 extended abstracts on Human factors in computing systems (CHI '07)
By Daniel R. Gillette, Dorothy Strickland, Gillian R. Hayes, Gregory D. Abowd, Justine Cassell, Patrice (Tamar) Weiss, Rana el Kaliouby
Issue Date:April 2007
pp. 2109-2112
In meeting health, education, and lifestyle goals, technology can both assist individuals with autism, and support those who live and work with them, such as family, caregivers, coworkers, and friends. The uniqueness of each individual with autism and the ...
     
 1