Computer Science and Information Engineering, World Congress on (2009)
Los Angeles, California USA
Mar. 31, 2009 to Apr. 2, 2009
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/CSIE.2009.982
Today’s vast use and frequent uploading of videos on the internet has created a need for a product that would search, select, and recommend videos that would be of interest to the user. Specifically, the video play list will be tailored to the user’s emotional mood. In this paper, we proposed XV-Pod, an emotion aware mobile video player that aims to tailor video selection based on a user’s emotional mood. In order to select videos that would be best suited to the user, the user’s emotional impact of videos was studied through physiological signals collected by BodyMedia SenseWear . We investigated two different approaches on recognizing user’s emotional response. In the first approach, we conducted an empirical study to find the emotional intensity change of the user before and after viewing the video. In the second approach, we tried to identify emotions of user by using a decision tree machine-learning algorithm. We discuss the results of both approaches. Finally, we discuss some implications of our findings and future work.
Computer Applications, Mobile Computing, Emotional Rating, Affective Search, Human-Computer Interaction (HCI), Machine Learning
Z. Segall and X. Y. Chen, "XV-Pod: An Emotion Aware, Affective Mobile Video Player," 2009 WRI World Congress on Computer Science and Information Engineering, CSIE(CSIE), Los Angeles, CA, 2009, pp. 277-281.