The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.03 - July-September (2011 vol.2)
pp: 119-133
Dimitris Giakoumis , Centre for Research and Technology Hellas and Aristotle University of Thessaloniki, Thessaloniki
Dimitrios Tzovaras , Centre for Research and Technology Hellas, Thessaloniki
Konstantinos Moustakas , Centre for Research and Technology Hellas, Thessaloniki
George Hassapis , Aristotle University of Thessaloniki, Thessaloniki
ABSTRACT
This paper presents work conducted toward the biosignals-based automatic recognition of boredom, induced during video-game playing. For this purpose, common biosignal feature extraction methods were exploited and their capability to identify boredom was assessed. Moreover, for the first time, Legendre and Krawtchouk moments, as well as novel moment variations, were extracted as biosignal features and their potential toward automatic affect recognition was examined using the specific application scenario. The present analysis was conducted with ECG and GSR data collected from 19 different subjects, while boredom was naturally induced during the repetitive playing of a 3D video game. Conventional biosignal features as well as moment-based ones were found to be effective for the automatic recognition of boredom by achieving classification accuracies around 85 percent. Then, the joint use of moments and moment variations with conventional features was found to significantly improve classification accuracy by producing a maximum correct classification ratio of 94.17 percent.
INDEX TERMS
Biosignals, boredom, ECG, emotion recognition, GSR, moments, video games.
CITATION
Dimitris Giakoumis, Dimitrios Tzovaras, Konstantinos Moustakas, George Hassapis, "Automatic Recognition of Boredom in Video Games Using Novel Biosignal Moment-Based Features", IEEE Transactions on Affective Computing, vol.2, no. 3, pp. 119-133, July-September 2011, doi:10.1109/T-AFFC.2011.4
REFERENCES
[1] R.W. Picard, Affective Computing. MIT Press, 1997.
[2] Z. Zeng, M. Pantic, G.I. Roisman, and T.S. Huang, “A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 31, no. 1, pp. 39-58, Jan. 2009.
[3] G. Chanel, J.J.M. Kierkels, M. Soleymani, and T. Pun, “Short-Term Emotion Assessment in a Recall Paradigm,” Int'l J. Human-Computer Studies, vol. 67, no. 8, pp. 607-627, Aug. 2009.
[4] R.R. Cornelius, The Science of Emotion. Prentice-Hall, 1996.
[5] R.W. Picard, E. Vyzas, and J. Healey, “Toward Machine Emotional Intelligence: Analysis of Affective Physiological State,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 10, pp. 1175-1191, Oct. 2001.
[6] J. Kim and E. Andre, “Emotion Recognition Based on Physiological Changes in Music Listening,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 30, no. 12, pp. 2067-2083, Dec. 2008.
[7] K.H. Kim, S.W. Bang, and S.R. Kim, “Emotion Recognition System Using Short-Term Monitoring of Physiological Signals,” Medical and Biological Eng. and Computing, vol. 42, pp. 419-427, May 2004.
[8] G. Chanel, J. Kronegg, D. Grandjean, and T. Pun, “Emotion Assessment: Arousal Evaluation Using EEG's and Peripheral Physiological Signals,” Proc. Int'l Workshop Multimedia Content Representation Classification and Security, B. Gunsel, A.M. Tekalp, A.K. Jain, and B. Sankur, eds., pp. 530-537, 2006.
[9] C.L. Lisetti and F. Nasoz, “Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals,” EURASIP J. Applied Signal Processing, vol. 2004, pp. 1672-1687, Jan. 2004.
[10] A. Haag, S. Goronzy, and P. Schaich, J. Williams, “Emotion Recognition Using Bio-Sensors: First Step toward an Automatic System,” Proc. Affective Dialog Systems: Tutorial and Research Workshop, 2004.
[11] J. Wagner, J. Kim, and E. André, “From Physiological Signals to Emotions: Implementing and Comparing Selected Methods for Features Extraction and Classification,” Proc. IEEE Int'l Conf. Multimedia and Expo, 2005.
[12] P. Rainville, A. Bechara, N. Naqvi, and A.R. Damasio, “Basic Emotions Are Associated with Distinct Patterns of Cardiorespiratory Activity,” Int'l J. Psychophysiology, vol. 61, no. 1, pp. 5-18, 2006.
[13] C.D. Katsis, N. Katertsidis, G. Ganiatsas, and D.I. Fotiadis, “Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach,” IEEE Trans. Systems, Man, and Cybernetics, Part A: Systems and Humans, vol. 38, no. 3, pp. 502-512, May 2008.
[14] P. Sundstrom, “Exploring the Affective Loop,” PhD thesis, Stockholm Univ., 2005.
[15] P. Sweetser and P. Wyeth, “GameFlow: A Model for Evaluating Player Enjoyment in Games,” ACM Computers in Entertainment, vol. 3, p. 3, July 2005.
[16] G.N. Yannakakis, M. Maragoudakis, and J. Hallam, “Preference Learning for Cognitive Modeling: A Case Study on Entertainment Preferences,” IEEE Trans. Systems, Man, and Cybernetics, Part A: Systems and Humans, vol. 39, no. 6, pp. 1165-1175, Nov. 2009.
[17] G.N. Yannakakis and J. Hallam, “Real-Time Game Adaptation for Optimizing Player Satisfaction,” IEEE Trans. Computational Intelligence and AI in Games, vol. 1, no. 2, pp. 121-133, June 2009.
[18] G. Chanel, C. Rebetez, M. Betrancourt, and T. Pun, “Boredom, Engagement and Anxiety as Indicators for Adaptation to Difficulty in Games,” Proc. 12th Int'l Conf. Entertainment and Media in the Ubiquitous Era, 2008.
[19] P. Rani, C. Liu, N. Sarkar, and E. Vanman, “An Empirical Study of Machine Learning Techniques for Affect Recognition in Human-Robot Interaction,” Pattern Analysis and Applications, vol. 9, no. 1, pp. 58-69, 2006.
[20] M.K. Hu, “Visual Pattern Recognition by Moment Invariants,” IRE Trans. Information Theory, vol. 8, no. 2, pp. 179-187, Feb. 1962.
[21] J. Shen, W. Shen, and D. Shen, “On Geometric and Orthogonal Moments,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 14, no. 7, pp. 875-894, 2000.
[22] M.R. Teague, “Image Analysis via the General Theory of Moments,” J. Optical Soc. of Am., vol. 70, no. 8, pp. 920-930, 1980.
[23] S.M. Lajevardi and Z.M. Hussain, “Zernike Moments for Facial Expression Recognition,” Proc. Int'l Conf. Comm., Computer, and Power, pp. 371-381, 2009.
[24] C.H. Teh and R.T. Chin, “On Image Analysis by the Methods of Moments,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 10, no. 4, pp. 496-513, July 1988.
[25] S. Li, M. Lee, and C. Pun, “Complex Zernike Moments Features for Shape-Based Image Retrieval,” IEEE Trans. Systems, Man, and Cybernetics, Part A: Humans and Systems, vol. 39, no. 1, pp. 227-237, Jan. 2009.
[26] P.-T. Yap, R. Paramesran, and S.-H. Ong, “Image Analysis by Krawtcouk Moments,” IEEE Trans. Image Processing, vol. 12, no. 11, pp. 1367-1377, Nov. 2003.
[27] B. Fu, J. Zhou, Y. Li, G. Zhang, and C. Wang, “Image Analysis by Modified Legendre Moments,” Pattern Recognition, vol. 40, no. 2, pp. 691-704, Feb. 2007.
[28] A. Mademlis, A. Axenopoulos, P. Daras, D. Tzovaras, and M.G. Strintzis, “3D Content-Based Search Based on 3D Krawtchouk Moments,” Proc. Third Int'l Symp. 3D Data Processing, Visualization, and Transmission, 2006.
[29] D. Ioannidis, D. Tzovaras, I.G. Damousis, S. Argyropoulos, and K. Moustakas, “Gait Recognition Using Compact Feature Extraction Transforms and Depth Information,” IEEE Trans. Information Forensics and Security, vol. 2, no. 3, pp. 623-630, Sept. 2007.
[30] G.G. Berntson, “Heart Rate Variability: Origins, Methods, and Interpretive Caveats,” Psychophysiology, vol. 34, pp. 623-648, 1997.
[31] Task Force of the European Soc. of Cardiology and the North Am. Soc. of Pacing and Electrophysiology, “Heart Rate Variability— Standards of Measurement, Physiological Interpretation, and Clinical Use,” Circulation, vol. 93, no. 5, pp. 1043-1065, 1996.
[32] J.L. Andreassi, Psychophysiology: Human Behavior and Physiological Response. Lawrence Erlbaum Assoc., 1995.
[33] R.L. Mandryk and M.S. Atkins, “A Fuzzy Physiological Approach for Continuously Modeling Emotion During Interaction with Play Technologies,” Int'l J. Human-Computer Studies, vol. 65, pp. 329-347, 2007.
[34] G.Y. Yang, H.Z. Shu, C. Toumoulin, G.N. Han, and L.M. Luo, “Efficient Legendre Moment Computation for Gray Level Images,” Pattern Recognition, vol. 39, no. 1, pp. 74-80, Jan. 2006.
[35] R. Mukundan and K.R. Ramakrishnan, “Fast Computation of Legendre and Zernike moments,” Pattern Recognition, vol. 28, no. 9, pp. 1433-1442, 1995.
[36] G.-B. Wang; and S.-G. Wang;, “Parallel Recursive Computation of the Inverse Legendre Moment Transforms for Signal and Image Reconstruction,” IEEE Signal Processing Letters, vol. 11, no. 12, pp. 929-932, Dec. 2004.
[37] R. Koekoek and R.F. Swarttouw, “The Askey-Scheme of Hypergeometric Orthogonal Polynomials and Its Q-Analogue,” Faculty of Technical Math. and Informatics Report 98-17, Technische Univ. Delft, pp. 46-47, 1998.
[38] P.A. Raj and A. Venkataramana, “Fast Computation of Inverse Krawtchouk Moment Transform Using Clenshaw's Recurrence Formula,” Proc. IEEE Int'l Conf. Image Processing, pp. 37-40, 2007.
[39] K.R. Scherer, “On the Nature and Function of Emotion: A Component Process Approach,” Approaches to Emotion, K.R. Scherer and P. Ekman, eds., pp. 293-318, Lawrence Erlbaum Assoc., 1984.
[40] K.R. Scherer, “Voice, Stress, and Emotion,” Dynamics of Stress, M.H. Appley and R. Trumbull, eds., pp. 159-181, Plenum, 1986.
[41] T.W. Malone, “What Makes Computer Games Fun?” Byte, vol. 6, pp. 258-277, 1981.
[42] A. Jain and D. Zongker, “Feature Selection: Evaluation, Application, and Small Sample Performance,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 19, no. 2, pp. 153-163, Feb. 1997.
23 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool