This Article 
 Bibliographic References 
 Add to: 
CAO: A Fully Automatic Emoticon Analysis System Based on Theory of Kinesics
January-June 2010 (vol. 1 no. 1)
pp. 46-59
Michal Ptaszynski, Hokkaido University, Sapporo
Jacek Maciejewski, Hokkaido University, Sapporo
Pawel Dybala, Hokkaido University, Sapporo
Rafal Rzepka, Hokkaido University, Sapporo
Kenji Araki, Hokkaido University, Sapporo
This paper presents CAO, a system for affect analysis of emoticons in Japanese online communication. Emoticons are strings of symbols widely used in text-based online communication to convey user emotions. The presented system extracts emoticons from input and determines the specific emotion types they express with a three-step procedure. First, it matches the extracted emoticons to a predetermined raw emoticon database. The database contains over 10,000 emoticon samples extracted from the Web and annotated automatically. The emoticons for which emotion types could not be determined using only this database, are automatically divided into semantic areas representing “mouths” or “eyes,” based on the idea of kinemes from the theory of kinesics. The areas are automatically annotated according to their co-occurrence in the database. The annotation is first based on the eye-mouth-eye triplet, and if no such triplet is found, all semantic areas are estimated separately. This provides hints about potential groups of expressed emotions, giving the system coverage exceeding 3 million possibilities. The evaluation, performed on both training and test sets, confirmed the system's capability to sufficiently detect and extract any emoticon, analyze its semantic structure, and estimate the potential emotion types expressed. The system achieved nearly ideal scores, outperforming existing emoticon analysis systems.

[1] N. Suzuki and K. Tsuda, "Express Emoticons Choice Method for Smooth Communication of e-Business," Proc. 10th Int'l Conf. Knowledge-Based and Intelligent Information and Eng. Systems, pp. 296-302, 2006.
[2] D. Derks, A.E.R. Bos, and J. von Grumbkow, "Emoticons and Social Interaction on the Internet: The Importance of Social Context," Computers in Human Behavior, vol. 23, pp. 842-849, 2007.
[3] K.C. Chiu, "Explorations in the Effect of Emoticon on Negotiation Process from the Aspect of Communication," master's thesis, Dept. of Information Management, Nat'l Sun Yat-sen Univ., 2007.
[4] Y. Tanaka, H. Takamura, and M. Okumura, "Extraction and Classification of Facemarks with Kernel Methods," Proc. 10th Int'l Conf. Intelligent User Interfaces, Jan. 2005.
[5] T. Yamada, S. Tsuchiya, S. Kuroiwa, and F. Ren, "Classification of Facemarks Using N-Gram," Proc. Int'l Conf. Natural Language Processing and Knowledge Eng., pp. 322-327, 2007.
[6] M. Kawakami, "The Database of 31 Japanese Emoticon with Their Emotions and Emphases," The Human Science Research Bull. Osaka Shoin Women's Univ., vol. 7, pp. 67-82, 2008.
[7] A. Ip, "The Impact of Emoticons on Affect Interpretation in Instant Messaging," , 20110.
[8] A. Wolf, "Emotional Expression Online: Gender Differences in Emoticon Use," CyberPsychology and Behavior, vol. 3, no. 5, pp. 827-833, 2004.
[9] J.M. Maness, "A Linguistic Analysis of Chat Reference Conversations with 18-24 Year-Old College Students," The J. Academic Librarianship, vol. 34, no. 1, pp. 31-38, Jan. 2008.
[10] J. Nakamura, T. Ikeda, N. Inui, and Y. Kotani, "Learning Face Mark for Natural Language Dialogue System," Proc. Conf. IEEE Int'l Conf. Natural Language Processing and Knowledge Eng., pp. 180-185, 2003.
[11] N. Suzuki and K. Tsuda, "Automatic Emoticon Generation Method for Web Community," IADIS Int'l Conf. Web Based Communities, pp. 331-334, 2006.
[12] K. Takami, R. Yamashita, K. Tani, Y. Honma, and S. Goto, "Deducing a User's State of Mind from Analysis of the Pictographic Characters and Emoticons Used in Mobile Phone Emails for Personal Content Delivery Services," Int'l J. Advances in Telecomm., vol. 2, no. 1, pp. 37-46, 2009.
[13] J. Read, "Using Emoticons to Reduce Dependency in Machine Learning Techniques for Sentiment Classification," Proc. ACL Student Research Workshop, pp. 43-48, 2005.
[14] C. Yang, K. Hsin-Yih Lin, and H.-H. Chen, "Building Emotion Lexicon from Weblog Corpora," Proc. ACL Demo and Poster Sessions, pp. 133-136, 2007.
[15] Handbook of Emotions, M. Lewis, J.M. Haviland-Jones, L. Feldman Barrett, eds. Guilford Press, 2008.
[16] A. Nakamura, Kanjo Hyogen Jiten, [Dictionary of Emotive Expressions] (in Japanese). Tokyodo Publishing, 1993.
[17] M.F. Vargas, Louder than Words: An Introduction to Nonverbal Communication. Iowa State Press, 1986.
[18] R.L. Birdwhistell, Introduction to Kinesics: An Annotation System for Analysis of Body Motion and Gesture. Univ. of Kentucky Press, 1952.
[19] R.L. Birdwhistell, Kinesics and Context. Univ. of Pennsylvania Press, 1970.
[20] M. Ptaszynski, P. Dybala, R. Rzepka, and K. Araki, "Affecting Corpora: Experiments with Automatic Affect Annotation System—A Case Study of the 2 Channel Forum," Proc. Pacific Assoc. for Computation Linguistics, pp. 223-228, 2009.
[21] R.C. Solomon, The Passions: Emotions and the Meaning of Life. Hackett Publishing, 1993.
[22] J.A. Russell, "A Circumplex Model of Affect," J. Personality and Social Psychology, vol. 39, no. 6, pp. 1161-1178, 1980.
[23] M. Ptaszynski, P. Dybala, W. Shi, R. Rzepka, and K. Araki, "Towards Context Aware Emotional Intelligence in Machines: Computing Contextual Appropriateness of Affective States," Proc. 21st Int'l Joint Conf. Artifical Intelligence, pp. 1469-1474, 2009.
[24] H. Kubota, K. Yamashita, T. Fukuhara, and T. Nihsida, "POC Caster: Broadcasting Agent Using Conversational Representation for Internet Community," Trans. JSAI, [in Japanese], AI-17, pp. 313-321, 2002.
[25] Y. Matsumoto, A. Kitauchi, T. Yamashita, Y. Hirano, H. Matsuda, K. Takaoka, and M. Asahara, "Japanese Morphological Analysis System ChaSen Version 2.2.1," 2000.
[26] T. Kudo and Y. Matsumoto, "Chunking with Support Vector Machines," Proc. Second Meeting North Am. Chapter Assoc. for Computational Linguistics, pp. 192-199, 2001.
[27] M. Ptaszynski, P. Dybala, R. Rzepka, and K. Araki, "Towards Fully Automatic Emoticon Analysis System $(\,{\hat{}}\circ{\hat{}})$ ," Proc. 16th Ann. Meeting Assoc. for Natural Language Processing, pp. 583-586, 2010.
[28] A. Abbasi and H. Chen, "Affect Intensity Analysis of Dark Web Forums," Intelligence and Security Informatics, pp. 282-288, 2007.
[29] F. Radulovic and N. Milikic, "Smiley Ontology," Proc. First Int'l Workshop Social Networks Interoperability, 2009.

Index Terms:
Affect analysis, text processing, emotion in human-computer interaction, affect sensing and analysis, emoticon.
Michal Ptaszynski, Jacek Maciejewski, Pawel Dybala, Rafal Rzepka, Kenji Araki, "CAO: A Fully Automatic Emoticon Analysis System Based on Theory of Kinesics," IEEE Transactions on Affective Computing, vol. 1, no. 1, pp. 46-59, Jan.-June 2010, doi:10.1109/T-AFFC.2010.3
Usage of this product signifies your acceptance of the Terms of Use.