The Community for Technology Leaders
2015 International Conference on Big Data and Smart Computing (BigComp) (2015)
Jeju, South Korea
Feb. 9, 2015 to Feb. 11, 2015
ISBN: 978-1-4799-7303-3
pp: 238-243
Jeesoo Bang , Pohang University of Science and Technology
Hyungjong Noh , Pohang University of Science and Technology
Yonghee Kim , Pohang University of Science and Technology
Gary Geunbae Lee , Pohang University of Science and Technology
ABSTRACT
This study introduces an example-based chat-oriented dialogue system with personalization framework using long-term memory. Previous representative chat-bots use simple keyword and pattern matching methodologies. To maintain the quality of systems, generating numerous heuristic rules with human labour is inevitable. The language expert knowledge is also necessary to build those rules and matching patterns. To avoid high annotation cost, example-based dialogue management is adopted for building chat-oriented dialogue system. We also propose three features: POS-tagged tokens for sentence matching, using NE types and values for searching proper responses, and using back-off responses for unmatched user utterances. Also, our system automatically collects user-related facts from user input sentences and stores the facts into a long-term memory. System responses can be modified by applying user-related facts in the long-term memory. A relevance score of a system response is proposed to select responses that include user-related fact, or frequently used responses. In several experiments, we have found that our proposed features contribute to improve the performance and our system shows competitive performance to ALICE system with the same training corpus.
INDEX TERMS
Training, Pattern matching, Cities and towns, Databases, Data mining, Context, Labeling
CITATION

J. Bang, H. Noh, Y. Kim and G. G. Lee, "Example-based chat-oriented dialogue system with personalized long-term memory," 2015 International Conference on Big Data and Smart Computing (BigComp)(BIGCOMP), Jeju, South Korea, 2015, pp. 238-243.
doi:10.1109/35021BIGCOMP.2015.7072837
106 ms
(Ver 3.3 (11022016))