The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.09 - September (2008 vol.20)
pp: 1254-1263
Tomoharu Iwata , NTT, Soraku-gun
Kazumi Saito , NTT, Kyoto
Takeshi Yamada , NTT, Kyoto
ABSTRACT
It is important for online stores to improve Customer Lifetime Value (LTV) if they are to increase their profits. Conventional recommendation methods suggest items that best coincide with user's interests to maximize the purchase probability, and this does not necessarily help to improve LTV. We present a novel recommendation method that maximizes the probability of the LTV being improved, which can apply to both of measured and subscription services. Our method finds frequent purchase patterns among high LTV users, and recommends items for a new user that simulate the found patterns. Using survival analysis techniques, we efficiently extract information from log data to find the patterns. Furthermore, we infer a user's interests from purchase histories based on maximum entropy models, and use these interests to improve the recommendations. Since a higher LTV is the result of greater user satisfaction, our method benefits users as well as online stores. We evaluate our method using two sets of real log data for measured and subscription services.
INDEX TERMS
Data mining, Information filtering, Machine learning
CITATION
Tomoharu Iwata, Kazumi Saito, Takeshi Yamada, "Recommendation Method for Improving Customer Lifetime Value", IEEE Transactions on Knowledge & Data Engineering, vol.20, no. 9, pp. 1254-1263, September 2008, doi:10.1109/TKDE.2008.55
REFERENCES
[1] J.B. Schafer, J.A. Konstan, and J. Riedl, “E-Commerce Recommendation Applications,” Data Mining and Knowledge Discovery, vol. 5, pp. 115-153, 2001.
[2] X. Jin, Y. Zhou, and B. Mobasher, “A Maximum Entropy Web Recommendation System: Combining Collaborative and Content Features,” Proc. ACM SIGKDD, 2005.
[3] R.J. Mooney and L. Roy, “Content-Based Book Recommending Using Learning for Text Categorization,” Proc. Fifth ACM Conf. Digital Libraries (ACM DL '00), pp. 195-204, 2000.
[4] D.Y. Pavlov and D.M. Pennock, “A Maximum Entropy Approach to Collaborative Filtering in Dynamic, Sparse, High-Dimensional Domains,” Proc. Advances in Neural Information Processing Systems (NIPS '02), pp. 1441-1448, 2002.
[5] A. Popescul, L. Ungar, D. Pennock, and S. Lawrence, “Probabilistic Models for Unified Collaborative and Content-Based Recommendation in Sparse-Data Environments,” Proc. 17th Conf. Uncertainty in Artificial Intelligence (UAI '01), pp. 437-444, 2001.
[6] P. Resnick, N. Iacovou, M. Suchak, P. Bergstrom, and J. Riedl, “Grouplens: An Open Architecture for Collaborative Filtering of Netnews,” Proc. ACM Conf. Computer Supported Cooperative Work (CSCW '94), pp. 175-186, 1994.
[7] D.R. Mani, J. Drew, A. Betz, and P. Datta, “Statistics and Data Mining Techniques for Lifetime Value Modeling,” Proc. ACM SIGKDD '99, pp. 94-103, 1999.
[8] M. Cleves, W.W. Gould, and R. Gutierrez, An Introduction to Survival Analysis Using Stata, revised ed. Stata Press, 2004.
[9] M.J.A. Berry and G.S. Linoff, Data Mining Techniques: For Marketing, Sales, and Customer Relationship Management. John Wiley, 2004.
[10] W.H. Au, K.C.C. Chan, and X. Yao, “A Novel Evolutionary Data Mining Algorithm with Applications to Churn Prediction,” IEEE Trans. Evolutionary Computation, vol. 7, no. 6, pp. 532-545, 2003.
[11] P.S. Fader, B.G.S. Hardie, and K.L. Lee, “‘Counting Your Customers,’ The Easy Way: An Alternative to The Pareto/NBD Model,” Marketing Science, vol. 24, no. 2, pp. 275-284, 2005.
[12] L. Jen, C.-H. Chou, and G.M. Allenby, “A Bayesian Approach to Modeling Purchase Frequency,” Marketing Letters, vol. 14, no. 1, pp. 5-20, 2003.
[13] M.C. Mozer, R. Wolniewicz, D.B. Grimes, E. Johnson, and H. Kaushansky, “Predicting Subscriber Dissatisfaction and Improving Retention in the Wireless Telecommunications Industry,” IEEE Trans. Neural Networks, vol. 11, no. 3, pp. 690-696, 2000.
[14] Y. Shono, Y. Takada, N. Komoda, H. Oiso, A. Hiramatsu, and K. Fukuda, “Customer Analysis of Monthly-Charged Mobile Content Aiming at Prolonging Subscription Period,” Proc. IEEE Int'l Conf. Computational Cybernetics (ICCC '04), pp. 279-284, 2004.
[15] G. Piatetsky-Shapiro and B. Masand, “Estimating Campaign Benefits and Modeling Lift,” Proc. ACM SIGKDD '99, pp. 185-193, 1999.
[16] S. Rosset, E. Neumann, U. Eick, and N. Vatnik, “Customer Lifetime Value Models for Decision Support,” Data Mining and Knowledge Discovery, vol. 7, pp. 321-339, 2003.
[17] T. Iwata, K. Saito, and T. Yamada, “Recommendation Method for Extending Subscription Periods,” Proc. ACM SIGKDD '06, pp. 574-579, 2006.
[18] J. Box-Steffensmeier and B. Jones, Event History Modeling. Cambridge Univ. Press, 2004.
[19] D.C. Liu and J. Nocedal, “On the Limited Memory BFGS Method for Large Scale Optimization,” Math. Programming, vol. 45, no. 3, pp. 503-528, 1989.
[20] L. Zitnick and T. Kanade, “Maximum Entropy for Collaborative Filtering,” Proc. 20th Conf. Uncertainty in Artificial Intelligence (UAI '04), pp. 636-643, 2004.
[21] K. Nigam, J. Lafferty, and A. McCallum, “Using Maximum Entropy for Text Classification,” Proc. Int'l Joint Conf. Artificial Intelligence Workshop Machine Learning for Information Filtering, pp.61-67, 1999.
[22] A. Ratnaparkhi, “A Maximum Entropy Model for Part-of-Speech Tagging,” Proc. First Conf. Empirical Methods in Natural Language Processing (EMNLP '96), pp. 133-142, 1996.
[23] S.F. Chen and R. Rosenfeld, “A Gaussian Prior for Smoothing Maximum Entropy Models,” CMUCS-99-108, technical report, 1999.
[24] D.R. Cox, “Regression Models and Life-Tables,” J. Royal Statistical Soc., Series B, vol. 34, no. 2, pp. 187-220, 1972.
[25] B. Sarwar, G. Karypis, J. Konstan, and J. Reidl, “Item-Based Collaborative Filtering Recommendation Algorithms,” Proc. 10th Int'l World Wide Web Conf. (WWW '01), pp. 285-295, 2001.
[26] T. Hofmann, “Probabilistic Latent Semantic Analysis,” Proc. 15th Conf. Uncertainty in Artificial Intelligence (UAI '99), pp. 289-296, 1999.
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool