CSDL Home I ICTAI 2004 2012 IEEE 24th International Conference on Tools with Artificial Intelligence
Boca Raton, Florida
Nov. 15, 2004 to Nov. 17, 2004
Nevin L. Zhang , Hong Kong University of Science & Technology
Tomáš Kočka , Prague University of Economics
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/ICTAI.2004.55
Hierarchical latent class (HLC) models are tree-structured Bayesian networks where leaf nodes are observed while internal nodes are hidden. In earlier work, we have demonstrated in principle the possibility of reconstructing HLC models from data. In this paper, we address the scalability issue and develop a search-based algorithm that can efficiently learn high-quality HLC models for realistic domains. There are three technical contributions: (1) the identification of a set of search operators; (2) the use of improvement in BIC score per unit of increase in model complexity, rather than BIC score itself, for model selection; and (3) the adaptation of structural EM for situations where candidate models contain different variables than the current model. The algorithm was tested on the COIL Challenge 2000 data set and an interesting model was found.
Nevin L. Zhang, Tomáš Kočka, "Efficient Learning of Hierarchical Latent Class Models", ICTAI, 2004, 2012 IEEE 24th International Conference on Tools with Artificial Intelligence, 2012 IEEE 24th International Conference on Tools with Artificial Intelligence 2004, pp. 585-593, doi:10.1109/ICTAI.2004.55