This Article 
 Bibliographic References 
 Add to: 
A Hierarchical Latent Variable Model for Data Visualization
March 1998 (vol. 20 no. 3)
pp. 281-293

Abstract—Visualization has proven to be a powerful and widely-applicable tool for the analysis and interpretation of multivariate data. Most visualization algorithms aim to find a projection from the data space down to a two-dimensional visualization space. However, for complex data sets living in a high-dimensional space, it is unlikely that a single two-dimensional projection can reveal all of the interesting structure. We therefore introduce a hierarchical visualization algorithm which allows the complete data set to be visualized at the top level, with clusters and subclusters of data points visualized at deeper levels. The algorithm is based on a hierarchical mixture of latent variable models, whose parameters are estimated using the expectation-maximization algorithm. We demonstrate the principle of the approach on a toy data set, and we then apply the algorithm to the visualization of a synthetic data set in 12 dimensions obtained from a simulation of multiphase flows in oil pipelines, and to data in 36 dimensions derived from satellite images. A Matlab software implementation of the algorithm is publicly available from the World Wide Web.

[1] M.I. Jordan and R.A. Jacobs, "Hierarchical Mixtures of Experts and the EM Algorithm," Neural Computation, vol. 6, no. 2, pp. 181-214, 1994.
[2] B.S. Everitt, An Introduction to Latent Variable Models.London: Chapman and Hall, 1984.
[3] W.J. Krzanowski and F.H.C. Marriott, Multivariate Analysis Part 2: Classification, Covariance Structures and Repeated Measurements.London: Edward Ar nold, 1994.
[4] M.E. Tipping and C.M. Bishop, "Mixtures of Principal Component Analysers," Proc. IEE Fifth Int'l Conf. Artificial Neural Networks, pp. 13-18,Cambridge, U.K., July 1997.
[5] M.E. Tipping and C.M. Bishop, "Mixtures of Probabilistic Principal Component Analysers," Tech. Rep. NCRG/97/003, Neural Computing Research Group, Aston University, Birmingham, U.K., 1997.
[6] A.P. Dempster, N.M. Laird, and D.B. Rubin, "Maximum Likelihood From Incomplete Data via the EM Algorithm," J. Royal Statistical Soc., B, vol. 39, no. 1, pp. 1-38, 1977.
[7] D.B. Rubin and D.T. Thayer, "EM Algorithms for ML Factor Analysis," Psychometrika, vol. 47, no. 1, pp. 69-76, 1982.
[8] C.M. Bishop, Neural Networks for Pattern Recognition. Oxford Univ. Press, 1995.
[9] C.M. Bishop and G.D. James, "Analysis of Multiphase Flows Using Dual-Energy Gamma Densitometry and Neural Networks," Nuclear Instruments and Methods in Physics Research, vol. A327, pp. 580-593, 1993.
[10] D. Michie, D.J. Spiegelhalter, and C.C. Taylor, Machine Learning, Neural and Statistical Classification.New York: Ellis Horwood, 1994.
[11] R.L. Maltson and J.E. Dammann, "A Technique for Determining and Coding Subclasses in Pattern Recognition Problems," IBM J., vol. 9, pp. 294-302, 1965.
[12] J.H. Friedman and J.W. Tukey, "A Projection Pursuit Algorithm for Exploratory Data Analysis," IEEE Trans. Computers, vol. 23, pp. 881-889, 1974.
[13] A. Buja, D. Cook, and D.F. Swayne, "Interactive High-Dimensional Data Visualization," J. Computational and Graphical Statistics, vol. 5, no. 1, pp. 78-99, 1996.
[14] R. Miikkulainen, "Script Recognition With Hierarchical Feature Maps," Connection Science, vol. 2, pp. 83-101, 1990.
[15] C. Versino and L.M. Gambardella, "Learning Fine Motion by Using the Hierarchical Extended Kohonen Map," Artificial Neural Networks—ICANN 96, C. von der Malsburg, W. von Seelen, J.C. Vorbrüggen, and B. Sendhoff, eds., Lecture Notes in Computer Science, vol. 1,112, pp. 221-226.Berlin: Springer-Verlag, 1996.
[16] T. Kohonen, Self-Organizing Maps.Berlin: Springer-Verlag, 1995.
[17] C.M. Bishop, M. Svensén, and C.K.I. Williams, "GTM: The Generative Topographic Mapping," Neural Computation, vol. 10, no. 1, pp. 215-234, 1998.
[18] P. McCullagh and J.A. Nelder, Generalized Linear Models, 2nd ed. Chapman and Hall, 1989.

Index Terms:
Latent variables, data visualization, EM algorithm, hierarchical mixture model, density estimation, principal component analysis, factor analysis, maximum likelihood, clustering, statistics.
Christopher M. Bishop, Michael E. Tipping, "A Hierarchical Latent Variable Model for Data Visualization," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 3, pp. 281-293, March 1998, doi:10.1109/34.667885
Usage of this product signifies your acceptance of the Terms of Use.