This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Nested Monte Carlo EM Algorithm for Switching State-Space Models
December 2005 (vol. 17 no. 12)
pp. 1653-1663
Switching state-space models have been widely used in many applications arising from science, engineering, economic, and medical research. In this paper, we present a Monte Carlo Expectation Maximization (MCEM) algorithm for learning the parameters and classifying the states of a state-space model with a Markov switching. A stochastic implementation based on the Gibbs sampler is introduced in the expectation step of the MCEM algorithm. We study the asymptotic properties of the proposed algorithm, and we also describe how a nesting approach and the Rao-Blackwellized forms can be employed to accelerate the rate of convergence of the MCEM algorithm. Finally, the performance and the effectiveness of the proposed method are demonstrated by applications to simulated and physiological experimental data.

[1] E. Keogh and P. Smyth, “A Probabilistic Approach to Fast Pattern Matching in Time Series Databases,” Proc. Third Int'l Conf. Knowledge Discovery and Data Mining, 1997.
[2] D. Berndt and J. Clifford, “Finding Patterns, Dynamic Programming Approach,” Advances in Knowledge Discovery and Data Mining, G. Fayyad et al., eds., pp. 229-248, 1996.
[3] R. Povinelli and X. Feng, “A New Temporal Pattern Identification Method for Characterization and Prediction of Complex Time Series Events,” IEEE Trans. Knowledge and Data Eng., vol. 15, no. 2, pp. 339-352, Mar./Apr. 2003.
[4] C.A. Popescu and Y.S. Wong, “Monte Carlo Approach for Switching State-Space Models,” Proc. 17th Int'l Conf. Industrial and Eng. Applications of Artificial Intelligence and Expert Systems, pp. 945-955, 2004.
[5] Z. Ghahramani and G. Hinton, “Variational Learning for Switching State-Space Models,” Neural Computation, vol. 12, no. 4, pp. 963-996, 2000.
[6] R. Shumway and D. Stoffer, “Dynamic Linear Models with Switching,” J. Am. Statistics Assoc., vol. 86, pp. 763-769, 1991.
[7] Z. Ghahramani, “An Introduction to Hidden Markov Models and Bayesian Networks,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 15, no. 1, pp. 9-42, 2001.
[8] P. Smyth, D. Heckerman, and M. Jordan, “Probabilistic Independence Networks for Hidden Markov Probability Models,” Neural Computation, vol. 9, pp. 227-269, 1997.
[9] R. Neal and G. Hinton, “A View of the EM Algorithm that Justifies Incremental, Sparse, and Other Variants,” Learning in Graphical Models, M. Jordan, ed., 1998.
[10] G. Wei and M. Tanner, “A Monte Carlo Implementation of the EM Algorithm and the Poor Men's Data Augmentation Algorithm,” J. Am. Statistics Assoc., vol. 85, pp. 699-704, 1990.
[11] C. Carter and R. Kohn, “On Gibbs Sampling for State Space Models,” Biometrika, vol. 81, pp. 541-553, 1994.
[12] R. Shumway and D. Stoffer, “An Approach to Time Series Smoothing and Forecasting Using the EM Algorithm,” J. Time Series Analysis, vol. 3, no. 4, pp. 253-264, 1982.
[13] D. Van Dyk, “Nesting EM Algorithms for Computational Efficiency,” Statistical Sinica, vol. 10, pp. 203-225, 2000.
[14] C. Kim, “Dynamic Linear Models with Markov-Switching,” J. Econometrics, vol. 60, pp. 1-22, 1994.
[15] G. McLachlan and D. Peel, Finite Mixture Models. John Wiley, 2000.
[16] R. Jacobs, M. Jordan, S. Nowlan, and G. Hinton, “Adaptive Mixture of Local Experts,” Neural Computation, vol. 3, pp. 79-87, 1991.
[17] S. Goldfeld and R. Quandt, “A Markov Model for Switching Regression,” J. Econometrics, vol. 1, pp. 3-16, 1973.
[18] J. Hamilton, “Analysis of Time Series Subject to Changes in Regime,” J. Econometrics, vol. 45, pp. 37-70, 1990.
[19] C. Kim and C. Nelson, State Space Models with Regime Switching. The MIT Press, 1999.
[20] Y. Bar-Shalom and X.-R. Li, Estimation and Tracking. Boston, Mass.: Artech House, 1993.
[21] N. Ueda and R. Nakano, “Deterministic Annealing Variant of the EM Algorithm,” Advances in Neural Information Processing Systems, D. Tesauro et al., eds., pp. 545-552, 1995.
[22] X. Meng and D. Van Dyk, “The EM Algorithm— An Old Folk-Song to a Fast New Tune,” J. Research Statistics Soc. B, vol. 59, no. 3, pp. 511-567, 1997.
[23] G. Robert, C.P. Celeux, and J. Diebolt, “Bayesian Estimation of Hidden Markov Chains: A Stochastic Implementation,” Statistics and Probability Letters, vol. 16, pp. 77-83, 1993.
[24] L. Tierney, “Markov Chains for Exploring Posterior Distributions,” Ann. Statistics, vol. 22, no. 4, pp. 1701-1728, 1994.
[25] F. Le Gland and L. Mevel, “Basic Properties of the Projective Product, with Application to Products of Column-Allowable Nonnegative Matrices,” Math. Control Signals Systems, vol. 13, pp. 41-62, 2000.
[26] J. Durbin and S. Koopman, “A Simple and Efficient Simulation Smoother for State-Space Time Series Analysis,” Biometrika, vol. 89, no. 3, pp. 603-615, 2002.
[27] J. Diebolt and C. Robert, “Estimation of Finite Mixture Distributions through Bayesian Sampling,” J. Research Statistics Soc. B, vol. 56, no. 2, pp. 363-375, 1994.
[28] M. Newton and A. Raftery, “Approximate Bayesian Inference with the Weighted Likelihood Bootstrap,” J. Research Statistics Soc. B, vol. 56, no. 1, pp. 3-48, 1994.
[29] S. Elaydi, An Introduction to Difference Equations. Springer Verlag, 1996.
[30] H. Tong, Nonlinear Time Series. New York: The Clarendon Press Oxford Univ. Press, 1990.
[31] C. Biernacki, G. Celeux, and G. Govaert, “Choosing Starting Values for the EM Algorithm for Getting the Highest Likelihood in Multivariate Gaussian Mixture Models,” Computer Statistics Data Analysis, vol. 41, pp. 561-575, 2003.
[32] F. LeGland and L. Mevel, “Exponential Forgetting and Geometric Ergodicity in Hidden Markov Models,” Math. Control Signals Systems, vol. 13, pp. 63-93, 2000.

Index Terms:
Index Terms- Time series analysis, machine learning, Markov processes, Kalman filtering, probabilistic algorithms, parameter learning, Monte Carlo simulation.
Citation:
Cristina Adela Popescu, Yau Shu Wong, "Nested Monte Carlo EM Algorithm for Switching State-Space Models," IEEE Transactions on Knowledge and Data Engineering, vol. 17, no. 12, pp. 1653-1663, Dec. 2005, doi:10.1109/TKDE.2005.202
Usage of this product signifies your acceptance of the Terms of Use.