
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Cristina Adela Popescu, Yau Shu Wong, "Nested Monte Carlo EM Algorithm for Switching StateSpace Models," IEEE Transactions on Knowledge and Data Engineering, vol. 17, no. 12, pp. 16531663, December, 2005.  
BibTex  x  
@article{ 10.1109/TKDE.2005.202, author = {Cristina Adela Popescu and Yau Shu Wong}, title = {Nested Monte Carlo EM Algorithm for Switching StateSpace Models}, journal ={IEEE Transactions on Knowledge and Data Engineering}, volume = {17}, number = {12}, issn = {10414347}, year = {2005}, pages = {16531663}, doi = {http://doi.ieeecomputersociety.org/10.1109/TKDE.2005.202}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Knowledge and Data Engineering TI  Nested Monte Carlo EM Algorithm for Switching StateSpace Models IS  12 SN  10414347 SP1653 EP1663 EPD  16531663 A1  Cristina Adela Popescu, A1  Yau Shu Wong, PY  2005 KW  Index Terms Time series analysis KW  machine learning KW  Markov processes KW  Kalman filtering KW  probabilistic algorithms KW  parameter learning KW  Monte Carlo simulation. VL  17 JA  IEEE Transactions on Knowledge and Data Engineering ER   
[1] E. Keogh and P. Smyth, “A Probabilistic Approach to Fast Pattern Matching in Time Series Databases,” Proc. Third Int'l Conf. Knowledge Discovery and Data Mining, 1997.
[2] D. Berndt and J. Clifford, “Finding Patterns, Dynamic Programming Approach,” Advances in Knowledge Discovery and Data Mining, G. Fayyad et al., eds., pp. 229248, 1996.
[3] R. Povinelli and X. Feng, “A New Temporal Pattern Identification Method for Characterization and Prediction of Complex Time Series Events,” IEEE Trans. Knowledge and Data Eng., vol. 15, no. 2, pp. 339352, Mar./Apr. 2003.
[4] C.A. Popescu and Y.S. Wong, “Monte Carlo Approach for Switching StateSpace Models,” Proc. 17th Int'l Conf. Industrial and Eng. Applications of Artificial Intelligence and Expert Systems, pp. 945955, 2004.
[5] Z. Ghahramani and G. Hinton, “Variational Learning for Switching StateSpace Models,” Neural Computation, vol. 12, no. 4, pp. 963996, 2000.
[6] R. Shumway and D. Stoffer, “Dynamic Linear Models with Switching,” J. Am. Statistics Assoc., vol. 86, pp. 763769, 1991.
[7] Z. Ghahramani, “An Introduction to Hidden Markov Models and Bayesian Networks,” Int'l J. Pattern Recognition and Artificial Intelligence, vol. 15, no. 1, pp. 942, 2001.
[8] P. Smyth, D. Heckerman, and M. Jordan, “Probabilistic Independence Networks for Hidden Markov Probability Models,” Neural Computation, vol. 9, pp. 227269, 1997.
[9] R. Neal and G. Hinton, “A View of the EM Algorithm that Justifies Incremental, Sparse, and Other Variants,” Learning in Graphical Models, M. Jordan, ed., 1998.
[10] G. Wei and M. Tanner, “A Monte Carlo Implementation of the EM Algorithm and the Poor Men's Data Augmentation Algorithm,” J. Am. Statistics Assoc., vol. 85, pp. 699704, 1990.
[11] C. Carter and R. Kohn, “On Gibbs Sampling for State Space Models,” Biometrika, vol. 81, pp. 541553, 1994.
[12] R. Shumway and D. Stoffer, “An Approach to Time Series Smoothing and Forecasting Using the EM Algorithm,” J. Time Series Analysis, vol. 3, no. 4, pp. 253264, 1982.
[13] D. Van Dyk, “Nesting EM Algorithms for Computational Efficiency,” Statistical Sinica, vol. 10, pp. 203225, 2000.
[14] C. Kim, “Dynamic Linear Models with MarkovSwitching,” J. Econometrics, vol. 60, pp. 122, 1994.
[15] G. McLachlan and D. Peel, Finite Mixture Models. John Wiley, 2000.
[16] R. Jacobs, M. Jordan, S. Nowlan, and G. Hinton, “Adaptive Mixture of Local Experts,” Neural Computation, vol. 3, pp. 7987, 1991.
[17] S. Goldfeld and R. Quandt, “A Markov Model for Switching Regression,” J. Econometrics, vol. 1, pp. 316, 1973.
[18] J. Hamilton, “Analysis of Time Series Subject to Changes in Regime,” J. Econometrics, vol. 45, pp. 3770, 1990.
[19] C. Kim and C. Nelson, State Space Models with Regime Switching. The MIT Press, 1999.
[20] Y. BarShalom and X.R. Li, Estimation and Tracking. Boston, Mass.: Artech House, 1993.
[21] N. Ueda and R. Nakano, “Deterministic Annealing Variant of the EM Algorithm,” Advances in Neural Information Processing Systems, D. Tesauro et al., eds., pp. 545552, 1995.
[22] X. Meng and D. Van Dyk, “The EM Algorithm— An Old FolkSong to a Fast New Tune,” J. Research Statistics Soc. B, vol. 59, no. 3, pp. 511567, 1997.
[23] G. Robert, C.P. Celeux, and J. Diebolt, “Bayesian Estimation of Hidden Markov Chains: A Stochastic Implementation,” Statistics and Probability Letters, vol. 16, pp. 7783, 1993.
[24] L. Tierney, “Markov Chains for Exploring Posterior Distributions,” Ann. Statistics, vol. 22, no. 4, pp. 17011728, 1994.
[25] F. Le Gland and L. Mevel, “Basic Properties of the Projective Product, with Application to Products of ColumnAllowable Nonnegative Matrices,” Math. Control Signals Systems, vol. 13, pp. 4162, 2000.
[26] J. Durbin and S. Koopman, “A Simple and Efficient Simulation Smoother for StateSpace Time Series Analysis,” Biometrika, vol. 89, no. 3, pp. 603615, 2002.
[27] J. Diebolt and C. Robert, “Estimation of Finite Mixture Distributions through Bayesian Sampling,” J. Research Statistics Soc. B, vol. 56, no. 2, pp. 363375, 1994.
[28] M. Newton and A. Raftery, “Approximate Bayesian Inference with the Weighted Likelihood Bootstrap,” J. Research Statistics Soc. B, vol. 56, no. 1, pp. 348, 1994.
[29] S. Elaydi, An Introduction to Difference Equations. Springer Verlag, 1996.
[30] H. Tong, Nonlinear Time Series. New York: The Clarendon Press Oxford Univ. Press, 1990.
[31] C. Biernacki, G. Celeux, and G. Govaert, “Choosing Starting Values for the EM Algorithm for Getting the Highest Likelihood in Multivariate Gaussian Mixture Models,” Computer Statistics Data Analysis, vol. 41, pp. 561575, 2003.
[32] F. LeGland and L. Mevel, “Exponential Forgetting and Geometric Ergodicity in Hidden Markov Models,” Math. Control Signals Systems, vol. 13, pp. 6393, 2000.