This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Learning in Gibbsian Fields: How Accurate and How Fast Can It Be?
July 2002 (vol. 24 no. 7)
pp. 1001-1006

Gibbsian fields or Markov random fields are widely used in Bayesian image analysis, but learning Gibbs models is computationally expensive. The computational complexity is pronounced by the recent minimax entropy (FRAME) models which use large neighborhoods and hundreds of parameters. In this paper, we present a common framework for learning Gibbs models. We identify two key factors that determine the accuracy and speed of learning Gibbs models: The efficiency of likelihood functions and the variance in approximating partition functions using Monte Carlo integration. We propose three new algorithms. In particular, we are interested in a maximum satellite likelihood estimator, which makes use of a set of precomputed Gibbs models called "satellites" to approximate likelihood functions. This algorithm can approximately estimate the minimax entropy model for textures in seconds in a HP workstation. The performances of various learning algorithms are compared in our experiments.

[1] M.P. Almeida and B. Gidas, “A Variational Method for Estimating the Parameters of MRF from Complete and Incomplete Data,” The Annals of Applied Statistics, vol. 3, pp. 103-136, 1993.
[2] C.H. Anderson and W.D. Langer, “Statistical Models of Image Texture,” Unpublished preprint, Washington Univ., St Louis, Mo., 1996.
[3] J. Besag, “Spatial Interaction and the Statistical Analysis of Lattice Systems (with discussion),” J. Royal Statistical Soc. B, vol. 36, pp. 192-236, 1973.
[4] J. Besag, “Efficiency of Pseudo-Likelihood Estimation for Simple Gaussian Fields,” Biometrika, vol. 64, pp. 616-618, 1977.
[5] R. Chellappa and A.K. Jain, Markov Random Fields: Theory and Applications. Academic Press, 1993.
[6] J. Coughlan and A.L. Yuille, “Minutemax: A Fast Approximation for Minimax Learning,” Proc. Neural Information Processing Systems, 1998.
[7] G.R. Cross and A.K. Jain, “Markov Random Field Texture Models,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 5, pp. 25-39, 1983.
[8] H. Derin and H. Elliott, "Modelling and Segmentation of Noisy and Textured Images Using Gibbs Random Fields," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 9, no. 1, pp. 39-55, Jan. 1987.
[9] X. Descombes, R. Morris, J. Zerubia, and M. Berthod, “Maximum Likelihood Estimation of Markov Random Field Parameters Using Markov Chain Monte Carlo Algorithms,” Proc. Int'l Conf. Energy Minimization Methods in Computer Vision and Pattern Recognition, May 1997.
[10] S. Geman and D. McClure, ”Bayesian Images Analysis: An Application to Single Photon Emission Tomography,” Proc. Statistical Computer Section Am. Statistical Assoc., pp. 12-18, 1985.
[11] S. Geman and D. McClure, ”Statistical Methods for Tomographic Image Reconstruction,” Bull. Int'l Statistical Inst., vol. LII-4, pp. 5-21, 1987.
[12] C.J. Geyer and E.A. Thompson, “Constrained Monte Carlo Maximum Likelihood for Dependent Data,” J. Royal Statistical Soc. B, vol. 54, pp. 657-699, 1992.
[13] C.J. Geyer, “On the Convergence of Monte Carlo Maximum Likelihood Calculations,” J. Royal Statistical Soc. B, vol. 56, pp. 261-274, 1994.
[14] B. Gidas, “Consistency of Maximum Likelihood and Pseudo-Likelihood Estimators for Gibbs Distributions,” Stochastic Differential Systems, Stochastic Control Theory and Applications, W. Fleming and P.L. Lions, eds., New York: Springer, 1988.
[15] M. Jerrum and A. Sinclair, "Polynomial Time Approximations for the Ising Model," SIAM J. Computing, Vol. 22, No. 5, Oct. 1993, pp. 1087-1116.
[16] G.G. Potamianos and J.K. Goutsias, “Partition Function Estimation of Gibbs Random Field Images Using Monte Carlo Simulations,” IEEE Trans. Information Theory, vol. 39, pp. 1322-1332, 1993.
[17] G. Potaniamos and J. Goutsias, Stochastic Approximation Algorithms for Partition Function Estimation of Gibbs Random Fields IEEE Trans. Information Theory, vol. 43, no. 6, pp. 1948-1965, 1997.
[18] H. Robbins and S. Monro, “A Stochastic Approximation Method,” Annals Math. Statistics, vol. 22, pp. 400-407, 1951.
[19] J. Shah, “Minimax Entropy and Learning by Diffusion,” Proc. Computer Vision and Pattern Recognition, 1998.
[20] Y.N. Wu, S.C. Zhu, and X.W. Liu, “The Equivalence of Julesz and Gibbs Ensembles,” Proc. Int'l Conf. Computer Vision, Sept. 1999.
[21] L. Younes, “Estimation and Annealing for Gibbsian Fields,” Annales de l'Institut Henri Poincare, Section B, Calcul des Probabilities et Statistique, vol. 24, pp. 269-294, 1988.
[22] S.C. Zhu, Y.N. Wu, and D. Mumford, “Minimax Entropy Principle and Its Application to Texture Modeling,” Neural Computation, vol. 9, no 8, Nov. 1997.
[23] S.-C. Zhu et al., "Exploring Texture Ensembles by Efficient Markov Chain Monte Carlo—Toward a 'Trichromacy' Theory of Texture," IEEE Trans. Pattern Analysis and Machine Intelligence, Vol. 22, No. 6, 2000, pp. 554-569.

Index Terms:
Markov random fields, minimax entropy learning, texture modeling, Markov chain Monte Carlo, maximum-likelihood estimate, importance sampling.
Citation:
Song Chun Zhu, Xiuwen Liu, "Learning in Gibbsian Fields: How Accurate and How Fast Can It Be?," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 7, pp. 1001-1006, July 2002, doi:10.1109/TPAMI.2002.1017626
Usage of this product signifies your acceptance of the Terms of Use.