This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Statistical Hough Transform
August 2009 (vol. 31 no. 8)
pp. 1502-1509
Rozenn Dahyot, Trinity College Dublin, Dublin
The Standard Hough Transform is a popular method in image processing and is traditionally estimated using histograms. Densities modeled with histograms in high dimensional space and/or with few observations, can be very sparse and highly demanding in memory. In this paper, we propose first to extend the formulation to continuous kernel estimates. Second, when dependencies in between variables are well taken into account, the estimated density is also robust to noise and insensitive to the choice of the origin of the spatial coordinates. Finally, our new statistical framework is unsupervised (all needed parameters are automatically estimated) and flexible (priors can easily be attached to the observations). We show experimentally that our new modeling encodes better the alignment content of images.

[1] P. Hough, “Methods of Means for Recognizing Complex Patterns,” US Patent 3 069 654, 1962.
[2] R.O. Duda and P.E. Hart, “Use of the Hough Transformation to Detect Lines and Curves in Pictures,” Comm. ACM, vol. 15, pp. 11-15, Jan. 1972.
[3] J.-Y. Goulermas and P. Liatsis, “Incorporating Gradient Estimations in a Circle-Finding Probabilistic Hough Transform,” Pattern Analysis and Applications, vol. 2, pp. 239-250, 1999.
[4] A. Goldenshluger and A. Zeevi, “The Hough Transform Estimator,” The Annals of Statistics, vol. 32, no. 5, pp. 1908-1932, Oct. 2004.
[5] A.S. Aguado, E. Montiel, and M.S. Nixon, “Bias Error Analysis of the Generalized Hough Transform,” J. Math. Imaging and Vision, vol. 12, pp. 25-42, 2000.
[6] M. Bober and J. Kittler, “Estimation of Complex Multimodal Motion: An Approach Based on Robust Statistics and Hough Transform,” Image and Vision Computing J., vol. 12, no. 10, pp. 661-668, Dec. 1994.
[7] P. Ballester, “Hough Transform and Astronomical Data Analysis,” Vistas in Astronomy, vol. 40, no. 4, pp. 479-485, 1996.
[8] G.R.J. Cooper and D.R. Cowan, “The Detection of Circular Features in Irregularly Spaced Data,” Computers & Geosciences, vol. 30, no. 1, pp. 101-105, Feb. 2004.
[9] C. Schmid, R. Mohr, and C. Bauckhage, “Evaluation of Interest Point Detectors,” Int'l J. Computer Vision, vol. 37, no. 2, pp. 151-172, 2000.
[10] B. Schiele and J.L. Crowley, “Recognition without Correspondence Using Multidimensional Receptive Field Histograms,” Int'l J. Computer Vision, vol. 36, no. 1, pp. 31-50, Jan. 2000.
[11] B.W. Silverman, Density Estimation for Statistics and Data Analysis. Chapman and Hall, 1986.
[12] D. Comaniciu and P. Meer, “Mean Shift: A Robust Approach Toward Feature Space Analysis,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 24, no. 5, pp. 603-619, May 2002.
[13] R. Dahyot, P. Charbonnier, and F. Heitz, “Unsupervised Statistical Change Detection in Camera-in-Motion Video,” Proc. IEEE Int'l Conf. Image Processing, Oct. 2001.
[14] Q. Ji and R.M. Haralick, “Error Propagation for the Hough Transform,” Pattern Recognition Letters, vol. 22, pp. 813-823, 2001.
[15] A. Bonci, T. Leo, and S. Longhi, “A Bayesian Approach to the Hough Transform for Line Detection,” IEEE Trans. Systems, Man, and Cybernetics, vol. 35, no. 6, pp. 945-955, Nov. 2005.
[16] G. Lai and R.D. Figueiredo, “A Novel Algorithm for Edge Detection from Direction-Derived Statistics,” Proc. IEEE Int'l Symp. Circuits and Systems, vol. 5, pp. 37-40, May 2000.
[17] R. Dahyot, N. Rea, A. Kokaram, and N. Kingsbury, “Inlier Modeling for Multimedia Data Analysis,” Proc. IEEE Int'l Workshop Multimedia Signal Processing, pp. 482-485, Sept. 2004.
[18] R. Dahyot and S. Wilson, “Robust Scale Estimation for the Generalized Gaussian Probability Density Function,” Advances in Methodology and Statistics (Metodološki zvezki), vol. 3, no. 1, pp. 21-37, 2006.
[19] P. Meer and B. Georgescu, “Edge Detection with Embedded Confidence,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, no. 12, pp.1351-1365, Dec. 2001.
[20] N. Aggarwal and W.C. Karl, “Line Detection in Image through Regularized Hough Transform,” IEEE Trans. Image Processing, vol. 15, no. 3, pp. 582-591, Mar. 2006.
[21] M.A. Fischler and R.C. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Comm. ACM, vol. 24, no. 6, pp. 381-395, 1981.
[22] D. Walsh and A.E. Raftery, “Accurate and Efficient Curve Detection in Images: The Importance Sampling Hough Transform,” Pattern Recognition, vol. 35, pp. 1421-1431, 2002.
[23] A. Bandera, J.P.B.J.M. Pérez-Lorenzo, and F. Sandoval, “Mean Shift Based Clustering of Hough Domain for Fast Line Segment Detection,” Pattern Recognition Letters, vol. 27, pp. 578-586, 2006.
[24] R.S. Stephens, “Probabilistic Approach to the Hough Transform,” Image and Vision Computing J., vol. 9, no. 1, pp. 66-71, Feb. 1991.
[25] P. Huber, Robust Statistics. John Wiley and Sons, 1981.
[26] R.M. Steele and C. Jaynes, “Feature Uncertainty Arising from Covariant Image Noise,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp.1063-1069, 2005.
[27] J. Princen, J. Illingworth, and J. Kittler, “A Formal Definition of the Hough Transform: Properties and Relationships,” J. Math. Imaging and Vision, vol. 1, no. 2, pp. 153-168, 1992.
[28] S.J. Sheather, “Density Estimation,” Statistical Science, vol. 19, no. 4, pp. 588-597, 2004.
[29] W.T. Freeman, “Steerable Filters and Local Analysis of Image Structure,” PhD dissertation, Massachusetts Inst. of Tech nology, 1992.
[30] R. Dahyot, “Bayesian Classification for the Statistical Hough Transform,” Proc. IEEE Int'l Conf. Pattern Recognition, Dec. 2008.

Index Terms:
Hough transform, Radon transform, kernel probability density function, uncertainty, line detection.
Citation:
Rozenn Dahyot, "Statistical Hough Transform," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 8, pp. 1502-1509, Aug. 2009, doi:10.1109/TPAMI.2008.288
Usage of this product signifies your acceptance of the Terms of Use.