This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Hough Forest Random Field for Object Recognition and Segmentation
May 2013 (vol. 35 no. 5)
pp. 1066-1079
N. Payet, Sch. of Electr. Eng. & Comput. Sci., Oregon State Univ., Corvallis, OR, USA
S. Todorovic, Sch. of Electr. Eng. & Comput. Sci., Oregon State Univ., Corvallis, OR, USA
This paper presents a new computational framework for detecting and segmenting object occurrences in images. We combine Hough forest (HF) and conditional random field (CRF) into HFRF to assign labels of object classes to image regions. HF captures intrinsic and contextual properties of objects. CRF then fuses the labeling hypotheses generated by HF for identifying every object occurrence. Interaction between HF and CRF happens in HFRF inference, which uses the Metropolis-Hastings algorithm. The Metropolis-Hastings reversible jumps depend on two ratios of proposal and posterior distributions. Instead of estimating four distributions, we directly compute the two ratios using HF. In leaf nodes, HF records class histograms of training examples and information about their configurations. This evidence is used in inference for nonparametric estimation of the two distribution ratios. Our empirical evaluation on benchmark datasets demonstrates higher average precision rates of object detection, smaller object segmentation error, and faster convergence rates of our inference, relative to the state of the art. The paper also presents theoretical error bounds of HF and HFRF applied to a two-class object detection and segmentation.
Index Terms:
Training,Hafnium,Image segmentation,Object recognition,Vegetation,Proposals,Image edge detection,Metropolis-Hastings algorithm,Object recognition and segmentation,conditional random field,Hough forest
Citation:
N. Payet, S. Todorovic, "Hough Forest Random Field for Object Recognition and Segmentation," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 5, pp. 1066-1079, May 2013, doi:10.1109/TPAMI.2012.194
Usage of this product signifies your acceptance of the Terms of Use.