This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Open boundary capable edge grouping with feature maps
Anchorage, AK, USA
June 23-June 28
ISBN: 978-1-4244-2339-2
Joachim S. Stahl, Department of Computer Science and Engineering, University of South Carolina, Columbia, 29208, USA
Kenton Oliver, Department of Computer Science and Engineering, University of South Carolina, Columbia, 29208, USA
Song Wang, Department of Computer Science and Engineering, University of South Carolina, Columbia, 29208, USA
Edge grouping methods aim at detecting the complete boundaries of salient structures in noisy images. In this paper, we develop a new edge grouping method that exhibits several useful properties. First, it combines both boundary and region information by defining a unified grouping cost. The region information of the desirable structures is included as a binary feature map that is of the same size as the input image. Second, it finds the globally optimal solution of this grouping cost. We extend a prior graph-based edge grouping algorithm to achieve this goal. Third, it can detect both closed boundaries, where the structure of interest lies completely within the image perimeter, and open boundaries, where the structure of interest is cropped by the image perimeter. Given this capability for detecting both open and closed boundaries, the proposed method can be extended to segment an image into disjoint regions in a hierarchical way. Experimental results on real images are reported, with a comparison against a prior edge grouping method that can only detect closed boundaries.
Citation:
Joachim S. Stahl, Kenton Oliver, Song Wang, "Open boundary capable edge grouping with feature maps," cvprw, pp.1-8, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008
Usage of this product signifies your acceptance of the Terms of Use.