The Community for Technology Leaders
2012 IEEE Conference on Computer Vision and Pattern Recognition (2012)
Providence, RI USA
June 16, 2012 to June 21, 2012
ISSN: 1063-6919
ISBN: 978-1-4673-1226-4
pp: 3162-3169
V. Ferrari , Univ. of Edinburgh, Edinburgh, UK
J. M. Buhmann , ETH Zurich, Zurich, Switzerland
A. Vezhnevets , ETH Zurich, Zurich, Switzerland
ABSTRACT
We address the problem of semantic segmentation: classifying each pixel in an image according to the semantic class it belongs to (e.g. dog, road, car). Most existing methods train from fully supervised images, where each pixel is annotated by a class label. To reduce the annotation effort, recently a few weakly supervised approaches emerged. These require only image labels indicating which classes are present. Although their performance reaches a satisfactory level, there is still a substantial gap between the accuracy of fully and weakly supervised methods. We address this gap with a novel active learning method specifically suited for this setting. We model the problem as a pairwise CRF and cast active learning as finding its most informative nodes. These nodes induce the largest expected change in the overall CRF state, after revealing their true label. Our criterion is equivalent to maximizing an upper-bound on accuracy gain. Experiments on two data-sets show that our method achieves 97% percent of the accuracy of the corresponding fully supervised model, while querying less than 17% of the (super-)pixel labels.
INDEX TERMS
learning (artificial intelligence), image segmentation, pairwise CRF, active learning, semantic segmentation, expected change, fully supervised images, class label, image label, weakly supervised method, Semantics, Labeling, Training, Image segmentation, Roads, Accuracy, Computational modeling
CITATION

V. Ferrari, J. M. Buhmann and A. Vezhnevets, "Active learning for semantic segmentation with expected change," 2012 IEEE Conference on Computer Vision and Pattern Recognition(CVPR), Providence, RI USA, 2012, pp. 3162-3169.
doi:10.1109/CVPR.2012.6248050
160 ms
(Ver 3.3 (11022016))