The Community for Technology Leaders
Green Image
We develop a novel method for class-based feature matching across large changes in viewing conditions. The method is based on the property that when objects share a similar part, the similarity is preserved across viewing conditions. Given a feature and a training set of object images, we first identify the subset of objects that share this feature. The transformation of the feature's appearance across viewing conditions is determined mainly by properties of the feature, rather than of the object in which it is embedded. Therefore, the transformed feature will be shared by approximately the same set of objects. Based on this consistency requirement, corresponding features can be reliably identified from a set of candidate matches. Unlike previous approaches, the proposed scheme compares feature appearances only in similar viewing conditions, rather than across different viewing conditions. As a result, the scheme is not restricted to locally planar objects or affine transformations. The approach also does not require examples of correct matches. We show that by using the proposed method, a dense set of accurate correspondences can be obtained. Experimental comparisons demonstrate that matching accuracy is significantly improved over previous schemes. Finally, we show that the scheme can be successfully used for invariant object recognition.
Feature matching, invariant recognition, parts.
Evgeniy Bart, Shimon Ullman, "Class-Based Feature Matching Across Unrestricted Transformations", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 30, no. , pp. 1618-1631, September 2008, doi:10.1109/TPAMI.2007.70818
105 ms
(Ver 3.1 (10032016))