The Community for Technology Leaders
Green Image
Issue No. 09 - September (2006 vol. 28)
ISSN: 0162-8828
pp: 1465-1479
V. Lepetit , Comput. Vision Lab., Ecole Polytech. Fed. de Lausanne
P. Fua , Comput. Vision Lab., Ecole Polytech. Fed. de Lausanne
In many 3D object-detection and pose-estimation problems, runtime performance is of critical importance. However, there usually is time to train the system, which we would show to be very useful. Assuming that several registered images of the target object are available, we developed a keypoint-based approach that is effective in this context by formulating wide-baseline matching of keypoints extracted from the input images to those found in the model images as a classification problem. This shifts much of the computational burden to a training phase, without sacrificing recognition performance. As a result, the resulting algorithm is robust, accurate, and fast-enough for frame-rate performance. This reduction in runtime computational complexity is our first contribution. Our second contribution is to show that, in this context, a simple and fast keypoint detector suffices to support detection and tracking even under large perspective and scale variations. While earlier methods require a detector that can be expected to produce very repeatable results, in general, which usually is very time-consuming, we simply find the most repeatable object keypoints for the specific target object during the training phase. We have incorporated these ideas into a real-time system that detects planar, nonplanar, and deformable objects. It then estimates the pose of the rigid ones and the deformations of the others
Robustness, Phase estimation, Classification tree analysis, Computer vision, Pattern recognition, Deformable models, Processor scheduling, Phase detection, Statistical learning, Runtime

V. Lepetit and P. Fua, "Keypoint recognition using randomized trees," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 28, no. 9, pp. 1465-1479, 2009.
93 ms
(Ver 3.3 (11022016))