This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Scale-invariant range features for time-of-flight camera applications
Anchorage, AK, USA
June 23-June 28
ISBN: 978-1-4244-2339-2
Martin Haker, Institute for Neuro- and Bioinformatics, University of Lübeck Ratzeburger Allee 160, 23538, Germany
Martin Bohme, Institute for Neuro- and Bioinformatics, University of Lübeck Ratzeburger Allee 160, 23538, Germany
Thomas Martinetz, Institute for Neuro- and Bioinformatics, University of Lübeck Ratzeburger Allee 160, 23538, Germany
Erhardt Barth, Institute for Neuro- and Bioinformatics, University of Lübeck Ratzeburger Allee 160, 23538, Germany
We describe a technique for computing scale-invariant features on range maps produced by a range sensor, such as a time-of-flight camera. Scale invariance is achieved by computing the features on the reconstructed three-dimensional surface of the object. The technique is general and can be applied to a wide range of operators. Features are computed in the frequency domain; the transform from the irregularly sampled mesh to the frequency domain uses the Nonequispaced Fast Fourier Transform. We demonstrate the technique on a facial feature detection task. On a dataset containing faces at various distances from the camera, the equal error rate (EER) for the case of scale-invariant features is halved compared to features computed on the range map in the conventional way. When the scale-invariant range features are combined with intensity features, the error rate on the test set reduces to zero.
Citation:
Martin Haker, Martin Bohme, Thomas Martinetz, Erhardt Barth, "Scale-invariant range features for time-of-flight camera applications," cvprw, pp.1-6, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008
Usage of this product signifies your acceptance of the Terms of Use.