The Community for Technology Leaders
Green Image
Issue No. 09 - September (2009 vol. 31)
ISSN: 0162-8828
pp: 1567-1581
Tal Arbel , McGill University, Montreal
Matthew Toews , Harvard Medical School, Boston
ABSTRACT
This paper presents a novel framework for detecting, localizing, and classifying faces in terms of visual traits, e.g., sex or age, from arbitrary viewpoints and in the presence of occlusion. All three tasks are embedded in a general viewpoint-invariant model of object class appearance derived from local scale-invariant features, where features are probabilistically quantified in terms of their occurrence, appearance, geometry, and association with visual traits of interest. An appearance model is first learned for the object class, after which a Bayesian classifier is trained to identify the model features indicative of visual traits. The framework can be applied in realistic scenarios in the presence of viewpoint changes and partial occlusion, unlike other techniques assuming data that are single viewpoint, upright, prealigned, and cropped from background distraction. Experimentation establishes the first result for sex classification from arbitrary viewpoints, an equal error rate of 16.3 percent, based on the color FERET database. The method is also shown to work robustly on faces in cluttered imagery from the CMU profile database. A comparison with the geometry-free bag-of-words model shows that geometrical information provided by our framework improves classification. A comparison with support vector machines demonstrates that Bayesian classification results in superior performance.
INDEX TERMS
Scale-invariant feature, viewpoint invariance, probabilistic modeling, visual trait, sex classification, faces, occlusion.
CITATION
Tal Arbel, Matthew Toews, "Detection, Localization, and Sex Classification of Faces from Arbitrary Viewpoints and under Occlusion", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 31, no. , pp. 1567-1581, September 2009, doi:10.1109/TPAMI.2008.233
315 ms
(Ver 3.1 (10032016))