The Community for Technology Leaders
Green Image
Issue No. 05 - May (2013 vol. 35)
ISSN: 0162-8828
pp: 1193-1205
Shengcai Liao , Center for Biometrics & Security Res., Inst. of Autom., Beijing, China
A. K. Jain , Dept. of Comput. Sci. & Eng., Michigan State Univ., East Lansing, MI, USA
S. Z. Li , Center for Biometrics & Security Res., Inst. of Autom., Beijing, China
Numerous methods have been developed for holistic face recognition with impressive performance. However, few studies have tackled how to recognize an arbitrary patch of a face image. Partial faces frequently appear in unconstrained scenarios, with images captured by surveillance cameras or handheld devices (e.g., mobile phones) in particular. In this paper, we propose a general partial face recognition approach that does not require face alignment by eye coordinates or any other fiducial points. We develop an alignment-free face representation method based on Multi-Keypoint Descriptors (MKD), where the descriptor size of a face is determined by the actual content of the image. In this way, any probe face image, holistic or partial, can be sparsely represented by a large dictionary of gallery descriptors. A new keypoint descriptor called Gabor Ternary Pattern (GTP) is also developed for robust and discriminative face recognition. Experimental results are reported on four public domain face databases (FRGCv2.0, AR, LFW, and PubFig) under both the open-set identification and verification scenarios. Comparisons with two leading commercial face recognition SDKs (PittPatt and FaceVACS) and two baseline algorithms (PCA+LDA and LBP) show that the proposed method, overall, is superior in recognizing both holistic and partial faces without requiring alignment.
Face, Face recognition, Detectors, Image edge detection, Robustness, Lighting, Histograms

Shengcai Liao, A. K. Jain and S. Z. Li, "Partial Face Recognition: Alignment-Free Approach," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 35, no. 5, pp. 1193-1205, 2013.
571 ms
(Ver 3.3 (11022016))