The Community for Technology Leaders
2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018) (2018)
Xi'an, China
May 15, 2018 to May 19, 2018
ISBN: 978-1-5386-2335-0
pp: 453-457
Face recognition is challenge task which involves determining the identity of facial images. With availability of a massive amount of labeled facial images gathered from Internet, deep convolution neural networks(DCNNs) have achieved great success in face recognition tasks. Those images are gathered from unconstrain environment, which contain people with different ethnicity, age, gender and so on. However, in the actual application scenario, the target face database may be gathered under different conditions compared with source training dataset, e.g. different ethnicity, different age distribution, disparate shooting environment. These factors increase domain discrepancy between source training database and target application database which makes the learnt model degenerate in target database. Meanwhile, for the target database where labeled data are lacking or unavailable, directly using target data to fine-tune pre-learnt model becomes intractable and impractical. In this paper, we adopt unsupervised transfer learning methods to address this issue. To alleviate the discrepancy between source and target face database and ensure the generalization ability of the model, we constrain the maximum mean discrepancy (MMD) between source database and target database and utilize the massive amount of labeled facial images of source database to training the deep neural network at the same time. We evaluate our method on two face recognition benchmarks and significantly enhance the performance without utilizing the target label.
face recognition, feedforward neural nets, unsupervised learning, visual databases

Z. Luo, J. Hu, W. Deng and H. Shen, "Deep Unsupervised Domain Adaptation for Face Recognition," 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018)(FG), Xi'an, China, 2018, pp. 453-457.
241 ms
(Ver 3.3 (11022016))