2018 24th International Conference on Pattern Recognition (ICPR) (2018)
Aug. 20, 2018 to Aug. 24, 2018
Nikolaos Passalis , Department of Informatics, Aristotle University of Thessaloniki, Thessaloniki, 54124, Greece
Anastasios Tefas , Department of Informatics, Aristotle University of Thessaloniki, Thessaloniki, 54124, Greece
Transferring the knowledge from a large and complex neural network to a smaller and faster one allows for deploying more lightweight and accurate networks. In this paper, we propose a novel method that is capable of transferring the knowledge between any two layers of two neural networks by matching the similarity between the extracted representations. The proposed method is model-agnostic overcoming several limitations of existing knowledge transfer techniques, since the knowledge is transferred between layers that can have different architecture and no information about the complex model is required, apart from the output of the layers employed for the knowledge transfer. Three image datasets are used to demonstrate the effectiveness of the proposed approach, including a large-scale dataset for learning a light-weight model for facial pose estimation that can be directly deployed on devices with limited computational resources, such as embedded systems for drones.
Receivers, Knowledge transfer, Training, Knowledge engineering, Neural networks, Computational modeling, Geometry
N. Passalis and A. Tefas, "Neural Network Knowledge Transfer using Unsupervised Similarity Matching," 2018 24th International Conference on Pattern Recognition (ICPR), Beijing, China, 2018, pp. 716-721.