Louise Gertz, MPBME, and Albin Lindmark, MPSYS

Knowledge distillation for face recognition on lightweight neural networks

Password: 214931

Examiner: Marija Furdek, Dept of Electrical Engineering
Supervisors: Kenneth Jonsson, Smart Eye, Ahmet Oguz Kislal, Dept of Electrical Engineering
Opponent: Jacob Larsson

Abstract


Face recognition is a common bio-metric used in everyday commercial products and is also widely used in safety and surveillance. Accuracy is critical when face recognition is used for authentication purposes. Implementation of accurate face recognition using CNN models is limited to deployment in high-end complex systems due to the computational complexity. Viable implementation of accurate face recognition in mobile devices demands less computationally expensive methods, such as smaller models. This thesis investigates the potential of knowledge distillation (KD), a machine learning technique used to improve a small models performance by transferring knowledge from a large model to the smaller one. KD was implemented on CNNs trained for the task of face-identification and -verification using low resolution near infrared images. Both identification and verification models trained with KD achieved a higher accuracy than the reference models trained with standard procedures. Methods using a staged training procedure or hints comparing features of the models was shown to further improve KD and is useful when there is a large discrepancy between model sizes. Training using KD was proven to increase learning, thus making it possible to increase the accuracy of small face recognition networks.
Category Student project presentation
Location: Web seminar
Starts: 22 January, 2021, 13:15
Ends: 22 January, 2021, 14:15

Page manager Published: Wed 13 Jan 2021.