logo Idiap Research Institute        
 [BibTeX] [Marc21]
SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data
Type of publication: Conference paper
Citation: OtroshiShahreza_IJCB2023_2023
Publication status: Accepted
Booktitle: IEEE International Joint Conference on Biometrics (IJCB 2023)
Year: 2023
Abstract: State-of-the-art face recognition networks are often com- putationally expensive and cannot be used for mobile appli- cations. Training lightweight face recognition models also requires large identity-labeled datasets. Meanwhile, there are privacy and ethical concerns with collecting and using large face recognition datasets. While generating synthetic datasets for training face recognition models is an alter- native option, it is challenging to generate synthetic data with sufficient intra-class variations. In addition, there is still a considerable gap between the performance of models trained on real and synthetic data. In this paper, we propose a new framework (named SynthDistill) to train lightweight face recognition models by distilling the knowledge of a pre- trained teacher face recognition model using synthetic data. We use a pretrained face generator network to generate syn- thetic face images and use the synthesized images to learn a lightweight student network. We use synthetic face im- ages without identity labels, mitigating the problems in the intra-class variation generation of synthetic datasets. In- stead, we propose a novel dynamic sampling strategy from the intermediate latent space of the face generator network to include new variations of the challenging images while further exploring new face images in the training batch. The results on five different face recognition datasets demon- strate the superiority of our lightweight model compared to models trained on previous synthetic datasets, achiev- ing a verification accuracy of 99.52% on the LFW dataset with a lightweight network. The results also show that our proposed framework significantly reduces the gap between training with real and synthetic data. The source code for replicating the experiments will be publicly released
Keywords:
Projects Idiap
Authors Otroshi Shahreza, Hatef
George, Anjith
Marcel, S├ębastien
Added by: [UNK]
Total mark: 0
Attachments
  • OtroshiShahreza_IJCB2023_2023.pdf
Notes