Publikationen

QUD: Unsupervised Knowledge Distillation for Deep Face Recognition

AutorKolf, Jan Niklas; Damer, Naser; Boutros, Fadi
Datum2024
ArtConference Paper
AbstraktWe present in this paper an unsupervised knowledge distillation (KD) approach, namely QUD, for face recognition. The proposed QUD approach utilizes a queue of features within a contrastive learning setup to guide the student model to learn a feature representation similar to its counterpart obtained from the teacher and dissimilar from the ones that are stored in a queue. This queue is updated by pushing a batch of feature representations obtained from the teacher into the queue and dequeuing the oldest ones from the queue in each training iteration. We additionally incorporate a temperature into the contrastive loss to control how sensitive contrastive learning is to samples considered negative in the queue. The proposed unsupervised QUD approach does not require accessing the same dataset used to train the teacher model or even for the data to have identity labels. The effectiveness of the proposed approach is demonstrated through several sensitivity studies on different teacher architectures and using different datasets for student training in the KD framework. Additionally, the achieved results on mainstream benchmarks by our unsupervised QUD are compared to state-of-the-art (SOTA), achieving very competitive performances and even outperforming SOTA on several benchmarks. Code and pre-trained models are available under https://github.com/jankolf/QUD.
KonferenzBritish Machine Vision Conference 2024
ProjektNext Generation Biometric Systems
Urlhttps://doi.org/10.24406/publica-4120