ARTICLE
ElShafey_TPAMI_2013/IDIAP
A Scalable Formulation of Probabilistic Linear Discriminant Analysis: Applied to Face Recognition
El Shafey, Laurent
McCool, Chris
Wallace, Roy
Marcel, Sébastien
Expectation maximization
face verification
PLDA
Probablistic Model
EXTERNAL
https://publications.idiap.ch/attachments/papers/2013/ElShafey_TPAMI_2013.pdf
PUBLIC
https://publications.idiap.ch/index.php/publications/showcite/ElShafey_Idiap-RR-07-2013
Related documents
IEEE Transactions on Pattern Analysis and Machine Intelligence
35
7
1788-1794
2013
https://pypi.python.org/pypi/xbob.paper.tpami2013
URL
10.1109/TPAMI.2013.38
doi
In this paper we present a scalable and exact solution for probabilistic linear discriminant analysis (PLDA). PLDA is a probabilistic model that has been shown to provide state-of-the-art performance for both face and speaker recognition. However, it has one major drawback, at training time estimating the latent variables requires the inversion and storage of a matrix whose size grows quadratically with the number of samples for the identity (class). To date two approaches have been taken to deal with this problem, to: i) use an exact solution which calculates this large matrix and is obviously not scalable with the number of samples or ii) derive a variational approximation to the problem. We present a scalable derivation which is theoretically equivalent to the previous non-scalable solution and so obviates the need for a variational approximation. Experimentally, we demonstrate the efficacy of our approach in two ways. First, on Labelled Faces in the Wild we illustrate the equivalence of our scalable implementation with previously published work. Second, on the large Multi-PIE database, we illustrate the gain in performance when using more training samples per identity (class), which is made possible by the proposed scalable formulation of PLDA.
REPORT
ElShafey_Idiap-RR-07-2013/IDIAP
A Scalable Formulation of Probabilistic Linear Discriminant Analysis: Applied to Face Recognition
El Shafey, Laurent
McCool, Chris
Wallace, Roy
Marcel, Sébastien
EXTERNAL
https://publications.idiap.ch/attachments/reports/2013/ElShafey_Idiap-RR-07-2013.pdf
PUBLIC
Idiap-RR-07-2013
2013
Idiap
March 2013
Accepted for publication
In this paper we present a scalable and exact solution for probabilistic linear discriminant analysis (PLDA). PLDA is a probabilistic model that has been shown to provide state-of-the-art performance for both face and speaker recognition. However, it has one major drawback, at training time estimating the latent variables requires the inversion and storage of a matrix whose size grows quadratically with the number of samples for the identity (class). To date two approaches have been taken to deal with this problem, to: i) use an exact solution which calculates this large matrix and is obviously not scalable with the number of samples or ii) derive a variational approximation to the problem.
We present a scalable derivation which is theoretically equivalent to the previous non-scalable solution and so obviates the need for a variational approximation. Experimentally, we demonstrate the efficacy of our approach in two ways. First, on Labelled Faces in the Wild we illustrate the equivalence of our scalable implementation with previously published work. Second, on the large Multi-PIE database, we illustrate the gain in performance when using more training samples per identity (class), which is
made possible by the proposed scalable formulation of PLDA.
https://pypi.python.org/pypi/xbob.paper.tpami2013
URL