logo Idiap Research Institute        
 [BibTeX] [Marc21]
A Scalable Formulation of Probabilistic Linear Discriminant Analysis: Applied to Face Recognition
Type of publication: Journal paper
Citation: ElShafey_TPAMI_2013
Publication status: Published
Journal: IEEE Transactions on Pattern Analysis and Machine Intelligence
Volume: 35
Number: 7
Year: 2013
Month: July
Pages: 1788-1794
Crossref: ElShafey_Idiap-RR-07-2013:
URL: https://pypi.python.org/pypi/x...
DOI: 10.1109/TPAMI.2013.38
Abstract: In this paper we present a scalable and exact solution for probabilistic linear discriminant analysis (PLDA). PLDA is a probabilistic model that has been shown to provide state-of-the-art performance for both face and speaker recognition. However, it has one major drawback, at training time estimating the latent variables requires the inversion and storage of a matrix whose size grows quadratically with the number of samples for the identity (class). To date two approaches have been taken to deal with this problem, to: i) use an exact solution which calculates this large matrix and is obviously not scalable with the number of samples or ii) derive a variational approximation to the problem. We present a scalable derivation which is theoretically equivalent to the previous non-scalable solution and so obviates the need for a variational approximation. Experimentally, we demonstrate the efficacy of our approach in two ways. First, on Labelled Faces in the Wild we illustrate the equivalence of our scalable implementation with previously published work. Second, on the large Multi-PIE database, we illustrate the gain in performance when using more training samples per identity (class), which is made possible by the proposed scalable formulation of PLDA.
Keywords: Expectation maximization, face verification, PLDA, Probablistic Model
Projects BBfor2
TABULA RASA
Authors El Shafey, Laurent
McCool, Chris
Wallace, Roy
Marcel, S├ębastien
Added by: [UNK]
Total mark: 0
Attachments
  • ElShafey_TPAMI_2013.pdf
Notes