A Scalable Formulation of Probabilistic Linear Discriminant Analysis: Applied to Face Recognition
2013
In this paper, we present a scalable and exact solution for probabilistic linear discriminant analysis (PLDA). PLDA is a probabilistic model that has been shown to provide state-of-the-art performance for both face and speaker recognition. However, it has one major drawback: At training time estimating the latent variables requires the inversion and storage of a matrix whose size grows quadratically with the number of samples for the identity (class). To date, two approaches have been taken to deal with this problem, to 1) use an exact solution that calculates this large matrix and is obviously not scalable with the number of samples or 2) derive a variational approximation to the problem. We present a scalable derivation which is theoretically equivalent to the previous nonscalable solution and thus obviates the need for a variational approximation. Experimentally, we demonstrate the efficacy of our approach in two ways. First, on labeled faces in the wild, we illustrate the equivalence of our scalable implementation with previously published work. Second, on the large Multi-PIE database, we illustrate the gain in performance when using more training samples per identity (class), which is made possible by the proposed scalable formulation of PLDA.
Keywords:
- Expectation–maximization algorithm
- Computer vision
- Artificial intelligence
- Computer science
- Speaker recognition
- Statistical model
- Linear discriminant analysis
- Scalability
- Matrix (mathematics)
- Pattern recognition
- Probabilistic logic
- Latent variable
- Theoretical computer science
- Equivalence (measure theory)
- Facial recognition system
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
18
References
65
Citations
NaN
KQI