A study on three linear discriminant analysis based methods in small sample size problem

作者:

Highlights:

摘要

In this paper, we make a study on three linear discriminant analysis (LDA) based methods: regularized discriminant analysis (RDA), discriminant common vectors (DCV) and maximal margin criterion (MMC) in the small sample size (SSS) problem. Our contributions are that: (1) we reveal that DCV obtains the same projection subspace as both RDA and wMMC (weighted MMC, a general form of MMC) when RDA's regularization parameter tends to zero and wMMC's weight parameter approaches to +∞, which builds on close relationships among these three LDA based methods; (2) we offer efficient algorithms to perform RDA and wMMC in the principal component analysis transformed space, which makes them feasible and efficient to applications such as face recognition; (3) we formulate the eigenvalue distribution of wMMC. On one hand, the formulated eigenvalue distribution can guide practitioners in choosing wMMC's projection vectors, and on the other hand, the underlying methodology can be employed in analyzing the eigenvalue distribution of matrices such as AAT-BBT, where the rows of A and B are far larger than their columns; and (4) we compare their classification performance on several benchmarks to get that, when the mean standard variance (MSV) criterion is small, DCV can obtain competitive classification performance to both RDA and wMMC under optimal parameters, but when MSV is large, DCV generally yields lower classification accuracy than RDA and wMMC under optimal parameters.

论文关键词:Regularized discriminant analysis (RDA),Discriminant common vectors (DCV),Maximal margin criterion (MMC),Small sample size (SSS),Eigenvalue distribution

论文评审过程:Received 19 September 2006, Revised 30 May 2007, Accepted 6 June 2007, Available online 15 June 2007.

论文官网地址:https://doi.org/10.1016/j.patcog.2007.06.001