A comparison of generalized linear discriminant analysis algorithms
作者:
Highlights:
•
摘要
Linear discriminant analysis (LDA) is a dimension reduction method which finds an optimal linear transformation that maximizes the class separability. However, in undersampled problems where the number of data samples is smaller than the dimension of data space, it is difficult to apply LDA due to the singularity of scatter matrices caused by high dimensionality. In order to make LDA applicable, several generalizations of LDA have been proposed recently. In this paper, we present theoretical and algorithmic relationships among several generalized LDA algorithms and compare their computational complexities and performances in text classification and face recognition. Towards a practical dimension reduction method for high dimensional data, an efficient algorithm is proposed, which reduces the computational complexity greatly while achieving competitive prediction accuracies. We also present nonlinear extensions of these LDA algorithms based on kernel methods. It is shown that a generalized eigenvalue problem can be formulated in the kernel-based feature space, and generalized LDA algorithms are applied to solve the generalized eigenvalue problem, resulting in nonlinear discriminant analysis. Performances of these linear and nonlinear discriminant analysis algorithms are compared extensively.
论文关键词:Dimension reduction,Feature extraction,Generalized linear discriminant analysis,Kernel methods,Nonlinear discriminant analysis,Undersampled problems
论文评审过程:Received 26 April 2006, Revised 27 June 2007, Accepted 30 July 2007, Available online 8 August 2007.
论文官网地址:https://doi.org/10.1016/j.patcog.2007.07.022