A Gradient Linear Discriminant Analysis for Small Sample Sized Problem

作者:Alok Sharma, Kuldip K. Paliwal

摘要

The purpose of conventional linear discriminant analysis (LDA) is to find an orientation which projects high dimensional feature vectors of different classes to a more manageable low dimensional space in the most discriminative way for classification. The LDA technique utilizes an eigenvalue decomposition (EVD) method to find such an orientation. This computation is usually adversely affected by the small sample size problem. In this paper we have presented a new direct LDA method (called gradient LDA) for computing the orientation especially for small sample size problem. The gradient descent based method is used for this purpose. It also avoids discarding the null space of within-class scatter matrix and between-class scatter matrix which may have discriminative information useful for classification.

论文关键词:Gradient linear discriminant analysis, Small sample size problem, Fisher’s criterion function, Dimensionality reduction

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11063-007-9056-7