Linear discriminant analysis with worst between-class separation and average within-class compactness

作者:Leilei Yang, Songcan Chen

摘要

Linear discriminant analysis (LDA) is one of the most popular supervised dimensionality reduction (DR) techniques and obtains discriminant projections by maximizing the ratio of average-case between-class scatter to average-case within-class scatter. Two recent discriminant analysis algorithms (DAS), minimal distance maximization (MDM) and worst-case LDA (WLDA), get projections by optimizing worst-case scatters. In this paper, we develop a new LDA framework called LDA with worst between-class separation and average within-class compactness (WSAC) by maximizing the ratio of worst-case between-class scatter to average-case within-class scatter. This can be achieved by relaxing the trace ratio optimization to a distance metric learning problem. Comparative experiments demonstrate its effectiveness. In addition, DA counterparts using the local geometry of data and the kernel trick can likewise be embedded into our framework and be solved in the same way.

论文关键词:dimensionality reduction, linear discriminant analysis, the worst separation, the average compactness

论文评审过程:

论文官网地址:https://doi.org/10.1007/s11704-014-3337-x