Structural max-margin discriminant analysis for feature extraction

作者:

Highlights:

摘要

Subclass discriminant analysis (SDA) is a recently developed dimensionality reduction technique which takes into consideration the intrinsic structure information lurking in data by approximating unknown distribution of each class with multiple Gaussian distributions, namely, subclasses. However, in SDA, the separability between heterogeneous subclasses, i.e. those from different classes, is measured by the between-subclass scatter calculated as average distance between the means of these subclasses. In this paper, in the view of maximum margin principle, we propose a novel feature extraction method coined structural max-margin discriminant analysis (SMDA), in order to enhance the performance of SDA. Specifically, SMDA targets at finding an orthogonal linear embedded subspace in which the margin, defined as the minimum pairwise between-subclasses distance, is maximized and simultaneously the within-subclasses scatter is minimized. The concrete formulation of the resulting model boils down to a nonconvex optimization problem that can be solved by combining the constrained concave–convex procedure with the column generation technique. We evaluate the proposed SMDA on several benchmark datasets and the experimental results confirm the effectiveness of the proposed method.

论文关键词:Discriminant analysis,Max-margin principle,Quadratic programming,Constrained concave–convex procedure,Column generation

论文评审过程:Received 13 August 2013, Revised 30 May 2014, Accepted 14 June 2014, Available online 23 June 2014.

论文官网地址:https://doi.org/10.1016/j.knosys.2014.06.020