Global and local metric learning via eigenvectors
作者:
Highlights:
•
摘要
Distance metric plays a significant role in machine learning methods(classification, clustering, etc.), especially in k-nearest neighbor classification(kNN), where the Euclidean distances are computed to decide the labels of unknown points. But Euclidean distance ignores the statistical structure which may help to measure the similarity of different inputs better. In this paper, we construct an unified framework, including two eigenvalue related methods, to learn data-dependent metric. Both methods aim to maximize the difference of intra-class distance and inter-class distance, but the optimization is considered in global view and local view respectively. Different from previous work in metric learning, our methods straight seek for equilibrium between inter-class distance and intra-class distance, and the linear transformation decomposed from the metric is to be optimized directly instead of the metric. Then we can effectively adjust the data distribution in transformed space and construct favorable regions for kNN classification. The problems can be solved simply by eigenvalue-decomposition, much faster than semi-definite programming. After selecting the top eigenvalues, the original data can be projected into low dimensional space, and then insignificant information will be mitigated or eliminated to make the classification more efficiently. This makes it possible that our novel methods make metric learning and dimension reduction simultaneously. The numerical experiments from different points of view verify that our methods can improve the accuracy of kNN classification and make dimension reduction with competitive performance.
论文关键词:Metric learning,Global and local,Dimension reduction,Classification
论文评审过程:Received 14 April 2016, Revised 1 November 2016, Accepted 4 November 2016, Available online 5 November 2016, Version of Record 14 December 2016.
论文官网地址:https://doi.org/10.1016/j.knosys.2016.11.004