Low-rank supervised and semi-supervised multi-metric learning for classification
作者:
Highlights:
•
摘要
Multi-metric learning is an important technique for improving classification performance since learning a single metric is usually insufficient for complex data. Most of the existing multi-metric learning approaches have high computational complexity. In this work, two multi-metric learning frameworks proposed to perform supervised and semi-supervised classifications respectively. Based on the frameworks, we first design a low-rank multi-metric learning model (LSMML) for supervised classification, in which multiple local class metrics as well as one global metric are jointly trained. A joint regularization scheme, composed of LogDet divergence and low-rank term, is also well-designed to incorporate prior knowledge for improving generalization. By learning appropriate metrics, LSMML not only captures the local nonlinear discriminate information of each class to reduce the probability of misclassification but also enhances the stability, alleviates the computational burden, and avoids the risk of overfitting. Then, we extend LSMML to a semi-supervised learning scenario and propose a low-rank semi-supervised multi-metric learning approach (LSeMML) to process data with scarce labels. Alternating iterative algorithms are designed to optimize both LSMML and LSeMML. At each iteration, we only require to perform geodesic convex optimizations, with closed-form solutions and low computational cost. In supervised and semi-supervised settings respectively, numerical simulations are carried out on different databases, which shows that the proposed LSMML and LSeMML have a simple form, fast training speed, and good classification performance.
论文关键词:Multi-metric learning,Supervised metric learning,Semi-supervised metric learning,Low-rank
论文评审过程:Received 27 April 2021, Revised 16 November 2021, Accepted 17 November 2021, Available online 1 December 2021, Version of Record 11 December 2021.
论文官网地址:https://doi.org/10.1016/j.knosys.2021.107787