A generalized least-squares approach regularized with graph embedding for dimensionality reduction

作者:

Highlights:

• It seeks a generalized orthogonality constraint based on the PCA idea of minimizing least-squares reconstruction errors, which restrains orthogonality on data while inducing a penalty factor to scale the influence of each data point.

• The proposed generalized least-squares approach shares both advantages of Dimensionality Reduction (DR) and least-squares reconstruction error. Our proposed method can achieve a balance between keeping global structure by data reconstruction technique and local structure by graph embedding technique.

• Our proposed framework can easily be extended to supervised and semi-supervised scenarios on the existing DR frameworks.

摘要

•It seeks a generalized orthogonality constraint based on the PCA idea of minimizing least-squares reconstruction errors, which restrains orthogonality on data while inducing a penalty factor to scale the influence of each data point.•The proposed generalized least-squares approach shares both advantages of Dimensionality Reduction (DR) and least-squares reconstruction error. Our proposed method can achieve a balance between keeping global structure by data reconstruction technique and local structure by graph embedding technique.•Our proposed framework can easily be extended to supervised and semi-supervised scenarios on the existing DR frameworks.

论文关键词:Dimensionality reduction,Graph embedding,Subspace learning,Least-squares

论文评审过程:Received 19 April 2017, Revised 9 August 2019, Accepted 26 August 2019, Available online 5 September 2019, Version of Record 11 September 2019.

论文官网地址:https://doi.org/10.1016/j.patcog.2019.107023