Analysis of regularized least-squares in reproducing kernel Kreĭn spaces

作者:Fanghui Liu, Lei Shi, Xiaolin Huang, Jie Yang, Johan A. K. Suykens

摘要

In this paper, we study the asymptotic properties of regularized least squares with indefinite kernels in reproducing kernel Kreĭn spaces (RKKS). By introducing a bounded hyper-sphere constraint to such non-convex regularized risk minimization problem, we theoretically demonstrate that this problem has a globally optimal solution with a closed form on the sphere, which makes approximation analysis feasible in RKKS. Regarding to the original regularizer induced by the indefinite inner product, we modify traditional error decomposition techniques, prove convergence results for the introduced hypothesis error based on matrix perturbation theory, and derive learning rates of such regularized regression problem in RKKS. Under some conditions, the derived learning rates in RKKS are the same as that in reproducing kernel Hilbert spaces (RKHS). To the best of our knowledge, this is the first work on approximation analysis of regularized learning algorithms in RKKS.

论文关键词:Approximation analysis, Regularized least squares, Indefinite kernel, Matrix perturbation theory

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-021-05955-2