Learning performance of regularized moving least square regression
作者:
Highlights:
•
摘要
Moving least square regression is an important local learning algorithm. In this paper, we consider a regularized moving least square regression algorithm in reproducing kernel Hilbert space. The localized representer theorem is different from the classical representer theorems for regularized kernel machines. It shows that, regularization not only ensures the computational stability, it is also necessary for the algorithm to preserve localization property. We also studied the learning performance of the regularized moving least square algorithm and conducted a rigorous error analysis. Compared with the unregularized method, convergence analysis of regularized moving least square regression requires more natural and much simpler conditions and achieves fast rates.
论文关键词:68T05,62J02,Moving least square regression,Regularization,Reproducing kernel Hilbert space,Error bounds,Learning rate
论文评审过程:Received 23 December 2015, Revised 29 December 2016, Available online 4 May 2017, Version of Record 16 May 2017.
论文官网地址:https://doi.org/10.1016/j.cam.2017.04.046