Robust regression using biased objectives

作者:Matthew J. Holland, Kazushi Ikeda

摘要

For the regression task in a non-parametric setting, designing the objective function to be minimized by the learner is a critical task. In this paper we propose a principled method for constructing and minimizing robust losses, which are resilient to errant observations even under small samples. Existing proposals typically utilize very strong estimates of the true risk, but in doing so require a priori information that is not available in practice. As we abandon direct approximation of the risk, this lets us enjoy substantial gains in stability at a tolerable price in terms of bias, all while circumventing the computational issues of existing procedures. We analyze existence and convergence conditions, provide practical computational routines, and also show empirically that the proposed method realizes superior robustness over wide data classes with no prior knowledge assumptions.

论文关键词:Robust loss, Heavy-tailed noise, Risk minimization

论文评审过程:

论文官网地址:https://doi.org/10.1007/s10994-017-5653-5