Semi-supervised domain adaptation via Fredholm integral based kernel methods
作者:
Highlights:
•
摘要
Along with the emergence of domain adaptation in semi-supervised setting, dealing with the noisy and complex data in classifier adaptation underscores its growing importance. We believe a large amount of unlabeled data in target domain, which are always only used in distribution alignment, are more of a great source of information for this challenge. In this paper, we propose a novel Transfer Fredholm Multiple Kernel Learning (TFMKL) framework to suppress the noise for complex data distributions. Firstly, with exploring unlabeled target data, TFMKL learns a cross-domain predictive model by developing a Fredholm integral based kernel prediction framework which is proven to be effective in noise suppression. Secondly, TFMKL explicitly extends the applied range of unlabeled target samples into adaptive classifier building and distribution alignment. Thirdly, multiple kernels are explored to induce an optimal learning space. Correspondingly, TFMKL is distinguished with allowing for noise resiliency, facilitating knowledge transfer and analyzing complex data characteristics at the same time. Furthermore, an effective optimization procedure is presented based on the reduced gradient, guaranteeing rapid convergence. We emphasize the adaptability of TFMKL to different domain adaptation tasks due to its extension to different predictive models. In particular, two models based on square loss and hinge loss respectively are proposed within the TFMKL framework. Comprehensive empirical studies on benchmark data sets verify the effectiveness and the noise resiliency of our proposed methods.
论文关键词:Domain adaptation,Semi-supervised learning,Multiple kernel learning,Hilbert space embedding of distributions
论文评审过程:Received 21 November 2017, Revised 1 June 2018, Accepted 31 July 2018, Available online 8 August 2018, Version of Record 23 August 2018.
论文官网地址:https://doi.org/10.1016/j.patcog.2018.07.035