Hierarchical Gaussian Processes model for multi-task learning
作者:
Highlights:
• A Hierarchical Gaussian Process Multi-task Learning (HGPMT) method.
• Effectively utilizing the explicit correlation prior information among tasks.
• A much lower computational complexity than the cross-covariance-based methods.
• A multi-kernel learning method for learning non-stationary function.
• Experiment on both toy and real-world datasets for demonstrating its superiority.
摘要
•A Hierarchical Gaussian Process Multi-task Learning (HGPMT) method.•Effectively utilizing the explicit correlation prior information among tasks.•A much lower computational complexity than the cross-covariance-based methods.•A multi-kernel learning method for learning non-stationary function.•Experiment on both toy and real-world datasets for demonstrating its superiority.
论文关键词:GP-LVM,Multi-task learning,Feature learning,Hierarchical model
论文评审过程:Received 12 February 2017, Revised 31 August 2017, Accepted 12 September 2017, Available online 13 September 2017, Version of Record 21 September 2017.
论文官网地址:https://doi.org/10.1016/j.patcog.2017.09.021