Flexible latent variable models for multi-task learning
作者:Jian Zhang, Zoubin Ghahramani, Yiming Yang
摘要
Given multiple prediction problems such as regression or classification, we are interested in a joint inference framework that can effectively share information between tasks to improve the prediction accuracy, especially when the number of training examples per problem is small. In this paper we propose a probabilistic framework which can support a set of latent variable models for different multi-task learning scenarios. We show that the framework is a generalization of standard learning methods for single prediction problems and it can effectively model the shared structure among different prediction tasks. Furthermore, we present efficient algorithms for the empirical Bayes method as well as point estimation. Our experiments on both simulated datasets and real world classification datasets show the effectiveness of the proposed models in two evaluation settings: a standard multi-task learning setting and a transfer learning setting.
论文关键词:Multi-task learning, Latent variable models, Hierarchical Bayesian models, Model selection, Transfer learning
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10994-008-5050-1