Scalable learning of probabilistic latent models for collaborative filtering
作者:
Highlights:
• We propose a scalable learning scheme for a probabilistic generative model for collaborative filtering.
• Predictive results of the model improve on current state of the art.
• The model is shown to work well in cold-start situations
摘要
Collaborative filtering has emerged as a popular way of making user recommendations, but with the increasing sizes of the underlying databases scalability is becoming a crucial issue. In this paper we focus on a recently proposed probabilistic collaborative filtering model that explicitly represents all users and items simultaneously in the model. This model class has several desirable properties, including high recommendation accuracy and principled support for group recommendations. Unfortunately, it also suffers from poor scalability. We address this issue by proposing a scalable variational Bayes learning and inference algorithm for these types of models. Empirical results show that the proposed algorithm achieves significantly better accuracy results than other straw-men models evaluated on a collection of well-known data sets. We also demonstrate that the algorithm has a highly favorable behavior in relation to cold-start situations.
论文关键词:Collaborative filtering,Scalable learning,Probabilistic model,Latent variables,Variational Bayes
论文评审过程:Received 29 April 2014, Revised 6 February 2015, Accepted 19 March 2015, Available online 30 March 2015.
论文官网地址:https://doi.org/10.1016/j.dss.2015.03.006