Bayesian hybrid generative discriminative learning based on finite Liouville mixture models

作者:

Highlights:

摘要

Recently hybrid generative discriminative approaches have emerged as an efficient knowledge representation and data classification engine. However, little attention has been devoted to the modeling and classification of non-Gaussian and especially proportional vectors. Our main goal, in this paper, is to discover the true structure of this kind of data by building probabilistic kernels from generative mixture models based on Liouville family, from which we develop the Beta-Liouville distribution, and which includes the well-known Dirichlet as a special case. The Beta-Liouville has a more general covariance structure than the Dirichlet which makes it more practical and useful. Our learning technique is based on a principled purely Bayesian approach which resulted models are used to generate support vector machine (SVM) probabilistic kernels based on information divergence. In particular, we show the existence of closed-form expressions of the Kullback–Leibler and Rényi divergences between two Beta-Liouville distributions and then between two Dirichlet distributions as a special case. Through extensive simulations and a number of experiments involving synthetic data, visual scenes and texture images classification, we demonstrate the effectiveness of the proposed approaches.

论文关键词:Liouville family of distributions,Generative models,Discriminative models,Mixture models,SVM,Bayesian inference,Exponential family,Conjugate prior,Gibbs sampling,Bayes factor,Image classification,Texture modeling

论文评审过程:Received 30 November 2009, Revised 10 November 2010, Accepted 13 December 2010, Available online 19 December 2010.

论文官网地址:https://doi.org/10.1016/j.patcog.2010.12.010