Parsimonious reduction of Gaussian mixture models with a variational-Bayes approach
作者:
Highlights:
•
摘要
Aggregating statistical representations of classes is an important task for current trends in scaling up learning and recognition, or for addressing them in distributed infrastructures. In this perspective, we address the problem of merging probabilistic Gaussian mixture models in an efficient way, through the search for a suitable combination of components from mixtures to be merged. We propose a new Bayesian modelling of this combination problem, in association to a variational estimation technique, that handles efficiently the model complexity issue. A main feature of the present scheme is that it merely resorts to the parameters of the original mixture, ensuring low computational cost and possibly communication, should we operate on a distributed system. Experimental results are reported on real data.
论文关键词:Mixture models,Bayesian estimation,Model aggregation
论文评审过程:Received 7 January 2009, Revised 19 June 2009, Accepted 5 August 2009, Available online 13 August 2009.
论文官网地址:https://doi.org/10.1016/j.patcog.2009.08.006