Approximate posterior inference for Bayesian models: black-box expectation propagation
作者:Ximing Li, Changchun Li, Jinjin Chi, Jihong Ouyang
摘要
Expectation propagation (EP) is a widely successful way to approximate the posteriors of complex Bayesian models. However, it suffers from expensive memory and time overheads, since it involves local approximations with locally specific messages. A recent art, namely averaged EP (AEP), upgrades EP by leveraging the average message effect on the posterior distribution, instead of the locally specific ones, so as to simultaneously reduce memory and time costs. In this paper, we extend AEP to a novel black-box expectation propagation (abbr. BBEP) algorithm, which can be directly applied to many Bayesian models without model-specific derivations. We leverage three ideas of black-box learning, leading to three versions of BBEP, referred to as BBEP\(^{{\varvec{m}}}\), BBEP\(^{{\varvec{g}}}\) and BBEP\(^{{\varvec{o}}}\) with Monte Carlo moment matching, Monte Carlo gradients and objective of AEP, respectively. For variance reduction, the importance sampling is used, and the proposal distribution selection as well as high dimensionality setting is discussed. Furthermore, we develop online versions of BBEP for optimization speedup given large-scale data sets. We empirically compare BBEP against the state-of-the-art black-box baseline algorithms on both synthetic and real-world data sets. Experimental results demonstrate that BBEP outperforms the baseline algorithms and it is even on a par with analytical solutions in some settings.
论文关键词:Black-box inference, Expectation propagation, Variance reduction, Importance sampling
论文评审过程:
论文官网地址:https://doi.org/10.1007/s10115-022-01705-5