Proposals which speed up function-space MCMC
作者:
Highlights:
•
摘要
Inverse problems lend themselves naturally to a Bayesian formulation, in which the quantity of interest is a posterior distribution of state and/or parameters given some uncertain observations. For the common case in which the forward operator is smoothing, then the inverse problem is ill-posed. Well-posedness is imposed via regularization in the form of a prior, which is often Gaussian. Under quite general conditions, it can be shown that the posterior is absolutely continuous with respect to the prior and it may be well-defined on function space in terms of its density with respect to the prior. In this case, by constructing a proposal for which the prior is invariant, one can define Metropolis–Hastings schemes for MCMC which are well-defined on function space (Stuart (2010) [1], Cotter et al. [2]), and hence do not degenerate as the dimension of the underlying quantity of interest increases to infinity, e.g. under mesh refinement when approximating PDE in finite dimensions. However, in practice, despite the attractive theoretical properties of the currently available schemes, they may still suffer from long correlation times, particularly if the data is very informative about some of the unknown parameters. In fact, in this case it may be the directions of the posterior which coincide with the (already known) prior which decorrelate the slowest. The information incorporated into the posterior through the data is often contained within some finite-dimensional subspace, in an appropriate basis, perhaps even one defined by eigenfunctions of the prior. We aim to exploit this fact and improve the mixing time of function-space MCMC by careful rescaling of the proposal. To this end, we introduce two new basic methods of increasing complexity, involving (i) characteristic function truncation of high frequencies and (ii) Hessian information to interpolate between low and high frequencies. The second, more sophisticated version, bears some similarities with recent methods which exploit local curvature information, for example RMHMC, Girolami and Calderhead (2011) [3], and stochastic Newton, Martin et al. (2012) [4]. These ideas are illustrated with numerical experiments on the Bayesian inversion of the heat equation and Navier–Stokes equation, given noisy observations.
论文关键词:Function-space Markov chain Monte Carlo,Bayesian inverse problems,Bayesian non-parametrics,Metropolis–Hastings,Navier–Stokes equation
论文评审过程:Received 18 December 2012, Revised 25 June 2013, Available online 7 August 2013.
论文官网地址:https://doi.org/10.1016/j.cam.2013.07.026