## Scaling limits in computational Bayesian inversion

^{1} Mathematics Institute, University of Warwick, Coventry CV4 7AL, England.

c.schillings@warwick.ac.uk
^{2} Seminar for Applied Mathematics, ETH, CH-8092 Zurich,
Switzerland.

christoph.schwab@sam.math.ethz.ch

Received:
16
October
2014

Revised:
29
July
2015

Accepted:
20
January
2016

Computational Bayesian inversion of operator equations with distributed uncertain input
parameters is based on an infinite-dimensional version of Bayes’ formula established in M.
Dashti and A.M. Stuart [*Handbook of Uncertainty Quantification*, edited by
R. Ghanem, D. Higdon and H. Owhadi. Springer (2015).] and its numerical realization in C.
Schillings and Ch. Schwab [*Inverse Problems ***29 **(2013)
065011; *Inverse Problems ***30 **(2014) 065007.] Based on the
sparsity of the posterior density shown in C. Schillings and Ch. Schwab [*Inverse
Problems ***29 **(2013) 065011; *Inverse Problems
***30 **(2014) 065007.]; C. Schwab and A.M. Stuart [*Inverse
Problems ***28 **(2012) 045003.], dimension-adaptive Smolyak
quadratures can afford higher convergence rates than MCMC in terms of the number
*M* of
solutions of the forward (parametric operator) equation in C. Schillings and Ch. Schwab
[*Inverse Problems ***29 **(2013) 065011; *Inverse
Problems ***30 **(2014) 065007.]. The error bounds and convergence
rates obtained in C. Schillings and Ch. Schwab [*Inverse Problems ***29
**(2013) 065011; *Inverse Problems ***30 **(2014) 065007.]
are independent of the parameter dimension (in particular free from the curse of
dimensionality) but depend on the (co)variance *Γ*> 0 of the additive, Gaussian observation noise
as exp(*b**Γ*^{-1}) for some
constant *b*>
0. It is proved that the Bayesian estimates admit asymptotic expansions
as *Γ* ↓ 0.
Sufficient (nondegeneracy) conditions for the existence of finite limits as
*Γ* ↓ 0 are
presented. For Gaussian priors, these limits are shown to be related to MAP estimators
obtained from Tikhonov regularized least-squares functionals. Quasi-Newton (QN) methods
with symmetric rank-1 updates are shown to identify the concentration points in a
non-intrusive way, and to obtain second order information of the posterior density at
these points. Based on the theory, two novel computational Bayesian estimation algorithms
for Bayesian estimation at small observation noise covariance *Γ*> 0 with performance
independent of *Γ* ↓
0 are proposed: first, dimension-adaptive Smolyak quadrature from C.
Schillings and Ch. Schwab [*Inverse Problems ***29 **(2013)
065011; *Inverse Problems ***30 **(2014) 065007.] combined with a
reparametrization of the parametric Bayesian posterior density near the MAP point (assumed
unique) and, second, generalized Richardson extrapolation to the limit of vanishing
observation noise variance. Numerical experiments are presented which confirm
*Γ*-independent convergence of the curvature-rescaled,
adaptive Smolyak algorithm. Dimension truncation of the posterior density is justified by
a general compactness result for the posterior’s Hessian at the MAP point.

Mathematics Subject Classification: 65M32 / 65N35 / 65K10

Key words: Bayesian inverse problems / parametric operator equations / smolyak quadrature / sparsity / non-Gaussian prior / quasi-Newton methods / SR1 update / posterior reparametrization / richardson extrapolation

*© EDP Sciences, SMAI 2016*