%I IOP PUBLISHING LTD
%V 34
%L discovery1576406
%K variational Gaussian approximation, Poisson data, hierarchical modeling, Kullback–Leibler divergence, alternating direction maximization
%N 2
%J Inverse Problems
%A SR Arridge
%A K Ito
%A B Jin
%A C Zhang
%X The Poisson model is frequently employed to describe count data, but in a Bayesian context it leads to
 an analytically intractable posterior probability distribution. In this work, we analyze a variational Gaussian
 approximation to the posterior distribution arising from the Poisson model with a Gaussian prior. This is
 achieved by seeking an optimal Gaussian distribution minimizing the Kullback-Leibler divergence from
 the posterior distribution to the approximation, or
 equivalently maximizing the lower bound for the model evidence. We derive an explicit expression for
 the lower bound, and show the existence and uniqueness of the optimal Gaussian approximation. The lower
 bound functional can be viewed as a variant of classical Tikhonov regularization that penalizes also the
 covariance. Then we develop an efficient alternating direction maximization algorithm for solving
 the optimization problem, and analyze its convergence. We discuss strategies for reducing the computational
 complexity via low rank structure of the forward operator and the sparsity of the covariance. Further, as an
 application of the lower bound, we discuss hierarchical Bayesian modeling for selecting the
 hyperparameter in the prior distribution, and propose a monotonically convergent algorithm for determining
 the hyperparameter. We present extensive numerical experiments to illustrate the Gaussian approximation and the algorithms.
%O Original content from this work may be used under the terms of the Creative  Commons Attribution 3.0 licence (http://creativecommons.org/licenses/by/3.0). Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
%D 2018
%T Variational Gaussian approximation for Poisson data