TY  - GEN
N2  - We propose conditional flows of the maximum mean discrepancy (MMD) with the negative distance kernel for posterior sampling and conditional generative modelling. This MMD, which is also known as energy distance, has several advantageous properties like efficient computation via slicing and sorting. We approximate the joint distribution of the ground truth and the observations using discrete Wasserstein gradient flows and establish an error bound for the posterior distributions. Further, we prove that our particle flow is indeed a Wasserstein gradient flow of an appropriate functional. The power of our method is demonstrated by numerical examples including conditional image generation and inverse problems like superresolution, inpainting and computed tomography in low-dose and limited-angle settings.
CY  - Vienna, Austria
T3  - International Conference on Learning Representations
Y1  - 2024/04/08/
PB  - ICLR
A1  - Hagemann, Paul
A1  - Hertrich, Johannes
A1  - Altekrüger, Fabian
A1  - Beinert, Robert
A1  - Chemseddine, Jannis
A1  - Steidl, Gabriele
N1  - This version is the author accepted manuscript. For information on re-use, please refer to the publisher?s terms and conditions.
ID  - discovery10194632
AV  - public
UR  - https://openreview.net/forum?id=YrXHEb2qMb
TI  - Posterior Sampling Based on Gradient Flows of the MMD with Negative Distance Kernel
KW  - Bayesian inverse Problems
KW  -  MMD
KW  -  Gradient Flows
KW  -  Deep Learning
ER  -