TY  - INPR
UR  - https://doi.org/10.1080/00401706.2022.2124311
PB  - Taylor & Francis
ID  - discovery10157434
N2  - Deep Gaussian processes (DGPs) provide a rich class of models that can better represent functions with varying regimes or sharp changes, compared to conventional GPs. In this work, we propose a novel inference method for DGPs for computer model emulation. By stochastically imputing the latent layers, our approach transforms a DGP into a linked GP: a novel emulator developed for systems of linked computer models. This transformation permits an efficient DGP training procedure that only involves optimizations of conventional GPs. In addition, predictions from DGP emulators can be made in a fast and analytically tractable manner by naturally using the closed form predictive means and variances of linked GP emulators. We demonstrate the method in a series of synthetic examples and empirical applications, and show that it is a competitive candidate for DGP surrogate inference, combining efficiency that is comparable to doubly stochastic variational inference and uncertainty quantification that is comparable to the fully-Bayesian approach. A Python package dgpsi implementing the method is also produced and available at https://github.com/mingdeyu/DGP.
KW  - Elliptical slice sampling
KW  -  Linked Gaussian processes
KW  -  Option Greeks
KW  -  Surrogate model
KW  -  Stochastic expectation maximization
A1  - Ming, Deyu
A1  - Williamson, Daniel
A1  - Guillas, Serge
JF  - Technometrics
Y1  - 2022/10/12/
AV  - public
TI  - Deep Gaussian Process Emulation using Stochastic Imputation
N1  - © 2022 The Author(s). Published with license by Taylor & Francis Group, LLC.
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or built upon in any way.
ER  -