deepgp: Bayesian Deep Gaussian Processes using MCMC
Performs Bayesian posterior inference for deep Gaussian processes
following Sauer, Gramacy, and Higdon (2023, <doi:10.48550/arXiv.2012.08015>). See Sauer
(2023, <http://hdl.handle.net/10919/114845>) for comprehensive methodological
details and <https://bitbucket.org/gramacylab/deepgp-ex/> for a variety of
coding examples. Models are trained through MCMC including elliptical
slice sampling of latent Gaussian layers and Metropolis-Hastings
sampling of kernel hyperparameters. Vecchia-approximation for faster
computation is implemented following Sauer, Cooper, and Gramacy
(2023, <doi:10.48550/arXiv.2204.02904>). Optional monotonic warpings are implemented
following Barnett et al. (2024, <doi:10.48550/arXiv.2408.01540>). Downstream tasks include sequential design
through active learning Cohn/integrated mean squared error
(ALC/IMSE; Sauer, Gramacy, and Higdon, 2023), optimization through
expected improvement (EI; Gramacy, Sauer, and Wycoff, 2022 <doi:10.48550/arXiv.2112.07457>),
and contour location through entropy
(Booth, Renganathan, and Gramacy, 2024 <doi:10.48550/arXiv.2308.04420>). Models
extend up to three layers deep; a one layer model is equivalent to typical
Gaussian process regression. Incorporates OpenMP and SNOW parallelization
and utilizes C/C++ under the hood.
Version: |
1.1.3 |
Depends: |
R (≥ 3.6) |
Imports: |
grDevices, graphics, stats, doParallel, foreach, parallel, GpGp, Matrix, Rcpp, mvtnorm, FNN |
LinkingTo: |
Rcpp, RcppArmadillo |
Suggests: |
interp, knitr, rmarkdown |
Published: |
2024-08-19 |
DOI: |
10.32614/CRAN.package.deepgp |
Author: |
Annie S. Booth [aut, cre] |
Maintainer: |
Annie S. Booth <annie_booth at ncsu.edu> |
License: |
LGPL-2 | LGPL-2.1 | LGPL-3 [expanded from: LGPL] |
NeedsCompilation: |
yes |
Materials: |
README |
CRAN checks: |
deepgp results |
Documentation:
Downloads:
Linking:
Please use the canonical form
https://CRAN.R-project.org/package=deepgp
to link to this page.