Abstract
This report provides an in-depth overview over the implications and novelty
Generalized Variational Inference (GVI) (Knoblauch et al., 2019) brings to Deep
Gaussian Processes (DGPs) (Damianou & Lawrence, 2013). Specifically, robustness
to model misspecification as well as principled alternatives for uncertainty
quantification are motivated with an information-geometric view. These
modifications have clear interpretations and can be implemented in less than
100 lines of Python code. Most importantly, the corresponding empirical results
show that DGPs can greatly benefit from the presented enhancements.
Users
Please
log in to take part in the discussion (add own reviews or comments).