Article,

Fast and Simple Natural-Gradient Variational Inference with Mixture of Exponential-family Approximations

, , and .
(2019)cite arxiv:1906.02914Comment: Accepted as a conference paper at ICML 2019.

Abstract

Natural-gradient methods enable fast and simple algorithms for variational inference, but due to computational difficulties, their use is mostly limited to minimal exponential-family (EF) approximations. In this paper, we extend their application to estimate structured approximations such as mixtures of EF distributions. Such approximations can fit complex, multimodal posterior distributions and are generally more accurate than unimodal EF approximations. By using a minimal conditional-EF representation of such approximations, we derive simple natural-gradient updates. Our empirical results demonstrate a faster convergence of our natural-gradient method compared to black-box gradient-based methods. Our work expands the scope of natural gradients for Bayesian inference and makes them more widely applicable than before.

Tags

Users

  • @kirk86
  • @dblp

Comments and Reviews