Abstract
A promising class of generative models maps points from a simple distribution
to a complex distribution through an invertible neural network.
Likelihood-based training of these models requires restricting their
architectures to allow cheap computation of Jacobian determinants.
Alternatively, the Jacobian trace can be used if the transformation is
specified by an ordinary differential equation. In this paper, we use
Hutchinson's trace estimator to give a scalable unbiased estimate of the
log-density. The result is a continuous-time invertible generative model with
unbiased density estimation and one-pass sampling, while allowing unrestricted
neural network architectures. We demonstrate our approach on high-dimensional
density estimation, image generation, and variational inference, achieving the
state-of-the-art among exact likelihood methods with efficient sampling.
Users
Please
log in to take part in the discussion (add own reviews or comments).