@kirk86

On $w$-mixtures: Finite convex combinations of prescribed component distributions

, and . (2017)cite arxiv:1708.00568Comment: 25 pages.

Abstract

We consider the space of $w$-mixtures that is the set of finite statistical mixtures sharing the same prescribed component distributions. The geometry induced by the Kullback-Leibler (KL) divergence on this family of $w$-mixtures is a dually flat space in information geometry called the mixture family manifold. It follows that the KL divergence between two $w$-mixtures is equivalent to a Bregman Divergence (BD) defined for the negative Shannon entropy generator. Thus the KL divergence between two Gaussian Mixture Models (GMMs) sharing the same components is (theoretically) a Bregman divergence. This KL-BD equivalence implies that we can perform optimal KL-averaging aggregation of $w$-mixtures without information loss. More generally, we prove that the skew Jensen-Shannon divergence between $w$-mixtures is equivalent to a skew Jensen divergence on their parameters. Finally, we state several divergence identity and inequalities relating $w$-mixtures.

Description

[1708.00568] On $w$-mixtures: Finite convex combinations of prescribed component distributions

Links and resources

Tags

community

  • @kirk86
  • @dblp
@kirk86's tags highlighted