A series of maximum entropy upper bounds of the differential entropy
F. Nielsen, and R. Nock. (2016)cite arxiv:1612.02954Comment: 18 pages.
Abstract
We present a series of closed-form maximum entropy upper bounds for the
differential entropy of a continuous univariate random variable and study the
properties of that series. We then show how to use those generic bounds for
upper bounding the differential entropy of Gaussian mixture models. This
requires to calculate the raw moments and raw absolute moments of Gaussian
mixtures in closed-form that may also be handy in statistical machine learning
and information theory. We report on our experiments and discuss on the
tightness of those bounds.
Description
[1612.02954] A series of maximum entropy upper bounds of the differential entropy
%0 Journal Article
%1 nielsen2016series
%A Nielsen, Frank
%A Nock, Richard
%D 2016
%K bounds entropy information theory
%T A series of maximum entropy upper bounds of the differential entropy
%U http://arxiv.org/abs/1612.02954
%X We present a series of closed-form maximum entropy upper bounds for the
differential entropy of a continuous univariate random variable and study the
properties of that series. We then show how to use those generic bounds for
upper bounding the differential entropy of Gaussian mixture models. This
requires to calculate the raw moments and raw absolute moments of Gaussian
mixtures in closed-form that may also be handy in statistical machine learning
and information theory. We report on our experiments and discuss on the
tightness of those bounds.
@article{nielsen2016series,
abstract = {We present a series of closed-form maximum entropy upper bounds for the
differential entropy of a continuous univariate random variable and study the
properties of that series. We then show how to use those generic bounds for
upper bounding the differential entropy of Gaussian mixture models. This
requires to calculate the raw moments and raw absolute moments of Gaussian
mixtures in closed-form that may also be handy in statistical machine learning
and information theory. We report on our experiments and discuss on the
tightness of those bounds.},
added-at = {2019-12-11T14:33:37.000+0100},
author = {Nielsen, Frank and Nock, Richard},
biburl = {https://www.bibsonomy.org/bibtex/2174c83c7ba5fe2ae61c79c1c79ab4cb7/kirk86},
description = {[1612.02954] A series of maximum entropy upper bounds of the differential entropy},
interhash = {093bf5db38bfd542e08f01a21f170868},
intrahash = {174c83c7ba5fe2ae61c79c1c79ab4cb7},
keywords = {bounds entropy information theory},
note = {cite arxiv:1612.02954Comment: 18 pages},
timestamp = {2019-12-11T14:33:37.000+0100},
title = {A series of maximum entropy upper bounds of the differential entropy},
url = {http://arxiv.org/abs/1612.02954},
year = 2016
}