Abstract
This paper develops systematic approaches to obtain $f$-divergence
inequalities, dealing with pairs of probability measures defined on arbitrary
alphabets. Functional domination is one such approach, where special emphasis
is placed on finding the best possible constant upper bounding a ratio of
$f$-divergences. Another approach used for the derivation of bounds among
$f$-divergences relies on moment inequalities and the logarithmic-convexity
property, which results in tight bounds on the relative entropy and
Bhattacharyya distance in terms of $\chi^2$ divergences. A rich variety of
bounds are shown to hold under boundedness assumptions on the relative
information. Special attention is devoted to the total variation distance and
its relation to the relative information and relative entropy, including
"reverse Pinsker inequalities," as well as on the $E_\gamma$ divergence, which
generalizes the total variation distance. Pinsker's inequality is extended for
this type of $f$-divergence, a result which leads to an inequality linking the
relative entropy and relative information spectrum. Integral expressions of the
Rényi divergence in terms of the relative information spectrum are derived,
leading to bounds on the Rényi divergence in terms of either the variational
distance or relative entropy.
Users
Please
log in to take part in the discussion (add own reviews or comments).