Zusammenfassung
In this paper, we initiate a principled study of how the generalization
properties of approximate differential privacy can be used to perform adaptive
hypothesis testing, while giving statistically valid $p$-value corrections. We
do this by observing that the guarantees of algorithms with bounded approximate
max-information are sufficient to correct the $p$-values of adaptively chosen
hypotheses, and then by proving that algorithms that satisfy
$(\epsilon,\delta)$-differential privacy have bounded approximate max
information when their inputs are drawn from a product distribution. This
substantially extends the known connection between differential privacy and
max-information, which previously was only known to hold for (pure)
$(\epsilon,0)$-differential privacy. It also extends our understanding of
max-information as a partially unifying measure controlling the generalization
properties of adaptive data analyses. We also show a lower bound, proving that
(despite the strong composition properties of max-information), when data is
drawn from a product distribution, $(\epsilon,\delta)$-differentially private
algorithms can come first in a composition with other algorithms satisfying
max-information bounds, but not necessarily second if the composition is
required to itself satisfy a nontrivial max-information bound. This, in
particular, implies that the connection between
$(\epsilon,\delta)$-differential privacy and max-information holds only for
inputs drawn from product distributions, unlike the connection between
$(\epsilon,0)$-differential privacy and max-information.
Beschreibung
[1604.03924] Max-Information, Differential Privacy, and Post-Selection Hypothesis Testing
Links und Ressourcen
Tags
Community