,

Privately Learning High-Dimensional Distributions

, , , и .
(2018)cite arxiv:1805.00216.

Аннотация

We present novel, computationally efficient, and differentially private algorithms for two fundamental high-dimensional learning problems: learning a multivariate Gaussian in $R^d$ and learning a product distribution in $\0,1\^d$ in total variation distance. The sample complexity of our algorithms nearly matches the sample complexity of the optimal non-private learners for these tasks in a wide range of parameters. Thus, our results show that private comes essentially for free for these problems, providing a counterpoint to the many negative results showing that privacy is often costly in high dimensions. Our algorithms introduce a novel technical approach to reducing the sensitivity of the estimation procedure that we call recursive private preconditioning, which may find additional applications.

тэги

Пользователи данного ресурса

  • @kirk86

Комментарии и рецензии