C. Wei, J. Lee, Q. Liu, and T. Ma. (2018)cite arxiv:1810.05369Comment: version 2: title changed from originally Ön the Margin Theory of Feedforward Neural Networks". Substantial changes from old version of paper, including a new lower bound on NTK sample complexity version 3: reorganized NTK lower bound proof.
M. Aldridge, O. Johnson, and J. Scarlett. (2019)cite arxiv:1902.06002Comment: Survey paper, 140 pages, 19 figures. To be published in Foundations and Trends in Communications and Information Theory.
J. Negrea, M. Haghifam, G. Dziugaite, A. Khisti, and D. Roy. (2019)cite arxiv:1911.02151Comment: 23 pages, 1 figure. To appear in, Advances in Neural Information Processing Systems (33), 2019.
D. Soudry, E. Hoffer, M. Nacson, S. Gunasekar, and N. Srebro. (2017)cite arxiv:1710.10345Comment: Final JMLR version, with improved discussions over v3. Main improvements in journal version over conference version (v2 appeared in ICLR): We proved the measure zero case for main theorem (with implications for the rates), and the multi-class case.
V. Papyan, J. Sulam, and M. Elad. (2017)cite arxiv:1707.06066Comment: This is the journal version of arXiv:1607.02005 and arXiv:1607.02009, accepted to IEEE Transactions on Signal Processing.
A. Genevay, L. Chizat, F. Bach, M. Cuturi, and G. Peyré. Proceedings of Machine Learning Research, volume 89 of Proceedings of Machine Learning Research, page 1574--1583. PMLR, (16--18 Apr 2019)