From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Uniform-in-time propagation of chaos for the mean-field gradient Langevin dynamics., , и . ICLR, OpenReview.net, (2023)Stochastic Gradient Descent with Exponential Convergence Rates of Expected Classification Errors., и . AISTATS, том 89 из Proceedings of Machine Learning Research, стр. 1417-1426. PMLR, (2019)Accelerated Stochastic Gradient Descent for Minimizing Finite Sums.. AISTATS, том 51 из JMLR Workshop and Conference Proceedings, стр. 195-203. JMLR.org, (2016)Exponential Convergence Rates of Classification Errors on Learning with SGD and Random Features., , и . AISTATS, том 130 из Proceedings of Machine Learning Research, стр. 1954-1962. PMLR, (2021)Data Cleansing for Models Trained with SGD., , и . NeurIPS, стр. 4215-4224. (2019)Generalization Bounds for Graph Embedding Using Negative Sampling: Linear vs Hyperbolic., , , , , и . NeurIPS, стр. 1243-1255. (2021)Convex Analysis of the Mean Field Langevin Dynamics., , и . AISTATS, том 151 из Proceedings of Machine Learning Research, стр. 9741-9757. PMLR, (2022)Stochastic Difference of Convex Algorithm and its Application to Training Deep Boltzmann Machines., и . AISTATS, том 54 из Proceedings of Machine Learning Research, стр. 470-478. PMLR, (2017)Functional Gradient Boosting for Learning Residual-like Networks with Statistical Guarantees., и . AISTATS, том 108 из Proceedings of Machine Learning Research, стр. 2981-2991. PMLR, (2020)Particle Dual Averaging: Optimization of Mean Field Neural Network with Global Convergence Rate Analysis., , и . NeurIPS, стр. 19608-19621. (2021)