From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Emergent properties of the local geometry of neural loss landscapes, и . (2019)cite arxiv:1910.05929Comment: 10 pages, 8 figures.Fundamental bounds on the fidelity of sensory cortical coding., , , , , , , , , и . Nat., 580 (7801): 100-105 (2020)Deep learning versus kernel learning: an empirical study of loss landscape geometry and the time evolution of the Neural Tangent Kernel., , , , , и . NeurIPS, (2020)Pruning neural networks without any data by iteratively conserving synaptic flow., , , и . NeurIPS, (2020)Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics., , , , и . NeurIPS, стр. 15670-15679. (2019)Identifying Learning Rules From Neural Network Observables., , , и . NeurIPS, (2020)Short-term memory in neuronal networks through dynamical compressed sensing., и . NIPS, стр. 667-675. Curran Associates, Inc., (2010)A theory of high dimensional regression with arbitrary correlations between input features and target functions: sample complexity, multiple descent curves and a hierarchy of phase transitions., и . ICML, том 139 из Proceedings of Machine Learning Research, стр. 7578-7587. PMLR, (2021)Get rich quick: exact solutions reveal how unbalanced initializations promote rapid feature learning., , , , , , и . CoRR, (2024)An analytic theory of generalization dynamics and transfer learning in deep linear networks., и . ICLR (Poster), OpenReview.net, (2019)