From post

Rethinking Bias-Variance Trade-off for Generalization of Neural Networks

, , , , и . Proceedings of the 37th International Conference on Machine Learning, том 119 из Proceedings of Machine Learning Research, стр. 10767--10777. PMLR, (13--18 Jul 2020)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

White-Box Transformers via Sparse Rate Reduction: Compression Is All There Is?, , , , , , , , , и . CoRR, (2023)TCT: Convexifying Federated Learning using Bootstrapped Neural Tangent Kernels., , , , и . NeurIPS, (2022)Fast Distributionally Robust Learning with Variance-Reduced Min-Max Optimization., , , и . AISTATS, том 151 из Proceedings of Machine Learning Research, стр. 1219-1250. PMLR, (2022)Learning One-hidden-layer ReLU Networks via Gradient Descent., , , и . AISTATS, том 89 из Proceedings of Machine Learning Research, стр. 1524-1534. PMLR, (2019)Scaling White-Box Transformers for Vision., , , , , , и . CoRR, (2024)What You See is What You Get: Principled Deep Learning via Distributional Generalization., , , , и . NeurIPS, (2022)Robust Calibration with Multi-domain Temperature Scaling., , , и . NeurIPS, (2022)Federated Conformal Predictors for Distributed Uncertainty Quantification., , , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 22942-22964. PMLR, (2023)Accuracy on the wrong line: On the pitfalls of noisy data for out-of-distribution generalisation., , , , , и . CoRR, (2024)CTRL: Closed-Loop Transcription to an LDR via Minimaxing Rate Reduction., , , , , , , , , и 1 other автор(ы). Entropy, 24 (4): 456 (2022)