From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Inefficiency of K-FAC for Large Batch Size Training., , , , , , и . CoRR, (2019)Rethinking Batch Normalization in Transformers., , , , и . CoRR, (2020)JumpReLU: A Retrofit Defense Strategy for Adversarial Attacks., , и . ICPRAM, стр. 103-114. SCITEPRESS, (2020)Improving Semi-supervised Federated Learning by Reducing the Gradient Diversity of Models., , , , , , и . IEEE BigData, стр. 1214-1225. IEEE, (2021)Understanding Int4 Quantization for Language Models: Latency Speedup, Composability, and Failure Cases., , , , и . ICML, том 202 из Proceedings of Machine Learning Research, стр. 37524-37539. PMLR, (2023)MAF: Multimodal Alignment Framework for Weakly-Supervised Phrase Grounding., , , , и . EMNLP (1), стр. 2030-2038. Association for Computational Linguistics, (2020)JumpReLU: A Retrofit Defense Strategy for Adversarial Attacks., , и . CoRR, (2019)Inefficiency of K-FAC for Large Batch Size Training., , , , , , и . AAAI, стр. 5053-5060. AAAI Press, (2020)DeepSpeed Data Efficiency: Improving Deep Learning Model Quality and Training Efficiency via Efficient Data Sampling and Routing., , , , , , и . AAAI, стр. 18490-18498. AAAI Press, (2024)I-BERT: Integer-only BERT Quantization., , , , и . ICML, том 139 из Proceedings of Machine Learning Research, стр. 5506-5518. PMLR, (2021)