Author of the publication

PowerNorm: Rethinking Batch Normalization in Transformers.

, , , , and . ICML, volume 119 of Proceedings of Machine Learning Research, page 8741-8751. PMLR, (2020)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Inefficiency of K-FAC for Large Batch Size Training., , , , , , and . CoRR, (2019)Rethinking Batch Normalization in Transformers., , , , and . CoRR, (2020)JumpReLU: A Retrofit Defense Strategy for Adversarial Attacks., , and . ICPRAM, page 103-114. SCITEPRESS, (2020)Improving Semi-supervised Federated Learning by Reducing the Gradient Diversity of Models., , , , , , and . IEEE BigData, page 1214-1225. IEEE, (2021)Understanding Int4 Quantization for Language Models: Latency Speedup, Composability, and Failure Cases., , , , and . ICML, volume 202 of Proceedings of Machine Learning Research, page 37524-37539. PMLR, (2023)MAF: Multimodal Alignment Framework for Weakly-Supervised Phrase Grounding., , , , and . EMNLP (1), page 2030-2038. Association for Computational Linguistics, (2020)JumpReLU: A Retrofit Defense Strategy for Adversarial Attacks., , and . CoRR, (2019)Inefficiency of K-FAC for Large Batch Size Training., , , , , , and . AAAI, page 5053-5060. AAAI Press, (2020)DeepSpeed Data Efficiency: Improving Deep Learning Model Quality and Training Efficiency via Efficient Data Sampling and Routing., , , , , , and . AAAI, page 18490-18498. AAAI Press, (2024)I-BERT: Integer-only BERT Quantization., , , , and . ICML, volume 139 of Proceedings of Machine Learning Research, page 5506-5518. PMLR, (2021)