From post

On the Multilingual Ability of Decoder-based Pre-trained Language Models: Finding and Controlling Language-Specific Neurons.

, , , , и . NAACL-HLT, стр. 6919-6971. Association for Computational Linguistics, (2024)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Noise-Robust Image Reconstruction Based on Minimizing Extended Class of Power-Divergence Measures., , , , и . Entropy, 23 (8): 1005 (2021)Making Use of Latent Space in Language GANs for Generating Diverse Text without Pre-training., , и . EACL (Student Research Workshop), стр. 175-182. Association for Computational Linguistics, (2021)Robustifying Vision Transformer without Retraining from Scratch by Test-Time Class-Conditional Feature Alignment., , и . IJCAI, стр. 1009-1016. ijcai.org, (2022)On the Multilingual Ability of Decoder-based Pre-trained Language Models: Finding and Controlling Language-Specific Neurons., , , , и . CoRR, (2024)Large Language Models are Zero-Shot Reasoners., , , , и . NeurIPS, (2022)Block-Iterative Reconstruction from Dynamically Selected Sparse Projection Views Using Extended Power-Divergence Measure., , , , и . Entropy, 24 (5): 740 (2022)Hierarchical concern-based pointcuts., и . AOAsia@AOSD, стр. 19-22. ACM, (2013)Die zweite Lean-Revolution. Verl. Moderne Industrie, Landsberg, (1995)Unnatural Error Correction: GPT-4 Can Almost Perfectly Handle Unnatural Scrambled Text., , , и . EMNLP, стр. 8898-8913. Association for Computational Linguistics, (2023)