@jaeschke

Which *BERT? A Survey Organizing Contextualized Encoders

, , и . Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), стр. 7516--7533. Association for Computational Linguistics, (ноября 2020)
DOI: 10.18653/v1/2020.emnlp-main.608

Аннотация

Pretrained contextualized text encoders are now a staple of the NLP community. We present a survey on language representation learning with the aim of consolidating a series of shared lessons learned across a variety of recent efforts. While significant advancements continue at a rapid pace, we find that enough has now been discovered, in different directions, that we can begin to organize advances according to common themes. Through this organization, we highlight important considerations when interpreting recent contributions and choosing which model to use.

Описание

Which *BERT? A Survey Organizing Contextualized Encoders - ACL Anthology

Линки и ресурсы

тэги

сообщество

  • @jaeschke
  • @rikbose
  • @nosebrain
  • @dblp
@jaeschke- тэги данного пользователя выделены