@tobias.koopmann

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

, , , и . Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, NAACL-HLT 2019, Minneapolis, MN, USA, June 2-7, 2019, Volume 1 (Long and Short Papers), стр. 4171--4186. Association for Computational Linguistics, (2019)
DOI: 10.18653/V1/N19-1423

Линки и ресурсы

тэги

сообщество

  • @mnoukhov
  • @lea-w
  • @tobias.koopmann
  • @lepsky
  • @jonaskaiser
  • @michan
  • @hotho
  • @nosebrain
  • @parismic
  • @dblp
@tobias.koopmann- тэги данного пользователя выделены