@nosebrain

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

, , , and . NAACL-HLT (1), page 4171-4186. Association for Computational Linguistics, (2019)

Links and resources

Tags

community

  • @mnoukhov
  • @lea-w
  • @tobias.koopmann
  • @lepsky
  • @jonaskaiser
  • @michan
  • @hotho
  • @nosebrain
  • @parismic
  • @dblp
@nosebrain's tags highlighted