Inproceedings,

KoreALBERT: Pretraining a Lite BERT Model for Korean Language Understanding.

, , , , , and .
ICPR, page 5551-5557. IEEE, (2020)

Meta data

Tags

Users

  • @dblp

Comments and Reviews