DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
J. Devlin, M. Chang, K. Lee, and K. Toutanova. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), page 4171--4186. Minneapolis, Minnesota, Association for Computational Linguistics, (June 2019)
K. Kobs, T. Koopmann, A. Zehe, D. Fernes, P. Krop, and A. Hotho. Findings of the Association for Computational Linguistics: EMNLP 2020, page 878--883. Online, Association for Computational Linguistics, (November 2020)