DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
J. Devlin, M. Chang, K. Lee, и K. Toutanova. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), стр. 4171--4186. Minneapolis, Minnesota, Association for Computational Linguistics, (июня 2019)
K. Kobs, T. Koopmann, A. Zehe, D. Fernes, P. Krop, и A. Hotho. Findings of the Association for Computational Linguistics: EMNLP 2020, стр. 878--883. Online, Association for Computational Linguistics, (ноября 2020)