DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
J. Devlin, M. Chang, K. Lee, und K. Toutanova. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Seite 4171--4186. Minneapolis, Minnesota, Association for Computational Linguistics, (Juni 2019)
K. Kobs, T. Koopmann, A. Zehe, D. Fernes, P. Krop, und A. Hotho. Findings of the Association for Computational Linguistics: EMNLP 2020, Seite 878--883. Online, Association for Computational Linguistics, (November 2020)