DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
J. Lin, R. Nogueira, и A. Yates. (2020)cite arxiv:2010.06467Comment: Final preproduction version of volume in Synthesis Lectures on Human Language Technologies by Morgan & Claypool.
M. Paris, и R. Jäschke. Proceedings of the 14th International Conference on Knowledge Science, Engineering and Management, том 12816 из Lecture Notes in Artificial Intelligence, стр. 1--14. Springer, (2021)
P. Xia, S. Wu, и B. Van Durme. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), стр. 7516--7533. Association for Computational Linguistics, (ноября 2020)
J. Devlin, M. Chang, K. Lee, и K. Toutanova. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), стр. 4171--4186. Minneapolis, Minnesota, Association for Computational Linguistics, (июня 2019)
K. Kobs, T. Koopmann, A. Zehe, D. Fernes, P. Krop, и A. Hotho. Findings of the Association for Computational Linguistics: EMNLP 2020, стр. 878--883. Online, Association for Computational Linguistics, (ноября 2020)