DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
J. Lin, R. Nogueira, and A. Yates. (2020)cite arxiv:2010.06467Comment: Final preproduction version of volume in Synthesis Lectures on Human Language Technologies by Morgan & Claypool.
M. Paris, and R. Jäschke. Proceedings of the 14th International Conference on Knowledge Science, Engineering and Management, volume 12816 of Lecture Notes in Artificial Intelligence, page 1--14. Springer, (2021)
P. Xia, S. Wu, and B. Van Durme. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), page 7516--7533. Association for Computational Linguistics, (November 2020)
J. Devlin, M. Chang, K. Lee, and K. Toutanova. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), page 4171--4186. Minneapolis, Minnesota, Association for Computational Linguistics, (June 2019)
K. Kobs, T. Koopmann, A. Zehe, D. Fernes, P. Krop, and A. Hotho. Findings of the Association for Computational Linguistics: EMNLP 2020, page 878--883. Online, Association for Computational Linguistics, (November 2020)