DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
J. Lin, R. Nogueira, und A. Yates. (2020)cite arxiv:2010.06467Comment: Final preproduction version of volume in Synthesis Lectures on Human Language Technologies by Morgan & Claypool.
M. Paris, und R. Jäschke. Proceedings of the 14th International Conference on Knowledge Science, Engineering and Management, Volume 12816 von Lecture Notes in Artificial Intelligence, Seite 1--14. Springer, (2021)
P. Xia, S. Wu, und B. Van Durme. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), Seite 7516--7533. Association for Computational Linguistics, (November 2020)
J. Devlin, M. Chang, K. Lee, und K. Toutanova. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Seite 4171--4186. Minneapolis, Minnesota, Association for Computational Linguistics, (Juni 2019)
K. Kobs, T. Koopmann, A. Zehe, D. Fernes, P. Krop, und A. Hotho. Findings of the Association for Computational Linguistics: EMNLP 2020, Seite 878--883. Online, Association for Computational Linguistics, (November 2020)