DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
P. Xia, S. Wu, and B. Van Durme. Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), page 7516--7533. Association for Computational Linguistics, (November 2020)
J. Huang, Z. Zhuang, J. Li, and C. Giles. Proceedings of the 2008 International Conference on Web Search and Data Mining, page 107--116. New York, NY, USA, ACM, (2008)