DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome - GitHub - jerryji1993/DNABERT: DNABERT: pre-trained Bidirectional Encoder Representations from Transformers model for DNA-language in genome
A. Sonnenbichler. (2010)cite arxiv:1006.4271
Comment: Presented at the International Network For Social Network Analysis
(INSNA): Sunbelt Conference 2009, San Diego, California, USA. 9 pages, 6
figures.
O. Corby, R. Dieng, and C. Hébert. Conceptual Structures: Logical, Linguistic, and Computational
Issues, 8th International Conference on Conceptual Structures,
ICCS 2000, Darmstadt, Germany, August 14-18, 2000, Proceedings, volume 1867 of LNCS, page 468--482. Springer, (2000)
D. Nguyen, N. Smith, and C. Rosé. Proceedings of the 5th ACL-HLT Workshop on Language Technology for Cultural Heritage, Social Sciences, and Humanities, page 115--123. Stroudsburg, PA, USA, Association for Computational Linguistics, (2011)
M. Tambuscio, G. Ruffo, A. Flammini, and F. Menczer. Proceedings of the 24th International Conference on World Wide Web, page 977--982. New York, NY, USA, ACM, (2015)