Author of the publication

Active Learning for Sequence Tagging with Deep Pre-trained Models and Bayesian Uncertainty Estimates.

, , , , , , , , , and . EACL, page 1698-1712. Association for Computational Linguistics, (2021)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

RuPAWS: A Russian Adversarial Dataset for Paraphrase Identification., , , , , and . LREC, page 5683-5691. European Language Resources Association, (2022)TeXDYNA: Hierarchical Reinforcement Learning in Factored MDPs., , and . SAB, volume 6226 of Lecture Notes in Computer Science, page 489-500. Springer, (2010)Considering Unseen States as Impossible in Factored Reinforcement Learning., , , and . ECML/PKDD (1), volume 5781 of Lecture Notes in Computer Science, page 721-735. Springer, (2009)SkoltechNLP at SemEval-2021 Task 5: Leveraging Sentence-level Pre-training for Toxic Span Detection., , , , , and . SemEval@ACL/IJCNLP, page 927-934. Association for Computational Linguistics, (2021)Hierarchical & Factored Reinforcement Learning. (Apprentissage par renforcement hiérarchique et factorisé).. Pierre and Marie Curie University, Paris, France, (2010)Anticipatory Learning Classifier Systems and Factored Reinforcement Learning., , , and . ABiALS, volume 5499 of Lecture Notes in Computer Science, page 321-333. Springer, (2008)Anticipatory Learning Classifier Systems and Factored Reinforcement Learning, , , and . Springer-Verlag, Berlin, Heidelberg, (2009)Crowdsourcing of Parallel Corpora: the Case of Style Transfer for Detoxification., , , , , , and . CSW@VLDB, volume 2932 of CEUR Workshop Proceedings, page 35-49. CEUR-WS.org, (2021)Methods for Detoxification of Texts for the Russian Language., , , , , , and . Multimodal Technol. Interact., 5 (9): 54 (2021)Active Learning for Sequence Tagging with Deep Pre-trained Models and Bayesian Uncertainty Estimates., , , , , , , , , and . EACL, page 1698-1712. Association for Computational Linguistics, (2021)