Inproceedings,

Co²PT: Mitigating Bias in Pre-trained Language Models through Counterfactual Contrastive Prompt Tuning.

, , , , and .
EMNLP (Findings), page 5859-5871. Association for Computational Linguistics, (2023)

Meta data

Tags

Users

  • @dblp

Comments and Reviews