Fine-tuned BERT Model for Multi-Label Tweets Classification
H. Zahera, I. Elgendy, R. Jalota, und M. Sherif. Proceedings of the Twenty-Eighth Text REtrieval Conference, TREC 2019, Gaithersburg, Maryland, USA, November 13-15, 2019, (2019)
Zusammenfassung
In this paper, we describe our approach to classify disaster-related tweets into multilabel information types (ie, labels). We aim to filter first relevant tweets during disasters. Then, we assign tweets relevant information types. Information types can be SearchAndRescue, MovePeople and Volunteer. We employ a fine-tuned BERT model with 10 BERT layers. Further, we submitted our approach to the TREC-IS 2019 challenge, the evaluation results showed that our approach outperforms the F1-score of median score in identifying actionable information.
%0 Conference Paper
%1 DBLP:conf/trec/ZaheraEJS19
%A Zahera, Hamada M.
%A Elgendy, Ibrahim A.
%A Jalota, Rricha
%A Sherif, Mohamed Ahmed
%B Proceedings of the Twenty-Eighth Text REtrieval Conference, TREC 2019, Gaithersburg, Maryland, USA, November 13-15, 2019
%D 2019
%K dice elgendy jalota sherif zahera
%T Fine-tuned BERT Model for Multi-Label Tweets Classification
%U https://svn-serv.cs.uni-paderborn.de/datascience/papers/2019/TREC-IS_2019_BertMutliLabelClassification
%X In this paper, we describe our approach to classify disaster-related tweets into multilabel information types (ie, labels). We aim to filter first relevant tweets during disasters. Then, we assign tweets relevant information types. Information types can be SearchAndRescue, MovePeople and Volunteer. We employ a fine-tuned BERT model with 10 BERT layers. Further, we submitted our approach to the TREC-IS 2019 challenge, the evaluation results showed that our approach outperforms the F1-score of median score in identifying actionable information.
@inproceedings{DBLP:conf/trec/ZaheraEJS19,
abstract = {In this paper, we describe our approach to classify disaster-related tweets into multilabel information types (ie, labels). We aim to filter first relevant tweets during disasters. Then, we assign tweets relevant information types. Information types can be SearchAndRescue, MovePeople and Volunteer. We employ a fine-tuned BERT model with 10 BERT layers. Further, we submitted our approach to the TREC-IS 2019 challenge, the evaluation results showed that our approach outperforms the F1-score of median score in identifying actionable information.},
added-at = {2024-06-18T09:43:55.000+0200},
author = {Zahera, Hamada M. and Elgendy, Ibrahim A. and Jalota, Rricha and Sherif, Mohamed Ahmed},
bibsource = {dblp computer science bibliography, https://dblp.org},
biburl = {https://www.bibsonomy.org/bibtex/2605f0d664a0289126671f0a80545ab50/aksw},
booktitle = {Proceedings of the Twenty-Eighth Text REtrieval Conference, {TREC} 2019, Gaithersburg, Maryland, USA, November 13-15, 2019},
crossref = {DBLP:conf/trec/2019},
interhash = {3d593fe82e45f255a31a27bd28010a0d},
intrahash = {605f0d664a0289126671f0a80545ab50},
keywords = {dice elgendy jalota sherif zahera},
timestamp = {2024-06-18T09:43:55.000+0200},
title = {Fine-tuned {BERT} Model for Multi-Label Tweets Classification},
url = {https://svn-serv.cs.uni-paderborn.de/datascience/papers/2019/TREC-IS_2019_BertMutliLabelClassification},
year = 2019
}