TY - GEN
T1 - Team DiSaster at SemEval-2020 Task 11: Combining BERT and hand-crafted Features for Identifying Propaganda Techniques in News
AU - Kaas, Anders
AU - Torp Thomsen, Viktor
AU - Plank, Barbara
PY - 2020
Y1 - 2020
N2 - The identification of communication techniques in news articles such as propaganda is important, as such techniques can influence the opinions of large numbers of people. Most work so far focused on the identification at the news article level. Recently, a new dataset and shared task has been proposed for the identification of propaganda techniques at the finer-grained span level. This paper describes our system submission to the subtask of technique classification (TC) for the SemEval 2020 shared task on detection of propaganda techniques in news articles. We propose a method of combining neural BERT representations with hand-crafted features via stacked generalization. Our model has the added advantage that it combines the power of contextual representations from BERT with simple span-based and article-based global features. We present an ablation study which shows that even though BERT representations are very powerful also for this task, BERT still benefits from being combined with carefully designed task-specific features.
AB - The identification of communication techniques in news articles such as propaganda is important, as such techniques can influence the opinions of large numbers of people. Most work so far focused on the identification at the news article level. Recently, a new dataset and shared task has been proposed for the identification of propaganda techniques at the finer-grained span level. This paper describes our system submission to the subtask of technique classification (TC) for the SemEval 2020 shared task on detection of propaganda techniques in news articles. We propose a method of combining neural BERT representations with hand-crafted features via stacked generalization. Our model has the added advantage that it combines the power of contextual representations from BERT with simple span-based and article-based global features. We present an ablation study which shows that even though BERT representations are very powerful also for this task, BERT still benefits from being combined with carefully designed task-specific features.
KW - Propaganda detection
KW - Span-level classification
KW - Neural BERT representations
KW - Hand-crafted features
KW - Stacked generalization
KW - Propaganda detection
KW - Span-level classification
KW - Neural BERT representations
KW - Hand-crafted features
KW - Stacked generalization
M3 - Article in proceedings
BT - SemEval 2020
PB - Association for Computational Linguistics
ER -