Strong Baselines for Neural Semi-Supervised Learning under Domain Shift

Sebastian Ruder, Barbara Plank

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

Abstract

Novel neural models have been proposed in recent years for learning under domain shift. Most models, however, only evaluate on a single task, on proprietary datasets, or compare to weak baselines, which makes comparison of models difficult. In this paper, we re-evaluate classic general-purpose bootstrapping approaches in the context of neural networks under domain shifts vs. recent neural approaches and propose a novel multi-task tri-training method that reduces the time and space complexity of classic tri-training. Extensive experiments on two benchmarks are negative: while our novel method establishes a new state-of-the-art for sentiment analysis, it does not fare consistently the best. More importantly, we arrive at the somewhat surprising conclusion that classic tri-training, with some additions, outperforms the state of the art. We conclude that classic approaches constitute an important and strong baseline.
OriginalsprogEngelsk
TitelProceedings of the 56th Annual Meeting of the Association for Computational Linguistics
ForlagAssociation for Computational Linguistics
Publikationsdato2018
StatusUdgivet - 2018
BegivenhedThe 56th Annual Meeting of the Association for Computational Linguistics - Melbourne, Melbourne, Australien
Varighed: 15 jul. 201820 jul. 2018
http://acl2018.org/

Konference

KonferenceThe 56th Annual Meeting of the Association for Computational Linguistics
LokationMelbourne
Land/OmrådeAustralien
ByMelbourne
Periode15/07/201820/07/2018
Internetadresse

Fingeraftryk

Dyk ned i forskningsemnerne om 'Strong Baselines for Neural Semi-Supervised Learning under Domain Shift'. Sammen danner de et unikt fingeraftryk.

Citationsformater