Abstract
Novel neural models have been proposed in recent years for learning under domain shift. Most models, however, only evaluate on a single task, on proprietary datasets, or compare to weak baselines, which makes comparison of models difficult. In this paper, we re-evaluate classic general-purpose bootstrapping approaches in the context of neural networks under domain shifts vs. recent neural approaches and propose a novel multi-task tri-training method that reduces the time and space complexity of classic tri-training. Extensive experiments on two benchmarks are negative: while our novel method establishes a new state-of-the-art for sentiment analysis, it does not fare consistently the best. More importantly, we arrive at the somewhat surprising conclusion that classic tri-training, with some additions, outperforms the state of the art. We conclude that classic approaches constitute an important and strong baseline.
| Originalsprog | Engelsk |
|---|---|
| Titel | Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics |
| Forlag | Association for Computational Linguistics |
| Publikationsdato | 2018 |
| Status | Udgivet - 2018 |
| Begivenhed | The 56th Annual Meeting of the Association for Computational Linguistics - Melbourne, Melbourne, Australien Varighed: 15 jul. 2018 → 20 jul. 2018 http://acl2018.org/ |
Konference
| Konference | The 56th Annual Meeting of the Association for Computational Linguistics |
|---|---|
| Lokation | Melbourne |
| Land/Område | Australien |
| By | Melbourne |
| Periode | 15/07/2018 → 20/07/2018 |
| Internetadresse |
Emneord
- Neural models
- Domain shift
- Bootstrapping approaches
- Tri-training
- Sentiment analysis