Abstract
Novel neural models have been proposed in recent years for learning under domain shift. Most models, however, only evaluate on a single task, on proprietary datasets, or compare to weak baselines, which makes comparison of models difficult. In this paper, we re-evaluate classic general-purpose bootstrapping approaches in the context of neural networks under domain shifts vs. recent neural approaches and propose a novel multi-task tri-training method that reduces the time and space complexity of classic tri-training. Extensive experiments on two benchmarks are negative: while our novel method establishes a new state-of-the-art for sentiment analysis, it does not fare consistently the best. More importantly, we arrive at the somewhat surprising conclusion that classic tri-training, with some additions, outperforms the state of the art. We conclude that classic approaches constitute an important and strong baseline.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics |
| Publisher | Association for Computational Linguistics |
| Publication date | 2018 |
| Publication status | Published - 2018 |
| Event | Annual Meeting of the Association for Computational Linguistics - Melbourne, Melbourne, Australia Duration: 15 Jul 2018 → 20 Jul 2018 Conference number: 56 http://acl2018.org/ |
Conference
| Conference | Annual Meeting of the Association for Computational Linguistics |
|---|---|
| Number | 56 |
| Location | Melbourne |
| Country/Territory | Australia |
| City | Melbourne |
| Period | 15/07/2018 → 20/07/2018 |
| Internet address |
Keywords
- Neural models
- Domain shift
- Bootstrapping approaches
- Tri-training
- Sentiment analysis
Fingerprint
Dive into the research topics of 'Strong Baselines for Neural Semi-Supervised Learning under Domain Shift'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver