Neural Unsupervised Domain Adaptation in NLP—A Survey
Research output: Conference Article in Proceeding or Book/Report chapter › Article in proceedings › Research › peer-review
Motivated by the latest advances, in this survey we review neural unsupervised domain adaptation techniques which do not require labeled target domain data. This is a more challenging yet a more widely applicable setup. We outline methods, from early traditional non-neural methods to pre-trained model transfer. We also revisit the notion of domain, and we uncover a bias in the type of Natural Language Processing tasks which received most attention. Lastly, we outline future directions, particularly the broader need for out-of-distribution generalization of future NLP.
|Title of host publication||The 28th International Conference on Computational Linguistics|
|Publisher||Association for Computational Linguistics|
|Publication date||Dec 2020|
|Publication status||Published - Dec 2020|