Projekter pr. år
Abstract
Transfer learning, particularly approaches that combine multi-task learning with pretrained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations |
Forlag | Association for Computational Linguistics |
Publikationsdato | 2021 |
Sider | 176-197 |
DOI | |
Status | Udgivet - 2021 |
Emneord
- Transfer Learning
- Multi-Task Learning
- Pretrained Contextualized Embeddings
- Fine-Tuning
- Natural Language Processing Toolkit
Fingeraftryk
Dyk ned i forskningsemnerne om 'Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP'. Sammen danner de et unikt fingeraftryk.Priser
-
Outstanding paper award, EACL 2021 demo track
van der Goot, R. (Modtager), Üstün, A. (Modtager), Ramponi, A. (Modtager), Sharaf, I. (Modtager) & Plank, B. (Modtager), 2021
Pris: Priser, stipendier, udnævnelser
Projekter
- 1 Afsluttet
-
Multi-Task Sequence Labeling Under Adverse Conditions
Plank, B. (PI) & van der Goot, R. (CoI)
01/04/2019 → 31/08/2020
Projekter: Projekt › Andet