Abstract
Transfer learning, particularly approaches that combine multi-task learning with pretrained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.
Original language | English |
---|---|
Title of host publication | Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations |
Publisher | Association for Computational Linguistics |
Publication date | 2021 |
Pages | 176-197 |
Publication status | Published - 2021 |
Fingerprint
Dive into the research topics of 'Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP'. Together they form a unique fingerprint.Prizes
-
Outstanding paper award, EACL 2021 demo track
van der Goot, Rob (Recipient), Üstün, Ahmet (Recipient), Ramponi, Alan (Recipient), Sharaf, Ibrahim (Recipient) & Plank, Barbara (Recipient), 2021
Prize: Prizes, scholarships, distinctions