Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP

Rob van der Goot, Ahmet Üstün, Alan Ramponi, Ibrahim Sharaf, Barbara Plank

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

Transfer learning, particularly approaches that combine multi-task learning with pretrained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.
Original languageEnglish
Title of host publicationProceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations
PublisherAssociation for Computational Linguistics
Publication date2021
Pages176-197
DOIs
Publication statusPublished - 2021

Keywords

  • Transfer Learning
  • Multi-Task Learning
  • Pretrained Contextualized Embeddings
  • Fine-Tuning
  • Natural Language Processing Toolkit

Fingerprint

Dive into the research topics of 'Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP'. Together they form a unique fingerprint.

Cite this