Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP

Rob van der Goot, Ahmet Üstün, Alan Ramponi, Ibrahim Sharaf, Barbara Plank

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review


Transfer learning, particularly approaches that combine multi-task learning with pretrained contextualized embeddings and fine-tuning, have advanced the field of Natural Language Processing tremendously in recent years. In this paper we present MaChAmp, a toolkit for easy fine-tuning of contextualized embeddings in multi-task settings. The benefits of MaChAmp are its flexible configuration options, and the support of a variety of natural language processing tasks in a uniform toolkit, from text classification and sequence labeling to dependency parsing, masked language modeling, and text generation.
Original languageEnglish
Title of host publicationProceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: System Demonstrations
PublisherAssociation for Computational Linguistics
Publication date2021
Publication statusPublished - 2021


Dive into the research topics of 'Massive Choice, Ample Tasks (MaChAmp): A Toolkit for Multi-task Learning in NLP'. Together they form a unique fingerprint.
  • Outstanding paper award, EACL 2021 demo track

    van der Goot, Rob (Recipient), Üstün, Ahmet (Recipient), Ramponi, Alan (Recipient), Sharaf, Ibrahim (Recipient) & Plank, Barbara (Recipient), 2021

    Prize: Prizes, scholarships, distinctions

Cite this