ITU

Cross-lingual Multi-task Transfer for Zero-shot Task-oriented Dialog

Research output: Contribution to conference - NOT published in proceeding or journalConference abstract for conferenceResearchpeer-review

View graph of relations

Digital assistants are becoming an integral part of everyday life. However, commercial digital assistants are only available for a limited set of languages. Because of this, a vast amount of people can not use these devices in their native tongue.
In this work, we focus on two core tasks within the digital assistant pipeline: intent classification and slot detection. Intent classification recovers the goal of the utterance, whereas slot detection identifies important properties regarding this goal. Besides introducing a novel cross-lingual dataset for these tasks, consisting of 11 languages, we evaluate a variety of models: 1)
multilingually pretrained transformer-based models, 2) we supplement these models with auxiliary tasks to evaluate whether multi-task learning can be beneficial, and 3) annotation transfer with neural machine translation.
Original languageEnglish
Publication date25 Sep 2021
Publication statusPublished - 25 Sep 2021
EventRESOURCEFUL-2020
: RESOURCEs and representations For Under-resourced Languages and domains
- Gothenburg, Gothenburg, Sweden
Duration: 25 Nov 2020 → …
https://gu-clasp.github.io/resourceful-2020/

Workshop

WorkshopRESOURCEFUL-2020
LocationGothenburg
CountrySweden
CityGothenburg
Period25/11/2020 → …
Internet address

ID: 85929442