Mention Attention for Pronoun Translation

Gongbo Tang, Christian Hardmeier

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Abstract

Most pronouns are referring expressions, computers need to resolve what do the pronouns refer to, and there are divergences on pronoun usage across languages. Thus, dealing with these divergences and translating pronouns is a challenge in machine translation. Mentions are referring candidates of pronouns and have closer relations with pronouns compared to general tokens. We assume that extracting additional mention features can help pronoun translation. Therefore, we introduce an additional mention attention module in the decoder to pay extra attention to source mentions but not non-mention tokens. Our mention attention module not only extracts features from source mentions, but also considers target-side context which benefits pronoun translation. In addition, we also introduce two mention classifiers to train models to recognize mentions, whose outputs guide the mention attention. We conduct experiments on the WMT17 English–German translation task, and evaluate our models on general translation and pronoun translation, using BLEU, APT, and contrastive evaluation metrics. Our proposed model outperforms the baseline Transformer model in terms of APT and BLEU scores, this confirms our hypothesis that we can improve pronoun translation by paying additional attention to source mentions, and shows that our introduced additional modules do not have negative effect on the general translation quality.
Original languageEnglish
Title of host publicationACM International Conference Proceeding Series
Number of pages4
PublisherAssociation for Computing Machinery
Publication date2023
Pages161 - 164
ISBN (Print)9798400707704
DOIs
Publication statusPublished - 2023
Event 2023 International Joint Conference on Robotics and Artificial Intelligence - Shanghai , China
Duration: 7 Jul 20239 Jul 2023

Conference

Conference 2023 International Joint Conference on Robotics and Artificial Intelligence
Country/TerritoryChina
CityShanghai
Period07/07/202309/07/2023

Keywords

  • Mention attention
  • Pronoun translation

Fingerprint

Dive into the research topics of 'Mention Attention for Pronoun Translation'. Together they form a unique fingerprint.

Cite this