Multi-Document Summarization with Centroid-Based Pretraining.

Ratish Surendran Puduppully, Parag Jain, Nancy Chen, Mark Steedman

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

Abstract

In Multi-Document Summarization (MDS), the input can be modeled as a set of documents, and the output is its summary. In this paper, we focus on pretraining objectives for MDS. Specifically, we introduce a novel pretraining objective, which involves selecting the ROUGE-based centroid of each document cluster as a proxy for its summary. Our objective thus does not require human written summaries and can be utilized for pretraining on a dataset consisting solely of document sets. Through zero-shot, few-shot, and fully supervised experiments on multiple MDS datasets, we show that our model Centrum is better or comparable to a state-of-the-art model. We make the pretrained and fine-tuned models freely available to the research community
OriginalsprogEngelsk
TitelProceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
ForlagAssociation for Computational Linguistics
Publikationsdato2023
Sider128-138
DOI
StatusUdgivet - 2023
Udgivet eksterntJa

Fingeraftryk

Dyk ned i forskningsemnerne om 'Multi-Document Summarization with Centroid-Based Pretraining.'. Sammen danner de et unikt fingeraftryk.

Citationsformater