Data-to-text Generation with Variational Sequential Planning.

Research output: Journal Article or Conference Article in JournalJournal articleResearchpeer-review

Abstract

We consider the task of data-to-text generation, which aims to create textual output from non-linguistic input. We focus on generating long-form text, that is, documents with multiple paragraphs, and propose a neural model enhanced with a planning component responsible for organizing high-level information in a coherent and meaningful way. We infer latent plans sequentially with a structured variational model, while interleaving the steps of planning and generation. Text is generated by conditioning on previous variational decisions and previously generated text. Experiments on two data-to-text benchmarks (RotoWire and MLB) show that our model outperforms strong baselines and is sample-efficient in the face of limited training data (e.g., a few hundred instances).
Original languageEnglish
JournalTransactions of the Association for Computational Linguistics
Volume10
Pages (from-to)697-715
ISSN2307-387X
DOIs
Publication statusPublished - 2022
Externally publishedYes

Fingerprint

Dive into the research topics of 'Data-to-text Generation with Variational Sequential Planning.'. Together they form a unique fingerprint.

Cite this