Abstract
The current state-of-the-art in neural graphbased parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shiftreduce
and graph-based parsers, projective or not. We also provide the first proof on the
lower bounds of projective maximum spanning tree, DAG, and digraph decoding.
and graph-based parsers, projective or not. We also provide the first proof on the
lower bounds of projective maximum spanning tree, DAG, and digraph decoding.
Original language | English |
---|---|
Title of host publication | Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |
Number of pages | 10 |
Volume | Volume 1 (Long and Short Papers) |
Publisher | Association for Computational Linguistics |
Publication date | 2019 |
Pages | 251-260 |
DOIs | |
Publication status | Published - 2019 |
Keywords
- Neural Graph-based Parsing
- Approximate Decoding
- Recurrent Models
- Projective Maximum Spanning Tree
- Shift-Reduce Parsers