Abstract
The current state-of-the-art in neural graphbased parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shiftreduce
and graph-based parsers, projective or not. We also provide the first proof on the
lower bounds of projective maximum spanning tree, DAG, and digraph decoding.
and graph-based parsers, projective or not. We also provide the first proof on the
lower bounds of projective maximum spanning tree, DAG, and digraph decoding.
Originalsprog | Engelsk |
---|---|
Titel | Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |
Antal sider | 10 |
Vol/bind | Volume 1 (Long and Short Papers) |
Forlag | Association for Computational Linguistics |
Publikationsdato | 2019 |
Sider | 251-260 |
DOI | |
Status | Udgivet - 2019 |
Emneord
- Neural Graph-based Parsing
- Approximate Decoding
- Recurrent Models
- Projective Maximum Spanning Tree
- Shift-Reduce Parsers