Recurrent models and lower bounds for projective syntactic decoding

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

Abstract

The current state-of-the-art in neural graphbased parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shiftreduce
and graph-based parsers, projective or not. We also provide the first proof on the
lower bounds of projective maximum spanning tree, DAG, and digraph decoding.
OriginalsprogEngelsk
TitelProceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Antal sider10
Vol/bindVolume 1 (Long and Short Papers)
ForlagAssociation for Computational Linguistics
Publikationsdato2019
Sider251-260
DOI
StatusUdgivet - 2019

Emneord

  • Neural Graph-based Parsing
  • Approximate Decoding
  • Recurrent Models
  • Projective Maximum Spanning Tree
  • Shift-Reduce Parsers

Fingeraftryk

Dyk ned i forskningsemnerne om 'Recurrent models and lower bounds for projective syntactic decoding'. Sammen danner de et unikt fingeraftryk.

Citationsformater