ITU

Recurrent models and lower bounds for projective syntactic decoding

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Standard

Recurrent models and lower bounds for projective syntactic decoding. / Schluter, Natalie.

Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Vol. Volume 1 (Long and Short Papers) Association for Computational Linguistics, 2019. p. 251-260.

Research output: Conference Article in Proceeding or Book/Report chapterArticle in proceedingsResearchpeer-review

Harvard

Schluter, N 2019, Recurrent models and lower bounds for projective syntactic decoding. in Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. vol. Volume 1 (Long and Short Papers), Association for Computational Linguistics, pp. 251-260. https://doi.org/10.18653/v1/N19-1022

APA

Schluter, N. (2019). Recurrent models and lower bounds for projective syntactic decoding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Vol. Volume 1 (Long and Short Papers), pp. 251-260). Association for Computational Linguistics. https://doi.org/10.18653/v1/N19-1022

Vancouver

Schluter N. Recurrent models and lower bounds for projective syntactic decoding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Vol. Volume 1 (Long and Short Papers). Association for Computational Linguistics. 2019. p. 251-260 https://doi.org/10.18653/v1/N19-1022

Author

Schluter, Natalie. / Recurrent models and lower bounds for projective syntactic decoding. Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Vol. Volume 1 (Long and Short Papers) Association for Computational Linguistics, 2019. pp. 251-260

Bibtex

@inproceedings{b4ca3ad8e91f444d990b57eb0b7b7510,
title = "Recurrent models and lower bounds for projective syntactic decoding",
abstract = "The current state-of-the-art in neural graphbased parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shiftreduceand graph-based parsers, projective or not. We also provide the first proof on thelower bounds of projective maximum spanning tree, DAG, and digraph decoding.",
author = "Natalie Schluter",
year = "2019",
doi = "10.18653/v1/N19-1022",
language = "English",
volume = "Volume 1 (Long and Short Papers)",
pages = "251--260",
booktitle = "Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
publisher = "Association for Computational Linguistics",
address = "United States",

}

RIS

TY - GEN

T1 - Recurrent models and lower bounds for projective syntactic decoding

AU - Schluter, Natalie

PY - 2019

Y1 - 2019

N2 - The current state-of-the-art in neural graphbased parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shiftreduceand graph-based parsers, projective or not. We also provide the first proof on thelower bounds of projective maximum spanning tree, DAG, and digraph decoding.

AB - The current state-of-the-art in neural graphbased parsing uses only approximate decoding at the training phase. In this paper aim to understand this result better. We show how recurrent models can carry out projective maximum spanning tree decoding. This result holds for both current state-of-the-art models for shiftreduceand graph-based parsers, projective or not. We also provide the first proof on thelower bounds of projective maximum spanning tree, DAG, and digraph decoding.

UR - https://www.aclweb.org/anthology/N19-1022/

U2 - 10.18653/v1/N19-1022

DO - 10.18653/v1/N19-1022

M3 - Article in proceedings

VL - Volume 1 (Long and Short Papers)

SP - 251

EP - 260

BT - Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies

PB - Association for Computational Linguistics

ER -

ID: 84679221