Skip to main content

Strongly Incremental Constituency Parsing with Graph Neural Networks

Author(s): Yang, Kaiyu; Deng, Jia

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1tp0k
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYang, Kaiyu-
dc.contributor.authorDeng, Jia-
dc.date.accessioned2021-10-08T19:50:43Z-
dc.date.available2021-10-08T19:50:43Z-
dc.date.issued2020en_US
dc.identifier.citationYang, Kaiyu, and Jia Deng. "Strongly Incremental Constituency Parsing with Graph Neural Networks." Advances in Neural Information Processing Systems 33 (2020): pp. 21687–21698.en_US
dc.identifier.issn1049-5258-
dc.identifier.urihttps://proceedings.neurips.cc/paper/2020/file/f7177163c833dff4b38fc8d2872f1ec6-Paper.pdf-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1tp0k-
dc.description.abstractParsing sentences into syntax trees can benefit downstream applications in NLP. Transition-based parsers build trees by executing actions in a state transition system. They are computationally efficient, and can leverage machine learning to predict actions based on partial trees. However, existing transition-based parsers are predominantly based on the shift-reduce transition system, which does not align with how humans are known to parse sentences. Psycholinguistic research suggests that human parsing is strongly incremental—humans grow a single parse tree by adding exactly one token at each step. In this paper, we propose a novel transition system called attach-juxtapose. It is strongly incremental; it represents a partial sentence using a single tree; each action adds exactly one token into the partial tree. Based on our transition system, we develop a strongly incremental parser. At each step, it encodes the partial tree using a graph neural network and predicts an action. We evaluate our parser on Penn Treebank (PTB) and Chinese Treebank (CTB). On PTB, it outperforms existing parsers trained with only constituency trees; and it performs on par with state-of-the-art parsers that use dependency trees as additional training data. On CTB, our parser establishes a new state of the art. Code is available at https://github.com/princeton-vl/attach-juxtapose-parser.en_US
dc.format.extent21687 - 21698en_US
dc.language.isoen_USen_US
dc.relation.ispartofAdvances in Neural Information Processing Systemsen_US
dc.rightsFinal published version. Article is made available in OAR by the publisher's permission or policy.en_US
dc.titleStrongly Incremental Constituency Parsing with Graph Neural Networksen_US
dc.typeConference Articleen_US
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
StronglyIncremental.pdf348.64 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.