Skip to main content

Position-aware Attention and Supervised Data Improve Slot Filling

Author(s): Zhang, Yuhao; Zhong, Victor; Chen, Danqi; Angeli, Gabor; Manning, Christopher D

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1q266
Full metadata record
DC FieldValueLanguage
dc.contributor.authorZhang, Yuhao-
dc.contributor.authorZhong, Victor-
dc.contributor.authorChen, Danqi-
dc.contributor.authorAngeli, Gabor-
dc.contributor.authorManning, Christopher D-
dc.date.accessioned2021-10-08T19:50:28Z-
dc.date.available2021-10-08T19:50:28Z-
dc.date.issued2017en_US
dc.identifier.citationZhang, Yuhao, Victor Zhong, Danqi Chen, Gabor Angeli, and Christopher D. Manning. "Position-aware Attention and Supervised Data Improve Slot Filling." In Conference on Empirical Methods in Natural Language Processing (2017): pp. 35-45. doi:10.18653/v1/D17-1004en_US
dc.identifier.urihttps://www.cs.princeton.edu/~danqic/papers/emnlp2017.pdf-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1q266-
dc.description.abstractOrganized relational knowledge in the form of “knowledge graphs” is important for many applications. However, the ability to populate knowledge bases with facts automatically extracted from documents has improved frustratingly slowly. This paper simultaneously addresses two issues that have held back prior work. We first propose an effective new model, which combines an LSTM sequence model with a form of entity position-aware attention that is better suited to relation extraction. Then we build TACRED, a large (119,474 examples) supervised relation extraction dataset obtained via crowdsourcing and targeted towards TAC KBP relations. The combination of better supervised data and a more appropriate high-capacity model enables much better relation extraction performance. When the model trained on this new dataset replaces the previous relation extraction component of the best TAC KBP 2015 slot filling system, its F1 score increases markedly from 22.2% to 26.7%.en_US
dc.format.extent35 - 45en_US
dc.language.isoen_USen_US
dc.relation.ispartofConference on Empirical Methods in Natural Language Processingen_US
dc.rightsAuthor's manuscripten_US
dc.titlePosition-aware Attention and Supervised Data Improve Slot Fillingen_US
dc.typeConference Articleen_US
dc.identifier.doi10.18653/v1/D17-1004-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/conference-proceedingen_US

Files in This Item:
File Description SizeFormat 
PositionAwareAttention.pdf705.93 kBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.