Cross-Sentence N-ary Relation Extraction with Graph LSTMs
N. Peng, H. Poon, C. Quirk, K. Toutanova, and W. Yih. ACL, (2017)cite arxiv:1708.03743Comment: Conditional accepted by TACL in December 2016; published in April 2017; presented at ACL in August 2017.
Abstract
Past work in relation extraction has focused on binary relations in single
sentences. Recent NLP inroads in high-value domains have sparked interest in
the more general setting of extracting n-ary relations that span multiple
sentences. In this paper, we explore a general relation extraction framework
based on graph long short-term memory networks (graph LSTMs) that can be easily
extended to cross-sentence n-ary relation extraction. The graph formulation
provides a unified way of exploring different LSTM approaches and incorporating
various intra-sentential and inter-sentential dependencies, such as sequential,
syntactic, and discourse relations. A robust contextual representation is
learned for the entities, which serves as input to the relation classifier.
This simplifies handling of relations with arbitrary arity, and enables
multi-task learning with related relations. We evaluate this framework in two
important precision medicine settings, demonstrating its effectiveness with
both conventional supervised learning and distant supervision. Cross-sentence
extraction produced larger knowledge bases. and multi-task learning
significantly improved extraction accuracy. A thorough analysis of various LSTM
approaches yielded useful insight the impact of linguistic analysis on
extraction accuracy.
Description
Cross-Sentence N-ary Relation Extraction with Graph LSTMs
%0 Conference Paper
%1 peng2017crosssentence
%A Peng, Nanyun
%A Poon, Hoifung
%A Quirk, Chris
%A Toutanova, Kristina
%A Yih, Wen-tau
%B ACL
%D 2017
%K extraction graph lstm network neural pubmed relation
%T Cross-Sentence N-ary Relation Extraction with Graph LSTMs
%U http://arxiv.org/abs/1708.03743
%X Past work in relation extraction has focused on binary relations in single
sentences. Recent NLP inroads in high-value domains have sparked interest in
the more general setting of extracting n-ary relations that span multiple
sentences. In this paper, we explore a general relation extraction framework
based on graph long short-term memory networks (graph LSTMs) that can be easily
extended to cross-sentence n-ary relation extraction. The graph formulation
provides a unified way of exploring different LSTM approaches and incorporating
various intra-sentential and inter-sentential dependencies, such as sequential,
syntactic, and discourse relations. A robust contextual representation is
learned for the entities, which serves as input to the relation classifier.
This simplifies handling of relations with arbitrary arity, and enables
multi-task learning with related relations. We evaluate this framework in two
important precision medicine settings, demonstrating its effectiveness with
both conventional supervised learning and distant supervision. Cross-sentence
extraction produced larger knowledge bases. and multi-task learning
significantly improved extraction accuracy. A thorough analysis of various LSTM
approaches yielded useful insight the impact of linguistic analysis on
extraction accuracy.
@inproceedings{peng2017crosssentence,
abstract = {Past work in relation extraction has focused on binary relations in single
sentences. Recent NLP inroads in high-value domains have sparked interest in
the more general setting of extracting n-ary relations that span multiple
sentences. In this paper, we explore a general relation extraction framework
based on graph long short-term memory networks (graph LSTMs) that can be easily
extended to cross-sentence n-ary relation extraction. The graph formulation
provides a unified way of exploring different LSTM approaches and incorporating
various intra-sentential and inter-sentential dependencies, such as sequential,
syntactic, and discourse relations. A robust contextual representation is
learned for the entities, which serves as input to the relation classifier.
This simplifies handling of relations with arbitrary arity, and enables
multi-task learning with related relations. We evaluate this framework in two
important precision medicine settings, demonstrating its effectiveness with
both conventional supervised learning and distant supervision. Cross-sentence
extraction produced larger knowledge bases. and multi-task learning
significantly improved extraction accuracy. A thorough analysis of various LSTM
approaches yielded useful insight the impact of linguistic analysis on
extraction accuracy.},
added-at = {2018-07-02T13:17:31.000+0200},
author = {Peng, Nanyun and Poon, Hoifung and Quirk, Chris and Toutanova, Kristina and Yih, Wen-tau},
biburl = {https://www.bibsonomy.org/bibtex/2579fb969679799515b8f72bc7c751bb0/schwemmlein},
booktitle = {ACL},
description = {Cross-Sentence N-ary Relation Extraction with Graph LSTMs},
interhash = {a8d8870408d4b2df4b06001e1723ec2e},
intrahash = {579fb969679799515b8f72bc7c751bb0},
keywords = {extraction graph lstm network neural pubmed relation},
note = {cite arxiv:1708.03743Comment: Conditional accepted by TACL in December 2016; published in April 2017; presented at ACL in August 2017},
timestamp = {2018-07-02T13:17:31.000+0200},
title = {Cross-Sentence N-ary Relation Extraction with Graph LSTMs},
url = {http://arxiv.org/abs/1708.03743},
year = 2017
}