Tree-structured composition in neural networks without tree-structured
architectures
S. Bowman, C. Manning, and C. Potts. (2015)cite arxiv:1506.04834Comment: To appear in the proceedings of the 2015 NIPS Workshop on Cognitive Computation: Integrating Neural and Symbolic Approaches.
Abstract
Tree-structured neural networks encode a particular tree geometry for a
sentence in the network design. However, these models have at best only
slightly outperformed simpler sequence-based models. We hypothesize that neural
sequence models like LSTMs are in fact able to discover and implicitly use
recursive compositional structure, at least for tasks with clear cues to that
structure in the data. We demonstrate this possibility using an artificial data
task for which recursive compositional structure is crucial, and find an
LSTM-based sequence model can indeed learn to exploit the underlying tree
structure. However, its performance consistently lags behind that of tree
models, even on large training sets, suggesting that tree-structured models are
more effective at exploiting recursive structure.
Description
Tree-structured composition in neural networks without tree-structured
architectures
cite arxiv:1506.04834Comment: To appear in the proceedings of the 2015 NIPS Workshop on Cognitive Computation: Integrating Neural and Symbolic Approaches
%0 Generic
%1 bowman2015treestructured
%A Bowman, Samuel R.
%A Manning, Christopher D.
%A Potts, Christopher
%D 2015
%K ling lstm nn rnn
%T Tree-structured composition in neural networks without tree-structured
architectures
%U http://arxiv.org/abs/1506.04834
%X Tree-structured neural networks encode a particular tree geometry for a
sentence in the network design. However, these models have at best only
slightly outperformed simpler sequence-based models. We hypothesize that neural
sequence models like LSTMs are in fact able to discover and implicitly use
recursive compositional structure, at least for tasks with clear cues to that
structure in the data. We demonstrate this possibility using an artificial data
task for which recursive compositional structure is crucial, and find an
LSTM-based sequence model can indeed learn to exploit the underlying tree
structure. However, its performance consistently lags behind that of tree
models, even on large training sets, suggesting that tree-structured models are
more effective at exploiting recursive structure.
@preprint{bowman2015treestructured,
abstract = {Tree-structured neural networks encode a particular tree geometry for a
sentence in the network design. However, these models have at best only
slightly outperformed simpler sequence-based models. We hypothesize that neural
sequence models like LSTMs are in fact able to discover and implicitly use
recursive compositional structure, at least for tasks with clear cues to that
structure in the data. We demonstrate this possibility using an artificial data
task for which recursive compositional structure is crucial, and find an
LSTM-based sequence model can indeed learn to exploit the underlying tree
structure. However, its performance consistently lags behind that of tree
models, even on large training sets, suggesting that tree-structured models are
more effective at exploiting recursive structure.},
added-at = {2016-10-30T22:42:59.000+0100},
author = {Bowman, Samuel R. and Manning, Christopher D. and Potts, Christopher},
biburl = {https://www.bibsonomy.org/bibtex/2ffdd012eb09bbe32b665d8d772eddd52/jkan},
description = {Tree-structured composition in neural networks without tree-structured
architectures},
interhash = {66bff5f6fc4fd15ab4452b608a2660ca},
intrahash = {ffdd012eb09bbe32b665d8d772eddd52},
keywords = {ling lstm nn rnn},
note = {cite arxiv:1506.04834Comment: To appear in the proceedings of the 2015 NIPS Workshop on Cognitive Computation: Integrating Neural and Symbolic Approaches},
timestamp = {2016-10-30T22:42:59.000+0100},
title = {Tree-structured composition in neural networks without tree-structured
architectures},
url = {http://arxiv.org/abs/1506.04834},
year = 2015
}