Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent
Unit for Summarization
M. Kim, M. Singh, und M. Lee. (2016)cite arxiv:1607.00718Comment: To appear in RepL4NLP at ACL 2016.
Zusammenfassung
In this work, we introduce temporal hierarchies to the sequence to sequence
(seq2seq) model to tackle the problem of abstractive summarization of
scientific articles. The proposed Multiple Timescale model of the Gated
Recurrent Unit (MTGRU) is implemented in the encoder-decoder setting to better
deal with the presence of multiple compositionalities in larger texts. The
proposed model is compared to the conventional RNN encoder-decoder, and the
results demonstrate that our model trains faster and shows significant
performance gains. The results also show that the temporal hierarchies help
improve the ability of seq2seq models to capture compositionalities better
without the presence of highly complex architectural hierarchies.
Beschreibung
[1607.00718] Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization
%0 Generic
%1 kim2016towards
%A Kim, Minsoo
%A Singh, Moirangthem Dennis
%A Lee, Minho
%D 2016
%K kallimachos rnn seq2seq summarisation
%T Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent
Unit for Summarization
%U http://arxiv.org/abs/1607.00718
%X In this work, we introduce temporal hierarchies to the sequence to sequence
(seq2seq) model to tackle the problem of abstractive summarization of
scientific articles. The proposed Multiple Timescale model of the Gated
Recurrent Unit (MTGRU) is implemented in the encoder-decoder setting to better
deal with the presence of multiple compositionalities in larger texts. The
proposed model is compared to the conventional RNN encoder-decoder, and the
results demonstrate that our model trains faster and shows significant
performance gains. The results also show that the temporal hierarchies help
improve the ability of seq2seq models to capture compositionalities better
without the presence of highly complex architectural hierarchies.
@misc{kim2016towards,
abstract = {In this work, we introduce temporal hierarchies to the sequence to sequence
(seq2seq) model to tackle the problem of abstractive summarization of
scientific articles. The proposed Multiple Timescale model of the Gated
Recurrent Unit (MTGRU) is implemented in the encoder-decoder setting to better
deal with the presence of multiple compositionalities in larger texts. The
proposed model is compared to the conventional RNN encoder-decoder, and the
results demonstrate that our model trains faster and shows significant
performance gains. The results also show that the temporal hierarchies help
improve the ability of seq2seq models to capture compositionalities better
without the presence of highly complex architectural hierarchies.},
added-at = {2017-11-09T08:54:05.000+0100},
author = {Kim, Minsoo and Singh, Moirangthem Dennis and Lee, Minho},
biburl = {https://www.bibsonomy.org/bibtex/2e93a321eacaa8c9c2b563af8a1044f16/albinzehe},
description = {[1607.00718] Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent Unit for Summarization},
interhash = {caea25b16a4c8e7696942cfcdeab37cf},
intrahash = {e93a321eacaa8c9c2b563af8a1044f16},
keywords = {kallimachos rnn seq2seq summarisation},
note = {cite arxiv:1607.00718Comment: To appear in RepL4NLP at ACL 2016},
timestamp = {2017-11-09T08:54:05.000+0100},
title = {Towards Abstraction from Extraction: Multiple Timescale Gated Recurrent
Unit for Summarization},
url = {http://arxiv.org/abs/1607.00718},
year = 2016
}