X. Zhang, and Y. LeCun. (2015)cite arxiv:1502.01710Comment: This technical report is superseded by a paper entitled "Character-level Convolutional Networks for Text Classification", arXiv:1509.01626. It has considerably more experimental results and a rewritten introduction.
Abstract
This article demontrates that we can apply deep learning to text
understanding from character-level inputs all the way up to abstract text
concepts, using temporal convolutional networks (ConvNets). We apply ConvNets
to various large-scale datasets, including ontology classification, sentiment
analysis, and text categorization. We show that temporal ConvNets can achieve
astonishing performance without the knowledge of words, phrases, sentences and
any other syntactic or semantic structures with regards to a human language.
Evidence shows that our models can work for both English and Chinese.
cite arxiv:1502.01710Comment: This technical report is superseded by a paper entitled "Character-level Convolutional Networks for Text Classification", arXiv:1509.01626. It has considerably more experimental results and a rewritten introduction
%0 Generic
%1 zhang2015understanding
%A Zhang, Xiang
%A LeCun, Yann
%D 2015
%K neural_networks thema
%T Text Understanding from Scratch
%U http://arxiv.org/abs/1502.01710
%X This article demontrates that we can apply deep learning to text
understanding from character-level inputs all the way up to abstract text
concepts, using temporal convolutional networks (ConvNets). We apply ConvNets
to various large-scale datasets, including ontology classification, sentiment
analysis, and text categorization. We show that temporal ConvNets can achieve
astonishing performance without the knowledge of words, phrases, sentences and
any other syntactic or semantic structures with regards to a human language.
Evidence shows that our models can work for both English and Chinese.
@misc{zhang2015understanding,
abstract = {This article demontrates that we can apply deep learning to text
understanding from character-level inputs all the way up to abstract text
concepts, using temporal convolutional networks (ConvNets). We apply ConvNets
to various large-scale datasets, including ontology classification, sentiment
analysis, and text categorization. We show that temporal ConvNets can achieve
astonishing performance without the knowledge of words, phrases, sentences and
any other syntactic or semantic structures with regards to a human language.
Evidence shows that our models can work for both English and Chinese.},
added-at = {2016-04-06T14:12:22.000+0200},
author = {Zhang, Xiang and LeCun, Yann},
biburl = {https://www.bibsonomy.org/bibtex/21d0df246b898e2d24a2c078727ce71ab/dallmann},
interhash = {61b1c92fccbb83e1bda75cee7c105600},
intrahash = {1d0df246b898e2d24a2c078727ce71ab},
keywords = {neural_networks thema},
note = {cite arxiv:1502.01710Comment: This technical report is superseded by a paper entitled "Character-level Convolutional Networks for Text Classification", arXiv:1509.01626. It has considerably more experimental results and a rewritten introduction},
timestamp = {2018-02-28T19:55:21.000+0100},
title = {Text Understanding from Scratch},
url = {http://arxiv.org/abs/1502.01710},
year = 2015
}