Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data
E. Bender, and A. Koller. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, page 5185-5198. Association for Computational Linguistics, Association for Computational Linguistics, (July 2020)
DOI: 10.18653/v1/2020.acl-main.463
Abstract
The success of the large neural language models on many NLP tasks is exciting. However, we find that these successes sometimes lead to hype in which these models are being described as “understanding” language or capturing “meaning”. In this position paper, we argue that a system trained only on form has a priori no way to learn meaning. In keeping with the ACL 2020 theme of “Taking Stock of Where We’ve Been and Where We’re Going”, we argue that a clear understanding of the distinction between form and meaning will help guide the field towards better science around natural language understanding.
%0 Conference Paper
%1 bender2020climbing
%A Bender, Emily M.
%A Koller, Alexander
%B Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
%D 2020
%I Association for Computational Linguistics
%K digital_linguistic meaning natural_language_processing understanding
%P 5185-5198
%R 10.18653/v1/2020.acl-main.463
%T Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data
%U https://doi.org/10.18653%2Fv1%2F2020.acl-main.463
%X The success of the large neural language models on many NLP tasks is exciting. However, we find that these successes sometimes lead to hype in which these models are being described as “understanding” language or capturing “meaning”. In this position paper, we argue that a system trained only on form has a priori no way to learn meaning. In keeping with the ACL 2020 theme of “Taking Stock of Where We’ve Been and Where We’re Going”, we argue that a clear understanding of the distinction between form and meaning will help guide the field towards better science around natural language understanding.
@inproceedings{bender2020climbing,
abstract = {The success of the large neural language models on many NLP tasks is exciting. However, we find that these successes sometimes lead to hype in which these models are being described as “understanding” language or capturing “meaning”. In this position paper, we argue that a system trained only on form has a priori no way to learn meaning. In keeping with the ACL 2020 theme of “Taking Stock of Where We’ve Been and Where We’re Going”, we argue that a clear understanding of the distinction between form and meaning will help guide the field towards better science around natural language understanding.},
added-at = {2023-07-28T16:02:56.000+0200},
author = {Bender, Emily M. and Koller, Alexander},
biburl = {https://www.bibsonomy.org/bibtex/2f7cc30371ecbf3a3d7609654f6d84c15/meneteqel},
booktitle = {Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics},
doi = {10.18653/v1/2020.acl-main.463},
eventtitle = {ACL},
interhash = {5bdbd4e73bc73fd94ceb7562dfff9b58},
intrahash = {f7cc30371ecbf3a3d7609654f6d84c15},
keywords = {digital_linguistic meaning natural_language_processing understanding},
language = {en-US},
month = jul,
organization = {Association for Computational Linguistics},
pages = {5185-5198},
publisher = {Association for Computational Linguistics},
timestamp = {2023-07-28T16:02:56.000+0200},
title = {Climbing towards {NLU}: On Meaning, Form, and Understanding in the Age of Data},
url = {https://doi.org/10.18653%2Fv1%2F2020.acl-main.463},
venue = {Online},
year = 2020
}