Author of the publication

Evaluating Features for Identifying Japanese-Chinese Bilingual Synonymous Technical Terms from Patent Families.

, , , and . BUCC@ACL/IJCNLP, page 52-61. Association for Computational Linguistics, (2015)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Patent NMT integrated with Large Vocabulary Phrase Translation by SMT at WAT 2017., , , , and . WAT@IJCNLP, page 110-118. Asian Federation of Natural Language Processing, (2017)Automatic Identification of Indicators of Compromise using Neural-Based Sequence Labelling., , , and . PACLIC, Association for Computational Linguistics, (2018)Multimodal Neural Machine Translation with Search Engine Based Image Retrieval., , , and . WAT@COLING, page 89-98. International Conference on Computational Linguistics, (2022)Exploring the Necessity of Visual Modality in Multimodal Machine Translation using Authentic Datasets., , , , , and . CoRR, (2024)Neural Machine Translation Model with a Large Vocabulary Selected by Branching Entropy., , , and . CoRR, (2017)Evaluating Features for Identifying Japanese-Chinese Bilingual Synonymous Technical Terms from Patent Families., , , and . BUCC@ACL/IJCNLP, page 52-61. Association for Computational Linguistics, (2015)Textbook Question Answering with Multi-type Question Learning and Contextualized Diagram Representation., , , , , and . ICANN (4), volume 12894 of Lecture Notes in Computer Science, page 86-98. Springer, (2021)Neural Machine Translation Model with a Large Vocabulary Selected by Branching Entropy., , , , and . MTSummit (1), page 227-240. (2017)Translation of Patent Sentences with a Large Vocabulary of Technical Terms Using Neural Machine Translation., , , and . CoRR, (2017)Generalization algorithm of multimodal pre-training model based on graph-text self-supervised training., , , and . CoRR, (2023)