Author of the publication

Attempts to recognize anomalously deformed Kana in Japanese historical documents.

, , , , and . HIP@ICDAR, page 31-36. ACM, (2017)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Attempts to recognize anomalously deformed Kana in Japanese historical documents., , , , and . HIP@ICDAR, page 31-36. ACM, (2017)An End-to-End Local Attention Based Model for Table Recognition., and . ICDAR (2), volume 14188 of Lecture Notes in Computer Science, page 20-36. Springer, (2023)Digitalizing educational workbooks and collecting handwritten answers for automatic scoring., , , , , , , , , and 5 other author(s). iTextbooks@AIED, volume 3444 of CEUR Workshop Proceedings, page 78-87. CEUR-WS.org, (2023)A Self-attention Based Model for Offline Handwritten Text Recognition., , and . ACPR (2), volume 13189 of Lecture Notes in Computer Science, page 356-369. Springer, (2021)Attention Augmented Convolutional Recurrent Network for Handwritten Japanese Text Recognition., , and . ICFHR, page 163-168. IEEE, (2020)Deep Convolutional Recurrent Network for Segmentation-Free Offline Handwritten Japanese Text Recognition., , , and . MOCR@ICDAR, page 5-9. IEEE, (2017)978-1-5386-3586-5.TabIQA: Table Questions Answering on Business Document Images., , , and . CoRR, (2023)2D Self-attention Convolutional Recurrent Network for Offline Handwritten Text Recognition., , and . ICDAR (1), volume 12821 of Lecture Notes in Computer Science, page 191-204. Springer, (2021)Recognition of Japanese historical text lines by an attention-based encoder-decoder and text line generation., , , , and . HIP@ICDAR, page 37-41. ACM, (2019)An attention-based row-column encoder-decoder model for text recognition in Japanese historical documents., , and . Pattern Recognit. Lett., (2020)