SingApp: um modelo de identificação de língua de sinais através de captura de movimento em tempo real
M. Leal. Universidade do Vale do Rio dos Sinos (Unisinos), São Leopoldo, Rio Grande do Sul, Brazil, mastersthesis, (2018)
Аннотация
The sign language recognition aims to allow a greater social and digital insertion of deaf people through interpretation of your language by the computer. This work presents a recognition model of two global parameters of the sign languages, hand configurations and their movements. Through the usage of infrared capture technology we built the hand structure on a virtual three-dimensional space and the Multilayer Perceptron Neural Network was used to do the hand configuration and movements classifying. Beyond of method to recognize signs, this work aims to provide a set of representative data of the daily conditions, consisting of a database of hand configurations and motion capture validated by fluent professionals in sign languages. To this work, was used, as study case, the Brazilian Sign Language, Libras, and was obtained accuracy rates of 99.8% and 86.7% from neural networks classifying hand configurations and hand motion respectively.
%0 Thesis
%1 leal2018singapp
%A Leal, Márcio Moura
%C São Leopoldo, Rio Grande do Sul, Brazil
%D 2018
%E Villamil, Marta Becker
%I Universidade do Vale do Rio dos Sinos (Unisinos)
%K artificial-neural-network brazilian-sign-language brazilian-sign-language-recognition bsl bslr convolutional-neural-network gesture-recognition infrared libras real sign-language-recognition slr
%T SingApp: um modelo de identificação de língua de sinais através de captura de movimento em tempo real
%U http://www.repositorio.jesuita.org.br/handle/UNISINOS/7124
%X The sign language recognition aims to allow a greater social and digital insertion of deaf people through interpretation of your language by the computer. This work presents a recognition model of two global parameters of the sign languages, hand configurations and their movements. Through the usage of infrared capture technology we built the hand structure on a virtual three-dimensional space and the Multilayer Perceptron Neural Network was used to do the hand configuration and movements classifying. Beyond of method to recognize signs, this work aims to provide a set of representative data of the daily conditions, consisting of a database of hand configurations and motion capture validated by fluent professionals in sign languages. To this work, was used, as study case, the Brazilian Sign Language, Libras, and was obtained accuracy rates of 99.8% and 86.7% from neural networks classifying hand configurations and hand motion respectively.
@mastersthesis{leal2018singapp,
abstract = {The sign language recognition aims to allow a greater social and digital insertion of deaf people through interpretation of your language by the computer. This work presents a recognition model of two global parameters of the sign languages, hand configurations and their movements. Through the usage of infrared capture technology we built the hand structure on a virtual three-dimensional space and the Multilayer Perceptron Neural Network was used to do the hand configuration and movements classifying. Beyond of method to recognize signs, this work aims to provide a set of representative data of the daily conditions, consisting of a database of hand configurations and motion capture validated by fluent professionals in sign languages. To this work, was used, as study case, the Brazilian Sign Language, Libras, and was obtained accuracy rates of 99.8% and 86.7% from neural networks classifying hand configurations and hand motion respectively.},
added-at = {2019-09-15T05:36:28.000+0200},
address = {São Leopoldo, Rio Grande do Sul, Brazil},
author = {Leal, Márcio Moura},
biburl = {https://www.bibsonomy.org/bibtex/231b78188b6204a29baea6c10b6545eca/jpmor},
editor = {Villamil, Marta Becker},
id = {http://www.repositorio.jesuita.org.br/handle/UNISINOS/7124},
interhash = {0280e79424fc7daa526206172433f2e5},
intrahash = {31b78188b6204a29baea6c10b6545eca},
keywords = {artificial-neural-network brazilian-sign-language brazilian-sign-language-recognition bsl bslr convolutional-neural-network gesture-recognition infrared libras real sign-language-recognition slr},
publisher = {Universidade do Vale do Rio dos Sinos (Unisinos)},
school = {Universidade do Vale do Rio dos Sinos (Unisinos)},
timestamp = {2020-10-07T13:36:50.000+0200},
title = {SingApp: um modelo de identificação de língua de sinais através de captura de movimento em tempo real},
type = {mastersthesis},
url = {http://www.repositorio.jesuita.org.br/handle/UNISINOS/7124},
year = 2018
}