A Gesture Processing Framework for Multimodal Interaction in Virtual Reality
M. Latoschik. Proceedings of the 1st International Conference on Computer Graphics, Virtual Reality and Visualisation in Africa, AFRIGRAPH 2001, page 95-100. ACM SIGGRAPH, (2001)
Abstract
This article presents a gesture detection and analysis framework for modelling multimodal interactions. It is particulary designed for its use in Virtual Reality (VR) applications and contains an abstraction layer for different sensor hardware. Using the framework, gestures are described by their characteristic spatio-temporal features which are on the lowest level calculated by simple predefined detector modules or nodes. These nodes can be connected by a data routing mechanism to perform more elaborate evaluation functions, therewith establishing complex detector nets. Typical problems that arise from the time-dependent invalidation of multimodal utterances under immersive conditions lead to the development of pre-evaluation concepts that as well support their integration into scene graph based systems to support traversal-type access. Examples of realized interactions illustrate applications which make use of the described concepts.
%0 Conference Paper
%1 latoschik:gestureprocessing:01
%A Latoschik, Marc Erich
%B Proceedings of the 1st International Conference on Computer Graphics, Virtual Reality and Visualisation in Africa, AFRIGRAPH 2001
%D 2001
%I ACM SIGGRAPH
%K myown
%P 95-100
%T A Gesture Processing Framework for Multimodal Interaction in Virtual Reality
%U http://www.hci.uni-wuerzburg.de/download/A_gesture_processing_framework.pdf
%X This article presents a gesture detection and analysis framework for modelling multimodal interactions. It is particulary designed for its use in Virtual Reality (VR) applications and contains an abstraction layer for different sensor hardware. Using the framework, gestures are described by their characteristic spatio-temporal features which are on the lowest level calculated by simple predefined detector modules or nodes. These nodes can be connected by a data routing mechanism to perform more elaborate evaluation functions, therewith establishing complex detector nets. Typical problems that arise from the time-dependent invalidation of multimodal utterances under immersive conditions lead to the development of pre-evaluation concepts that as well support their integration into scene graph based systems to support traversal-type access. Examples of realized interactions illustrate applications which make use of the described concepts.
@inproceedings{latoschik:gestureprocessing:01,
abstract = {This article presents a gesture detection and analysis framework for modelling multimodal interactions. It is particulary designed for its use in Virtual Reality (VR) applications and contains an abstraction layer for different sensor hardware. Using the framework, gestures are described by their characteristic spatio-temporal features which are on the lowest level calculated by simple predefined detector modules or nodes. These nodes can be connected by a data routing mechanism to perform more elaborate evaluation functions, therewith establishing complex detector nets. Typical problems that arise from the time-dependent invalidation of multimodal utterances under immersive conditions lead to the development of pre-evaluation concepts that as well support their integration into scene graph based systems to support traversal-type access. Examples of realized interactions illustrate applications which make use of the described concepts.},
added-at = {2012-05-03T15:56:47.000+0200},
author = {Latoschik, Marc Erich},
biburl = {https://www.bibsonomy.org/bibtex/21b119525ecdcd11753bbef47bda6b963/marcerich},
booktitle = {Proceedings of the 1st International Conference on Computer Graphics, Virtual Reality and Visualisation in Africa, AFRIGRAPH 2001},
interhash = {0e7a50feee8d939cb896e55ef2b1603e},
intrahash = {1b119525ecdcd11753bbef47bda6b963},
keywords = {myown},
pages = {95-100},
publisher = {ACM SIGGRAPH},
timestamp = {2012-05-03T15:56:49.000+0200},
title = {A Gesture Processing Framework for Multimodal Interaction in Virtual Reality},
url = {http://www.hci.uni-wuerzburg.de/download/A_gesture_processing_framework.pdf},
year = 2001
}