@marcerich

A Gesture Processing Framework for Multimodal Interaction in Virtual Reality

. Proceedings of the 1st International Conference on Computer Graphics, Virtual Reality and Visualisation in Africa, AFRIGRAPH 2001, page 95-100. ACM SIGGRAPH, (2001)

Abstract

This article presents a gesture detection and analysis framework for modelling multimodal interactions. It is particulary designed for its use in Virtual Reality (VR) applications and contains an abstraction layer for different sensor hardware. Using the framework, gestures are described by their characteristic spatio-temporal features which are on the lowest level calculated by simple predefined detector modules or nodes. These nodes can be connected by a data routing mechanism to perform more elaborate evaluation functions, therewith establishing complex detector nets. Typical problems that arise from the time-dependent invalidation of multimodal utterances under immersive conditions lead to the development of pre-evaluation concepts that as well support their integration into scene graph based systems to support traversal-type access. Examples of realized interactions illustrate applications which make use of the described concepts.

Links and resources

Tags

community

  • @hci-uwb
  • @dblp
  • @marcerich
@marcerich's tags highlighted