Abstract
This paper proposes a novel interaction paradigm for multi-touch interfaces, that integrates both touch and near-touch interactions. The paper describes the hardware prototype that we have built, as well as the computer vision approach that we propose for real-time hand tracking and differentiation between near-touch and touch events. We also present a case study showing how near-touch and touch interactions can be successfully integrated in an information visualization application.
Users
Please
log in to take part in the discussion (add own reviews or comments).