Abstract

In this paper the term ldquoimplicit human-computer interactionrdquo is defined. It is discussed how the availability of processing power and advanced sensing technology can enable a shift in HCI from explicit interaction, such as direct manipulation GUIs, towards a more implicit interaction based on situational context. In the paper, an algorithm is given based on a number of questions to identify applications that can facilitate implicit interaction. An XML-based language to describe implicit HCI is proposed. The language uses contextual variables that can be grouped using different types of semantics as well as actions that are called by triggers. The term of perception is discussed and four basic approaches are identified that are useful when building context-aware applications. Two examples, a wearable context awareness component and a sensor-board, show how sensor-based perception can be implemented. It is also discussed how situational context can be exploited to improve input and output of mobile devices.

Description

Context-aware business processes

Links and resources

Tags

community