@duerrschnabel

Neural Networks for Semantic Gaze Analysis in XR Settings

, , и . (2021)cite arxiv:2103.10451Comment: 16 pages, 6 figures, 1 table, Accepted to: ETRA2021, ACM Symposium on Eye Tracking Research and Applications.
DOI: 10.1145/3448017.3457380

Аннотация

Virtual-reality (VR) and augmented-reality (AR) technology is increasingly combined with eye-tracking. This combination broadens both fields and opens up new areas of application, in which visual perception and related cognitive processes can be studied in interactive but still well controlled settings. However, performing a semantic gaze analysis of eye-tracking data from interactive three-dimensional scenes is a resource-intense task, which so far has been an obstacle to economic use. In this paper we present a novel approach which minimizes time and information necessary to annotate volumes of interest (VOIs) by using techniques from object recognition. To do so, we train convolutional neural networks (CNNs) on synthetic data sets derived from virtual models using image augmentation techniques. We evaluate our method in real and virtual environments, showing that the method can compete with state-of-the-art approaches, while not relying on additional markers or preexisting databases but instead offering cross-platform use.

Линки и ресурсы

тэги

сообщество

  • @duerrschnabel
  • @dblp
@duerrschnabel- тэги данного пользователя выделены