@uniwue_info3

Eye Tracker in the Wild: Studying the Delta Between What is Said and Measured in a Crowdsourcing Experiment

, , , , and . Workshop on Crowdsourcing for Multimedia (CrowdMM), Brisbane, Australia, (October 2015)

Abstract

Self-reported metrics collected in crowdsourcing experiments do not always match the actual user behaviour. Therefore in the laboratory studies the visual attention, the capability of humans to selectively process the visual information with which they are confronted, is traditionally measured by means of eye trackers. Visual attention has not been typically considered in crowdsourcing environments, mainly because of the requirements of specific hardware and challenging gaze calibration. This paper proposes the use of a non- intrusive eye tracking crowdsourcing framework, where the only technical requirements from the users’ side are a web- cam and a HTML5 compatible web browser, to study the differences between what a participant implicitly and explicitly does during a crowdsourcing experiment. To demonstrate the feasibility of this approach, an exemplary crowdsourcing campaign was launched to collect and compare both eye tracking data and self-reported metrics from the users. Participants performed a movie selection task, where they were asked about the main reasons motivating them to choose a particular movie. Results demonstrate the added value of monitoring gaze in crowdsourcing contexts: consciously or not, users behave differently than what they report through questionnaires.

Links and resources

Tags