@uniwue_info3

Is affective crowdsourcing reliable?

, , , , and . 5th International Conference on Communications and Electronics (ICCE 2014), Da Nang, Vietnam, (July 2014)

Abstract

Affective content annotations are typically acquired from subjective manual assessments by experts in supervised laboratory tests. While well manageable, such campaigns are expensive, time-consuming and results may not be generalizable to larger audiences. Crowdsourcing constitutes a promising approach for quickly collecting data with wide demographic scope and reasonable costs. Undeniably, affective crowdsourcing is particularly challenging in the sense that it attempts to collect subjective perceptions from humans with different cultures, languages, knowledge background, etc. In this study we analyze the validity of well-known user affective scales in a crowdsourcing context by comparing results with the ones obtained in laboratory tests. Experimental results demonstrate that pictorial scales possess promising features for affective crowdsourcing.

Links and resources

Tags