Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments Dagstuhl Seminar 15481, Dagstuhl Castle, Germany, November 22 – 27, 2015, Revised Contributions

As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015...

Full description

Bibliographic Details
Other Authors: Archambault, Daniel (Editor), Purchase, Helen (Editor), Hoßfeld, Tobias (Editor)
Format: eBook
Language:English
Published: Cham Springer International Publishing 2017, 2017
Edition:1st ed. 2017
Series:Information Systems and Applications, incl. Internet/Web, and HCI
Subjects:
Online Access:
Collection: Springer eBooks 2005- - Collection details see MPG.ReNa
Description
Summary:As the outcome of the Dagstuhl Seminar 15481 on Crowdsourcing and Human-Centered Experiments, this book is a primer for computer science researchers who intend to use crowdsourcing technology for human centered experiments. The focus of this Dagstuhl seminar, held in Dagstuhl Castle in November 2015, was to discuss experiences and methodological considerations when using crowdsourcing platforms to run human-centered experiments to test the effectiveness of visual representations. The inspiring Dagstuhl atmosphere fostered discussions and brought together researchers from different research directions. The papers provide information on crowdsourcing technology and experimental methodologies, comparisons between crowdsourcing and lab experiments, the use of crowdsourcing for visualisation, psychology, QoE and HCI empirical studies, and finally the nature of crowdworkers and their work, their motivation and demographic background, as well as the relationships among people formingthe crowdsourcing community
Physical Description:VII, 191 p. 15 illus online resource
ISBN:9783319664354