Factors such as instructions, payment schemes, platform demographics, along with strategies for mapping studies into crowdsourcing environments, play an important role in the reproducibility of results. However, inferring these details from scientific articles is often a challenging endeavor, calling for the development of proper reporting guidelines. This paper makes the first steps towards this goal, by describing an initial taxonomy of relevant attributes for crowdsourcing experiments, and providing a glimpse into the state of reporting by analyzing a sample of CSCW papers.
DREC: Towards a datasheet for reporting experiments in crowdsourcing / Ramirez, Jorge; Baez, Marcos; Casati, Fabio; Cernuzzi, Luca; Benatallah, Boualem. - (2020), pp. 377-382. (Intervento presentato al convegno 3rd ACM Conference on Computer-Supported Cooperative Work and Social Computing, CSCW 2020 tenutosi a usa nel 2020) [10.1145/3406865.3418318].
DREC: Towards a datasheet for reporting experiments in crowdsourcing
Ramirez Jorge;Casati Fabio;
2020-01-01
Abstract
Factors such as instructions, payment schemes, platform demographics, along with strategies for mapping studies into crowdsourcing environments, play an important role in the reproducibility of results. However, inferring these details from scientific articles is often a challenging endeavor, calling for the development of proper reporting guidelines. This paper makes the first steps towards this goal, by describing an initial taxonomy of relevant attributes for crowdsourcing experiments, and providing a glimpse into the state of reporting by analyzing a sample of CSCW papers.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione