Systematic literature reviews (SLRs) are one of the most common and useful form of scientific research and publication. Tens of thousands of SLRs are published each year, and this rate is growing across all fields of science. Performing an accurate, complete and unbiased SLR is however a difficult and expensive endeavor. This is true in general for all phases of a literature review, and in particular for the paper screening phase, where authors filter a set of potentially in-scope papers based on a number of exclusion criteria. To address the problem, in recent years the research community has began to explore the use of the crowd to allow for a faster, accurate, cheaper and unbiased screening of papers. Initial results show that crowdsourcing can be effective, even for relatively complex reviews. In this paper we derive and analyze a set of strategies for crowd-based screening, and show that an adaptive strategy, that continuously re-assesses the statistical properties of the problem ...
Crowd-based Multi-Predicate Screening of Papers in Literature Reviews / Krivosheev, Evgeny; Casati, Fabio; Benatallah, Boualem. - (2018), pp. 55-64. ( 27th International World Wide Web, WWW 2018 Lione, Francia Aprile 2018) [10.1145/3178876.3186036].
Crowd-based Multi-Predicate Screening of Papers in Literature Reviews
Krivosheev, Evgeny;Casati, Fabio;Benatallah, Boualem
2018-01-01
Abstract
Systematic literature reviews (SLRs) are one of the most common and useful form of scientific research and publication. Tens of thousands of SLRs are published each year, and this rate is growing across all fields of science. Performing an accurate, complete and unbiased SLR is however a difficult and expensive endeavor. This is true in general for all phases of a literature review, and in particular for the paper screening phase, where authors filter a set of potentially in-scope papers based on a number of exclusion criteria. To address the problem, in recent years the research community has began to explore the use of the crowd to allow for a faster, accurate, cheaper and unbiased screening of papers. Initial results show that crowdsourcing can be effective, even for relatively complex reviews. In this paper we derive and analyze a set of strategies for crowd-based screening, and show that an adaptive strategy, that continuously re-assesses the statistical properties of the problem ...I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione



