Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems

Crowdsourcing systems, in which numerous tasks are electronically distributed to numerous “information pieceworkers,” have emerged as an effective paradigm for human-powered solving of large-scale problems in domains such as image classification, data entry, optical character recognition, recommenda...

Full description

Bibliographic Details
Main Authors: Karger, David R., Oh, Sewoong, Shah, Devavrat
Other Authors: Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Format: Article
Language:en_US
Published: Institute for Operations Research and the Management Sciences (INFORMS) 2014
Online Access:http://hdl.handle.net/1721.1/87088
https://orcid.org/0000-0003-0737-3259
https://orcid.org/0000-0002-0024-5847
_version_ 1811084511783419904
author Karger, David R.
Oh, Sewoong
Shah, Devavrat
author2 Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
author_facet Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Karger, David R.
Oh, Sewoong
Shah, Devavrat
author_sort Karger, David R.
collection MIT
description Crowdsourcing systems, in which numerous tasks are electronically distributed to numerous “information pieceworkers,” have emerged as an effective paradigm for human-powered solving of large-scale problems in domains such as image classification, data entry, optical character recognition, recommendation, and proofreading. Because these low-paid workers can be unreliable, nearly all such systems must devise schemes to increase confidence in their answers, typically by assigning each task multiple times and combining the answers in an appropriate manner, e.g., majority voting. In this paper, we consider a general model of such crowdsourcing tasks and pose the problem of minimizing the total price (i.e., number of task assignments) that must be paid to achieve a target overall reliability. We give a new algorithm for deciding which tasks to assign to which workers and for inferring correct answers from the workers' answers. We show that our algorithm, inspired by belief propagation and low-rank matrix approximation, significantly outperforms majority voting and, in fact, is optimal through comparison to an oracle that knows the reliability of every worker. Further, we compare our approach with a more general class of algorithms that can dynamically assign tasks. By adaptively deciding which questions to ask to the next set of arriving workers, one might hope to reduce uncertainty more efficiently. We show that, perhaps surprisingly, the minimum price necessary to achieve a target reliability scales in the same manner under both adaptive and nonadaptive scenarios. Hence, our nonadaptive approach is order optimal under both scenarios. This strongly relies on the fact that workers are fleeting and cannot be exploited. Therefore, architecturally, our results suggest that building a reliable worker-reputation system is essential to fully harnessing the potential of adaptive designs.
first_indexed 2024-09-23T12:51:56Z
format Article
id mit-1721.1/87088
institution Massachusetts Institute of Technology
language en_US
last_indexed 2024-09-23T12:51:56Z
publishDate 2014
publisher Institute for Operations Research and the Management Sciences (INFORMS)
record_format dspace
spelling mit-1721.1/870882022-09-28T10:33:36Z Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems Karger, David R. Oh, Sewoong Shah, Devavrat Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science Karger, David R. Shah, Devavrat Crowdsourcing systems, in which numerous tasks are electronically distributed to numerous “information pieceworkers,” have emerged as an effective paradigm for human-powered solving of large-scale problems in domains such as image classification, data entry, optical character recognition, recommendation, and proofreading. Because these low-paid workers can be unreliable, nearly all such systems must devise schemes to increase confidence in their answers, typically by assigning each task multiple times and combining the answers in an appropriate manner, e.g., majority voting. In this paper, we consider a general model of such crowdsourcing tasks and pose the problem of minimizing the total price (i.e., number of task assignments) that must be paid to achieve a target overall reliability. We give a new algorithm for deciding which tasks to assign to which workers and for inferring correct answers from the workers' answers. We show that our algorithm, inspired by belief propagation and low-rank matrix approximation, significantly outperforms majority voting and, in fact, is optimal through comparison to an oracle that knows the reliability of every worker. Further, we compare our approach with a more general class of algorithms that can dynamically assign tasks. By adaptively deciding which questions to ask to the next set of arriving workers, one might hope to reduce uncertainty more efficiently. We show that, perhaps surprisingly, the minimum price necessary to achieve a target reliability scales in the same manner under both adaptive and nonadaptive scenarios. Hence, our nonadaptive approach is order optimal under both scenarios. This strongly relies on the fact that workers are fleeting and cannot be exploited. Therefore, architecturally, our results suggest that building a reliable worker-reputation system is essential to fully harnessing the potential of adaptive designs. National Science Foundation (U.S.) (Grant 1117381) National Science Foundation (U.S.) (EMT project) United States. Air Force Office of Scientific Research (Complex Networks project) United States. Army Research Office (Multidisciplinary University Research Initiative Award 58153-MA-MUR) 2014-05-22T15:50:09Z 2014-05-22T15:50:09Z 2014-02 2013-04 Article http://purl.org/eprint/type/JournalArticle 0030-364X 1526-5463 http://hdl.handle.net/1721.1/87088 Karger, David R., Sewoong Oh, and Devavrat Shah. “Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems.” Operations Research 62, no. 1 (February 2014): 1–24. https://orcid.org/0000-0003-0737-3259 https://orcid.org/0000-0002-0024-5847 en_US http://dx.doi.org/10.1287/opre.2013.1235 Operations Research Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf Institute for Operations Research and the Management Sciences (INFORMS) arXiv
spellingShingle Karger, David R.
Oh, Sewoong
Shah, Devavrat
Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
title Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
title_full Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
title_fullStr Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
title_full_unstemmed Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
title_short Budget-Optimal Task Allocation for Reliable Crowdsourcing Systems
title_sort budget optimal task allocation for reliable crowdsourcing systems
url http://hdl.handle.net/1721.1/87088
https://orcid.org/0000-0003-0737-3259
https://orcid.org/0000-0002-0024-5847
work_keys_str_mv AT kargerdavidr budgetoptimaltaskallocationforreliablecrowdsourcingsystems
AT ohsewoong budgetoptimaltaskallocationforreliablecrowdsourcingsystems
AT shahdevavrat budgetoptimaltaskallocationforreliablecrowdsourcingsystems