Lessons Learned from Crowdsourcing Complex Engineering Tasks.
Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by us...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2015-01-01
|
Series: | PLoS ONE |
Online Access: | http://europepmc.org/articles/PMC4575153?pdf=render |
_version_ | 1828470913483407360 |
---|---|
author | Matthew Staffelbach Peter Sempolinski Tracy Kijewski-Correa Douglas Thain Daniel Wei Ahsan Kareem Gregory Madey |
author_facet | Matthew Staffelbach Peter Sempolinski Tracy Kijewski-Correa Douglas Thain Daniel Wei Ahsan Kareem Gregory Madey |
author_sort | Matthew Staffelbach |
collection | DOAJ |
description | Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations.Our investigation examined the feasibility of using crowdsourcing for complex, highly technical tasks. This was done to determine if the benefits of crowdsourcing could be harnessed to accurately and effectively contribute to solving complex real world engineering problems. Of course, untrained crowds cannot be used as a mere substitute for trained expertise. Rather, we sought to understand how crowd workers can be used as a large pool of labor for a preliminary analysis of complex data.We compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students, making a first pass at analyzing wind simulation data. For the first phase, we posted analysis questions to Amazon crowd workers and to two groups of civil engineering graduate students. A second phase of our experiment instructed crowd workers and students to create simulations on our Virtual Wind Tunnel website to solve a more complex task.With a sufficiently comprehensive tutorial and compensation similar to typical crowd-sourcing wages, we were able to enlist crowd workers to effectively complete longer, more complex tasks with competence comparable to that of graduate students with more comprehensive, expert-level knowledge. Furthermore, more complex tasks require increased communication with the workers. As tasks become more complex, the employment relationship begins to become more akin to outsourcing than crowdsourcing. Through this investigation, we were able to stretch and explore the limits of crowdsourcing as a tool for solving complex problems. |
first_indexed | 2024-12-11T05:02:27Z |
format | Article |
id | doaj.art-c5db2f209abc4ad8adf55859a87a8ea9 |
institution | Directory Open Access Journal |
issn | 1932-6203 |
language | English |
last_indexed | 2024-12-11T05:02:27Z |
publishDate | 2015-01-01 |
publisher | Public Library of Science (PLoS) |
record_format | Article |
series | PLoS ONE |
spelling | doaj.art-c5db2f209abc4ad8adf55859a87a8ea92022-12-22T01:20:07ZengPublic Library of Science (PLoS)PLoS ONE1932-62032015-01-01109e013497810.1371/journal.pone.0134978Lessons Learned from Crowdsourcing Complex Engineering Tasks.Matthew StaffelbachPeter SempolinskiTracy Kijewski-CorreaDouglas ThainDaniel WeiAhsan KareemGregory MadeyCrowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations.Our investigation examined the feasibility of using crowdsourcing for complex, highly technical tasks. This was done to determine if the benefits of crowdsourcing could be harnessed to accurately and effectively contribute to solving complex real world engineering problems. Of course, untrained crowds cannot be used as a mere substitute for trained expertise. Rather, we sought to understand how crowd workers can be used as a large pool of labor for a preliminary analysis of complex data.We compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students, making a first pass at analyzing wind simulation data. For the first phase, we posted analysis questions to Amazon crowd workers and to two groups of civil engineering graduate students. A second phase of our experiment instructed crowd workers and students to create simulations on our Virtual Wind Tunnel website to solve a more complex task.With a sufficiently comprehensive tutorial and compensation similar to typical crowd-sourcing wages, we were able to enlist crowd workers to effectively complete longer, more complex tasks with competence comparable to that of graduate students with more comprehensive, expert-level knowledge. Furthermore, more complex tasks require increased communication with the workers. As tasks become more complex, the employment relationship begins to become more akin to outsourcing than crowdsourcing. Through this investigation, we were able to stretch and explore the limits of crowdsourcing as a tool for solving complex problems.http://europepmc.org/articles/PMC4575153?pdf=render |
spellingShingle | Matthew Staffelbach Peter Sempolinski Tracy Kijewski-Correa Douglas Thain Daniel Wei Ahsan Kareem Gregory Madey Lessons Learned from Crowdsourcing Complex Engineering Tasks. PLoS ONE |
title | Lessons Learned from Crowdsourcing Complex Engineering Tasks. |
title_full | Lessons Learned from Crowdsourcing Complex Engineering Tasks. |
title_fullStr | Lessons Learned from Crowdsourcing Complex Engineering Tasks. |
title_full_unstemmed | Lessons Learned from Crowdsourcing Complex Engineering Tasks. |
title_short | Lessons Learned from Crowdsourcing Complex Engineering Tasks. |
title_sort | lessons learned from crowdsourcing complex engineering tasks |
url | http://europepmc.org/articles/PMC4575153?pdf=render |
work_keys_str_mv | AT matthewstaffelbach lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT petersempolinski lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT tracykijewskicorrea lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT douglasthain lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT danielwei lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT ahsankareem lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT gregorymadey lessonslearnedfromcrowdsourcingcomplexengineeringtasks |