The Community for Technology Leaders
RSS Icon
Issue No.05 - Sept.-Oct. (2012 vol.16)
pp: 10-12
Ed H. Chi , Google
Michael S. Bernstein , Stanford University
Crowdsourcing involves outsourcing some job to a distributed group of people online, typically by breaking the job down into microtasks. Online markets offer human users payment for completing small tasks, or users can participate in nonpaid platforms such as games and volunteer sites. These platforms' general availability has enabled researchers to recruit large numbers of participants for user studies, generate third-party content and assessments, or even build novel user experiences. This special issue provides a snapshot of the most recent crowdsourcing research.
human computation, crowdsourcing, microtask, Amazon Mechanical Turk
Ed H. Chi, Michael S. Bernstein, "Leveraging Online Populations for Crowdsourcing", IEEE Internet Computing, vol.16, no. 5, pp. 10-12, Sept.-Oct. 2012, doi:10.1109/MIC.2012.111
1. L. von Ahn et al., "reCAPTCHA: Human-Based Character Recognition via Web Security Measures," Science, vol. 321, 12 Dec. 2008, pp. 1465–1468.
2. A. Kittur, E.H. Chi, and B. Suh, "Crowdsourcing User Studies With Mechanical Turk," Proc. ACM Conf. Human Factors in Computing Systems (CHI 08), ACM, 2008, pp. 453–456.
3. M.S. Bernstein et al., "Soylent: A Word Processor with a Crowd Inside," Proc. ACM Symp. User Interface Software and Technology (UIST 10), ACM, 2010.
4. J. Giles, "Internet Encyclopedias Go Head to Head," Nature, vol. 438, no. 7070, pp. 900–901.
5. P. Ipeirotis, "Demographics of Mechanical Turk," working paper CeDER-10-01, Center for Digital Economy Research, 10 Mar. 2010;
6. J.C.R. Licklider, "Man-Computer Symbiosis," IRE Trans. Human Factors in Electronics, vol. HFE-1, Mar. 1960, pp. 4–11.
18 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool