PDA

View Full Version : Crowd Work Statistics and Other Data



  1. Amazon Mechanical Turk (mturk) workers are highly educated - sources
  2. Demographics of Mechanical Turk
  3. Maps of Turker Locations
  4. Aaron Sojourner's paper is out (the one who put fake reviews on TO)
  5. Turker Tools survey results
  6. Turkers are more distracted, more often look for outside help on survey questions
  7. Human-Computer Interaction and Collective Intelligence
  8. Number of Turkers on forums (and Microworkers!)
  9. Number of articles including the words "Mechanical Turk"/"mTurk" on PubMed/PsycINFO
  10. The Dynamics of Micro-Task Crowdsourcing
  11. Collecting Psycholinguistic Response Time Data Using Amazon Mechanical Turk
  12. User Perceptions of Associative Thesaural Relationships: A Preliminary Study
  13. Using Mechanical Turk to Study Family Processes and Youth Mental Health: A Test of Fe
  14. Assessment of household appliance surveys collected with Amazon Mechanical Turk
  15. Using Crowdsourcing Websites for Sociological Research: The Case of Amazon Mechanical
  16. 50c FOR 30 MINUTES IS NOT THE TYPICAL RATE!
  17. Log in, tune out, turn off
  18. Think Outside the Boss: Cooperative Alternatives to the Sharing Economy
  19. Are annoying ads good for business?
  20. Understanding Health Information Intent via Crowdsourcing
  21. An online daily diary study of alcohol use using Amazon's Mechanical Turk
  22. Crowdsourcing Uncovered - A Literature Review
  23. $16 per hour, but based on the AMT calculator? Probably wrong :(
  24. Yay they paid $10/hr, but they really wasted their money on Masters
  25. "... some workers do not provide quality responses when given low monetary rewards."
  26. Amazon Customer Service drives away a potential Requester, should have come to TN :)
  27. AMT consistency 90%+. 90% of tasks done by 23% of workers (Super Turkers)
  28. Ethical Turkers do exist, and increasing pay DOUBLES # of workers
  29. Turkers are very reliable due to fear of rejections, few 70+ Turkers
  30. Preserving Privacy in Crowd-Powered Systems
  31. 40c for 15+ minutes of task+ they hid length of video so workers didn't know
  32. Researcher says mTurk is no good for longitudinal studies (he's wrong)
  33. Use AMT to find phone numbers of sex traffickers on Craigslist?
  34. 20 minutes, 50c :( The need to belong and the value of belongings
  35. Assessing work-asthma interaction with Amazon mechanical turk
  36. Problem with Cross-Cultural Comparison of User-Generated Ratings on Mechanical Turk
  37. Comparing the Similarity of Responses Received from Studies in AMT to online
  38. IT?S A TRAP!: Instructional manipulation checks prompt systematic thinking on...
  39. Systemic Metaphors Promote Systems Thinking
  40. CrowdTrust: A Context-Aware Trust Model for Workers Selection in Crowdsourcing Enviro
  41. Notes from a Day on the Forums: Recommendations for Maintaining a Good Reputation
  42. 8% of respondents claimed to be medical PhDs, 200+ Turkers claim grad degrees
  43. A map of Turker respondents
  44. Estimated time: 60 minutes; Pay: $1; PhD defense passed: priceless
  45. We would like to acknowledge the countless hours of effort provided by AMT workers
  46. Turkers paid the same as offline participants, nice!
  47. Only 5% missed the AC, that's a great results
  48. Used your queue? Well, that's a rejection... or a hundred
  49. $11.35 an hour - great stuff, Todd Gureckis et al!
  50. $10 for a study, that's not too shabby!
  51. On Wayfaring in Social Machines
  52. "workers ... show greater task affinity (i.e., they like the tasks) offer relatively higher monetary rewards"
  53. Swapping bricks for clicks: Crowdsourcing longitudinal data on Amazon Turk
  54. The Global Opportunity in Online Outsourcing - World Bank
  55. Notes from a Day on the Forums: Recommendations for Maintaining a Good Reputation as an AMT Requester
  56. Style Compatibility for 3D Furniture Models
  57. Informing Protocol Design Through Crowdsourcing: the Case of Pervasive Encryption
  58. Emergent Collective Sensing in Human Groups
  59. $15 per hour, yay @thecloflo & @ProfJimRoberts - Does Thin Always Sell?
  60. Visual Semantic Role Labeling - why do researchers think that pay is not necessary to mention in their paper?
  61. Make Hay While the Crowd Shines: Towards Efficient Crowdsourcing on the Web
  62. Improving Paid Microtasks through Gamification and Adaptive Furtherance Incentives
  63. An Online Learning Approach to Improving the Quality of Crowd-Sourcing
  64. Crowdsourcing Entity Resolution: a Short Overview and Open Issues
  65. Got Many Labels? (Turkers offer high quality results)
  66. An Explorative Approach for Crowdsourcing Tasks Design
  67. "The proposed study had few ethical concerns to address" (except paying 75c for 10-20 minutes)
  68. ASSETS: U: Web Accessibility Evaluation with the Crowd: Rapidly Coding User Testing Video ($9/hr)
  69. The narrative bias revisited ($4.14/hr, and they admit it with no disclaimer)
  70. "one dollar was the maximum typical wage for Mechanical Turk" LOL misinformed academia strikes again
  71. "companies use crowdsourcing for small tasks ... using the Amazon Mechanical Turkey platform" Srsly?
  72. Exposing ambiguities in a relation-extraction gold standard with crowdsourcing - Mass rejected at 70% approval
  73. Conducting Longitudinal Experiments with Behavioral Models in Repeated Stackelberg Security Games on AMT
  74. Getting More for Less: Optimized Crowdsourcing with Dynamic Tasks and Goals
  75. The Mobile Revolution Is Here: Are You Ready?
  76. iCrowd: An Adaptive Crowdsourcing Framework
  77. QASCA: A Quality-Aware Task Assignment System for Crowdsourcing Applications
  78. Crowdsourcing IT Work: A Three-Fold Perspective from the Workers, Buyers, and Platform Providers
  79. Mobile crowdsourcing - activation of smartphones users to elicit specialized knowledge through worker profile
  80. Cost Optimization for Crowdsourcing Translation
  81. Perceived Robot Capability (I remember doing this study lol)
  82. A System for Scalable and Reliable Technical-Skill Testing in Online Labor Markets
  83. Preliminary Experiments on Crowdsourced Evaluation of Feedback Granularity
  84. CommuniSense: Crowdsourcing Road Hazards in Nairobi
  85. Assessing Collaborative Problem Solving with Simulation Based Tasks
  86. Using Nonnaive Participants Can Reduce Effect Sizes
  87. Effect of Being Observed on the Reliability of Responses in Crowdsourcing Micro-task Platforms
  88. Auction Based Mechanisms for Dynamic Task Assignments in Expert Crowdsourcing
  89. Trust and distrust on the web: User experiences and website Characteristics
  90. Knowledge and risk perceptions of the Ebola virus in the United States
  91. Turkers are "bored" people doing surveys for "meager" pay (researcher blames us for his lack of results)
  92. We were surprised to find that Master workers are not much superior in quality than regular workers
  93. We did not observe that any forum members (posted any) ?spoilers? that might have confounded the study
  94. "As an exploited worker, one can earn $4.95 for 4 hours 30 minutes of microtasking"
  95. Reputation Transferability in Online Labor Markets
  96. Methods for the Detection of Carelessly Invalid Responses in Survey Data
  97. Turkers aren't considered human by IRBs
  98. Experience in using MTurk for Network Measurement
  99. Screening Amazon?s Mechanical Turk for Adults With ADHD
  100. Feelings of Pride and Respect as Drivers of Ongoing Member Activity on Crowdsourcing Platforms
  101. Measuring the Crowd ? A Preliminary Taxonomy of Crowdsourcing Metrics
  102. *SHOCK* As pay rate went down, percentage of unusable submissions & rate of attrition increased
  103. "~40% (Turkers) consider mTurk primary job, we strongly encourage (researchers) to act as reputable employers" - @IOAliceB
  104. YAY! It's finally recognized that the Turkers knew Worker IDs weren't anonymous before the academics published about it
  105. The Rise of the ?Just-in-Time Workforce?: On-Demand Work, Crowd Work and Labour Protection in the ?Gig-Economy?.
  106. Re-assembling the Assembly Line: Digital Labor Economies and Demands for an Ambient Workforce
  107. On the reception and detection of pseudo-profound bullshit
  108. The Crowd is a Collaborative Network
  109. Parietal and prefrontal: categorical differences?
  110. Data on screen sizes for a sample of 700 workers
  111. The Generalizability of Survey Experiments - AMT convenience samples = effects of population samples
  112. Top Requesters on Turk Alert
  113. Data quality is directly affected by compensation in a study of Indian workers, so pay = high quality
  114. Then & Now: mTurk last year at this time versus now
  115. Social scientists ... are participating in a market that leaves the people we study in precarity and poverty
  116. Give it a week, 70% of Turkers will have CRT2 exposure - "Investigating an alternate form of the cognitive reflection test"
  117. Survey Satisficing Inflates Reliability and Validity Measures An Experimental Comparison of College and Amazon Mechanical Tur
  118. Are samples drawn from Mechanical Turk valid for research on political ideology?
  119. Curiosity Killed the Cat, but Makes Crowdwork Better
  120. The data speaks for itself - number of Requesters dropped from a peak of ~9,000 in May 2015 to ~4,000 today, still dropping
  121. THE FUTURE OF EMPLOYMENT: HOW SUSCEPTIBLE ARE JOBS TO COMPUTERISATION? (47 million jobs at risk)
  122. The Reliability of Crowdsourcing: Latent Trait Modeling with Mechanical Turk
  123. Pretty damning proof that #mTurk isn't always representative when money is involved (proof it's LABOUR, not a subject pool!)
  124. The Communication Network Within the Crowd (presented today at #WWW16)
  125. Canada and the Changing Nature of Work
  126. The platform economy and the disruption of the employment relationship
  127. The Impact of the Collaborative Economy on the Labour Market
  128. The Changing Workplaces Review
  129. Are all ?research fields? equal? Rethinking practice for the use of data from crowdsourcing market places
  130. Languages spoken on AMT (n=335)
  131. Microworkers of the Gig Economy: Separate and Precarious
  132. Running (an #mTurk) study meant we weren't just researchers interacting with p's we were employers interacting with employees
  133. Survey Satisficing Inflates Reliability and Validity Measures An Experimental Comparison of College and AMT Samples
  134. 161007 - Results from ERB minimum wage rule poll of workers and Requesters
  135. NY State Rules 2 Former Uber Drivers Eligible for Jobless Payments
  136. Collaboration among workers
  137. A Mechanical Turk Worker's Perspective
  138. How Participant Characteristics and Behavior Vary Over Time and Day on MTurk
  139. The humans behind Mechanical Turk?s artificial intelligence (now available without having to pay to read it)
  140. Peter Thiel, Trump and Facebook vs Cambridge Analytica?s Steve Bannon
  141. Uber begins background collection of rider location data
  142. In the Future Graphic Design Layout Will Be Automated
  143. TaskRabbit Quietly Doubled The Cut It Takes From Many Of Its Workers
  144. Avoiding misrepresentations on MTurk
  145. Time To Turn The Page Of Platform Capitalism?
  146. The Billion-Dollar Question: How To Fix Gig Work
  147. Paying minimum wage on online labor platforms: A request for comments
  148. The Gig Economy: Research And Policy Implications Of Regional, Economic, And Demographic Trends
  149. POLICY BRIEF ON THE FUTURE OF WORK: Automation and Independent Work in a Digital Economy
  150. Mapping Twitter Followers in R
  151. Tourism and the sharing economy
  152. The sharing economy in Canada
  153. Comparing the Reliability of Amazon Mechanical Turk and Survey Monkey to Traditional Market Research Surveys
  154. Lie for a Dime: When most prescreening responses are honest but most "eligible" respondents are lies
  155. Turking overtime: how participant characteristics and behavior vary over time and day on Amazon Mechanical Turk
  156. MTurk Character Misrepresentation: Assessment and Solutions
  157. Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection
  158. Systems Perspective of Amazon Mechanical Turk for Organizational Research: Review and Recommendations
  159. The Social Protection of Workers in the Platform Economy
  160. "We Regret to Inform You": Managing Reactions to Rejection in Crowdsourcing