We are halfway to hitting our goal of paying the monthly server bill on May 1.
This is the second month in a row we've been short. If you can help, please do!
Click the thermometer to learn more.

Note: clicking on the above banners and making ANY purchase returns a commission to Turker Nation.
If you can't see the ad, please click on Shop on Amazon instead. | Want to advertise here? PM Spamgirl to learn more!




Results 1 to 5 of 5

Thread: "We Regret to Inform You": Managing Reactions to Rejection in Crowdsourcing

  1. #1
    Community Manager spamgirl's Avatar
    Join Date
    Aug 2011
    Gender
    Female
    Location
    Canada
    Posts
    13,564
    Thanks
    2,902
    Thanked 7,428 Times in 2,420 Posts

    "We Regret to Inform You": Managing Reactions to Rejection in Crowdsourcing

    Crowdsourcing has been successfully used to help overcome important Semantic Web challenges such as factual data collection, knowledge base curation, and to perform ontology management tasks. In current microtask crowdsourcing systems, requesters typically exer-cise the power to decide whether or not to accept the tasks completed by crowd workers. Rejecting work has a direct impact on workers; (i) they may not be rewarded for work which has actually been done and for their e ort that has been exerted, and (ii) rejection a ects worker reputa-tion and may limit their access to future work opportunities. This paper presents a comprehensive study that aims to understand how workers react to rejections in microtask crowdsourcing. We investigate the a ect of the mood of workers on their performance, as well as the interaction of their moods with their reactions to rejection. Finally, we explore tech-niques such as social comparison that can be used to foster positive reac-tions. Our ndings bear important implications on maintaining positive interactions between workers and requesters in microtask crowdsourcing systems, thereby improving the e ectiveness of the paradigm.

    https://2018.eswc-conferences.org/wp..._paper_204.pdf
    Last edited by spamgirl; 02-28-2018 at 03:05 PM.
    It's all good.

  2. #2
    Member
    Join Date
    Jun 2015
    Gender
    Female
    Posts
    946
    Thanks
    2,069
    Thanked 1,840 Times in 767 Posts

    Hmm.
    From that link, page 6:

    However, after the pre-screening phase, immaterial of how the workers performedthey were presented with the following message - “Sorry! You didnot succeed in passing the pre-screening phase.”

    Not cool.

  3. #3
    Community Manager spamgirl's Avatar
    Join Date
    Aug 2011
    Gender
    Female
    Location
    Canada
    Posts
    13,564
    Thanks
    2,902
    Thanked 7,428 Times in 2,420 Posts

    Quote Originally Posted by nel View Post
    Hmm.
    From that link, page 6:

    However, after the pre-screening phase, immaterial of how the workers performedthey were presented with the following message - “Sorry! You didnot succeed in passing the pre-screening phase.”

    Not cool.
    Exactly. So cruel.
    It's all good.

  4. The Following User Says Thank You to spamgirl For This Useful Post:

    nel

  5. #4
    Member
    Join Date
    Jun 2015
    Gender
    Female
    Posts
    946
    Thanks
    2,069
    Thanked 1,840 Times in 767 Posts

    Quote Originally Posted by spamgirl View Post
    Exactly. So cruel.

    Figured that was what they were up to, but was hoping they had come up with a kinder way.

    Wonder if they went back and explained, given one of their conclusions is:

    Thus, a further recommendation for crowdsourcing task design wecan draw from these results is the importance of requesters providing detailedexplanations to crowd workers on the errors which have been made rather than,as commonly done, simply reject work.

  6. The Following User Says Thank You to nel For This Useful Post:


  7. #5
    Community Manager spamgirl's Avatar
    Join Date
    Aug 2011
    Gender
    Female
    Location
    Canada
    Posts
    13,564
    Thanks
    2,902
    Thanked 7,428 Times in 2,420 Posts

    Quote Originally Posted by nel View Post
    Figured that was what they were up to, but was hoping they had come up with a kinder way.

    Wonder if they went back and explained, given one of their conclusions is:

    Thus, a further recommendation for crowdsourcing task design wecan draw from these results is the importance of requesters providing detailedexplanations to crowd workers on the errors which have been made rather than,as commonly done, simply reject work.
    It'd be hard for them to explain, I guess... unless there was a debrief after that, but even then most people would have returned the HIT.
    It's all good.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •