The Negative Effect of Feedback on Performance in Crowd Labor Tournaments

2014 
With crowd labor markets such as Amazon Mechanical Turk (MTurk for short), oDesk, or Clickworker, it is relatively easy to access a large, distributed workforce with different skill and pay levels on demand. The crowd processes tasks which range from simple repetitive email tagging to creative and complex jobs such as building logos or websites. In crowd labor, crowdsourcing and other collective intelligence settings such as prediction markets, one challenge is to properly incentivize worker effort and quality of work. Besides intrinsic motivation, these systems typically hand out monetary incentives: Often they include a flat fee and an additional bonus for high quality work. Bonus payments can be handed out through various ranking or linear payments schemes known from labor economics. Which incentive structure works best in crowd work, is still an open question.In this paper we address the question if performance feedback about a worker’s relative position in a rank-order tournament influences his behavior in crowd labor settings. We conducted a real effort experiment on MTurk analyzing the effect of performance feedback on worker effort in rank-order tournaments. In line with standard theory, we observe that on average, rank-order tournaments improve performance compared to a piece rate payment. In rank-order tournaments, feedback has on average a negative effect on performance. In a nutshell, the root for this unintuitive result is participant heterogeneity: While comparatively low performing workers stop working all together, comparatively high performing workers knowing that they will be rewarded work less.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    2
    Citations
    NaN
    KQI
    []