Identifying experts in the crowd for evaluation of engineering designs

2017 
ABSTRACTCrowdsourcing offers the opportunity to gather evaluations on concept designs from evaluators that otherwise may not have been considered, thus leveraging additional expertise to improve decision making during early stages of the design process. Previous research has shown that crowdsourcing may fail to evaluate correctly even ‘simple’ engineering design concepts, because non-expert evaluations overwhelm the entire crowd evaluation. This article proposes using expertise prediction heuristics to automatically identify experts and filter non-experts prior to a crowdsourced evaluation. We conducted an experiment to test four common expertise prediction heuristics: (1) evaluator demographics, (2) evaluation reaction time, (3) mechanical reasoning aptitude, and (4) ‘easy and known’ versions of the actual ‘difficult and unknown’ design evaluation task. The results show statistical significance between variables for all four heuristics; however, most predictive power is garnered going from easy to diffic...
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    63
    References
    10
    Citations
    NaN
    KQI
    []