Web Accessibility Evaluation in a Crowdsourcing-Based System with Expertise-Based Decision Strategy

2018 
The rising awareness of accessibility increases the demand for Web accessibility evaluation projects to verify the implementation of Web accessibility guidelines and identify accessibility barriers in websites. However, the complexity of accessibility evaluation tasks and the lack of experts limits their scope and reduces their significance. Due to this complexity, they could not directly rely on a technique called crowdsourcing, which made great contributions in many fields by dividing a problem into many tedious micro-tasks and solving tasks in parallel. Addressing this issue, we develop a new crowdsourcing-based Web accessibility evaluation system with two novel decision strategies, golden set strategy and time-based golden set strategy. These strategies enable the generation of task results with high accuracy synthesized from micro-tasks solved by workers with heterogeneous expertise. An accessibility evaluation of 98 websites by 55 workers with varying experience verifies that our system can complete the evaluation in half the time with a 7.2% improvement on accuracy than the current approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    7
    Citations
    NaN
    KQI
    []