Integration of Automated Essay Scoring Models Using Item Response Theory

2021 
Automated essay scoring (AES) is the task of automatically grading essays without human raters. Many AES models offering different benefits have been proposed over the past few decades. This study proposes a new framework for integrating AES models that uses item response theory (IRT). Specifically, the proposed framework uses IRT to average prediction scores from various AES models while considering the characteristics of each model for evaluation of examinee ability. This study demonstrates that the proposed framework provides higher accuracy than individual AES models and simple averaging methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    3
    Citations
    NaN
    KQI
    []