The Contribution of Constructed Response Items to Large Scale Assessment: Measuring and Understanding Their Impact.

2012 
This article investigates several questions regarding the impact of different item formats on measurement characteristics. Constructed response (CR) items and multiple choice (MC) items obviously differ in their formats and in the resources needed to score them. As such, they have been the subject of considerable discussion regarding the impact of their use and the potential effect of ceasing to use one or the other item format in an assessment. In particular, this study examines the differences in constructs measured across different domains, changes in test reliability and test characteristic curves, and interactions of item format with race and gender. The data for this study come from the Maryland High School Assessments that are high stakes state examinations whose passage is required in order to obtain a high school diploma. Our results indicate that there are subtle differences in the impact of CR and MC items. These differences are demonstrated in dimensionality, particularly for English and Government, and in ethnic and gender differential performance with these two item types.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    14
    Citations
    NaN
    KQI
    []