Exploring higher education students’ critical thinking skills through content analysis

2021 
Abstract Given the psychometric limitations associated with many of the existing critical thinking assessments as well as the lack of these assessments utilising constructed-response items, there is a need for critical thinking assessments that are not only psychometrically sound but also allow test-takers to articulate their reasoning. Compared to selected-response items, in which test-takers are required to select the ‘correct’ answer from a list of alternatives, constructed-response items require test-takers to construct, or generate, their answer. This study had two aims: (1) evaluate the reliability and validity of a pilot critical thinking assessment tool that predominantly consists of constructed-response items, and (2) conduct an exploratory content analysis of higher education students’ responses, including a comparison between participants’ selected and constructed responses. A total of 95 undergraduate students were included in the study, with participants recruited from psychology, chiropractic and computer science programs. This study found preliminary evidence of inter-rater reliability for the constructed-response portion of the assessment tool yet mixed evidence regarding the validity of the tool. This study also found several discrepancies between participants’ selected and constructed responses, therefore raising concerns regarding the use of selected-response items in critical thinking assessments. These findings have important implications for the assessment of higher education students’ critical thinking skills based on the quality of their reasoning rather than merely their ability to ‘select’ the correct answer.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    28
    References
    0
    Citations
    NaN
    KQI
    []