Answer script evaluator: A literature survey

2019 
Every college, university, school conduct exams and most important part of exams are the results. In order to get these results, the exam papers have to be evaluated one by one manually. This process of evaluating the exam papers is time-consuming and requires more manpower. To overcome this solution, we have come up with a thought that removes the manual evaluation process. Our project focuses on developing a system that evaluates an answer script against a pre-uploaded marking scheme. Initially, the answers are taken in digital format and those digital answers are processed using algorithms such as word2vec where the word’s similarity index is extracted and the words similar to it are noted. Using this we can also get the meaning of the paragraph and we can match it with the answer key and get the match percentage. Using this percentage, we can calculate the marks to be awarded.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []