An Innovative Approach For Automatically Grading Spelling In Essays Using Rubric-based Scoring

Publisher:
Academic Press Inc Elsevier Science
Publication Type:
Journal Article
Citation:
Journal of Computer and System Sciences, 2013, 79 (7), pp. 1040 - 1056
Issue Date:
2013-01
Full metadata record
Files in This Item:
Filename Description Size
Thumbnail2012005089OK.pdf831.58 kB
Adobe PDF
Automated Essay Grading (AEG) is defined as a computer technology that evaluates and scores written prose. A number of AEG systems have been developed since the 1960s and in most of them, an ad-hoc or generalized approach is used to grade spelling even though it is an important element of an essay-scoring rubric. Existing approaches do not therefore give an accurate representation or measure of spelling in essays. According to the rubric-based scoring method used in the National Assessment Program Literacy and Numeracy (NAPLAN) in Australia, spelling is marked in three steps first, by identifying the correct and incorrect words in the essay; second, by categorizing each word based on the difficulty level into one of four classes: Simple, Common, Difficult or Challenging, and counting the number of correct and incorrect words in each category; finally, by using the pre-defined NAPLAN rubric scale to assign the mark. Only a small number of existing AEG systems can be used for rubric-based scoring, and none can be used to grade spelling according to the NAPLAN rubric. In this paper, we address this shortcoming in the existing literature and present an innovative approach to automatically mark spelling using rubric based scoring. We develop two algorithms based on the rules and heuristics of the English language to formulize the rubric for spelling and then implement these algorithms in Java language and perform a series of evaluations of our system using an essay dataset. Our results are very promising, even though it is the first system of this kind.
Please use this identifier to cite or link to this item: