Automated essay scoring
ACARA has undertaken research reviews and studies that shows automated essay scoring (AES) is a viable solution for marking NAPLAN Online writing tasks and achieves scores that are comparable to human markers scoring. These findings mirror numerous studies on automated essays scoring in Australia and internationally.
The AES system marks writing tests using the same NAPLAN marking criteria used by human markers. The AES system is trained using more than 1,000 NAPLAN writing tests scored by human markers. It is trained to apply the marking criteria to a broad range of narrative and persuasive writing tasks. If the AES system cannot score a piece of writing, it will ‘red flag’ it and a human marker will score the essay.
The research, which began in 2012 and was released 2015, found that four separate and independent automated essay scoring systems were able to mark NAPLAN persuasive writing tasks as reliably as expert human markers.
Four experienced vendors were engaged to score a broad sample of NAPLAN persuasive essays in 2013, using the current NAPLAN writing rubric. The vendors represented a cross-section of different approaches and methods for automated assessment of writing. The vendors were provided with 1,014 essays, along with scores provided by human markers, to train and validate their automated essay scoring systems. After training and validating the systems, the vendors used them to mark 339 tests without knowing the human-provided mark. On overall scores and each writing criteria assessed, the four automated essay scoring systems achieved levels of agreement comparable with the human markers. Of especial significance, the AES systems were even able to match human markers on the ‘creative’ rubric criteria: audience and ideas.
Read the research report ( 883 kb)
Read our fact sheet on AES ( 482 kb)
See our infographic which illustrates the automated essay scoring research ( 1.26 mb)
Updated: 24 April 2017