Automated essay scoring
On 30 November 2015, ACARA released results of research which provides specific evidence that automated essay scoring (AES) is a viable solution for marking NAPLAN online writing tasks.
The research, which began in 2012, concluded that four separate and independent automated essay scoring systems were able to mark NAPLAN persuasive writing tasks as reliably as expert human markers.
Four experienced vendors were engaged to score a broad sample of NAPLAN persuasive essays in 2013, using the current NAPLAN writing rubric. The vendors represented a cross-section of different approaches and methods for automated assessment of writing. The vendors were provided with 1,014 essays, along with scores provided by human markers, to train and validate their automated essay scoring systems. After training and validating the systems, the vendors used them to mark 339 tests without knowing the human-provided mark. On overall scores and each writing criteria assessed, the four automated essay scoring systems achieved levels of agreement comparable with the human markers. Of especial significance, the AES systems were even able to match human markers on the ‘creative’ rubric criteria: audience and ideas.
More research is planned for 2016 which will include a larger sample of students, multiple prompts within and across writing genres (persuasive and narrative) and key validity questions—does the use of AES affect features of student writing and writing instruction—to inform a recommendation to Education Ministers about the approach to be used in 2017.
Read the research report ( 883 kb)
See our infographic which illustrates the automated essay scoring research ( 1.26 mb)
Updated: 23 May 2016