April 5, 2013
EdX, which provides a platform for massive open online courses (or MOOCs), recently waded into the fray surrounding automated grading systems by introducing software that employs artificial intelligence to grade student essays and short written answers and provide an immediate assessment, The New York Times reported. The nonprofit organization, founded by Harvard and the Massachusetts Institute of Technology, is making the tool available for free via the Internet to colleges and other institutions to use. Perhaps not surprisingly, the technology, while not altogether new, has already garnered controversy.
According to EdX president Dr. Anant Agarwal, the software is beneficial in that it enables students to learn from instantaneous feedback, ultimately improving their writing as a result. They can write and rewrite without having to wait for their instructor's input, a period that can span days or even weeks.
For students taking an MOOC with hundreds and sometimes thousands of others, the automatic grading software with its immediate feedback would seem to be preferable to human grading, Boston Magazine noted. For professors teaching MOOCs, the technology just might be a necessity.
How does it work? A teacher first must themselves grade 100 essays or essay questions, from which the software trains itself to be able to evaluate any essay or answer, explained The New York Times. When the software grades, it delivers an assessment based on the scoring system the teacher created, either a numerical ranking or letter grade. It also indicates whether or not a student's answer is on topic.
Critics say the automated system pales compared to human grading. One longstanding critic, a retired director of writing and current researcher at MIT, Les Perelman, is one of a group of educators who last month began circulating a petition against the use of automated assessment software. Calling themselves Professionals Against Machine Scoring of Student Essays in High-Stakes Assessment, the group to date has garnered about 2,000 signatures.
"Let's face the realities of automatic essay scoring," the group's statement declares. "Computers cannot 'read.' They cannot measure the essentials of effective written communication: accuracy, reasoning, adequacy of evidence, good sense, ethical stance, convincing argument, meaningful organization, clarity and veracity, among others."
However, edX's Dr. Agarwal said he believed the software was approaching a level of grading compatible to that of humans.
"This is machine learning and there is a long way to go, but it's good enough, and the upside is huge," he said. "We found that the quality of the grading is similar to the variation you find from instructor to instructor."
Mother Jones' Kevin Drum argued an automated system has huge potential if it can assess short student writings and provide immediate feedback, albeit marginal, as great value exists in having pupils write frequently and receive critiques soon after.
"This software may not be 100 percent ready for prime time yet, but it's getting there," he added. And it could be a game changer."
Compiled by Doresa Banning
"Can Computers Teach Students to Write Better?" motherjones.com, April 4, 2013, Kevin Drum
"EdX Now Has Software to Grade Your Essays," bostonmagazine.com, April 4, 2013, Eric Randall
"Essay-Grading Software Offers Professors a Break," nytimes.com, April 4, 2013, John Markoff