The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox.

robo-reader
Dean Ramser, a writing instructor at California State University-San Dominguez, reviews a student’s work. Credit: Siskanna Naynaha.

For decades, a stubborn truth has dogged efforts to create automated writing assessment tools, aka robo-readers: Computers are stupid. True, their digital brains can process torrents of data in milliseconds without a misstep. But the essence of writing — to convey meaning — remains well beyond their grasp.

“Right now, the idea that a computer could actually understand an essay is laughable,” said Matthew Ramirez, CEO of a startup called WriteLab. A few years ago, Ramirez decided to postpone his doctoral studies in English at the University of California, Berkeley in order to build a better robo-reader — not to score writing but to improve it. Even if a computer can’t really understand your writing, Ramirez thinks it might understand enough to help you write better.

The idea sprouted during Ramirez’s first semester teaching undergraduates. He struggled to give students timely feedback on their drafts — the kind he’d always craved. By the time he did, they’d usually moved on to the next assignment. That same semester, Ramirez took a seminar with Donald McQuade, an English professor who chaired the board of directors for the National Writing Project, a network that supports writing teachers with research and professional development.

“My first paper back from Don was just flourishing with feedback,” Ramirez said. “They were really thoughtful comments, not directives. They made me feel valued as a writer.”

Ramirez sough McQuade’s advice on responding to student writing, and after many discussions, they decided to create WriteLab, which rolled out its Web-based writing tutor in 2015.

A technophile with coding experience, Ramirez knew that existing robo-readers tracked quantifiable proxy measures correlated with writing quality, such as word count, the number of complex sentences and the frequency of esoteric vocabulary.

Lacking true understanding, robo-readers did their best to imitate it, propelled by the testing industry’s push for fast, reliable essay-scorers. Even when the developers modified their programs for classrooms, Ramirez said, “It was always about getting a score. The feedback was mostly decoration.”

WriteLab takes a different approach. Dropping the pretense of scoring, it aims algorithms and machine learning at a more modest goal — giving writers timely, specific feedback to spur review and revision. WriteLab has spread to classrooms and writing centers in more than 50 high schools and colleges. But it’s still a work in progress.

For starters, revision only seems like a modest goal if you ignore how rare it is in student writing, where the typical pattern is: procrastinate, panic, write all night, turn it in, and forget it. A week or two later, when teachers finally pass back the work, any feedback they offer is more or less ignored.

“I mean, once you get a graded essay back, where do you put it? You don’t keep it!” said Dean Ramser, who teaches first-year composition at California State University, Dominguez Hills.

Ever since Ramser’s students have started using WriteLab, they often revise at least ten times before sharing essays with him. Teachers using WriteLab have a dashboard that shows student revisions and writing choices. But, teachers can’t see the writing until a student shares it.

“When I see a student doing many drafts of a piece of writing, that means they thought about their essay that many times. That’s really cool,” Ramser said.

“Writing takes practice,” said McQuade, “My metaphor is sports. Once you’ve practiced something sufficiently, you begin to internalize the kind of moves you need to make.”

“When I see a student doing many drafts of a piece of writing, that means they thought about their essay that many times. That’s really cool.”

WriteLab goes sentence by sentence, using what’s known as practical style, which Ramirez sums up as “strong, punchy verbs, getting subjects and verbs close to the start of sentences, transitioning cleanly from old to new information and writing concisely.”

Most importantly, the program takes a Socratic approach. The color-coded markers peppering the text never label a word or phrase as wrong. Instead, they ask questions. Would this sentence be stronger or weaker if you dropped this adverb? What if you switched from the passive to the active voice?

Adán Olmedo, an English instructor at Berkeley City College who piloted the program with his first-year composition class last fall, said that “hands-off style” suits students for whom “getting feedback on their writing is a scary thing.”

Also, writers can simply dismiss suggestions that don’t make sense or go against their authorial intentions. In beta tests, they did so about 25 percent of the time. Psychologically, it’s a lot easier to challenge a machine than a person, especially a person who’s grading you.

“If you tell most college students they should make a change to their writing, then they’ll make that change,” said Ramirez. “They’re not thinking about what it is they want to do with their writing if it seems it’s at all to the teacher’s displeasure. But software is something that they’re totally comfortable fighting.”

WriteLab’s robo reader isn’t just deferential to a writer’s choices, it learns from them. Every response is fed into a cloud-based analyzer that uses the data to bend and shape future questions and suggestions.

Ramirez compares the process to a Google search. “The recommendations that appear when you start typing are most heavily weighted by your recent searches,” he said. “But they’re also influenced by what everyone else is searching for.”

“If you tell most college students they should make a change to their writing, then they’ll make that change. … But software is something that they’re totally comfortable fighting.”

One big lesson the WriteLab team has learned is that some students can take only so much Socratic feedback. Within a few months after the rollout, a majority of students had stopped using their accounts. When WriteLab surveyed the dropouts, Ramirez said, “What we heard, especially from high school students, was, ‘Why can’t you just give us the revision? What do you want me to do here?’ ”

Those complaints convinced them to add a strong grammar- and spell-checker (currently under development). They’re piloting a pre-writing tool called the “Essay Builder,” which, again, uses questions to help students think through topics, bolster arguments and plan essays.

“We learned from our writing center partners that 40 percent of students show up there with just a blank page,” said Ramirez. “They haven’t even gotten started.”

Of course, it’s still a teacher’s job to judge the final product. “It’s one thing to look at sentence-level style and structure, and another to analyze whether, say, Walt Whitman really argues something or explores a particular theme in his writing,” noted Ramser.

“It’s important to say that this program is meant to supplement teacher feedback, not replace it,” said Ramirez. “It enables students to turn in prose that’s much more refined, but not by any means finished.”

This story was produced by The Hechinger Report, nonprofit, independent news organization focused on inequality and innovation in education. Read more about Blended Learning.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Letters to the Editor

At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.

By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.

Your email address will not be published. Required fields are marked *