Get important education news and analysis delivered straight to your inbox
A high-tech program designed to predict which students are at risk of failure might sound like a way to extend a helping hand.
Some students don’t see it that way.
They worry that the data will be used to label them before they have a chance to make their own impression on a teacher. That it will relegate them to the heap of students who are written off. That it will follow them long after they have improved, as a stigma that won’t allow them to chart their own course. They want their teachers to get to know them personally, not blindly usher them down a path set into motion by an algorithm.
“We don’t know who is choosing it and who is pulling the strings,” said Luis (known as Adrian) Manzano-Anzures, a student at Warren, Michigan’s Macomb Community College, who spoke last month on a panel at EduCon 2.9, an annual conference about education and technology at Philadelphia’s Science Leadership Academy, a public magnet school.
The panel of several students from the Macomb Community College explored “digital redlining and privacy.” There are already stereotypes and systems in place that can hold them back from reaching their full potential, and some students say they are not inclined to trust digital systems.
Advocates of these predictive programs say they help educators find and help students at risk of failure, but the students on the panel presented another side of the story. What happens if this information is used against us? Will a digital dossier – possibility with inaccurate, incomplete or out-of-context data — follow us forever?
These skeptical questions aren’t totally unexpected. Around this time last year, Mount St. Mary’s University in Maryland made national news after its president suggested using a survey to find weak students persuade them to leave. (His reported phrasing “drown the bunnies” made the stories go viral.)
This drew criticism from educators who said it would violate the trust of students, and that information gleaned should be used to help students, not cull them.
During the panel discussion at EduCon students expressed worry that they would be subjected to similar programs. They expressed worry that they often don’t get to see the data that is being kept on them, and that computers lack the ability to form human relationships needed to gather contextual information about their lives. Bad grades one semester might not be predictive of academic ability if, for example, they were failing because of a problem outside of school.
The students invited to speak at EduCon had been discussing the use of data in education in class and doing their own research. That included guidance from Common Sense Media’s Bill Fitzgerald, and from Chris Gilliard and Hugh Culik, professors at Macomb Community College. Gilliard and Culik recently wrote about the drawbacks to predictive software in education.
“At the community college where we teach – as at many community colleges nationwide – where digital resources are scarce and the students and faculty are embedded in working class realities, digital redlining imposes losses that directly limit the futures of our students,” they wrote. “But the issue isn’t confined to community colleges. Its pervasive role in educational technologies needs to be recognized and integrated into the judgments we make about how edtech can categorize students and limit their choices.”
The Macomb students’ education has made them better consumers of all things digital. One student panelist, Orion Jacobs, for instance, said he actually reads the terms of service offered by digital programs – a radical shift from when he would blindly “just click accept.” Students say they are armed with information they can use to ask questions and to advocate for themselves.
“I am thinking outside the box of the system,” said Kelsey Larson, another student panelist.