Computerized instruction offers the promise of a technological version of a personal tutor, giving instant feedback and tailoring lessons for each child’s needs. Yet even advocates of educational technology recognize the motivating power of a human teacher to encourage a demoralized student or clear up a point of confusion. To that end, software programmers have designed all sorts of data dashboards for classroom teachers. These are computer screens, akin to a dashboard in a pilot’s cockpit, that visually depict students’ progress as they learn online and flag areas of concern that a teacher can address. But the dashboards can sometimes look like an overwhelming array of color-coded bars, dots, lines and circles, which require a lot of eye movement and clicking to take them all in.
“There’s been a lot of time, effort and money expense from companies and from researchers on creating learning analytics dashboards for teachers,” said Ken Holstein, an assistant professor in the Human-Computer Interaction Institute at Carnegie Mellon University. “If only teachers can be more informed during class about things they can’t normally easily monitor, then students might learn better. But that hadn’t been established.”
It’s been difficult for the research community to prove that all of this student data actually improves teaching and helps students learn more. So Holstein, along with two colleagues, set up an unusual middle school experiment in the Pittsburgh area. He invented a pair of goggles that allowed teachers to see their student data dashboards without a tablet or a computer screen. Instead, just by scanning a classroom, a teacher could see a virtual reality dashboard in the air, with information hovering above children’s heads as they toiled on a computer. When a teacher clicked on a student’s indicator by making a tap gesture in mid air, detailed data screens on academic progress were displayed for that student, visible only to the teacher on the inside of his or her smart glasses. The goggles also alerted a teacher to student misbehavior. If a student wasn’t working for a two-minute stretch, a “Zzz” would pop up above the student’s head. A student who was clicking on too many hints in rapid succession to solve problems without thinking generated a “hint abuse” alert. Additional flags indicated if a student was not clicking on enough hints or struggling unproductively and repeatedly getting wrong answers.
During the experiment, more than 300 seventh and eighth graders in 18 classrooms spent a week using instructional math software that had performed well in previous studies. In some classes, the teachers were randomly assigned to use the smart glasses and, in others, they were told not to use them. At first, the results seem to confirm the usefulness of student data. Students in classrooms where teachers wore the glasses improved more on a math test than students in classrooms where teachers didn’t wear the glasses. Low-achieving students especially benefited. The researchers hypothesized that the data alerted teachers to quiet students who needed extra help, not just the students who were raising their hands and demanding attention.
But there was a fascinating third situation. The researchers assigned some teachers to wear glasses with the data analytics turned off. There were no “deep dive” screens for teachers to click on or behavior alerts for sleeping. All the glasses could do was peek at a student’s screen from afar and see what the naked eye could see if you were physically hovering behind a student.
The students in classrooms with teachers wearing turned-off glasses learned a lot, too. Indeed, more than half the learning gains that the researchers documented from access to the data were achieved merely by the teachers wearing the functionally limited glasses.
“You need to account for the possibility that just holding a dashboard, or just wearing glasses in our case, might have pretty substantial effects,” said Holstein. “We included the condition just as a sanity test to make sure we’re not overattributing the power of learning analytics.”
Student psychology appears to be at play. Students were more diligent and less likely to avoid work or game the system when they saw their teacher wearing the glasses regardless of whether the teacher was actually using them. The phenomenon is similar to what industrial researchers have found in the workplace; productivity increases when workers think their boss is monitoring them.
Holstein advises against using this study, published in the proceedings of the 2018 Artificial Intelligence in Education conference, as a reason for teachers and schools to increase monitoring of students. Holstein suspects the positive learning gains would dissipate overtime as the novelty of the experiment and the teacher’s headgear fades. A longer study would be needed to see if the psychological trick persists.
But more importantly, Holstein thinks it would ultimately fail. “In terms of fostering a good classroom climate, I don’t think I would encourage the feeling of being policed by the teacher,” he said. “Although you might argue that the students already feel this, to some extent, these tools might amplify that.”
What Holstein wants is for the data itself to be useful rather than the psychology of surveillance. For him, the study was proof that at least some of the students’ learning gains could be attributed to teachers’ use of data analytics. “It was really a big positive result for us,” said Holstein.
Holstein had been planning to launch a large national study of his smart glasses in April, but that experiment was suddenly postponed when schools closed for the coronavirus pandemic. While Holstein waits for schools to reopen, he is designing data dashboards that can be embedded in internet browsers while teachers are teaching remotely.
This story about monitoring student progress written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.