The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

Education research

Education theories come and go. Experts seem to advocate for polar opposites, from student discovery to direct teacher instruction, from typing to cursive hand-writing, and from memorizing times tables to using calculators. Who can blame a school system for not knowing what works?

One big problem is that education scholars don’t bother to replicate each other’s studies. And you can’t figure out which teaching methods are most effective unless the method can be reproduced in more than one setting and produce the same results. A new study, Facts Are More Important Than Novelty: Replication in the Education Sciences, published August 14, 2014 in Educational Researcher, found that education researchers have attempted to replicate other researchers’ results only a scant 0.13 percent of the time. Compare that with the field of psychology where there’s a 1.07 percent replication rate, eight times as much as in education. By contrast, replications within the field of medicine are commonplace and expected.

“When we teach science, we teach students that it’s important for other people to get the same findings as you,” said Matthew C. Makel of Duke University, one of the study’s co-authors. “Replication is a key part of the error-finding process. In education, if our findings cannot be replicated, we lose a lot of credibility with the scientific world and the greater public.”

“Error — or limited generalizability — won’t be found if no one looks,” he added. “And our findings show that, for the most part, in education research, we aren’t looking.”

Makel and his University of Connecticut colleague Jonathan Plucker conducted a text search through the entire publication history of the top 100 education journals and found that only 221 out of more than 165,000 scholarly articles were replication studies, in which researchers tried to reproduce the results of earlier studies. (The 221 includes both exact replications and approximate ones where the experiments were tweaked a bit, say, to see if the intervention would work with a different type of student).

You’d think in education, where best practices could actually help millions of children, there would be a priority on reproducing results. So why so little replication?

Part of it is unique to education. In psychology, for example, you can reproduce results fairly easily using another group of 25 undergraduates in a laboratory or clinic setting. In education, it’s far more complicated to find similar groups of students in similar school settings. Often poverty levels and racial makeups vary. And no two teachers are the same. Each will invariably put his own spin on the teaching method being tested. Many parents and school leaders are understandably reluctant to experiment on children at all.

The culture of the Ivory Tower is also to blame. Professors live by a “publish or perish” mentality. Their tenure, prestige and research funding are often based on how many articles they can get published in leading journals. And the editors and reviewers of these journals (staffed by fellow university professors) have a bias toward the new and the novel.

I talked with Steve Graham, an Arizona State University professor of education, who has edited five education journals. He says he gets 600 submissions a year for Educational Psychology alone, and thus has “the luxury to be very choosy.” He says he doesn’t publish replications studies “unless they cover new ground” (a sort of contradiction in terms). “We want studies that have significant new impact,” he said. “There’s a bias built in. I’m not saying it’s a good thing. It’s a problem. I recognize it.”

The problem affects Graham directly because his own research work involves metastudies of how to teach writing — that is, he synthesizes other researchers’ papers on effective writing instruction to figure out what works. The lack of duplications makes his work difficult. “I got a lot of noise in my metaanalyses,” he explains.

Graham suspects that if there were more research funds earmarked for replications, more academics would apply for them and conduct replication studies. (Foundations out there: hark, there’s a new way to fix education!) The American Psychological Association, also worried about the dearth of replication in its field, is looking to launch a new journal exclusively devoted to publishing replication studies. Perhaps education can create one too.

To be sure, Makel’s and Plucker’s word-search methodology — where they looked only for variants of the word “replicate” — may be exaggerating the lack of scientific process in education research. Neil Seftor, an economist at Mathematica, runs the What Works Clearinghouse for the Department of Education. He specifically examines what the majority of scientific studies say about the best way to teach, or about a particular curriculum or textbook. He admits that exact replications are rare, but says he wouldn’t be interested in exact replications such as those in a laboratory setting. “What you want in education is evidence over a variety of settings in the real world — urban areas, special ed,” Seftor said. When he searches for studies on new interventions, he said he often finds dozens of papers on each one, but they might not have the word “replication” anywhere in their text.

“I don’t think it would be fair to say that they’re all these educational approaches out there and they’ve only been studied one time,” Seftor said. (Admittedly, many of the studies Seftor looks at are unpublished and financed by the developer of the curriculum.)

Seftor, of course, would welcome more scientific studies on education theories. Often, when he is developing practice guides for teachers, the teaching methods recommended by experts don’t have much scientific evidence to support them.

In the meantime, the American Educational Research Association (AERA), which publishes a number of top education journals including Educational Researcher, decided last year (2013) to publish AERA Open as a new open-access research journal, and is specifically encouraging the publication of peer-reviewed replication studies in it. It just began accepting submissions on Sept. 15.

Maybe, once we see more replication studies in print, we’ll be able to judge whether they can filter down to the classroom and improve instruction.

Related stories:

Study finds taking intro statistics class online does no harm

US DOE evaluation of Gates-funded Early College High Schools shows that low-income males more likely to graduate from high school and enroll in college afterward

Bonus pay for teachers thoroughly discredited

Less math is more: data supports Saxon Math curriculum

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Letters to the Editor

At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.

By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.

Your email address will not be published. Required fields are marked *