Ah, spring approaching: warm weather, birds chirping, and a new round of standardized tests tied to the Common Core. Since they were first rolled out three years ago, the tests have been hugely controversial. Parent-led “opt-out” movements have impacted school districts, with as many as 80 percent of students in some districts sitting out the test, with even some teacher associations supporting them.
But a new study claims that all the trouble might be worth it and that the tests are a good metric by which to measure Common Core standards. A second study cautioned, however, that there is benefit to the testing only if the pass scores are set higher.
Over the past five years, the Common Core standards, a set of educational guidelines for what students should know in English and math, have been adopted in 44 states plus the District of Columbia. To evaluate how well students are learning the new material, states have turned to new standardized tests.
Most states using the Common Core standards (or a version of them) signed up for one of two tests that were developed with the support of federal funding: Smarter Balanced Assessment Consortium and PARCC, or Partnership for Assessment of Readiness for College and Careers. Several states have dropped out to create their own tests, but this spring, there are still 17 states giving Smarter Balanced tests and nine using PARCC. A smaller contingent of three states have adopted an alternative, Act Aspire.
While the tests have elicited a lot of attention, the comprehensive study released last month by the Thomas B. Fordham Institute, an advocate for the Common Core standards,* actually looked under the hood and evaluated whether the tests do indeed test the content of the standards.
Morgan Polikoff, assistant professor of education at University of Southern California, who coordinated the study together with educational consultant Nancy Doorey, said the imperative to do so was strong, “It’s well known that teachers are influenced by the tests and tailor their teaching to assessments. The tests not only drive instruction, often, they are the real standards.”
Over the past 22 months, the researchers examined the actual tests to see if they specifically targeted a main focus of the English language arts standards – close reading of a text and writing an essay using evidence from it. They found that PARCC and Smarter Balanced did indeed test for the new standard. Act Aspire did so to a lesser degree. They also included a state-designed test, Massachusetts’s MCAS, for comparison, and found the exam barely covered the new standard.
States that have adopted one of the new tests that passed the content bar shouldn’t necessarily rest easy, however. A second study out this week from the American Institutes for Research (AIR) conducted by its vice president Gary Phillips, says that it’s not only the quality of the tests that matter: “The main point of Common Core tests is to give information about whether a student is college and career ready. What matters there is not the test, but the cut score.”
In a four-month long study, he examined data of about a million students who took math and English tests in grade 4 and 8 produced by Smarter Balanced, PARCC, and Act Aspire. He found that the different test groups had different proficiency levels, and that all three often had lower levels than what is considered “college and career ready” by the National Assessment of Educational Progress (NAEP), better known as the Nation’s Report Card.
PARCC was the only testing group that had a cut score comparable in difficulty to NAEP proficient in both grades—but only in math. Overall, Phillips discovered that PARCC tests were the more rigorous of the three. (Results from tests last spring showed students performed better on Smarter Balanced than they did on the NAEP exam, but some researchers have suggested NAEP is outdated and isn’t a good measure of whether students have mastered Common Core standards.)
“These differences undermine the goal of Common Core,” he said in a call, adding that he did find one state – Florida – that had college-ready standards as stringent as the NAEP. It is not affiliated with any of the three testing groups.
Both researchers said the goal of their studies was to provide state lawmakers with a detailed picture of the tests. “I worry about states making short sighted decisions because of politics,” added Polikoff.
*Clarification: This story has been updated to clarify the Thomas B. Fordham Institute’s position on Common Core.