Get important education news and analysis delivered straight to your inbox
Parents wanting to get their kids into a Los Angeles elementary school they think is the best enter a lottery and pray, cross their fingers and do whatever else they can to increase their chances. But, in terms of gains in student achievement, some of the district’s most popular schools are not the most effective. That is one of the insights from a new value-added analysis by the Los Angeles Times.
The Times’ new article examines the relative effectiveness of Los Angeles elementary schools, based on seven years of individual student performance data.
The “value-added” data analysis, partly supported by a grant from The Hechinger Report, was based on “test scores in Grade 2 through 5 at 450 of Los Angeles’ approximately 500 elementary schools. It substantially changes the picture of which schools are succeeding and which are not,” the paper reported.
“The approach generally doesn’t penalize schools for things beyond their control — students’ poverty, English-language ability, previous achievement or other factors commonly used to explain schools’ success or failure. That’s because each student’s progress is measured against his or her own past performance, not that of other children.
“Value-added has many critics who consider it unreliable and a narrow gauge of performance. It looks, in this instance, only at math and English scores, and it ignores many other factors that parents consider when choosing a school. Most of the controversy over value-added, however, has centered on whether it should be used to assess individual teachers, not schools.
“Last Sunday, The Times published findings from a value-added analysis of more than 6,000 teachers in L.A. Unified, which noted that it matters much more which teacher a child gets than which school he or she attends. But parents don’t usually pick a school for a single teacher; this analysis points to schools where teachers overall tend to be more successful at raising scores year after year.
Troubled by the exclusive focus on achievement under the federal No Child Left Behind law, the Obama administration has made analysis of student progress a priority for both teachers and schools. Several states, including California, are moving in that direction.
“‘I’m much less interested in absolute test scores and more interested in how kids are improving,’ U.S. Education Secretary Arne Duncan told The Times last week.
“The results of such a shift are sure to be surprising.”
Last week’s story generated strong reactions nationally and internationally from editorial pages, bloggers, advocates, teachers, union leaders and politicians. Secretary Duncan said he supported the use of such analyses. But few places use value-added analysis in evaluating teaches or schools.
In California, schools are held accountable for their Academic Performance Index scores.
The Times reports: “In elementary and middle school, the 1,000-point index is based entirely on how high students score on the state’s annual tests, given in Grades 2 through 12. According to state data, 81% of the differences among schools reflect socioeconomic factors such as poverty and parents’ education.
“The benefit of the API is that it reflects state standards, helping to maintain clear and common goals for all schools. California third-graders, for instance, are expected to be able to add and subtract simple fractions.
“But even those who designed the API more than a decade ago say it was never meant to be used alone. They recommended measuring student progress as soon as possible.
‘The superiority of looking at student growth was recognized from the very beginning,’ said Ed Haertel, a Stanford professor and testing expert who helped develop the API for the state. ‘It’s much more sensitive and accurate than the current system.’”
As a result of last week’s analysis of teacher effectiveness, the Los Angeles school district announced it would try to introduce value-added data into teacher evaluations.
The leader of the Unified Teachers-Los Angeles union said he’d consider agreeing to such a plan.
Later this month, The Times plans to publish a data base showing the relative effectiveness of the 6,000 teachers whose performance was analyzed.