The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

New Jersey’s first effort to correlate student test scores to individual teachers quietly began this month, as nearly a dozen districts were sent data that does exactly that.

Website for New Jersey Spotlight
This story also appeared in New Jersey Spotlight

The new data, which uses a measure called “student growth percentiles” (SGP), was mailed early this month to the first 11 districts involved in the state’s teacher evaluation pilot — from large ones Newark and Elizabeth to smaller ones like Bergenfield and Alexandria.

The measure compares the median of a student’s progress against comparable children statewide, and pairs that progress — or lack of it — to a specific teacher. The progress reports only cover teachers working in math and language arts in grades 3 through 8, the grades covered by the state’s annual NJASK (New Jersey Assessment of Skills and Knowledge) tests.

Assistant Commissioner Peter Shulman said in his letter to districts that the initial data is meant as a first foray into the information that will be central to the state’s evaluation system.

“The purpose of this data distribution is to help pilot districts and the Department to learn more about this student achievement measure as part of the piloting process,” he wrote.

The data is almost surely the most controversial component of the evaluations. And this release was just the start of a torrent of similar numbers to be provided districts over the next year, using SGP to assess not just teachers but also schools and entire districts.

The stakes will be especially high for teachers. Evaluation systems will be mandated for every district by next fall, although student performance will not necessarily be part of a teacher’s grade from the start.

Student test scores are only one of several measures to be used to grade teachers, and then only for those whose students take the state’s tests. Shulman repeatedly stressed that the pilots are a work in progress, a template to help determine the state’s guidelines on which factors are to be used. Those much-anticipated regulations are expected this spring.

“The purpose of our current pilot is to collaboratively develop more meaningful evaluation systems that will help all educators continuously improve their practice,” Shulman wrote to the pilot districts. “As a central … tenet, we believe that educators should never be evaluated on a single factor or test score alone, but on multiple measures of both effective practice and student learning outcomes.”

The early reaction from a handful of local administrators contacted last week was generally positive.

By and large, the administrators said the data looked consistent with their own measures of individual teacher performance, as observed in classrooms and schools.

“Our early data is pretty positive in connecting our own feedback on teachers with their student outcomes,” said Rachel Goldberg, Elizabeth’s director of staff development, whose district received the data on 278 of its more than 2,000 certificated staff.

“It was pretty consistent,” she said. “It has been a definite reinforcement in terms of the feedback we were already giving to teachers.”

John Mazzei, director of human resources in Pemberton Township, said he also was encouraged by the early results for about 300 of his teachers, out of an overall staff of 800.

“They are correlating fairly closely,” he said. “There looks like a pretty good area of reliability between scores and practice.”

Still, Mazzei was among those voicing some caution in drawing too many conclusions this early in the game, so much so that the administrators had yet to share results with individual teachers.

He and others said there is always a question of data integrity in any first run, including the district’s own information for teachers. And the scores after all are only for a small sample and for one year, they said.

“We’re not ready” to share data with the teachers, Mazzei said. “We want to make sure everything is correct.”

Nevertheless, the data has introduced some new qualms into the debate over teacher quality that has roiled state education circles for the better part of the past three years, culminating in the passage of the new tenure law last summer.

Advocates and others said while they were still exploring the data themselves, they were already getting nervous about its effect on teachers.

“There is no doubt that this pits teacher against teacher and will have a negative impact on collaboration,” said Rosemary Knab, associate director of the New Jersey Education Association, the teachers union.

“Teachers will be blamed for the poor performance of students on standardized tests, not the dozen of other reasons that impact test performance,” she said.

This story appears courtesy the New Jersey Spotlight.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Letters to the Editor

At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.

By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.

Your email address will not be published. Required fields are marked *