Quantcast

The DNAinfo archives brought to you by WNYC.
Read the press release here.

Controversial Teacher Evaluation Data Riddled With Problems, Experts Warn

By Jill Colvin | February 23, 2012 11:10am
Critics say the data set to be released by the city's Department of Education has a disturbing degree of error.
Critics say the data set to be released by the city's Department of Education has a disturbing degree of error.
View Full Caption
Getty Images

MANHATTAN — Teachers across the city are bracing for the release of new evaluation data many say is deeply flawed and threatens to shame them publicly — and they're not the only ones with concerns.

Everyone from Schools Chancellor Dennis Walcott to those who helped the city create the "Teacher Data Reports" have spoken out about the results, which could be released by the Department of Education as soon as Thursday.

The data ranks about 18,000 middle school teachers based on how well their students performed on state math and English tests — but it's saddled with a large margin of error of up to 66 percent.

"Everyone's pretty aware of how faulty it is,” said eighth-grade math teacher Jose Vilson, 30, who teaches at a school in Washington Heights and expects his name and test data to be included in the release.

The statistical model used by the Department of Education to rate teachers based on student test scores.
The statistical model used by the Department of Education to rate teachers based on student test scores.
View Full Caption
UFT

Wilson said he ranked "average" in the five-tiered scoring range, which he said isn't a fair representation of the work he does with students who include English language learners and those facing some sort of disability or serious issue at home.

He said the shortfalls of the ranking system — which shoehorns teachers into five categories, "Low," "Below Average," "Average," "Above Average" or "High" — are distressing for colleagues who are scared about their names being linked to poor test scores they say are not a fair representation of their skills.

In addition, many school advocates say they're not convinced by arguments from proponents of the scoring system that the formula takes school resources and student limitations into account.

“A lot of [teachers] feel like it’s completely unfair,” Vilson said. “Where’s my name going to end up? How are people going to interpret what they see? Is it going to be a reflection of my practice? What does it all mean?”

The pilot "Teacher Data Reports," compiled from 2007-2008 through 2009-2010, rate teachers using  a so-called "value-added formula," which compares students' state test scores before and after they are taught by a particular teacher.

The complicated algorithm tries to capture the extent to which a teacher has contributed to their students' success during a given school year, adjusted for factors such as the teacher's years of experience. The push to measure teachers based on test scores has been a controversial growing movement across the nation.

But critics — including those who helped in its development — say the measure is deeply flawed and question the value of its release.

"I think it's a terrible idea," said Sean Corcoran, an associate professor at New York University who has studied the city's value-added system and is deeply critical of the approach, which he said is far from an objective measure of job performance.

Corcoran analyzed the test data and found what he called an "astronomical" rate of error. In math, scores had a 61-point margin of error for a single year, and a 34-point margin when scores were averaged over four years. In English language arts, the numbers were even worse, with a margin of error of 44 points for a four-year average and a whopping 66 points for a single year.

"That means if you have one teacher that’s at the 50th percentile and another that’s at the 75th percentile you really can’t statistically distinguish between the two," he said.

"That’s a huge grain of salt that these numbers need to be taken with."

Douglas Staiger, an economics professor at Dartmouth College who developed the pilot program that was the precursor to the data about to be released, said the value-added measures were never intended to be viewed publicly.

While he defended the measure as the best available for predicting teachers' ability to help students master tests, he said the scores are never a perfect measure of how teachers may perform, even in the long run, and cautioned parents to be skeptical about what they might read.

The tests, he noted, fail to capture facets of teacher performance that parents care about other than test preparation, like how well teachers foster a love of learning, whether they encourage kids to work hard, or if they teach a dimension of learning that isn’t covered on state tests."

"People should think of this as just one piece of information," he said. "It is not perfect. It has lots of problems. But it’s useful."

Others have noted the numbers are based solely on old state tests, which both the state and city education departments have criticized as flawed.

The fundamental concerns about the reliability of the data have caused many advocates, teachers and parents to call on the media not to publish the results, including a petition on change.org that has more than 500 signatures.

Walcott advocated caution in reading the numbers during an appearance on NY1 Wednesday evening.

"I've been very open around my conflict and my conundrum around the release of the data about employees,” Walcott said. "On one hand, I definitely have talked about not being transparent…but on the other hand, I’m very conscious of the potential impact on the employees."

"I don’t want to have our employees denigrated by the press," he added, cautioning that the data "is just one snapshot of how a teacher is doing."

The teachers’ union has also cited several egregious cases in which reports mistakenly placed students in the wrong classes and credited teachers with classes they'd never taught.

"This is a ridiculous way to evaluate teachers, and the city should be ashamed that it's releasing information that belongs in a personnel file," New York University professor Diane Ravitch said, slamming the data as "inaccurate and useless."

But others, including Robert Freeman, executive director of the Department of State's Committee on Open Government, believe the information should be made public, no matter what, since parents are entitled to know how effectively teachers are instructing their kids.

Freeman argued that public employees have never been granted the same privacy rights as private workers and have always been subject to public performance evaluations. Even if there are certain concerns about the data, he said, that doesn't mean the information shouldn't be released. 

"What could be more important to a parent of a child than learning how well or poorly a child’s teacher performs in the classroom?" he asked.

"This kind of information goes to the heart, it seems to me, of accountability."

But that's not enough to satisfy teachers like Dennis Gault, a special education teacher at P.S. 19 and a member of Community Board 1, who said he found the release infuriating after all the mistakes he's heard his colleagues describe.

He said teachers also feel violated by the DOE when it turns over data they'd been told would be collected for purely internal use.

"It wouldn’t be really a concern or problem for me if I knew the information was accurate or fair," he said, and urged news organizations not to publish the data.

John Elfrank-Dana, social studies and history teacher at Murry Bergtraum High School, went further, accusing the DOE of "reneg[ing] on its promise of confidentiality" by releasing the results of what was supposed to be an experimental, pilot program.

"How many times do we have to be stabbed in the back?" he asked, accusing the DOE of trying to "discredit" teachers by publicly shaming them.

"I’m aghast."