The DNAinfo archives brought to you by WNYC.
Read the press release here.

City Releases Controversial Teacher Evaluation Data

By Jill Colvin | February 24, 2012 1:46pm
Schools Chancellor Dennis Walcott briefed reporters on the data Friday.
Schools Chancellor Dennis Walcott briefed reporters on the data Friday.
View Full Caption
DNAinfo/Jill Colvin

MANHATTAN — The Department of Education released controversial evaluation data Friday for thousands of teachers that includes not only achievement scores, but also teachers’ names.

The release, which comes after years of litigation, is unprecedented, giving parents access — for the first time — to the best available predictor of how individual teachers will impact their children's performance on state Math and reading exams.

The data reveals, for instance, that 521 middle school teachers across the city scored consistently in the lowest fifth percentile in the five year period ending in 2009-2010, while 696 teachers consistently ranked in the best category, performing better than 95 percent of their colleagues over the same five year period. Some teachers did not teach all five years, but their scores were included for the years within that window when they did teach.

Schools Chancellor Dennis Walcott said he has mixed feelings about the data, which was released to reporters Friday.
Schools Chancellor Dennis Walcott said he has mixed feelings about the data, which was released to reporters Friday.
View Full Caption
DNAinfo/Jill Colvin

Taken as a whole, the data has the potential to reveal important patterns, like how teacher effectiveness varies from the best to worst schools, and how the city chooses to reassign individual teachers once a school has closed, proponents say.

But experts, teachers and officials warn there are serious caveats to interpreting the data, including disturbingly large margins of error. The data ranks 18,000 middle school teachers based on how well their students performed on state math and English tests.

DOE officials cautioned parents Friday to remember that the data, which was ordered released by the courts following a Freedom of Information Law request from media organizations, was never intended to be made public and never supposed to provide a full accounting of teacher’ skills.

Schools Chancellor Dennis Walcott told reporters Friday that he remains torn about the data being made public with names attached.

“I just don’t want our teachers put in a position where they have a small piece of outdated information characterizing their performance,” Walcott told reporters Friday at a briefing on the data, which he nonetheless believes is a “rich tool”  for principals and teachers trying to identify strengths and weaknesses.

Because sample sizes were so low, the average margin of error for teachers in the ratings was 53 points for an English teacher, and 35 points for math. That means that a teacher who scored a 50 on the data may actually have scored anywhere between 32 and 68 points — both considered “average,” but still a very wide line, DOE officials said.

The margins also vary depending on how many students an individual teacher taught. Teachers with fewer students could have margins of error of up to 75 points for math and a whopping 87 points for English.

Teachers were given the opportunity to report errors in their data to the DOE for the 2008-2009 and the 2009-2010 data last winter. About 11,000 teachers —  or 37 percent — chose to participate, and those reported, on average, one incorrect student per report. About 3 percent of the reports they vetted were for courses they did not teach.

Officials also cautioned that the data was never intended to be used in isolation.

They noted, for instance, that last year, of the 133 teachers up for tenure who had ranked in the lowest five percent, more than one in three — 36 percent  — was awarded tenure because they proved themselves in other areas.

“No principal would ever make a decision based on this score alone and we would never advise anyone — parent, reporter, principal, teacher — to draw a conclusion based on this score alone,” Senior Deputy Schools Chancellor Shael Polakow-Suransky said Friday.

The pilot "Teacher Data Reports" compiled back in 2007-2008, 2008-2009 and 2009-2010 rate teachers using something called a "value-added formula," which compares students' state test scores before and after they are taught by particular teachers.

It shoehorns them into five categories: "Low," "Below Average," "Average," "Above Average" or "High.”

The complicated algorithm is intended to capture the extent to which teachers have contributed to their students' success during a given school year. It's adjusted for factors such as how many special needs students are in a class and how many years a teacher has taught.

A similar model will be used to determine 20 percent of a teacher’s rating under the new city-wide evaluation system that is currently in the works.

The United Federation of Teachers has blasted the city for releasing the data, and began running ads in local papers Friday criticizing what it described as “inaccurate and misleading information” intended to tear teachers down.

“This is no way to rate a teacher,” it says, over an image of the complex mathematical formula used by the DOE to compute teachers’ scores.

“The Department of Education should be ashamed of itself,” its President Michael Mulgrew said in a statement, accusing the data of combining “bad tests, a flawed formula and incorrect data to mislead tens of thousands of parents about their children’s teachers.”