NEW YORK CITY — Rares Benga’s 4-year-old son, Luca, scored in the 99th percentile on the city’s gifted and talented exam when the city announced the results earlier this month.
That put Luca in an elite group of just 1,363 New York City kids who got the best possible score on this year's test, giving them first pick of the city's most sought-after public gifted programs.
But a week later, the Department of Education uncovered a scoring error at testing company Pearson and announced that the number of kids scoring in the 99th percentile had swelled to more than 2,560. That vastly increased Luca's competition for a school spot.
Now, Benga and other parents are questioning the newly released scores, saying there are so many high-scoring kids that there must be another mistake at Pearson.
“The 99 percentile bracket is absurdly large,” said Benga, an Upper West Side dad who works in marketing analytics at a financial firm. “All I want is fairness.”
He fired off letters Monday to the City Council’s Education Committee and the DOE calling for an independent commission to audit the scores and to release detailed data and information about the scoring methodology.
He said that releasing the data was the only way to ensure “credibility” in this year’s admissions process.
"Ninety-nine is meaningless the way they do it," said Benga, who hopes his son will win a spot at the ultra-competitive Anderson School, one of five citywide gifted programs. "The entire methodology is highly suspect.”
In the wake of the Pearson errors, many parents are questioning the validity of this year's record-high number of students qualifying for the city’s gifted and talented programs. Some are calling it Testing GATE — a clever play on the acronym for Gifted and Talented Exam.
The Department of Education changed this year’s G&T test in the hopes of making it more difficult to prepare for after too many kids qualified for the limited number of seats in previous years. Yet, the new, harder test resulted in even more children qualifying — a nearly 33 percent spike — once the DOE announced that scoring errors had been made by Pearson.
Overall, more than 11,700 children were deemed eligible out of 36,012 test takers — or 32.5 percent — versus last year’s 9,644 out of 39,353 — or 24.5 percent.
The DOE found an additional 2,700 students qualified for district seats and more than 2,000 others were in the 97th percentile and eligible for the five elite citywide schools.
Only six students would have lost their eligibility because of the scoring error, DOE officials said. The department would not change their percentile ranks because of Pearson’s mistakes, so those children were allowed to keep their initial, higher scores, officials said.
Many parents on Internet forums across the city were outraged when they learned of the errors — especially those with kids in the 99th percentile where the competition for limited seats became even more fierce.
A Park Slope lawyer whose daughter got a perfect score is even exploring legal options over the results, she said. She asked to remain anonymous.
Another parent, a mathematician, began analyzing the scores and thought the big increase in the number of qualifying kids suggested some red flags.
Alexey Kuptsov, a professor at NYU’s Courant Institute of Mathematical Science whose 4-year-old daughter Sofia received a perfect score on the gifted test, scratched his head at the low number of students — just six — who would have been ineligible because of the scoring error.
He said it would have made more sense if there were either zero students who became ineligible or thousands, just as there were thousands who became eligible because of the mistake.
Without having access to the data, Kuptsov couldn’t know for sure what happened, so he wrote to Schools Chancellor Dennis Walcott asking for it.
“I believe that there is still a mistake in their calculation methodology,” he wrote last week. “Is it possible to initiate a check in the calculations by Pearson?”
Kuptsov, a 33-year-old Manhattan Beach resident, told DNAinfo.com New York he has not received any data despite requests.
He's concerned that even though his daughter Sofia scored 160 out of 160 on the nonverbal part of the test and 150 out of 150 on the verbal part, she will have a slim chance of getting a gifted seat since so many other children also did well.
“With Sofia, I feel she would be bored in a general education class,” he said. “With probability, I feel we won’t have a chance [at a citywide program]. I don’t mind a lottery, but I think the DOE should be consistent [about scoring].”
Pearson officials said they made three separate errors, including the way kids' ages were used to calculate scores, a mistake in the score-conversion tables and a mistake in the mathematical formula for combining the verbal and nonverbal portions of the test.
Kuptsov believes that when Pearson fixed its mistakes in calculating the New York City scores, the company did not fix similar mistakes in the calculation of national averages, which would affect the number of local kids considered high-scoring.
“What I suspect is that they were using exactly the same methodology [nationwide] and had this error in their system forever, but noticed the problem only when New York City parents came forward and challenged the results," Kuptsov said, adding that he could not be sure without seeing the data.
Even before Pearson’s errors were made public, the local group Parents for Fair Education was pushing for the DOE to use composite scores, so someone who got no questions wrong would be ranked above a child who got one question wrong rather than placed in a lottery with others in the 99th percentile.
“If the methodology is wrong,” said Benga, the Upper West Side parent, “then they should use the composite scores since it’s more likely those are correct.”
Michael McCurdy, co-founder of TestingMom.com, a test preparation website, also questioned the results and said parents were fuming.
"Basically one in three qualify," he said. "How could more kids qualify than last year? Even adults have to do double takes on the questions [because they're so difficult]. It doesn’t make sense.”
Neither Pearson nor the DOE responded to questions about the results.