Saturday, January 04, 2014

‘Small typo’ casts big doubt on teacher evaluations

This from Politico:
A single missing suffix among thousands of lines of programming code led a public school teacher in Washington, D.C., to be erroneously fired for incompetence, three teachers to miss out on $15,000 bonuses and 40 others to receive inaccurate job evaluations.
Teachers complained that results fluctuate wildly from year to year.
The miscalculation has raised alarms about the increasing reliance nationwide on complex “value-added” formulas that use student test scores to attempt to quantify precisely how much value teachers have added to their students’ academic performance. Those value-added metrics often carry high stakes: Teachers’ employment, pay and even their professional licenses can depend on them. 
The Obama administration has used financial and policy levers, including Race to the Top grants and No Child Left Behind waivers, to nudge more states to rate teachers in part based on value-added formulas or other measures of student achievement. Education Secretary Arne Duncan has credited D.C.’s strong recent gains on national standardized tests in part to the district’s tough teacher evaluation policy, which was launched by former Chancellor Michelle Rhee.

But teachers have complained that the results fluctuate wildly from year to year — and can be affected by human error, like the missing suffix in the programming code for D.C. schools.

“You can’t simply take a bunch of data, apply an algorithm and use whatever pops out of a black box to judge teachers, students and our schools,” Randi Weingarten, president of the American Federation of Teachers, said this week. The AFT and its affiliates have signed off on contracts that use value-added measures as a significant portion of teacher evaluations — including in D.C. — but Weingarten called the trend “very troubling” nonetheless.

The problem in D.C. stemmed from “a very small typo” inserted into complex programming code during an upgrade earlier this year, said Barbara Devaney, chief operating officer of Mathematica Policy Research, the private firm that holds the contract to calculate value-added scores for the district.

Devaney said the firm employs stringent quality control, which in this case included 40 hours of meetings to review the updated model and an analysis by independent programmers paid to comb through the code line by line. Yet no one noticed the missing suffix until yet another routine quality review took place this November — after the district had already distributed bonuses, layoff notices and evaluation scores based on the value-added data for the 2012-13 school year, Devaney said.

Jason Kamras, chief of human capital for the district, said Mathematica had certified that its results were accurate and had passed its quality control inspection before the district acted on the scores. When the error was belatedly discovered, the firm immediately recalculated those scores.

The recalculations produced “very small differences” in individual teachers’ scores, Devaney said. “But small differences can sometimes have big implications,” she added.

Mathematica’s other clients use different programming codes, Devaney said, and thus the error should not affect other districts’ teacher ratings.

In all, the error affected 44 teachers in D.C. — about 10 percent of those who receive value-added scores based on their students’ standardized tests. Half were rated higher than they should have been and half were rated lower, Kamras said.

The teacher who was fired for an “ineffective” rating in fact should have been ranked “minimally effective,” Kamras said. Three other teachers who scored effective were, in fact, “highly effective” by the district’s scale and deserved bonuses of $15,000 apiece.

Kamras said the district has already reached out to the teacher who was mistakenly fired with a job offer and the promise of salary payments retroactive to the start of the school year. He said the bonuses for the three highly effective teachers will be distributed immediately.

Kamras said he didn’t know if any of the teachers whose ratings were inflated by the Mathematica error received bonuses they didn’t deserve. Even if they did, he said the district will not ask them to repay the money. No one’s evaluation will be lowered as a result of the new calculations, he said.

2 comments:

Anonymous said...

Oh come on, we all know that the best way to determine if someone is a good teacher is by some logarithm which is programed into a computer which quantitatively determines whether a human is doing their job well. Probably better yet, we should use that program as the basis for creating a robot teacher to teach kids. Then all unemployed teachers with their college degrees can draw endless unemployment and qualify for free Obamacare. Won't have to pay robots and they will never mistakes because they are program with ideal qualities

Anonymous said...

I personally would rather have a robot than some of the teachers that we have teaching our chldren. By the way, it seems all they care about is testing.

Why not just go back to actually teaching these children? Then test them at the end of the year to see if you did your job or not? They spend half the year preparing for different tests and miss half of the educational process.