
Indefatigable blogger Reality-Based Educator often pegs the margin of error of these numbers at 12-35%, while the UFT "...claims the average margin of error is plus or minus 27 points, or a spread of 54 points". Even the sample TDR the DOE provides shows a MOE of +/- 25 points (although, in typical DOE doublespeak, the report calls it a "range" and not a margin of error).
Now, if any of these numbers is correct, or anywhere near correct, it's clear that these numbers are garbage. The sample DOE report shows a teacher with a 50 percentile rank, who may be as low as about 22% to as high as 72%.
I'm no statistician, but I am a baseball fan, so I can understand and explain why these numbers stink. A baseball team ranked at the 50 percentile would be perfectly average. But a team that won 22% of its games would be the worst team in major league history, while the team with a winning percentage of 72 would be the greatest team of all time. Baseball fans, who tend to eat up crazy stats, would spit on value-added because it doesn't mean anything.
I understand that the "margin of error" is meant to show the range into which a teacher may fall in a given year. But I would argue that that number is even more meaningless than it appears when we look at multiple years. I'll use myself as an example. Two years ago, my TDR placed me at the very bottom of the pile, with a single digit score. According to the report, the highest score I could have attained given the margin of error was a 33. Yet this year, I scored at the very top, and the lowest score I could have attained according to the report is an 83.
So, according to these reports, even given the margin of error, there was a 50 point difference between the best teacher I could have been one year, and the worst teacher I could have been the next year.
That is 50 points beyond the margin of error.
Some math maven will likely point out that this result is over two years, and the value-added score only measures one year, but I really don't see how that matters.
I am the same teacher, in the same school, teaching the same subject to the same grade, using the same curriculum and lessons, and my score changed almost 90 percentage points.
Perhaps my results are extreme, but they happened. I've spoken to many teachers who've had drops or spikes nearly as large. To me, that means that just about anyone can find himself in danger hitting the bottom and becoming a target of administrators.
If any math teachers care to explain where my analysis went wrong, I'd like to hear. Or perhaps I'm right, and the value-added numbers just don't add up to much.
,