In an e-mail response to Deborah Tonguis, 2009 Teacher of the Year, regarding the obvious inflation of SPS scores John White claims he doesn’t understand simple mathematical formulae or obvious statistical anomalies, and couldn’t possibly be expected to, since all the decisions that went into it were made long before he got there.
To your point, and to her analysis, I think I’ve been as up front about the situation as possible. See this article: http://theadvocate.com/home/4474689-125/high-schools-raise-scores.The story is, as is said there, that three factors, each determined years in advance of my arrival in this role, let to a significant increase in scores. You can argue whether they are inflated per se or not, of course. [John White]
I think it’s safe to say many people have argued they are inflated. Frankly, only a complete idiot would think they weren’t. It’s a crying shame there aren’t tests or educational requirements for State Superintendents, eh? You got lucky there, bud.
John White somehow thinks that directing Deborah to an article in the Advocate ,where he makes the point that only 3 high schools in the entire state did not decline, somehow gets him off the hook for not noticing and simultaneously excuses him for taking credit for performance because he’s increased expectations? (While actually doing the exact opposite formulaicly)
White pointed out that growth in performance scores was not confined to high schools. Indeed, 76 percent of the almost 1,300 schools that earned scores in 2011-12 improved compared with the year before.
High schools’ performance score growth, however, far outpaced the growth in elementary and middle schools.
For instance, while 74 percent of elementary and middle schools improved at least some compared with 2010-11, more than 97 percent of high schools improved their scores during that same time period.
Indeed, only four out of 130 high schools declined, while 252 out of 956 elementary and middle schools declined. [The Advocate]
White points out that 97% of high schools improved their SPS score and also admits in his letter to Deborah that he fully understood all the changes that were implemented and that the impact of those changes would have significant impacts on the scores making them useless for comparison purposes without proper calibration, which he did not do, and which was fully within his power to ask for.
The factors are:
- The decision to count graduation rate in the high school SPS (the grad rate grew in response by more than four percentage points, increasing schools’ numbers dramatically)
- The inclusion of a bonus for all schools with rates over 65 percent (the increase in the rate above was compounded in its impact through this bonus)
- The onset of the EOC tests, and schools becoming accustomed to these tests, perhaps faster than anticipated
In each case, the Department made a decision years ago to give points in specific ways. We now see the results of those decisions. In one sense, the decisions achieved exactly what they were supposed to achieve: schools focused on graduation and on EOC tests. On the other hand, according to some, they skewed the results. One way or the other, they were decisions made for valid reasons with the best information the Department had at the time. [John White]
“The schools met two different challenges at the same time,” White said.
The challenges are growing more challenging. Consequently, it’s possible that schools that improved a letter grade this year may drop back to where they were, or even decline. [The Advocate]