The Cincinnati Post reported:
While there's been much talk in Kentucky education circles that there were an unusual number of complaints from schools, department of education spokeswoman Lisa Gross said the number wasn't extraordinary given that the state changed the company doing the test and increased the number of grades tested.
"The breadth of the issues with the NCLB data was typical of that in previous years and had no major statewide effects," she said.
I'd like to be comforted by that thought, but I'm not sure I am. From the minute NCLB scores were released some western Kentucky superintendents claimed the data were contaminated.
Is Gross saying it's always this bad?
If so, why did KDE decide to delay release of the CATS data this year, as opposed to last year...or the year before?
Is Gross blaming the new company?
Covington District Assessment Coordinator Bill Grien told the Cincinnati Post, " When (the state department of education) listened to all the complaints from around the state, they decided to take time to make sure data was correct."
The clear message for every school in the state is, "Don't trust the data!" Double check it yourself.
Changes in the testing program this past year make all of the data suspect. Errors undermine confidence.
In addition, Kentucky School News and Commentary began hearing reports from teachers last spring (when the new tests were first being administered) that the new test was easier on its face - an unverifiable opinion of numerous teachers; suspicious nonetheless.
Even if the released data had been pristine, the fact that the assessment itself has changed has caused the Kentucky Department of Education to warn schools that the data can not be compared to past years.
Of course that did not stop district superintendents from proclaiming progress based on the new data. It's the only data they have.
Around the state local superintendents took bows and praised teachers for district progress. Much of that progress may well be accounted for by the new test alone. Superintendent Stu Silberman claimed that Fayette County had "broken some records" with this year's data. If the data cannot be compared...how can that be?
The question is...Has Kentucky joined the growing list of states who have lowered standards to avoid looking bad under NCLBs accountability system?
It looks like the answer might be yes.
Dick Innes, over at the Bluegrass Institute, has been dissecting the assessment program for years now. See his MUST READ post on the current KDE testing woes. His preliminary data on 2007 NCLB proficiency rates seem to confirm my own suspicions - that those teachers who said the new test is less demanding are correct.
Consider: Reading proficiency among all Ky students stood at 48.24% in 2002. In 2003, it rose a couple of points to 50.16%. The average growth per year was 2.15% as the proficiency percentage advanced to 53.45% in 2004; 55.95% in 2005 and 56.84% by 2006.
Then the test was changed.
In 2007 the percentage of students scoring proficient in reading jumped to 66.12%! A 9.28% increase in one year - which would really be worth celebrating if we knew it was the result of increased student achievement.
By changing the test, Kentucky made more "progress" toward meeting its NCLB goals in one year than it had over the previous four years.
The results are similar for Reading and Math and for African American and Free Lunch subgroups.
Average yearly gains for "all students" in Math were 2.49% until the new test. This year it jumped to 9.86%.
Average yearly gains for "African Americans" in Math were 2.24% until the new test. This year it jumped to 10.15%.
Average yearly gains for "Free lunch" students in Math were 2.73% until the new test. This year it jumped to 10.28%.
Average yearly gains for "African Americans" in Reading were 2.40% until the new test. This year it jumped to 10.03%.
Average yearly gains for "Free Lunch" students in Reading were 2.65% until the new test. This year it jumped to 10.14%.
(The sources for his data: KDE, NCLB Progress Reports for 2003 - 2007.)
Is it safe to assume (after multiple administrations of the old test) that the established pattern of 2.5% (approximate) growth per year was about right? If so, the new data should be viewed with suspicion. Reported student achievement growth may be inflated by something around 7%. If that's true...the celebrating ought not get to big unless a school's growth exceeds 7% this year.If my suspicions are correct, when the CATS data are released on October 2nd, we ought to see some monster "progress." Annual increases may well break some records. In baseball parlance, the dead ball era is over. And in places where progress has historically been slow, but where the instructional focus has been the most intense, we may see numbers that are huge.
Any Fayette County folks want to guess what kind of gains Booker T Washington Academy will post when the data are released next month?
Principal Peggy Petrilli recently resigned (or was released, depending upon who you ask) amid a lot of fussing from parents about her methods. Petrilli was changing a school climate whose legacy was one of long-standing failure. She rubbed a lot a folks the wrong way. She canned some politically connected people. Lots of claims. A few investigations. So far...apparently nothing...no wrongdoing substantiated.
But what about the results?
The CATS results for 2007 will show that the percentage of proficient readers at Booker T Washington Academy will...
a. decline
b. stay about the same
c. increase by less than 7 points
d. increase by a little more than 7 points
e. increase by more than 10 points
f. increase by more than 15 points
The correct answer is.... f.
No comments:
Post a Comment