We have this conversation at least three times a year when the various "yardsticks" (NAEP; CATS; NCLB) trot out their measurement data. Then there's SAT, ACT... EIEIO....
Wouldn't it be great if such data were integrated into a comprehensive value-added system? But I digress.
NAEP scores nationally, and in many individual states, showed modest gains from 2005 to 2007.
As Diane Ravitch explains in today's New York Post,
The federally sponsored National Assessment of Educational Progress (NAEP) is known in the education world as the gold standard of testing. In 2002, Congress authorized NAEP testing in every state to serve as a check of the states' own claims about their progress. (Congress rightly worried that individual states would dumb-down tests that they themselves develop and administer.)And, there is at least reason to be suspicious of Kentucky's "new and improved" test. It appears Kentucky may have joined a number of other states in a race to the bottom by the redefining of proficiency.
Whenever test score data are released the spinning begins. The Kentucky Department of Education has an interest (some might say a duty) in pointing out the progress made by the schools. So they publicly shine a light on the best numbers, and privately express concern for the worst.
It's a little thing called spin. Everybody seems to delight in the practice these days.
In a Tuesday press release, the Kentucky Department of Education said,
"The results of the 2007 National Assessment of Educational Progress (NAEP) in reading and mathematics show that Kentucky's 4th and 8th-graders made gains when compared to the state's performance in previous NAEP assessments..."
True. Gains were made. Kentucky's student achievement, as measured by the NAEP, has trended steadily upward overall. (See charts below.)
So that's KDE's headline; Progress over time.On the other side of the argument, assessment watchdogs are sniffing out specific areas of concern. Writing for the Bluegrass Institute this week, Richard Innes took issue with KDE's discounting of declines in 8th grade reading.
KDE claims of eighth grade reading since 1998, “Kentucky’s 8th-graders’ scores have remained steady, with minor gains and losses.”
Is that a fair description? Let’s examine the facts.
In the new ... NAEP assessments ... Kentucky had a reading proficiency rate of 30 percent in 1998. That rose to 32 percent in 2002 and went up again to 34 percent in 2003.
Then, things came unglued.
Eighth grade reading proficiency decayed to 31 percent in 2005, and in 2007 it slid again to just 28 percent. The 2007 proficiency rate is statistically significantly lower than both the 2002 and 2003 scores and is clearly six points lower than the 2003 performance. That six point difference isn’t just statistically significant – it’s just plain SIGNIFICANT.
No other state lost more ground in this time frame.
What’s more, during the same time period, the CATS Kentucky Core Content Test reading proficiency rates for eighth graders continuously rose. Do you believe that?
CATS up 10 points while NAEP declined six?
While Kentucky has progressed steadily, so have other states. Growth is a vital factor to consider, but so is excellence. Kentucky's relative standing among the states frequently leaves the state in all too familiar territory.
For example, who do Kentucky students outperform in 4th grade math? (See map below)
New Mexico, Louisiania, Mississippi and Alabama. All other states are roughly equal to (9 states), or exceed (36 states) Kentucky's progress. You're not going to hear that in a KDE headline.
It's a little better at 8th grade. Add California, Nevada, Oklahoma, Tenessee, Hawaii and West Virginia to the list.
Kentucky only outscores eight states in 8th grade reading.
But clearly the best news for Kentucky is in 4th grade reading where Kentucky joins the national leaders and is only outscored by seven states. What happens between 4th grade and 8th grade in reading ought to be of concern.
We're less than a week away from KDE's next big announcement of progress. I predict the new CATS assessment will show average performance gains of 7% or so across-the-board and in some places, jumps will be huge based at least partly upon changes...
a) to the test itself
b) to the "cut scores" used to define proficiency
The new test data can not be compared to the previous tests - but it will be. It's the data school folks have.
We discussed the NCLB data situation last night at UK. Without advance comment, I asked a group of graduate students (and future principals) to analyze the NCLB proficiency rates in Kentucky. The general reaction to the sharp increase was "Wow!" One of the students shared her experience working with the assessment company to establish the new cut scores. We discussed changes to the system that might account for the dramatic increases, and how school leaders could "present" the data. That's when one of the students came up with the best spin ever. (Pay attention Lisa. Here's your angle.) The new assessment is a truer reflection of the content actually taught in Kentucky's schools, and therefore the 7-point spike in proficiency levels is a fairer measure of the actual progress Kentucky students have made than under the old test.
Terrific.
Now, if we can only get the NAEP data to bear that out....
We have a fundamental problem in our current accountability system. It's initial purpose was political (to garner the support of the business community for KERA's big price tag). It not focused so much on student achievement and curriculum. The focus was school accountability.
Better, would have been a assessment system that began with content and then folded the data into a value-added system, such as the one used in Tennessee. If CATS had been designed to improve instruction for individual students, it would have looked very different.
To their credit, and after the fact, many educators began to look at interim assessment systems that would help teachers identify learning problems early and intervene quickly. There has been a lot of good work done in the trenches, but the state system has become a hodgepoge under NCLB.
Interpreting test data to the public is a national problem, and "interested" parties will always spin the data to suit their own purposes. What we really need is a "disinterested" assessment/accountability reporting source.
As Ravitch understands, we need...
an independent, nonpartisan, professional audit agency to administer tests and report results to the public.
Such an agency should be staffed by testing professionals without a vested interest in whether the scores go up or down. Right now, when scores go down, the public is told that the test was harder this year - but when scores rise, state officials never speculate that the test might have been easier. Instead, they high-five one another and congratulate the state Board...for their wise policies and programs.
What the public needs are the facts. No spin, no creative explanations, no cherry-picking of data for nuggets of good news.Just the facts.
Student Characteristics
Number enrolled: 679,878
Percent in Title I schools: 60.6%
With Individualized Education Programs (IEP): 16.0%
Percent in limited-English proficiency programs: 1.5%
Percent eligible for free/reduced lunch: 52.4%
School/District Characteristics
Number of school districts: 176
Number of schools: 1,426
Number of charter schools: N/A
Per-pupil expenditures: $7,2541
Pupil/teacher ratio: 16.0
Number of FTE teachers: 42,413
Racial/Ethnic Background
White: 86.3%1
Black: 10.6%1
Hispanic: 2.1%1
Asian/Pacific Islander: 0.9%
American Indian/Alaskan Native: 0.2%
Scale Scores for Mathematics
Blue = Kentucky
Green = States above Ky
Yellow = States about the same as Ky
Red = States below Ky
No comments:
Post a Comment