Saturday, October 04, 2014

Predictable Program Review Flaws now on Display

For the first time Kentucky schools conducted Program Reviews where they graded themselves, providing scores for nearly a quarter of the accountability index.


According to KRS 158.6453(1)(i), a Program Review is ...a systematic method of analyzing components of an instructional program, including instructional practices, aligned and enacted curriculum, student work samples, formative and summative assessments, professional development and support services, and administrative support and monitoring. Program reviews are being used in three areas: Arts & Humanities, Writing, and Practical Living and Career Studies.

Oh, and by the way, the schools grade themselves, and KDE lacks sufficient staffing to perform any meaningful oversight.


Last November, as districts were working through the reviews, Commissioner Terry Holliday expressed reservations about inconsistent ratings that were showing up. 
“We do have concerns about the quality control on program reviews," Holliday said. Several schools  gave themselves high program review scores while their students' writing scores were very low.
Without saying that other districts cheated (64 percent of Kentucky schools reportedly gave themselves perfect marks), Shelton suggested to the Herald-Leader that the reason for Fayette County's accountability index decline this year was related to Fayette's world-class standards.

"We held ourselves to a very high standard," Shelton said. "Our goal is to provide world-class programs and instruction, and we graded ourselves with that in mind."

State education officials cautioned against making comparisons with last year's scores because of the addition of the Program Reviews. But of course, that's the first thing everybody did.

Program Reviews were the product of a squabble over writing portfolios back in 2008. Kentucky was the only state using writing portfolios, there were scoring difficulties, inter-rater reliability was suspect, and so the portfolios were attacked by the Family Foundation, the Bluegrass Institute and a group of legislators led by then Sen. Dan Kelly (R-Springfield). The biggest fear was that removing writing portfolios from the state assessment would cause teachers to take their eyes off this critical skill set. Illustrating the point, Frankfort Independent Superintendent Rich Crowe said, "If they're not going to score it, and if it's not going to count for anything, I don't know why we should waste our time."


Writing consultant, Dr. Charles Whitaker warned Commissioner Jon Draud's CATS Task Force that if Program Reviews were not properly implemented (the prospects for which were neither cheap nor easy) that we would only be trading one doubtful measure for another. He told the panel that Program Reviews can be slop, or can be a high quality. A lot would depend on the details of who's doing the reviewing. "I'd be very careful to ensure that we had a pretty clear idea of who's going to do it, what kind of training they've had, Whitaker said. "I can see the same problems appearing in a program review."

Ultimately, when the KEA threw its support behind Senate Bill 1 (the original intent of which was to kill CATS, not produce college- and carer-ready students), portfolios went away and the compromise solution was the program review - which shared every potential flaw of the portfolio it was meant to improve upon. The Courier-Journal noted the odd political "alliance of (1) Republicans who have opposed the Kentucky Education Reform Act since its passage; (2) reflexive right-wing opponents of public schools, and (3) teacher groups that find KERA too demanding.

At the time, then Daviess County Superintendent Tom Shelton opposed the switch from Writing Portfolios to Program Reviews and would have preferred to keep the CATS exams until this year - the year by which Kentucky legislators said that all students should be proficient. 
"If I was among the public, I would be saying, 'You promised me proficiency by 2014...Now what?'" Shelton said.
Well, it's 2014. And, with the possible exception of the Fayette County Board of Education, the public is asking...
Read more here: http://www.kentucky.com/2014/10/03/3460434/fayette-schools-raise-overall.html?sp=/99/164/142/#storylink=cpy

1 comment:

Anonymous said...

Ditch them, self evaluation has not place in accountability, especially not a quarter of your school's overall index. Same with PGES - same folks evaluating program reviews are completing TPGES and their jobs are on the line, not just a school score index. Next we will hear of some folks just low balling student growth/achievement expectations on TPGES and overwhelm the principal with an overburdensome teacher evaluation system -boom another 10% of your school accountability down the drain.

Why are trying to mash so much into what should be a meaningful measurement of whether students have learned anything? How can we pretend to evaluate if students are learning in a school if we only count 24-15% of a school score based upon student achievement? You beat GAP by increasing Free/Reduced lunch numbers so that achievement and gap basically are the same thing. Similarly, growth (though well intentioned) is the biggest misnomer you could use for the concocted idea that students in shared baseline cohort groups who perform below the contrived 40% mark (regardless of their score) didn't not demonstrate growth.

Sad part is all the public does is once a year look at the overall score for their school, compared it to others guy in the neighboring county and then draw some overly simplistic impression of whether their school is doing well or not.

What next weight students for health scores or exams for teachers?