A while back Rick Hess published a piece titled "The New Stupid" in Educational Leadership. He has revisited the piece at his blog, Straight Up. In part one, Hess wrote, "It is hard to attend an education conference or read an education magazine without encountering broad claims for data-based decision making and research-based practice. Yet these phrases can too readily morph into convenient buzzwords that obscure rather than clarify." ...and he sets off to provide examples of how misused data can lead to unintended consequences.
The first element of the new stupid is Using Data in Half-Baked Ways. I first encountered the inclination to energetically misuse data a few years ago, while giving a presentation to a group of aspiring superintendents. They were passionate, eager to make data-driven decisions and employ research, and committed to leaving no child behind. We had clearly left the old stupid in the rearview mirror. New grounds for concern emerged, however, as we discussed value-added assessment and teacher assignments.
The group had recently read a research brief high-lighting the effect of teachers on student achievement as well as the inequitable distribution of teachers within districts, with higher-income, higher-performing schools getting the pick of the litter. The aspirants were fired up and ready to put this knowledge to use. To a roomful of nods, one declared, "Day one, we're going to start identifying those high value-added teachers and moving them to the schools that aren't making AYP."
Now, although I was generally sympathetic to the premise, the certainty of the stance provoked me to ask a series of questions: Can we be confident that teachers who are effective in their current classrooms would be equally effective elsewhere? What effect would shifting teachers to different schools have on the likelihood that teachers would remain in the district? Are the measures in question good proxies for teacher quality? What steps might either encourage teachers to accept reassignment or improve recruiting for underserved schools?
My concern was not that the would-be superintendents lacked firm answers to these questions--that's natural even for veteran big-district superintendents who are able to lean on research and assessment departments. It was that they seemingly regarded such questions as distractions.
In part two, Hess takes on those who oversimplify the meaning behind the data, and who underestimate the value of a well-run school.
The second element of the new stupid is Translating Research Simplistically. For two decades, advocates of class-size reduction have referenced the findings from the Student Teacher Achievement Ratio (STAR) project, a class-size experiment conducted in Tennessee in the late 1980s. Researchers found significant achievement gains for students in small kindergarten classes and additional gains in 1st grade, especially for black students. The results seemed to validate a crowd-pleasing reform and were famously embraced in California, where in 1996 legislators adopted a program to reduce class sizes that cost nearly $800 million in its first year and billions in its first decade. The dollars ultimately yielded disappointing results, however, with the only major evaluation (a joint American Institutes for Research and RAND study) finding no effect on student achievement.
What happened? Policymakers ignored nuance and context. California encouraged districts to place students in classes of no more than 20--but that class size was substantially larger than those for which STAR found benefits. Moreover, STAR was a pilot program serving a limited population, which minimized the need for new teachers. California's statewide effort created a voracious appetite for new educators, diluting teacher quality and encouraging well-off districts to strip-mine teachers from less affluent communities. The moral is that even policies or practices informed by rigorous research can prove ineffective if the translation is clumsy or ill considered.
When it comes to "research-based practice," the most vexing problem may be the failure to recognize the limits of what even rigorous scientific research can tell us...
A third and final element of the new stupid is Giving Short Shrift to Management Data. School and district leaders have embraced student achievement data but have paid scant attention to collecting or using data that are more relevant to improving the performance of schools and school systems. The result is "data-driven" systems in which leaders give short shrift to the operations, hiring, and financial practices that are the backbone of any well-run organization and that are crucial to supporting educators.
Existing achievement data are of limited utility for management purposes. State tests tend to provide results that are too coarse to offer more than a snapshot of student and school performance, and few district data systems link student achievement metrics to teachers, practices, or programs in a way that can help determine what is working. More significant, successful public and private organizations monitor their operations extensively and intensively. FedEx and UPS know at any given time where millions of packages are across the United States and around the globe. Yet few districts know how long it takes to respond to a teaching applicant, how frequently teachers use formative assessments, or how rapidly school requests for supplies are processed and fulfilled.
For all of our attention to testing and assessment, student achievement measures are largely irrelevant to judging the performance of many school district employees...
No comments:
Post a Comment