Today’s Herald-Leader opinion page contained a Pro and Con feature on charter schools. Did either argument win?
The pro side was held down by WKU’s Gary Houchins who declared that if we want to see bigger gains in student learning, schools need “autonomy and accountability.” And for some reason he thinks Brad Montell’s charter school bill will get us there.
Writing in opposition JCTA President Brent McKim claims that charters would “divert critical funding from public schools” and complained about recent pro-charter “advertisements designed to make our public schools look worse and charter schools look better than they really are.”
Houchins’s imagined correlation between autonomy and student achievement is never established, mostly, one assumes, because it doesn’t exist in the literature. I do imagine there is some correlation between accountability and productivity, in general, and that may well hold true for schools - but broadly, accountability is antithetical to freedom, right?
Houchins erroneously claims that charter schools operate under “far more autonomy and accountability.” That’s a half-truth by definition. While charters do enjoy more autonomy, the accountability standards charter schools bravely claim they will meet are exactly the same as those required of every public school in the state. That traditional public schools are not routinely closed says more about the persistent educational needs of students in a given community. Closing a school does not cure the actual problem. It only makes victims of children.
McKim claims that Kentucky’s school-based councils already empower administrators to cut through red tape but we know that’s not true. SBDMs might nibble at the edges but no real departure from the state/district plan will occur there. Houchins correctly points out that charter schools are much freer to “innovate curriculum, teaching methods, and the length and structure of the school day.” But are we to believe that charter school operators will turn teachers loose? I doubt that. That’s not how it’s done at KIPP schools. There is a national program.
Both men acknowledge the lackluster results produced by charters, but McKim goes further pointing out other problems with charters such as the general lack of oversight, resistance to open records and their tendency to counsel (or throw) non-conforming students out.
Although charter schools are one centerpiece of the current NCLB corporate school reform movement, Houchins complains that “One-size-fits-all state and federal education mandates, while well intentioned, have caused a serious narrowing of curricula and a near-total focus on testing. Individualized student learning, long a goal of good teachers, is harder than ever.”
If Kentucky wants a charter school bill, we would be better off to return to the bill Commissioner Holliday promoted in the last term. It did not abandon oversight nor leave so much to chance.
11 comments:
First, even the CREDO 2009 report that McKim and others like to incompletely cite has a most important finding starting on page 32 -- once students spend enough time in charter schools (3 years), they do outperform their traditional public school peers. That's not lackluster.
Second, today and tomorrow the NAEP results for charters versus non-charters for poor students are being released in the Bluegrass Policy Blog. Those results indicate charters are moving ahead of non-charters. Read that here: www.bipps.org/bipps-blog.
Third, Richard, you like to do research. See if there is a linkage between Western Michigan Studies, the person who creates them, the NEPC group where he is a fellow, and the union financial connection to that group. Just in the interests of full disclosure.
Thanks Richard.
1) Out of all of the studies - some we have both cited which have varying degrees of problems - one bright spot hardly establishes conclusive evidence. I'm happy with "lackluster" as a description of the overall performance. Some places seem better than others, but overall, it is very clear to me that charter schools are not game changers.
2) I'll give your stuff a look.
3) I'm not likely to get into that. I've got a few higher priorities at present. But you are correct to infer that I am always suspicious of political connections to research. I see it as a huge bias - and that goes for any point on the political spectrum.
Richard,
There is more than the 2009 CREDO study; I was just citing one example.
In fact, CREDO has issued a series of studies on individual states and some cities. In many cases, charters outperform once students spend sufficient time in them.
I'm actually not a fan of CREDO's methodology. Fabricating "virtual" regular public school students using somewhat limited demographic data is potentially problematic.
Two studies using what is probably the best available approach available include Hoxby's 2009 study on New York City Charters:
http://www.nber.org/~schools/charterschoolseval/how_NYC_charter_schools_affect_achievement_sept2009.pdf
and the Boston Foundation/Mass. Dept. of Ed. study in Boston:
http://www.tbf.org/uploadedFiles/tbforg/Utility_Navigation/Multimedia_Library/Reports/InformingTheDebate_Final.pdf Accessed 19Jan10
Both of these studies show that as students spend more time in charters, they start to outperform their peers.
I certainly have not read every charter report out there, but I suspect that many which don't show advantages for charters don't look at charter performance for a long enough period of time.
For example, a new report from Mathematica for the US Dept. of Ed. came out in December, but it only looks at two years of data. CREDO says you have to look longer than that.
If you have some 'con' studies in mind, let me know and I'll take a look at them.
Richard,
Oh No. You were doing fine. Then this?
So what’s the argument now? …that charters are a good idea because they can be easily closed if they are not producing. But before we open the first one in Kentucky, you are angling for more time. C’mon.
Are charter schools magical or not? : )
Seriously, you started off fine. You mentioned CREDO’s best bit for the pro-charter position and raised appropriate questions about methodology.
Then you mentioned Hoxby?!!! …and called it the best available approach?!!! It is never the best approach for a researcher to conduct sponsored research and then withhold the data so that it can’t be scrutinized.
Jesse Rothstein and Gerald Bracey pointed out multiple flaws with the Hoxby study: (http://theprincipal.blogspot.com/2010/02/fools-gold-from-hoxby.html?z):
1) Many of the data that would be needed to draw conclusions are not presented. Hoxby has not made available the data that were used for her original paper.
2) Hoxby's basic result holds only when we rely on her specific construction of the larger streams variable. She has never presented estimates that do not rely on this variable; all such estimates that Rothstein computed yielded small, insignificant effects of choice on test scores.
3) There are several odd aspects of Hoxby’s particular larger streams variable that cast doubt on its validity. And when I presented her data to a math PhD at another Kentucky university, who cared nothing for the political issue, she verified numerous problems.(http://theprincipal.blogspot.com/2010/02/hoxbys-hocus-pocus.html?z)
4) There are serious errors in both the program code and data that Hoxby has distributed.
5) Hoxby has not released the code that was used in her Reply, but she offers no indication that these errors have been repaired.
6) The study is limited to New York City.
7) The study has not been peer reviewed.
8) The study was published by a pro-charter advocacy group staffed with people
who used to work in charter schools.
9) The editorial writer, who promoted the study lacks the background in econometric research to actually know how to interpret the study. She was, therefore, engaging in faith-based editorializing but passing it off as evidence-based.
10) Even if the study had proved to be sound, it is only ONE study. Strong conclusions in any field should never be drawn on the basis of only one study.
As long as I’m on a roll…
• Between CREDO and Hoxby only one of the studies has shared their data so that it could be confirmed by other researchers - CREDO.
• Only one of the studies has been peer-reviewed to protect against politically motivated bias - CREDO.
• Only one of the studies reports evidence on both sides of the issue - CREDO.
• On the other hand, only one study was ordered up by a rich guy for the express purpose of finding "positive" data to report - Hoxby.
• Only one of the studies falsely claims to be a clinical trial (the gold standard) which it clearly is not - Hoxby.
• If Hoxby is willing to exaggerate her method, why should we believe anything else from her study?
• Is there some reason she should not be held to the same high standards as other researchers?
• Plus, there are many other studies that have been peer-reviewed which report mixed results - reports BIPPS religiously ignores.
• Neither study is perfect. The results are all over the place. Yet you show no reluctance to praise Hoxby's study whenever you can. Why is that?
• The CREDO study showed that learning improved the longer students were in charters but that fewer than 20% of the charter schools offer a better education than comparable local schools.
As you know, CREDO’s 2009 study found wide variation in performance. Why wouldn’t it? The study revealed that 17 percent of charter schools provided superior education opportunities for their students. Nearly half of the charter schools nationwide had results that were no different from the local public school options and 37 percent, deliver learning results that are significantly worse than their student would have realized had they remained in traditional public schools. But it’s just one study.
Here are the individual CREDO studies you referred to: http://credo.stanford.edu/research-reports.html
It looks like CREDO has added studies form NY, PA and IN since 2009.
It seems to me that McKim’s just doing what you are doing. …from the other angle, of course. You both cite the studies that support your previously held positions and tout them as best evidence.
For myself, I favor a very tight charter law that would allow district leadership to try new approaches in places of long-standing failure. But what I find most difficult is getting worked up about charters as an approach. It always comes down to the energy and resources of the adults in the community that determine the best schools. That can happen in any school district right now – and does in many.
Richard,
You did indeed get on a wordy roll. Too many words, too little research. I’ll let the silly comment comparing me to Brent McKim pass and get on to more factual stuff.
Let’s talk Hoxby first.
Everyone calls her 2009 report the “Hoxby Report,” but the facts are she had two co-authors, Sonali Murarka and Jenny Kang. Both those other authors are associated with the National Bureau of Economic Research (NBER – See below).
So, the report is a team product. Using a team might somewhat offset the fact that there is no peer review for Hoxby’s report (at least none that was publicly announced).
You say Hoxby’s study was ordered up by “a rich guy.” Really?
Of interest, the report’s introductory pages say funding for Hoxby’s study came from multiple sources such as the US Department of Education’s Institute for Education Sciences under Contract R305A040043, a subcontract of the National Center on School Choice at Vanderbilt University. There also was grant and administrative help from staff of the National Bureau of Economic Research (NBER).
http://www.nber.org/~schools/charterschoolseval/how_NYC_charter_schools_affect_achievement_sept2009.pdf
I doubt one “rich guy” controlled those rather different funding sources.
Regarding the NBER: its web site says it is governed by a Board of Directors with representatives from the leading U.S. research universities and major national economics organizations. Other prominent economists from business, trade unions, and academe also sit on the Bureau's Board. With trade unions on its board, this does not sound like a union-hating operation.
http://www.nber.org/info.html
Another point. You say I cite only a single study – Hoxby’s.
Pardon me, but I cited and provided links to another random lottery based study from the Boston Foundation and the Massachusetts Department of Education. You ‘conveniently’ overlooked that fact. The Boston results were similar to Hoxby’s in NYC. Charters outperform and get better as students spend more time in them.
I am aware that Hoxby has been criticized for not releasing her data. Did it ever occur to you that the usual agreements and restrictions found in virtually all education research could prevent her from doing so? I would be very surprised if cooperating schools had not demanded anonymity. Given the nature of her study, that would lock the data up tightly. I don’t like that sort of restriction in education research, but it is the norm for ‘stuff’ generated by your colleagues in education.
One more point: the federal Family Education Rights and Privacy Act (FERPA) probably gets in the way of releasing the data used by Hoxby. Hoxby’s study required access to testing results and other data for individual students. FERPA has a lot of limitations concerning who can see that. While FERPA has undergone a number of changes, it has been around for at least a decade. FERPA recently became a major thorn in the side of people doing education research in Kentucky, by the way. Ask your buddies at the Prichard Committee about that. They were REALLY unhappy when the KDE had to pull student headcount data out of the Interim Performance Reports.
Stay tuned, because I have more to say about CREDO and Bracey.
Meanwhile, Extra Credit Test Question – Do you know where Gerald Bracey got at least some of his funding?
Richard,
Let’s talk about the late Gerald Bracey. You unquestioningly present his arguments against Hoxby as though they have the aura of guru status. Bracey was intelligent, but he wasn’t perfect.
You lift many Hoxby criticisms without qualification from Bracey’s Huffington Post diatribe of September 30, 2009.
http://www.huffingtonpost.com/gerald-bracey/the-washington-post--unio_b_305238.html
That was published exactly one month before Bracey died, by the way, only about a week after the Hoxby study was released. Bracey obviously didn’t take much time to research and think before he put pen to paper (or finger to keyboard). That’s not a recipe for high quality work. Sadly, Bracey never got the chance to do more extensive analysis, or to defend against challenges to his allegations, either.
Consider this point: Bracey makes clear his belief that charters are primarily union-busting efforts. That betrays an awful strong bias.
Bracey’s bias may be way off target in Hoxby’s case. As I mentioned in my earlier post, with union people on its board, it seems unlikely the National Bureau of Economic Research (NBER), which sponsored the Hoxby report, would support such anti-union activity (reread the NBER comment in my earlier post about union presence on its board). Bracey’s allegation seems tenuous in Hoxby’s case.
Furthermore, there is something about Bracey you apparently don’t know or choose not to discuss. Bracey has a union tie that may explain why he takes the union position on charter school research. It’s not easy to find, but it is there. (Continued)
Richard,
Let’s talk some more about CREDO and the sometimes laughable status of peer reviews in education research.
The value of peer reviews where education research is involved is perhaps overblown. Very simply, what does it matter if more people who share your biases agree with your report? You need to do some remedial reading in Arthur Levine’s “Educating School Teachers” and “Educating Researchers” (Google them).
Regarding the CREDO 2009 charter school report – who did the peer review? Did those reviewers have a clue? Qualifications?
Maybe not.
Consider this: Unless they were ignored, CREDO’s peer reviewers totally missed the report’s generally very poor treatment of its most important finding and the implications of that finding for everything else in the report.
That most important CREDO finding (buried on pages 32 and 33): Once students spend three years in charters, they notably outperform their traditional public school counterparts.
There are substantial impacts on the CREDO report’s other findings from the failure to spot and correctly assess this critical finding. The critical finding indicates there is serious bias in the data used to support all the other findings in the CREDO report. That most definitely includes the now notorious CREDO finding that people like you and Brent McKim like to cite which said only 17% of charters outperformed traditional public schools.
Consider how CREDO discusses the potentially severe bias in findings like the 17% finding. This comment is found on page 32 of the report:
“Because the number of students attending charter schools grows each year, the experience of charter school students reflected in each state’s data is skewed toward first-year charter students. More than half of the records in this analysis capture the first year of charter school experience.”
Once CREDO learned that charters did not ‘show their stuff’ until students had enough time in them to benefit, the original researchers – and the peer reviewers – should have realized that this major finding seriously undermined the credibility of all the other findings in the report. If it takes three years for students to benefit from charters, and if the data sample used to generate the 17% statistic is loaded with students who have less time in charters, of course that statistic is dubious.
Incredibly, both the original CREDO team and the peer reviewers missed this very obvious situation. That gross oversight raises considerable concern about the quality of the CREDO peer review.
Actually, I submit that the CREDO report seriously suffers from a huge and unresolved internal inconsistency. Given the CREDO finding that charters do outperform when students spend enough time in them, the researchers should have re-accomplished their analyses of demographic breakouts by race, learning disabled students, poor students, etc., but only for students who had spent three years or more in charters.
The CREDO team didn’t do that, thereby confirming they didn’t have a clue about the importance of what they wrote on pages 32 and 33 of their report. And, the peer reviewers missed this, too.
So much for the value of peer reviews on CREDO’s credibility.
So, I’ll reiterate. Real peer reviews can be worthwhile, but in education they too often degenerate into a ‘me too,’ amen sort of situation that does nothing to improve the quality of reporting.
Still more to come.
Richard,
Changing subjects a bit, you claim that most reports on charters show mixed results. Are you relying on others for that conclusion, or do you have a list of reports?
If you have such a list, please provide that. I would be interested in those reports that have considered students who have been in charters long enough to benefit (about 3 years according to CREDO and work from Hoxby and the Boston Foundation). Those findings would be compelling.
On the other hand, most of the reports I have seen seem to look at only one year of data and have absolutely no consideration about how long students have been in the charter schools. Since it looks like charters need several years to really perform for students, reports that don’t examine the data in that way are not compelling.
I am glad I don't give a hoot about Charter Schools
February 27, 2012 10:10 AM: touche'
Richard:
I don't have time for much right now but consider that Bloomberg did the press conference, and this from Matthew Di Carlo at NEPC regarding Hoxby engineering The Home Run Conclusion.
"Let’s start with the study’s major conclusion, quoted above, that nine years in a NYC charter school (K through grade eight) would, on average, eliminate most of the “Scarsdale-Harlem achievement gap.” Virtually all media coverage used this finding; on occasion, you can still hear it today.
Now, it bears mentioning Hoxby doesn’t actually follow any student or group of students from kindergarten through grade eight (nine years). Actually, since her data are only for 2000-01 to 2007-08, we know for a fact that she does not have data for a single student that attended a NYC charter for nine straight years (K-8). She doesn’t report how many students in her dataset attended for eight straight years, but does note, in the technical report (released months later – see below) that only 25 percent of her sample has 6-8 years of “charter treatment.” The majority of her sample is students with 3-5 years in a charter school (or less).
So, how did Hoxby come up with the “Scarsdale-Harlem” finding? Well, her models estimate an average single-year gain for charter students (most of whom have only a few years of “treatment”). Those one-year estimates are her primary results. She ignores them completely in the executive summary (and I mean that literally – she does not report the single-year gains until page 43 of the 85-page report).
Instead, she multiplies the single year gain (for math and reading separately) by nine years to produce a sensational talking point. It’s kind of like testing a new diet pill on a group of subjects, who take the pill for anywhere between one and 9-10 months, finding that they lose an average of ten pounds per month, and then launching an advertising campaign proclaiming that the pill will make people lose 120 pounds in a year.
In fairness, months after the report’s release, Hoxby and her co-authors replicated their analysis on students with different durations of charter treatment, and found that there are still large, cumulative effects among those students who have attended charters for 6-8 years. In other words, the annual effect of attending a charter schools does not necessarily depend on how long the student has been there."
Post a Comment