Held in a windowless conference room in the massive ant farm known as the Sheraton New York, Tuesday afternoon’s session about media coverage of higher education research was in many ways a preview of the Thursday morning session I’m doing with others about education research and the Internet.
Much of the substance was familiar, if not yet widely heeded: Journalists (NPR’s Steve Drummond, USA Today’s Mary Beth Marklein, and Inside Higher Ed’s Scott Jaschik) telling academics and PR folks to send better press releases and explain their findings better. Plus the usual concerns about “real” academic research vs. “pseudo” research that’s more readily available and better translated for popular consumption.
Interesting stuff, and smart people making good points, but most of it not entirely new. The thing that jumped out at me for some reason was the idea (I forget who floated it) that coverage of education research might suffer not only for substantive and structural reasons -- we all know those -- but also for psychological ones: Journalists' tendency to dismiss or downplay ed research because of its affiliation with teacher training, education’s favorite scapegoat.
It’s an interesting thought – especially since lots of ed research comes from far outside ed schools. And it got me thinking that another possible reason that media coverage of ed research is so sparse and so critical: journalists themselves are often underneath it all soft-hearted liberal types who can’t add much less comprehend stats.
But for the grace of God, they might well have been teachers, or academics, or social scientists themselves. And so they dismiss ed research not only because it's sometimes bad and associated with crunchy ed schools, but also because of self-loathing that's projected outward. Totally psychological, and completely unsupported, but it has the ring of truth -- to me at least.
The challenge in education research is the fact that it's social science. The variables don't hold still. The instruments lack the reliability of natural science instruments - like the thermometer - which can be counted upon to produce consistent results. People differ, and they change.
If education was a natural science...it would be like meteorology. Billions of data points constantly in flux, and highly localized. What you get - whether it be sunshine or tornadoes - depends largely on where you are.
As it is, too few research dollars for too expensive clinical trials has left the field with a host of studies that stretch the credibility of the statistics they rely upon. But that's not the worst of it.
The "pseudo"research Russo refers to is a fraudulent condition where one begins their research with the end clearly in sight. The predetermined outcome in advance of inquiry is the modus operandi of far too many (most?) think tanks and other politically (or religiously) motivated groups. Formed for a particular purpose, everything they do is aimed at proving their point - even if that means ignoring conflicting evidence. I suspect there's a nice bonus in it for the think tanker who produces the best pseudo-science. What some groups have done very well, however, is to get the occasional spun outrage into the press before any critical attention can be paid to it; thus trafficking in hyperbole.
Balanced reporting is better.
Dante once wrote about a place - I believe he called it the Eighth Circle - where conscious panderers and falsifiers are beset by disease, or forced to march for all eternity while being whipped by demons. But, I don't believe in that. Do you?
No comments:
Post a Comment