This from Top Performers:
The headline in Education Week was typical: "Most States Surpass Global Average in Math, Science." The paper was referring to the results from a recently released NAEP-TIMSS linking study. That, of course, is good news and a welcome relief from a long string of not so good news from international comparative assessments. But how good is it?
Readers will be familiar with the findings of the PISA survey from the OECD, which have been reliably dismal, year after year. While a finding that we are slightly above average, on average, is not great news, the news that the average U.S. state is doing better than the average TIMSS country and the top states are competitive with the best countries is certainly better than the news we have been getting from the OECD over the years. But we need to look inside the findings.
First, you should know that our states don't actually have TIMSS scores. The scores that were used to produce the headlines were produced by methods used by the researchers to project the TIMSS scores based on the NAEP scores that we do have for the states. Beware. Such projections are used all the time, for a variety of purposes, but they should be used with care. I can sit down a group of students and have them all take the SATs, the ACTs and the University of Cambridge International General Certificate of Secondary Education Examinations. Indeed, our organization has done just that. Having done that, one can show that students with an average score of X on one of those exams gets, on average, a score of Y on one of the others and Z on the other one. So one can—and people do—say that these scores equal one another. Or, put another way, students with these different scores on these different exams, all perform at the same level. But, when you look at what these three exams—in this hypothetical case—are examining, they are examining very different things. One quick look, for example, at the Cambridge Exams and the SAT will show that the differences in what is actually being measured are very large. We have no way of knowing, in the case of this TIMSS/NAEP data, how large the difference is between what the TIMSS numbers are telling us and what the NAEP numbers are telling us, when the researchers tell us that the scores are equivalent.
The second thing you should consider—and this point is closely related to the first one above—is that PISA does not measure what TIMSS measures, any more than TIMSS measures what NAEP measures. TIMSS tells you on its web site that it measures students' accomplishments on a consensus curriculum devised by the countries that participate in TIMSS. It tells you how well you have mastered that curriculum. The reason that our Center for International Education Benchmarking uses PISA rather than TIMSS to guide our work is because PISA measures not what students know as a result of studying the curriculum, but rather what they can do with what they know on problems that are as real world as possible. If you are most interested in knowing how students do when examined on a test based on what they should have studied in school, then you should pay the most attention to TIMSS. If you are most interested in what they can do with what they know to solve real world problems, then you should pay the most attention to the PISA results. But do not imagine when you put the TIMSS results and the PISA results side by side that they are measuring the same things.
Last, and not least important, when you compare the TIMSS results and the PISA results, bear in mind that you are not comparing the performance of the states in the United States to the same group of countries in both cases. The last year for which we have data from PISA is 2009 (the 2012 results will be available in early December). In 2009, the following countries were among those surveyed by PISA but not by TIMSS in their most recent report: Austria, Belgium, France, Germany, The Netherlands, Shanghai/China, Switzerland and Denmark. The following countries were among those surveyed by TIMSS for their most recent report, but were not surveyed by PISA in their most recent report: Algeria, Armenia, Botswana, Cyprus, Egypt, El Salvador, Ghana, Georgia, Iran, Kuwait, Lebanon, Oman, Saudi Arabia, Serbia and Syria. So it is hardly surprising that our states look better when matched against the TIMSS group of countries than when they are matched against the countries that are typically in the PISA group. Nice, I suppose, to know that we are doing better than Armenia, Ghana and Syria, but it is very hard for this observer to take much comfort in that.
No comments:
Post a Comment