Join the conversation

...about what is working in our public schools.

Where Are We in the Horse Race?

vonzastrowc's picture

The new results of the 2007 Trends in International Mathematics and Science Study (TIMSS) seem at first blush very encouraging. U.S. 4th and 8th graders improved in mathematics, though their science performance remained essentially flat. In fact, improvement in U.S. 8th-graders' mathematics scores outpaced that of students in most other participating countries.

In 8th-grade science, apparently only Singapore and Taiwan outperformed Massachusetts.

Cause for celebration? Not so fast, says Mark Schneider from the American Institutes for Research. He points to yawning achievement gaps laid bare by the TIMSS results. He also notes that some high-performing OECD countries that bested the U.S. in the 2007 Programme for International Assessment did not participate in TIMSS, possibly skewing the results.

Still, American 4th and 8th graders who took the TIMSS assessment outperformed students in countries such as Germany, Australia, Sweden, Austria, New Zealand and Norway. All of these nations performed better than the U.S. in the 2007 PISA, which assesses 15-year-olds.

So, do American students lose that much ground between the ages of 13 and 15? Or do TIMSS and PISA measure different things? Perhaps a bit of both.

Creators of PISA purport to measure how well students can apply academic concepts. (On this point, see our interview with PISA director Andreas Schleicher.) Critics of PISA contend that the test is inferior to TIMSS, because (they claim) it emphasizes lower-level skills. (Stanford mathematician Jim Milgram sent us his version of this argument a couple of months ago.) These critics point to PISA superstar Finland's relatively disappointing showing in the 1999 TIMSS as evidence of Finland's unexamined shortcomings and PISA's weakness. (Finland has not participated in TIMSS since 1999.)

I don't have a horse in this particular race. But the dissonance between the PISA and TIMSS results should remind us to review international assessments with a critical eye before making final pronouncements on their meaning.

We report. You decide.


Nice post. Diane Ravtich

Nice post.

Diane Ravtich wrote something for Flypaper that caught my eye:

"I would point out that Minnesota showed dramatic gains on TIMMS not because of “new, more rigorous standards,” but because of that state’s decision to implement a coherent grade-by-grade curriculum in mathematics. William Schmidt took the lead in developing that curriculum and deserves to bask in glory for what he has done for the children of Minnesota. That is the most important lesson of 2007 TIMSS for the United States."

I'm curious to learn more about what's going right in Minnesota...

Thanks for sending that along

Thanks for sending that along that very interesting post. TIMSS is useful, because it encourages us to look for the fire behind the smoke. If we have confidence in the assessment's validity, we can try to uncover successful practices that might otherwise go unnoticed. Massachusetts also warrants a closer look.

It's also nice to see states make unmistakable progress in tests other than their state tests. That sort of success can calm fears about teaching to a particular test.

This article has facts

This article has facts incorrectly. It’s also misleading and sugar-coated. Just to keep the record straight:

Average mathematics of eighth-grade students:
1. Chinese Taipei
2. Korea
3. Singapore
4. Hong Kong
5. Japan
6. Hungary
7. England
8. Russian Federation
9. United States

Average science of eighth-grade students:
1. Singapore
2. Chinese Taipei
3. Japan
4. Korea
5. England
6. Hungary
7. Czech Republic
8. Slovenia
9. Hong Kong
10. Russian Federation
11. United States

Full report for 2007 results is available at
http://nces.ed.gov/pubs2009/2009001.pdf

Thanks for your comment. I

Thanks for your comment.

I should acknowledge one omission in my blog posting: In 8th-grade science, only Singapore and Taiwan outperformed Massachusetts. (My original posting omitted the subject--science. It also neglected to acknowledge that Massachusetts actually tied with a number of other countries.) Apologies for the oversight--I have corrected the posting..

Still, I'm not quite sure what facts are wrong. Let's see.... 4th and 8th graders improved in mathematics over the past decade, but they did not improve in science. U.S. 8th graders' made greater gains in mathematics than did their peers in most other participating countries.

Closer examination of the TIMSS results does admittedly dampen some observers' enthusiasm: Most of U.S. student gains in 8th-grade mathematics occurred between 1995 and 2003--gains have admittedly slowed a bit since 2003. Also, countries such as the UK and Hong Kong made much bigger gains in 4th-grade science than did the U.S.

Still, I don't quite see how I "sugar coat" the TIMSS results. The posting acknowledges lack of movement in science and legitimate concerns raised by Mark Schneider of AIR. And I don't suggest that differences between TIMSS and PISA offer cause for celebration, either. As Jim Milgram writes in the comment I mention, "I am far more worried by our TIMSS results than our below average performance on PISA."

Indeed, your listing of countries' performance would mislead the lay reader, who might not know that many countries perform below the U.S.

Overall, my posting is meant to highlight differences between two different assessments used in international comparisons--and the importance of understanding what each assesses.

I make no claims to infallibility, so please feel free to make further corrections to the record.

Claus