Join the conversation

...about what is working in our public schools.

Testing the Tests

vonzastrowc's picture

It’s great news that administration intends to improve the quality and relevance of education research. I hope they’ll also make good on their vow to improve the quality of assessments. After all, the two efforts are closely related.

The value of research on what works depends on the quality of assessments measuring school and student gains. Two recent items drive home the point:

First, a New York Daily News analysis questioning the steep rise in New York State test scores. After reviewing the state assessments, former Eduwonkette Jennifer Jennings determined that they had grown less challenging and more susceptible to test-prep manipulation. Critics of the New York City Department of Education point to this analysis as they accuse the Department of over-hyping the success of their reforms.

Second, a new Harvard study of charter schools examining the low cognitive demand placed on students in some high-performing charters:

The instructional emphasis frequently was on procedure, not on conceptual understanding. Students were not being asked to think for themselves, nor were they being asked to conjecture, evaluate, or assess. Why? Because the tests that hold these charter schools accountable do not measure higher-order thinking. [Hat Tip: Tom Hoffman.]

As we try to determine the effect of promising reforms, we need greater confidence in our measures. This confidence is especially important now, when ideology so often trumps integrity in education research.

Perhaps we can take a lesson from Chicago. Between 1998 and 2002, the Consortium on Chicago School Research evaluated Chicago’s reforms on the basis of assignments and student work: “Rather than relying on the results of standardized, basic skills test scores, this study sought to collect more direct evidence of the intellectual work life of students in Chicago's schools.” They found that students who received the most intellectually demanding assignments produced the most sophisticated intellectual work. Those students also posted greater gains on standardized assessments.

The Chicago Consortium’s results suggest that ruthless, mindless test prep is not the only way to high scores on standardized tests. They also suggest that we need excellent--and, unfortunately, expensive--research to distinguish the effects of test prep from those of excellent instruction. Finally, and perhaps most important, they suggest that we need better measures of what students know and can do.

Perhaps we have reason for hope. The Consortium’s former director, John Easton, was recently named the director of the Institute for Education Sciences at the Education Department. This bodes well for the administration’s education research agenda.

Wishful thinking? We’ll see.


But didn't the Chicago

But didn't the Chicago reforms flop? How does good research get you good reforms if the reform strategy isn't the right strategy?

Gary--I don't mean to suggest

Gary--I don't mean to suggest that research alone guarantees the success of reforms. I certainly don't know everything about the Chicago Annenberg reforms, but the Consortium research certainly gives us more detailed information about what was going on in some of those schools. Currently, without such research, it's difficult to judge if reforms that raise test scores are truly successful. In fact, successful schools may remain under suspicion of employing barren test prep strategies as long as we don't have richer information about what they're doing--and how others might emulate it.

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options