Join the conversation

...about what is working in our public schools.

If You Build It, Will They Come?

vonzastrowc's picture

The history of education reform is strewn with the wreckage of dazzling new education technologies no one ever taught teachers to use. Hayes Mizell of the National Staff Development Council (NSDC) sees history repeating itself in the Race to the Top.

Mizell suggests in a recent blog posting that RTTT's investment in powerful new data systems will founder on lack of teacher professional development:

The Department seems to have made two faulty assumptions: (a) improved data systems, in and of themselves, will result in improved instruction, and (b) educators currently have the knowledge and skills they need to use data to improve instruction. Unfortunately, the proposed requirements do not mention professional development. States applying for Race To The Top funds do not have to demonstrate that they, or their school systems, are engaging educators in professional development experiences that increase their understanding and effective use of data.

Earlier this month, the Alliance for Excellent Education expressed similar concerns about data systems. Alliance president Bob Wise argued, "merely providing teachers with more data does not ensure that they know how to use data to improve student learning.”

We often become so enamored of promising new tools that we forget to build the capacity of those who use them.


Let's not confuse tools with

Let's not confuse tools with processes, Claus. Using data to inform instruction presupposes a conceptual and practical model for how children learn and how instruction can be organized, personalized, and delivered to support that (all of which would be aligned to agreed upon community standards for what children should know and be able to do). This model needs to be programmed into computers in advance by smart people, seeded with high-quality and relevant data to result in the identification of intervention strategies, and adjusted over time as we learn more and standards are revised. In the best of all possible worlds, the model also would capture information on not just the conceptual misunderstandings of children (or lack of recall of specific facts), but also on the specific intervention or approach likely to work best for that child that educators would not (of their own professional judgment) think to use or be able to employ. This is a vision for a standardized process of education (so that it could be programmed on a computer) and only relates to technological adoption in schools in so far as that technology enables this vision to come to be. So, yes, of course, the skills of people are important if we expect them to take best advantage of the tools they are provided. However, I think the professional development issue is frankly minor compared to the utopian (or disutopian?) vision presented by cradle to grave data driven instructional + administrative IT systems. I would never argue that data are unimportant or that we couldn't and shouldn't be more data driven in education; let's just be clear on where we want to head and why, though. This is a discussion more about professionalism and the future of the teacher workforce, than professional development...

That's a very insightful

That's a very insightful comment, Doug. To some degree at least, our comments hold true for most education technologies: Those technologies have to be deeply embedded in broader structures and practices that improve teaching and learning. In some cases, people make claims for the technologies before those structures and practices themselves have been developed or are well understood.

I agree with you that the success of data systems depends on much more than teacher professional development. Many educators and communities have ample reason to question the quality of the data those systems would produce, for example. Appropriate interventions are not always well understood. And the stakes for teacher professionalism are indeed high.

Still, I don't see that the creation of robust data systems will necessarily dictate educators' interventions or so thoroughly standardize teacher practice as to threaten professionalism--which would be a dystopian vision, indeed. In fact, job-embedded, site-embedded professional development should actually empower teachers to use their own professional judgment to much better effect after they receive data that actually have meaning for them. This would, of course, be a real work in progress.

Data systems without strong professional development--now that's a path towards dystopia.

There is faulty assumption

There is faulty assumption held among many people outside of school settings, and that is that if you have enough data or the right data, then the solution is self-evident. In fact, there are so many factors involved that it takes very careful and thoughtful analysis to draw even tentative conclusions. My English department recently engaged in a process of data evaluation using only one set of data - grade distributions. How many students in each of our classes earned A's, B's, C's, etc.? If two teachers teach the same course, but one teacher has 40% A's and 20% C's, and the other teacher has 20% A's and 40% C's, the uninitiated would conclude that the first teacher is the easy grader and the second teacher is the harder grader. Okay, except that it turns out that if you analyze separate sections/periods of the same teacher and same course, you can have the exact same discrepancy. So, the easy grader in 2nd period is the same person as the hard grader in 6th period. Is it something about the school schedule? Are students doing better in morning classes than in afternoon classes? Or is it the math schedule affecting the English schedule? After all, Algebra 1 meets at the same time as one English class, while Algebra 1 Honors meets at the same time as the other English class. And it just so happens that one part-time English teacher and one part-time math teacher have constricted all of a certain class into certain times of day, skewing the selection of students into each class, meaning that the part-time English teacher is going to look particularly effective or ineffective because her students are actually not a random sample but a higher or lower achieving subgroup of that grade level. I could go on all day with this stuff. I wish people who claim to be experts in school systems and assessment could actually spend some time among those who DO the work. It's like trying to understand the demands of a pilot's job with no understanding of cockpits, physics, or air traffic control.

Thank you for your thoughtful

Thank you for your thoughtful comment, David. Your example of course grades is fascinating. I remember going through a similar exercise when I was teaching college sections a while back. Our different sections were performing very differently, and we were trying to figure out why. One of the problems was that we really didn't have a whole lot of data to go on, and no one had trained us to use the systems well.

What do you think of data systems that make good use of formative assessment to help teachers identify areas where students need most attention? The assessments have to be very well designed, as does the data system--and teachers would presumably play a central role in constructing them and then designing accompanying professional development structures.

Does that sound right?

Post new comment

The content of this field is kept private and will not be shown publicly.
  • Allowed HTML tags: <a> <em> <strong> <cite> <code> <ul> <ol> <li> <dl> <dt> <dd>
  • Lines and paragraphs break automatically.
  • Web page addresses and e-mail addresses turn into links automatically.

More information about formatting options