A Draft Bill of Research Rights for Educators

A Draft Bill of Research Rights for Educators
X
Story Stream
recent articles

RCEd Commentary

When I talk to educators about research, their most common complaint (by a long shot) is that they are asked to implement new interventions (a curriculum, a pedagogical technique, a software product, whatever), and are offered no reason to do so other than a breezy “all the research supports it.” The phrase is used as a blunt instrument to silence questions. As a scientist I find this infuriating because it abuses what ought to be a serious claim -- research backs this -- and in so doing devalues research. It’s an ongoing problem (see Jessica & Tim Lahey’s treatment here) that’s long concerned me.

In fact, the phrase “research supports it” invites questions. It implies that we can, in a small way, predict the future. It claims “if we do X, Y will happen.” If I take this medication, my ear infection will go away. If we adopt this new curriculum, kids will be more successful in learning math. Saying “research supports it” implies that you know not only what the intervention is, but you have at least a rough idea of what outcome you expect, the likelihood that it will happen, and when it will happen.

I offer the following list of rights for educators who are asked to change what they are doing in the name of research, whether it’s a mandate handed down from administrator to teacher or from lawmaker to administrator.

  1. The right to know what is supposed to improve. What problem is being solved? For example, when I’ve been to schools or districts implementing a one-to-one tablet/laptop policy, I’ve always asked what it’s meant to do. The modal response is a blank look followed by the phrase “we don’t want our kids left behind.” Behind in what? In what way are kids elsewhere with devices zooming ahead?
  2. The right to know the means by which improvement will be measured. How will we know things are getting better? If you’re trying to improve students’ understanding of math, for example, are you confident that you have a metric that captures that construct? Are you sure scores on that metric will be comparable in the future to those you’re looking at now? How big an increase will be deemed a success?
  3. 3.The right to know the approximate time by which this improvement is expected. A commitment to an intervention shouldn’t be open-ended. At some point we must evaluate how it’s going.
  4. 4.The right to know what will be done if the goal is or is not met. Naturally, conditions may change, but let’s have a plan. If we don’t meet our target, will we quit? Keep trying for a while? Tweak it?
  5. 5.The right to know what evidence exists that the intervention will work as expected. Is the evidence from actual classrooms or is it laboratory science (plus some guesswork)? If classrooms, were they like ours? In how many classrooms was it tried?
  6. The right to have your experience and expertise acknowledged. If the intervention sounds to you and your colleagues like it cannot work, this issue should be addressed in detail, not waved away with the phrase “all the research supports it.” The fact that it sounds fishy to experienced people doesn’t mean it can’t work, but whoever is pitching it should have a deep enough understanding of the mechanisms behind the intervention to be able to say why it sounds fishy, and why that’s not a problem.

This list is not meant to dictate criteria that must be met before an intervention should be tried, but rather what information ought to be on the table. In other words, the information provided in each category need not unequivocally support the intervention for it to be legitimate. For example, I can imagine an administrator admitting that the research support for an intervention is absent, yet mounting a case for why it should be tried anyway.

This list should also be considered a work in progress. I invite your additions or emendations. 

Comment
Show commentsHide Comments

Related Articles