Wednesday, May 22, 2013

Evidence shows that reading informational text more frequently is correlated with lower reading scores

I have another little story about non-evidence-based BS.  I'm getting kind of tired of this topic, but I'm going to write it up anyway, just for the record, while my students are writing an in-class essay on Song of Solomon.

Is there evidence that reading more informational text is important?
Because it's being pushed by the Common Core, "informational text" is all the rage these days.  Lesson plans for high school English classes are looking more and more like SAT prep--read a brief passage and answer some factual questions about it--except that the passages and questions I've seen in lesson plans have been less interesting than the ones I used to see on the SAT, back when I used to work as a tutor. One of the people promoting the Common Core these days is literacy titan Tim Shanahan. Some of Shanahan's work on CCSS matters is pretty good--he has a decent take on how to handle close reading in the classroom that is much better than a lot of the dreck I have seen--but like DAvid Coleman he has, I think, too little to say about reading volume, and he has jumped on the informational text bandwagon too wholeheartedly.  In his most recent blog post, Shanahan writes, "CCSS is emphasizing the reading of literary and informational text to ensure that students are proficient with a wide variety of text."

I am skeptical of this claim, since my working hypothesis is that what's really important is overall reading ability, which is increased by reading a lot of whatever kind of text interests you. So I wrote a comment on Shanahan's blog post asking if he knew of any evidence for his assertion.  I wrote, "I have not seen any evidence that trying to make students read more informational text will lead to greater proficiency with informational text.

Shanahan quickly replied to my comment, saying that there was lots of evidence: "Actually there is quite a bit of research showing that if you want students to be able to read expository text, you have to have them read (or write) expository text"

I wrote back asking for specifics, which he didn't give (I understand--he's a busy guy), and then I spent a bit of time poking around.  What I found shouldn't have surprised me.  Here's the upshot: not only does there seem to be no hard evidence that reading informational text makes you a better reader of informational text, there is actually, oddly, some hard evidence that the very opposite is true: that the more regularly students read informational text, the worse they do on reading tests.

A leading scholar makes the case for informational reading, but has no evidence
Nell Duke is a Michigan professor who has spent much of her career pushing to get more informational text in U.S. classrooms; she also edits the "Research-Informed Classroom" book series. Duke has tried to make the case for more informational text in many articles over many years, and her efforts may be paying off: both of my children have been exposed to more informational text in the course of their schooling than I was. This is not necessarily bad, but it's not necessarily good, either.

For what Nell Duke has not done is provide empirical evidence that reading more informational text will make you better at reading informational text.  She is upfront about this: "While there is a great deal of agreement about the necessity of substantial or ongoing  experience with a genre (e.g.,  New London Group, 1996), there is currently no empirical research available to speak to the question of how  much experience  with a given form of written text is necessary for a particular level of acquisition" (Duke, "3.6 Minutes a Day," RRQ, 2000, p.207)  In other words, there is "agreement" among some researchers, but they don't have any hard evidence.

Do U.S. children "need" to read informational text?
In 2010 Nell Duke published an article in The Phi Beta Kappan called "The Real World Writing U.S. Children Need."  The article begins by citing an international test that shows US children doing slightly better on standardized test questions about literary text than those on informational text. The article goes on to make Duke's usual argument that students need to read more informational text.

Because I am skeptical of this claim, I looked up the international test Duke mentions, the PIRLS. As Duke reported, U.S. children, like those in many other countries, did a bit better on questions about literary text than informational text--but the scores were not very far apart.  What Duke did not report, however, was that the 2006 PIRLS study had actually done a bit of empirical research on the very question of whether more exposure to informational text is associated with higher scores on informational text.

The PIRLS study asked students how frequently they read literary texts, and how frequently they read informational texts.  It turns out, counterintuitively perhaps, that students who reported reading informational texts more frequently actually did worse on the reading test than students who reported reading informational texts less frequently.  Here's the relevant section of the US Government report on the 2006 PIRLS:

"The average score on the combined reading literacy scale for U.S. students who read stories or novels every day or almost every day (558) was higher than the average score for students who read stories or novels once or twice a week (541), once or twice a month (539), and never or almost never (509). In contrast, the average score for students who read for information every day or almost every day (519) was lower than the average score for students who read for information once or twice a week (538), once or twice a month (553), and never or almost never (546).

"The higher performance of U.S. students who read for information less frequently relative to U.S. students who read for information more frequently was also observed internationally."
(http://nces.ed.gov/pubs2008/2008017.pdf, page 16-17))

So, to clarify, the very study that was cited as evidence of U.S. students not reading enough informational text turns out to show that frequent reading of informational text is associated with lower reading scores.

What to conclude?
First, while those PIRLS data are weird and counterintuitive, and almost certainly don't mean that reading informational text actually harms one's reading level, one thing is clear: this is not a study that offers any support for the idea that U.S. students "need" to read more informational text.  The evidence for this assertion, like the evidence for explicit vocabulary instruction, for charter schools, for VAM teacher evaluation, for larger class size, for explicitly teaching reading "strategies" rather than focusing on meaningful, content-based reading and discussion--the evidence is simply very weak, if not outright negative.

Second, we are again confronted with the spectacle of very eminent scholars (Shanahan is a real bigwig, and Duke is a professor at a very good university who is quite well-established) making strong assertions in the practical and policy realms that don't seem backed up by evidence in the scholarship realm.  There is a striking contrast between the careful language ("may," "currently no empirical research available," etc.) used in scholarly papers and the bold, authoritative tone of articles aimed at teachers and the public about what children "need" to be doing, and what practices will "ensure" a particular result.

The takeaway for me, once again, is that we simply cannot trust any assertion that we have not ourselves looked into carefully--even, or perhaps especially, if it is accompanied by the label "research-based", or as Nell Duke's book series has it, "Research-Informed." Instead, we must rely mostly on our own common sense and our sense of humanity.  At the heart of our work should be: meaningful reading, meaningful writing, and meaningful discussion.

4 comments:

  1. It is hard to tell what anyone is really talking about here. I gather that a lot of the commentary about informational texts is in response to elementary literacy programs which, for reasons I cannot fathom, are pretty much 100% fiction AND which have displaced all but minimal science and social studies curricula.

    So... perhaps Shanahan et al have a strong case that reading SOME informational texts is important, and who would ever really argue otherwise, but not much backing any particular amount.

    In other words, as is usually the case in ed policy, swinging from one extreme to another.

    I'd also note that standards don't dictate the amount of time or volume of reading spent on each standard. The entire concept is that you spend the amount of time on each task necessary for the kids to achieve each standard. You don't count up the days and divide by the number of standards.

    ReplyDelete
    Replies
    1. Thanks for your comment--but I think you're too kind to Shanahan et al. My main point is that they are making wildly inflated claims about what educational "research" shows (a common problem). You suggest that perhaps they have a "strong case", but as far as I can tell they do not have a strong case. And I think you are overly kind to the CCSS themselves. The authors express what is "required" in an introductory document (http://www.corestandards.org/ELA-Literacy/introduction/key-design-consideration). This introductory document contains a table showing that in 4th grade the literary/informational ratio on the NAEP is 50/50, while in 12th grade it is 30/70. These ratios are clearly intended to be the ratio of reading done by students. In twelfth grade, the document says, "70 percent of student reading across the grade should be informational." That seems pretty clear. The authors of the document, like, as far as I can tell, the majority of its readers, seem to be envisioning a curriculum that is driven by test-preparation (the ratios are drawn right from the NAEP!) and that consists almost entirely of either brief texts (Gettysburg Address, MLK's Letter from a Birmingham Jail, etc.) or of "passages" isolated from longer, complete works. I think close reading is great, but I fear what we are going to end up with is essentially test-prep, with very little of the immersive, high-volume reading without which I doubt anyone has ever become a good reader.

      Delete
  2. Here's an entirely unresearched opinion: reading non-informational texts is correlated with more time spent reading (enjoyably) which is highly correlated with reading better, which is highly correlated with read informational texts better. So there's no paradox--stay away from informational texts that put you to sleep, read gobs of YA fiction, and you'll be able to understand anything....

    ReplyDelete
    Replies
    1. I find informational texts (newspapers, research papers, etc.) fairly addicting, so I'm not going to endorse every piece of that theory wholeheartedly, but it sounds pretty reasonable.

      Delete