We have some family friends whose kids used to go to a (private) Waldorf school nearby. The kids were wonderful, but I didn't know much about the school or its methods. A few days ago, when I was saying something about whether children needed to read "informational text" in the very early grades, my wife said, "That's silly. Waldorf schools don't teach ANY reading until second grade, and those kids end up just fine."
I looked this up, and, as usual, my wife was right: Waldorf schools generally don't teach reading until second grade, use a whole language approach and avoid much explicit strategy instruction when they do teach it, and their students apparently end up reading just fine. This is an important result, because it would seem to show that explicit reading instruction in Kindergarten and first grade may not be necessary, and that students certainly don't need to read much informational text to themselves in kindergarten and first grade in order to learn to read well later on.
Recent Studies
Until recently, most Waldorf schools were private, so skeptics could argue that if Waldorf students ended up being good readers, the students and families at those schools were distinctly different from the norm, so no comparison was possible. Over the past couple of decades, however, a number of public Waldorf schools have opened, most of them in California, and two recent studies in the U.S.(Oberman 2007(pdf); Larrison et al. 2012) compare the results at these schools with those at traditional schools with comparable student demographics. The two studies find the same result: when it comes to reading on their own, students in the early grades in Waldorf schools are dramatically worse than their peers in regular schools, but by the later grades, the Waldorf students have caught up or surpassed the regular-school students.
The graphic below shows some of the results obtained in the 2012 study. The scores of the Waldorf students start well below average, then catch up by fourth grade, then seem to pull ahead.
These results are striking. When the same researchers looked only at the California Waldorf schools, so as to avoid issues with cross-state comparisons, the same pattern was seen, though with less dramatic divergence in the upper grades. When Oberman did a similar comparison on a more limited scale and with data from two years earlier, she found a somewhat similar pattern--Waldorf students starting out behind and catching up, if not pulling ahead. A New Zealand study comes to the same conclusion: Waldorf students do badly on reading tests when they are 6 and 7, but by the time they are entering adolescence, they have caught up or even pulled ahead.
Conclusions
Now, of course the students and families at these schools are self-selecting, and of course there may be other ways to explain away these results, and of course this is not a very large body of scholarly literature. Nevertheless, I can't find any studies that contradict these three, and these results are consistent with thinking that what is important is not explicit instruction in discrete reading skills, and not reading a minimum proportion of informational text--but, instead, developing students minds by engaging their imaginations, creating a culture of engaged intellectual inquiry, doing lots of reading stories aloud and having them sing songs and repeat poems.
So these studies aren't definitive, but they are enough to call into further doubt the blithe assurances of people like Tim Shanahan and David Coleman that their preferred approach is consistent with the available empirical evidence.
Showing posts with label Tim Shanahan. Show all posts
Showing posts with label Tim Shanahan. Show all posts
Monday, June 17, 2013
Friday, June 14, 2013
When shown that there's no solid evidence for their side, literacy gurus demand evidence for the other
Over the past month or so, Tim Shanahan and I have been having an interesting discussion in the comments section to one of his blog posts. I've been pleased that he's taking the time to respond thoughtfully, but he's not convincing me. I'm writing this post, even more than some others, to clarify my own thinking--apologies for getting too much into the boring weeds here.
Initially I asked for evidence that reading more informational text led to better comprehension of such text. He said there was lots of evidence, I asked for specifics, and he finally admitted, after a few back-and-forths, that "You are correct that there is no study showing that increasing the amount of the reading of informational text has had a clear positive unambiguous impact on reading achievement or student knowledge. "
Shanahan did not, however, address why he had written in his blog post:CCSS is emphasizing the reading of literary and informational text to ensure that students are proficient with a wide variety of text. Nor did he address why, when I asked for evidence that reading more information text led to greater proficiency with informational text, he responded by saying "Actually there is quite a bit of research showing that if you want students to be able to read expository text, you have to have them read (or write) expository text."
Instead of explaining why he had made incorrect statements about the evidence for reading informational text, Shanahan asked me to show the evidence for reading literary text. He doesn't seem to get it: my whole point is that there is not strong evidence either way, and it is dishonest to pretend that there is. He, and many other scholars who engage in education discourse aimed at teachers and the general public, are continually pretending that there is strong scientific evidence for their pet curriculum ideas. Very often there is no such evidence.
When I suggested that a lot of "evidence-based" educational policies are not founded on particularly strong evidence, Shanahan made an interesting move: he essentially said that I was demanding too much. As he put it, "the basic problem here is with your understanding of research and how causal claims are put forward." He said that what he and others do is to look at some available evidence and come up with a "logic model" that fits the facts. Not all research is done, because some questions, like " Is third grade really necessary?", are not going to be studied.
So he seems to think if you have a story that is not inconsistent with some emprirically established facts, then apparently you have the right to say that "there is quite a bit of research showing" that your story is true.
Maybe. But it seems to me that if there is debate about a question, like the question of whether it is worthwhile to make young children read more informational text, then if you say there is "quite a bit of research showing" that your side of the debate is true, you have to have evidence that is not only consistent with your side of the debate but also inconsistent with the other side.
And it's not like we couldn't do some studies! Nell Duke, a prominent proponent of more informational text in the early grades, has gotten millions of dollars in grant money and has spent over a decade studying the issue of how much informational text children "are exposed to" in school. Couldn't she have taken some of that large amount of time and money and done a controlled experiment? Surely some district would have been happy to have a huge library of informational text provided to half of their K-4 schools, so that Duke could check whether students at those schools would actually do better, a few years down the line, at understanding informational texts? But she didn't do it, and Shanahan didn't do it, and now Shanahan is implicitly suggesting that such research would be as silly as a controlled experiment in which we got rid of third grade.
I'm still trying to figure out what I think about "research-based" arguments. I guess my position now is: research can be useful and informative, but it is only rarely, to use a legal term that has been cropping up a lot lately, dispositive; and we should have a lot more of it before we take the kind of authoritative tone that Tim Shanahan and a lot of educational experts take when they are writing for a popular audience. In their scholarly papers, and when pressed in debate, these experts are circumspect and honest about the limitations of their certainty; I'd like to see more of that circumspection in the advice given to us teachers and to the public.
Initially I asked for evidence that reading more informational text led to better comprehension of such text. He said there was lots of evidence, I asked for specifics, and he finally admitted, after a few back-and-forths, that "You are correct that there is no study showing that increasing the amount of the reading of informational text has had a clear positive unambiguous impact on reading achievement or student knowledge. "
Shanahan did not, however, address why he had written in his blog post:CCSS is emphasizing the reading of literary and informational text to ensure that students are proficient with a wide variety of text. Nor did he address why, when I asked for evidence that reading more information text led to greater proficiency with informational text, he responded by saying "Actually there is quite a bit of research showing that if you want students to be able to read expository text, you have to have them read (or write) expository text."
Instead of explaining why he had made incorrect statements about the evidence for reading informational text, Shanahan asked me to show the evidence for reading literary text. He doesn't seem to get it: my whole point is that there is not strong evidence either way, and it is dishonest to pretend that there is. He, and many other scholars who engage in education discourse aimed at teachers and the general public, are continually pretending that there is strong scientific evidence for their pet curriculum ideas. Very often there is no such evidence.
When I suggested that a lot of "evidence-based" educational policies are not founded on particularly strong evidence, Shanahan made an interesting move: he essentially said that I was demanding too much. As he put it, "the basic problem here is with your understanding of research and how causal claims are put forward." He said that what he and others do is to look at some available evidence and come up with a "logic model" that fits the facts. Not all research is done, because some questions, like " Is third grade really necessary?", are not going to be studied.
So he seems to think if you have a story that is not inconsistent with some emprirically established facts, then apparently you have the right to say that "there is quite a bit of research showing" that your story is true.
Maybe. But it seems to me that if there is debate about a question, like the question of whether it is worthwhile to make young children read more informational text, then if you say there is "quite a bit of research showing" that your side of the debate is true, you have to have evidence that is not only consistent with your side of the debate but also inconsistent with the other side.
And it's not like we couldn't do some studies! Nell Duke, a prominent proponent of more informational text in the early grades, has gotten millions of dollars in grant money and has spent over a decade studying the issue of how much informational text children "are exposed to" in school. Couldn't she have taken some of that large amount of time and money and done a controlled experiment? Surely some district would have been happy to have a huge library of informational text provided to half of their K-4 schools, so that Duke could check whether students at those schools would actually do better, a few years down the line, at understanding informational texts? But she didn't do it, and Shanahan didn't do it, and now Shanahan is implicitly suggesting that such research would be as silly as a controlled experiment in which we got rid of third grade.
I'm still trying to figure out what I think about "research-based" arguments. I guess my position now is: research can be useful and informative, but it is only rarely, to use a legal term that has been cropping up a lot lately, dispositive; and we should have a lot more of it before we take the kind of authoritative tone that Tim Shanahan and a lot of educational experts take when they are writing for a popular audience. In their scholarly papers, and when pressed in debate, these experts are circumspect and honest about the limitations of their certainty; I'd like to see more of that circumspection in the advice given to us teachers and to the public.
Wednesday, May 22, 2013
Evidence shows that reading informational text more frequently is correlated with lower reading scores
I have another little story about non-evidence-based BS. I'm getting kind of tired of this topic, but I'm going to write it up anyway, just for the record, while my students are writing an in-class essay on Song of Solomon.
Is there evidence that reading more informational text is important?
Because it's being pushed by the Common Core, "informational text" is all the rage these days. Lesson plans for high school English classes are looking more and more like SAT prep--read a brief passage and answer some factual questions about it--except that the passages and questions I've seen in lesson plans have been less interesting than the ones I used to see on the SAT, back when I used to work as a tutor. One of the people promoting the Common Core these days is literacy titan Tim Shanahan. Some of Shanahan's work on CCSS matters is pretty good--he has a decent take on how to handle close reading in the classroom that is much better than a lot of the dreck I have seen--but like DAvid Coleman he has, I think, too little to say about reading volume, and he has jumped on the informational text bandwagon too wholeheartedly. In his most recent blog post, Shanahan writes, "CCSS is emphasizing the reading of literary and informational text to ensure that students are proficient with a wide variety of text."
I am skeptical of this claim, since my working hypothesis is that what's really important is overall reading ability, which is increased by reading a lot of whatever kind of text interests you. So I wrote a comment on Shanahan's blog post asking if he knew of any evidence for his assertion. I wrote, "I have not seen any evidence that trying to make students read more informational text will lead to greater proficiency with informational text.
Shanahan quickly replied to my comment, saying that there was lots of evidence: "Actually there is quite a bit of research showing that if you want students to be able to read expository text, you have to have them read (or write) expository text"
I wrote back asking for specifics, which he didn't give (I understand--he's a busy guy), and then I spent a bit of time poking around. What I found shouldn't have surprised me. Here's the upshot: not only does there seem to be no hard evidence that reading informational text makes you a better reader of informational text, there is actually, oddly, some hard evidence that the very opposite is true: that the more regularly students read informational text, the worse they do on reading tests.
A leading scholar makes the case for informational reading, but has no evidence
Nell Duke is a Michigan professor who has spent much of her career pushing to get more informational text in U.S. classrooms; she also edits the "Research-Informed Classroom" book series. Duke has tried to make the case for more informational text in many articles over many years, and her efforts may be paying off: both of my children have been exposed to more informational text in the course of their schooling than I was. This is not necessarily bad, but it's not necessarily good, either.
For what Nell Duke has not done is provide empirical evidence that reading more informational text will make you better at reading informational text. She is upfront about this: "While there is a great deal of agreement about the necessity of substantial or ongoing experience with a genre (e.g., New London Group, 1996), there is currently no empirical research available to speak to the question of how much experience with a given form of written text is necessary for a particular level of acquisition" (Duke, "3.6 Minutes a Day," RRQ, 2000, p.207) In other words, there is "agreement" among some researchers, but they don't have any hard evidence.
Do U.S. children "need" to read informational text?
In 2010 Nell Duke published an article in The Phi Beta Kappan called "The Real World Writing U.S. Children Need." The article begins by citing an international test that shows US children doing slightly better on standardized test questions about literary text than those on informational text. The article goes on to make Duke's usual argument that students need to read more informational text.
Because I am skeptical of this claim, I looked up the international test Duke mentions, the PIRLS. As Duke reported, U.S. children, like those in many other countries, did a bit better on questions about literary text than informational text--but the scores were not very far apart. What Duke did not report, however, was that the 2006 PIRLS study had actually done a bit of empirical research on the very question of whether more exposure to informational text is associated with higher scores on informational text.
The PIRLS study asked students how frequently they read literary texts, and how frequently they read informational texts. It turns out, counterintuitively perhaps, that students who reported reading informational texts more frequently actually did worse on the reading test than students who reported reading informational texts less frequently. Here's the relevant section of the US Government report on the 2006 PIRLS:
"The average score on the combined reading literacy scale for U.S. students who read stories or novels every day or almost every day (558) was higher than the average score for students who read stories or novels once or twice a week (541), once or twice a month (539), and never or almost never (509). In contrast, the average score for students who read for information every day or almost every day (519) was lower than the average score for students who read for information once or twice a week (538), once or twice a month (553), and never or almost never (546).
"The higher performance of U.S. students who read for information less frequently relative to U.S. students who read for information more frequently was also observed internationally." (http://nces.ed.gov/pubs2008/2008017.pdf, page 16-17))
So, to clarify, the very study that was cited as evidence of U.S. students not reading enough informational text turns out to show that frequent reading of informational text is associated with lower reading scores.
What to conclude?
First, while those PIRLS data are weird and counterintuitive, and almost certainly don't mean that reading informational text actually harms one's reading level, one thing is clear: this is not a study that offers any support for the idea that U.S. students "need" to read more informational text. The evidence for this assertion, like the evidence for explicit vocabulary instruction, for charter schools, for VAM teacher evaluation, for larger class size, for explicitly teaching reading "strategies" rather than focusing on meaningful, content-based reading and discussion--the evidence is simply very weak, if not outright negative.
Second, we are again confronted with the spectacle of very eminent scholars (Shanahan is a real bigwig, and Duke is a professor at a very good university who is quite well-established) making strong assertions in the practical and policy realms that don't seem backed up by evidence in the scholarship realm. There is a striking contrast between the careful language ("may," "currently no empirical research available," etc.) used in scholarly papers and the bold, authoritative tone of articles aimed at teachers and the public about what children "need" to be doing, and what practices will "ensure" a particular result.
The takeaway for me, once again, is that we simply cannot trust any assertion that we have not ourselves looked into carefully--even, or perhaps especially, if it is accompanied by the label "research-based", or as Nell Duke's book series has it, "Research-Informed." Instead, we must rely mostly on our own common sense and our sense of humanity. At the heart of our work should be: meaningful reading, meaningful writing, and meaningful discussion.
Is there evidence that reading more informational text is important?
Because it's being pushed by the Common Core, "informational text" is all the rage these days. Lesson plans for high school English classes are looking more and more like SAT prep--read a brief passage and answer some factual questions about it--except that the passages and questions I've seen in lesson plans have been less interesting than the ones I used to see on the SAT, back when I used to work as a tutor. One of the people promoting the Common Core these days is literacy titan Tim Shanahan. Some of Shanahan's work on CCSS matters is pretty good--he has a decent take on how to handle close reading in the classroom that is much better than a lot of the dreck I have seen--but like DAvid Coleman he has, I think, too little to say about reading volume, and he has jumped on the informational text bandwagon too wholeheartedly. In his most recent blog post, Shanahan writes, "CCSS is emphasizing the reading of literary and informational text to ensure that students are proficient with a wide variety of text."
I am skeptical of this claim, since my working hypothesis is that what's really important is overall reading ability, which is increased by reading a lot of whatever kind of text interests you. So I wrote a comment on Shanahan's blog post asking if he knew of any evidence for his assertion. I wrote, "I have not seen any evidence that trying to make students read more informational text will lead to greater proficiency with informational text.
Shanahan quickly replied to my comment, saying that there was lots of evidence: "Actually there is quite a bit of research showing that if you want students to be able to read expository text, you have to have them read (or write) expository text"
I wrote back asking for specifics, which he didn't give (I understand--he's a busy guy), and then I spent a bit of time poking around. What I found shouldn't have surprised me. Here's the upshot: not only does there seem to be no hard evidence that reading informational text makes you a better reader of informational text, there is actually, oddly, some hard evidence that the very opposite is true: that the more regularly students read informational text, the worse they do on reading tests.
A leading scholar makes the case for informational reading, but has no evidence
Nell Duke is a Michigan professor who has spent much of her career pushing to get more informational text in U.S. classrooms; she also edits the "Research-Informed Classroom" book series. Duke has tried to make the case for more informational text in many articles over many years, and her efforts may be paying off: both of my children have been exposed to more informational text in the course of their schooling than I was. This is not necessarily bad, but it's not necessarily good, either.
For what Nell Duke has not done is provide empirical evidence that reading more informational text will make you better at reading informational text. She is upfront about this: "While there is a great deal of agreement about the necessity of substantial or ongoing experience with a genre (e.g., New London Group, 1996), there is currently no empirical research available to speak to the question of how much experience with a given form of written text is necessary for a particular level of acquisition" (Duke, "3.6 Minutes a Day," RRQ, 2000, p.207) In other words, there is "agreement" among some researchers, but they don't have any hard evidence.
Do U.S. children "need" to read informational text?
In 2010 Nell Duke published an article in The Phi Beta Kappan called "The Real World Writing U.S. Children Need." The article begins by citing an international test that shows US children doing slightly better on standardized test questions about literary text than those on informational text. The article goes on to make Duke's usual argument that students need to read more informational text.
Because I am skeptical of this claim, I looked up the international test Duke mentions, the PIRLS. As Duke reported, U.S. children, like those in many other countries, did a bit better on questions about literary text than informational text--but the scores were not very far apart. What Duke did not report, however, was that the 2006 PIRLS study had actually done a bit of empirical research on the very question of whether more exposure to informational text is associated with higher scores on informational text.
The PIRLS study asked students how frequently they read literary texts, and how frequently they read informational texts. It turns out, counterintuitively perhaps, that students who reported reading informational texts more frequently actually did worse on the reading test than students who reported reading informational texts less frequently. Here's the relevant section of the US Government report on the 2006 PIRLS:
"The average score on the combined reading literacy scale for U.S. students who read stories or novels every day or almost every day (558) was higher than the average score for students who read stories or novels once or twice a week (541), once or twice a month (539), and never or almost never (509). In contrast, the average score for students who read for information every day or almost every day (519) was lower than the average score for students who read for information once or twice a week (538), once or twice a month (553), and never or almost never (546).
"The higher performance of U.S. students who read for information less frequently relative to U.S. students who read for information more frequently was also observed internationally." (http://nces.ed.gov/pubs2008/2008017.pdf, page 16-17))
So, to clarify, the very study that was cited as evidence of U.S. students not reading enough informational text turns out to show that frequent reading of informational text is associated with lower reading scores.
What to conclude?
First, while those PIRLS data are weird and counterintuitive, and almost certainly don't mean that reading informational text actually harms one's reading level, one thing is clear: this is not a study that offers any support for the idea that U.S. students "need" to read more informational text. The evidence for this assertion, like the evidence for explicit vocabulary instruction, for charter schools, for VAM teacher evaluation, for larger class size, for explicitly teaching reading "strategies" rather than focusing on meaningful, content-based reading and discussion--the evidence is simply very weak, if not outright negative.
Second, we are again confronted with the spectacle of very eminent scholars (Shanahan is a real bigwig, and Duke is a professor at a very good university who is quite well-established) making strong assertions in the practical and policy realms that don't seem backed up by evidence in the scholarship realm. There is a striking contrast between the careful language ("may," "currently no empirical research available," etc.) used in scholarly papers and the bold, authoritative tone of articles aimed at teachers and the public about what children "need" to be doing, and what practices will "ensure" a particular result.
The takeaway for me, once again, is that we simply cannot trust any assertion that we have not ourselves looked into carefully--even, or perhaps especially, if it is accompanied by the label "research-based", or as Nell Duke's book series has it, "Research-Informed." Instead, we must rely mostly on our own common sense and our sense of humanity. At the heart of our work should be: meaningful reading, meaningful writing, and meaningful discussion.
Friday, November 30, 2012
The Common Core is not "evidence-based"--but maybe that's okay!
My Curriculum Coordinator just got a subscription to the Marshall Memo, which may be bad for my mental health--I'll be reading a lot more Ed. research. This post came from reading an article in EdWeek that Marshall refers to in this week's memo. I'm sorry the post is so long. I think it served to clarify my thinking...
The Common Core is Not Evidence-Based
There's an article in a recent EdWeek with the remarkable headline: "New Literacy Research Infuses Common Core." The article's subtitle reads, "In the 15 years since the National Reading Panel convened, the knowledge base on literacy has grown." As far as I can tell, both the headline and the subhead are essentially false. The Common Core standards are not really evidence-based, and the knowledge base on literacy has not grown much in the 15 years since the now-discredited National Reading Panel --except perhaps in the Socratic sense of knowing how much it doesn't know.
This is interesting only because it points up the farcical nature of so much of today's educational discourse. While most of what happens in schools today is worthwhile, the way people talk about it is just ridiculous. One of my colleagues suggested that our schools would be better off if all the Graduate Schools of Education disappeared from the face of the earth (he used stronger words), and I think he might be right. Education research is like Hollywood: NOBODY KNOWS ANYTHING. Of course, this isn't true, either of schools or of Hollywood, but it's partly true.
The article itself is somewhat better than its headlines, and if you read it carefully you can see that the Ed. professors are basically just as in the dark as we teachers are--if not more so. What is most amazing about an article that claims to be about "new literacy research" is that it describes very little actual research. The article quotes many academics, but often they say things as questionable and non-evidence-based as the following paragraph, which manages to express the same simple idea, an idea that has been a truism for many decades now, over and over:
"In our knowledge-based economy, students are not only going to have to read, but develop knowledge-based capital. We need to help children use literacy to develop critical-thinking skills, problem-solving skills, making distinctions among different types of evidence," said Susan B. Neuman, a professor in educational studies specializing in early-literacy development at the University of Michigan in Ann Arbor. "The Common Core State Standards is privileging knowledge for the first time. To ensure they are career-and-college ready, we have to see students as lifelong learners and help them develop the knowledge-gathering skills they will use for the rest of their lives. That's the reality."
Privileging knowledge is a new idea? Helping kids become life-learners is a new idea? Critical thinking in literacy is a new idea? What? Not only are these old ideas, there is not a the slightest bit of research or data to be found in that paragraph.
The recent history of "evidence-based" BS
The EdWeek article goes on to discuss the National Reading Panel of 2000, which was a much-ballyhooed effort to establish the most advanced and scientific thinking about how children learn to read and how we can help them. The Panel's report came down decisively on the side of explicit instruction in skills: phonemic awareness; vocabulary; comprehension strategies; etc. The panel's recommendations formed the basis of a $1 billion-a-year effort, "Reading First" by the Federal government to improve reading in the early grades. Eight years later, there was a comprehensive assessment of the program, to find out how much difference this explicit instruction in skills had made. The answer: zero difference.
The assessment reported three key findings: (1) Reading First did indeed result in students spending more time on reading "instruction" (phonemic awareness, vocab, etc.); (2) Reading First did indeed result in more professional development in "scientifically based reading instruction (SBRI)"; (3) however, "Reading First did not produce a statistically significant impact on student reading comprehension test scores in grades one, two or three" (page v).
Another finding, that the assessment did not consider "key," but that may have had some impact, was that the increased instructional time and the emphasis on skills did not result in any increase in students' actually reading. As the assessment puts it: "Reading First had no statistically significant impacts on student engagement with print" (page xii).
This is remarkable: in 2000, only twelve years ago, the state of the research (the "knowledge-based capital," in the vapid phrase of the Michigan professor), which the panel of eminent experts claimed to hold to the "highest standards of scientific evidence," was utterly and completely wrong.
After being so wrong, the education experts tried to reposition themselves--but not very clearly. One of them is quoted in the EdWeek article as saying that after the National Reading Panel, "comprehension became the 'next great frontier of reading research.'" This is odd, since "comprehension" was one of the central topics of the NRP itself. (1)
Reading Next:
One of the ways the experts tried to reposition themselves was in a report called "Reading Next," which according to the EdWeek article "helped spark the common core's approach. Education professor Catherine A. Snow and then-doctoral student Gina Biancarosa of the Harvard Graduate School of Education found that explicit comprehension instruction, intensive writing, and the use of texts in a wide array of difficulty levels, subjects, and disciplines all helped improve literacy for struggling adolescent readers."
Reading Next focused on an array of fifteen "powerful tools" for improving literacy. In an improvement on the NRP's exclusive focus on skills instruction, many of Reading Next's recommendations were so vague that no one could object ("Effective Instructional Principles Embedded in Content"), and many sounded fairly old-fashioned (Strategic Tutoring; Motivation and Self-Directed Learning; Extended Time for Literacy). But when you actually looked more deeply into what the specific recommendations were, it became clear that the report was, like the NRP, trying as hard as possible to avoid mentioning the very simple strategy of having students actually read.
Avoiding all mention of actual reading:
Here is the passage from the Reading Next report that discusses "Extended Time for Literacy," which I had thought from its title might mean more time for students to actually read. That may be what is meant, but the authors seem to twist themselves into jargony knots so as to avoid discussing actual "reading":
Extended Time for Literacy
None of the above-mentioned elements are likely to effect much change if instruction is limited to thirty or forty-five minutes per day. The panel strongly argued the need for two to four hours of literacy-connected learning daily. This time is to be spent with texts and a focus on reading and writing effectively. Although some of this time should be spent with a language arts teacher, instruction in science, history, and other subject areas qualifies as fulfilling the requirements of this element if the instruction is text centered and informed by instructional principles designed to convey content and also to practice and improve literacy skills.
To leverage time for increased interaction with texts across subject areas, teachers will need to reconceptualize their understanding of what it means to teach in a subject area. In other words, teachers need to realize they are not just teaching content knowledge but also ways of reading and writing specific to a subject area. This reconceptualization, in turn, will require rearticulation of standards and revision of preservice training.
This passage is amazing. Despite the fact that it seems intended to promote spending more time having students actually reading, the language in this passage and the whole report seems to avoid saying that straight out. Instead we hear about "instruction" (four times), "literacy-connected learning," "interaction with texts," and "instructional principles designed to convey content." The word "reading" appears twice, but never on its own, never with the implication that the students might be actually reading; instead, we read that time should be spent with "a focus on reading" and in "teaching... ways of reading."
This passage, like the whole report and indeed like so much of the discourse of reading experts, makes me think of Pearson and Gallagher's "Gradual Release of Responsibility Model." These experts, perhaps because they are so far removed from teaching actual children, are not willing to release responsibility...
Much of the data that does exist is obvious
Much of the "research" that the EdWeek article mentions is super-obvious. For instance, here is some expert wisdom:
"research showing that there is no bright line for when students start to read to learn"
"Kids have to read across texts, evaluate them, respond to them all at the same time. In office work of any sort, people are doing this sort of thing all the time."
"a student's depth and complexity of vocabulary knowledge predicts his or her academic achievement better than other early-reading indicators, such as phonemic awareness."
Didn't we all know these things already? But here is my personal favorite piece of obvious data:
"students who practiced reading, even when it was difficult, were significantly better 20 weeks later at reading rate, word recognition, and comprehension, in comparison with the control group."
Wow--if you read more, you get better. Who knew?!
Education is like medicine, circa 1850
I have read that at the end of John Hattie's 2009 Magnum Opus, Visible Learning (I've ordered the book, but it hasn't come yet), Hattie compares the state of research in education to the state of medical research in the nineteenth century. In other words, we teachers might be better off with home remedies or folk wisdom. And in a sense this makes me feel a bit better about the Common Core. The Common Core is in no real sense, as far as I can tell, evidence-based (saying that students will one day have to write non-fiction is not scientific evidence for making them read it a lot when they are eight), but given the state of education research, maybe that's okay. What matters is that students read a lot, think and talk about what they read, and look carefully at their own writing. We English teachers can facilitate this process, but we shouldn't worry too much about the standards, which are, as Tim Shanahan says, in an expert opinion I can agree with completely, "a little goofy."
***************************************************
Footnotes:
(1) In fact, one of my favorite passages from the NRP report is the following piece of meaningless verbiage: "Comprehension is critically important to the development of children’s reading skills and therefore to the ability to obtain an education. Indeed, reading comprehension has come to be the “essence of reading”." This is as absurd as if one were to say, "Movement is critically important to the development of children's running skills and therefore to the ability to compete in many team sports. Indeed, movement has come to be the "essence of running"."
The Common Core is Not Evidence-Based
There's an article in a recent EdWeek with the remarkable headline: "New Literacy Research Infuses Common Core." The article's subtitle reads, "In the 15 years since the National Reading Panel convened, the knowledge base on literacy has grown." As far as I can tell, both the headline and the subhead are essentially false. The Common Core standards are not really evidence-based, and the knowledge base on literacy has not grown much in the 15 years since the now-discredited National Reading Panel --except perhaps in the Socratic sense of knowing how much it doesn't know.
This is interesting only because it points up the farcical nature of so much of today's educational discourse. While most of what happens in schools today is worthwhile, the way people talk about it is just ridiculous. One of my colleagues suggested that our schools would be better off if all the Graduate Schools of Education disappeared from the face of the earth (he used stronger words), and I think he might be right. Education research is like Hollywood: NOBODY KNOWS ANYTHING. Of course, this isn't true, either of schools or of Hollywood, but it's partly true.
The article itself is somewhat better than its headlines, and if you read it carefully you can see that the Ed. professors are basically just as in the dark as we teachers are--if not more so. What is most amazing about an article that claims to be about "new literacy research" is that it describes very little actual research. The article quotes many academics, but often they say things as questionable and non-evidence-based as the following paragraph, which manages to express the same simple idea, an idea that has been a truism for many decades now, over and over:
"In our knowledge-based economy, students are not only going to have to read, but develop knowledge-based capital. We need to help children use literacy to develop critical-thinking skills, problem-solving skills, making distinctions among different types of evidence," said Susan B. Neuman, a professor in educational studies specializing in early-literacy development at the University of Michigan in Ann Arbor. "The Common Core State Standards is privileging knowledge for the first time. To ensure they are career-and-college ready, we have to see students as lifelong learners and help them develop the knowledge-gathering skills they will use for the rest of their lives. That's the reality."
Privileging knowledge is a new idea? Helping kids become life-learners is a new idea? Critical thinking in literacy is a new idea? What? Not only are these old ideas, there is not a the slightest bit of research or data to be found in that paragraph.
The recent history of "evidence-based" BS
The EdWeek article goes on to discuss the National Reading Panel of 2000, which was a much-ballyhooed effort to establish the most advanced and scientific thinking about how children learn to read and how we can help them. The Panel's report came down decisively on the side of explicit instruction in skills: phonemic awareness; vocabulary; comprehension strategies; etc. The panel's recommendations formed the basis of a $1 billion-a-year effort, "Reading First" by the Federal government to improve reading in the early grades. Eight years later, there was a comprehensive assessment of the program, to find out how much difference this explicit instruction in skills had made. The answer: zero difference.
The assessment reported three key findings: (1) Reading First did indeed result in students spending more time on reading "instruction" (phonemic awareness, vocab, etc.); (2) Reading First did indeed result in more professional development in "scientifically based reading instruction (SBRI)"; (3) however, "Reading First did not produce a statistically significant impact on student reading comprehension test scores in grades one, two or three" (page v).
Another finding, that the assessment did not consider "key," but that may have had some impact, was that the increased instructional time and the emphasis on skills did not result in any increase in students' actually reading. As the assessment puts it: "Reading First had no statistically significant impacts on student engagement with print" (page xii).
This is remarkable: in 2000, only twelve years ago, the state of the research (the "knowledge-based capital," in the vapid phrase of the Michigan professor), which the panel of eminent experts claimed to hold to the "highest standards of scientific evidence," was utterly and completely wrong.
After being so wrong, the education experts tried to reposition themselves--but not very clearly. One of them is quoted in the EdWeek article as saying that after the National Reading Panel, "comprehension became the 'next great frontier of reading research.'" This is odd, since "comprehension" was one of the central topics of the NRP itself. (1)
Reading Next:
One of the ways the experts tried to reposition themselves was in a report called "Reading Next," which according to the EdWeek article "helped spark the common core's approach. Education professor Catherine A. Snow and then-doctoral student Gina Biancarosa of the Harvard Graduate School of Education found that explicit comprehension instruction, intensive writing, and the use of texts in a wide array of difficulty levels, subjects, and disciplines all helped improve literacy for struggling adolescent readers."
Reading Next focused on an array of fifteen "powerful tools" for improving literacy. In an improvement on the NRP's exclusive focus on skills instruction, many of Reading Next's recommendations were so vague that no one could object ("Effective Instructional Principles Embedded in Content"), and many sounded fairly old-fashioned (Strategic Tutoring; Motivation and Self-Directed Learning; Extended Time for Literacy). But when you actually looked more deeply into what the specific recommendations were, it became clear that the report was, like the NRP, trying as hard as possible to avoid mentioning the very simple strategy of having students actually read.
Avoiding all mention of actual reading:
Here is the passage from the Reading Next report that discusses "Extended Time for Literacy," which I had thought from its title might mean more time for students to actually read. That may be what is meant, but the authors seem to twist themselves into jargony knots so as to avoid discussing actual "reading":
Extended Time for Literacy
None of the above-mentioned elements are likely to effect much change if instruction is limited to thirty or forty-five minutes per day. The panel strongly argued the need for two to four hours of literacy-connected learning daily. This time is to be spent with texts and a focus on reading and writing effectively. Although some of this time should be spent with a language arts teacher, instruction in science, history, and other subject areas qualifies as fulfilling the requirements of this element if the instruction is text centered and informed by instructional principles designed to convey content and also to practice and improve literacy skills.
To leverage time for increased interaction with texts across subject areas, teachers will need to reconceptualize their understanding of what it means to teach in a subject area. In other words, teachers need to realize they are not just teaching content knowledge but also ways of reading and writing specific to a subject area. This reconceptualization, in turn, will require rearticulation of standards and revision of preservice training.
This passage is amazing. Despite the fact that it seems intended to promote spending more time having students actually reading, the language in this passage and the whole report seems to avoid saying that straight out. Instead we hear about "instruction" (four times), "literacy-connected learning," "interaction with texts," and "instructional principles designed to convey content." The word "reading" appears twice, but never on its own, never with the implication that the students might be actually reading; instead, we read that time should be spent with "a focus on reading" and in "teaching... ways of reading."
This passage, like the whole report and indeed like so much of the discourse of reading experts, makes me think of Pearson and Gallagher's "Gradual Release of Responsibility Model." These experts, perhaps because they are so far removed from teaching actual children, are not willing to release responsibility...
Much of the data that does exist is obvious
Much of the "research" that the EdWeek article mentions is super-obvious. For instance, here is some expert wisdom:
"research showing that there is no bright line for when students start to read to learn"
"Kids have to read across texts, evaluate them, respond to them all at the same time. In office work of any sort, people are doing this sort of thing all the time."
"a student's depth and complexity of vocabulary knowledge predicts his or her academic achievement better than other early-reading indicators, such as phonemic awareness."
Didn't we all know these things already? But here is my personal favorite piece of obvious data:
"students who practiced reading, even when it was difficult, were significantly better 20 weeks later at reading rate, word recognition, and comprehension, in comparison with the control group."
Wow--if you read more, you get better. Who knew?!
Education is like medicine, circa 1850
I have read that at the end of John Hattie's 2009 Magnum Opus, Visible Learning (I've ordered the book, but it hasn't come yet), Hattie compares the state of research in education to the state of medical research in the nineteenth century. In other words, we teachers might be better off with home remedies or folk wisdom. And in a sense this makes me feel a bit better about the Common Core. The Common Core is in no real sense, as far as I can tell, evidence-based (saying that students will one day have to write non-fiction is not scientific evidence for making them read it a lot when they are eight), but given the state of education research, maybe that's okay. What matters is that students read a lot, think and talk about what they read, and look carefully at their own writing. We English teachers can facilitate this process, but we shouldn't worry too much about the standards, which are, as Tim Shanahan says, in an expert opinion I can agree with completely, "a little goofy."
***************************************************
Footnotes:
(1) In fact, one of my favorite passages from the NRP report is the following piece of meaningless verbiage: "Comprehension is critically important to the development of children’s reading skills and therefore to the ability to obtain an education. Indeed, reading comprehension has come to be the “essence of reading”." This is as absurd as if one were to say, "Movement is critically important to the development of children's running skills and therefore to the ability to compete in many team sports. Indeed, movement has come to be the "essence of running"."
Friday, July 6, 2012
Two curricula: one for the elite, another for the masses
The elite are "nurtured"and "inspired" toward a "love" for reading
Like Barack Obama and Arne Duncan, Bill Gates did not go to a public high school. Instead, Gates, a scion of an elite Seattle family, went to a fancy prep school called Lakeside. Lakeside's English curriculum is quite different from the Common Core Standards that Gates paid millions to have created and is spending millions now to promote, and that Obama and Duncan are pushing as well, through their "Race to the Top" (sic) program. The Common Core standards suggest long and detailed classroom analyses of extremely difficult texts, and offer absolutely nothing in the way of requiring extensive reading or encouraging a love of reading. This curriculum is dramatically different from the ones offered at Lakeside, where Bill Gates's kids now go, but I wouldn't expect Lakeside to change its ways anytime soon.
Here are the mission statements for Lakeside's English programs at the middle school and high school levels:
"The Middle School English Department is dedicated to nurturing a lifelong love of reading
and writing. We strive to create a community of readers and writers that inspires students to
experiment with a variety of written forms."
"Lakeside’s [High School] English Department’s highest goals are to inspire in students a
love of literature and to help students become great writers."
Both the middle school and high school statements use the word "love" and emphasize writing in an "authentic voice" and "artistically." The curriculum is notably literary and cultural, and not narrowly designed to ready students for the business or political world.
It's also notable that these English departments aren't afraid to talk about encouraging a love of reading. Encouraging a love for reading might seem like an obvious goal of English class, but in the Orwellian world of the Education-Industrial-Complex that goal is controversial.
The masses are given "instruction" aimed at "proficiency"
This Orwellian madness surfaced in 2006, when the new President of the International Reading Association came out against encouraging a love for reading. Professor Tim Shanahan, one of the biggest names int he reading world, had already made clear that he was against natural reading: he was a prominent member of the "National Reading Panel" (2000) that after a cockeyed look at the evidence, argued at length for explicit instruction and dishonestly claimed that there was no evidence that independent silent reading was effective. In 2006, he became President of the International Reading Association, which has as one of its three stated purposes, in addition to improving reading instruction and promote reading proficiency, to "encourage reading and an interest in reading" (Reading Today, June 2006). Shanahan's first move as President of the Association was to say that while he could support improved instruction and promoting proficiency, he was not in favor of "encouraging reading and an interest in reading." Although Shanahan can be eloquent and passionate about why reading is important, he apparently thinks it's inappropriate and dangerous to encourage interest in it.
For this, Shanahan was not laughed out of the profession; he remains one of the big shots of the reading world. This past week, the thoughtful, intelligent instructor of my PD workshop referred to Shanahan in glowing terms and gave us a couple of his articles. How could this be? How could the President of the International Reading Association argue against teachers' trying to encourage "an interest in reading"?! Bill Gates's kids have teachers that nurture a lifelong love of reading, but the rest of us can't even encourage an interest in reading? Are there different rules for private and public schools? Well, yes--according to Shanahan.
Interest in reading and "freedom of choice"
For, although his central (if insane) argument is that encouraging an interest in reading is somehow inimical to effective teaching, and that we should be "jealous of instructional time" which would apparently be wasted by encouraging student interest in our subject, Shanahan also argues at length that it is beyond a public school teacher's mandate to encourage interest in his subject. In order to make this argument, Shanahan shifts the terms of the debate from the words "interest" to "pleasure" and then to "desire" and then to "love", and argues suggests that as "institutional beings," teachers have no right to try to instill love or desire in anyone. A teacher's "public responsibility," according to Shanahan, does not include "encouraging reading," which is, he says, a "personal goal" that might carry "danger." What danger? Apparently encouraging reading would limit "freedom of choice."
That encouraging an interest in reading could be considered as limiting to freedom of choice is obviously Orwellian. As Bill Gates found when he went from public school to private school, and as Shanahan should know, given his explanation of why he is passionate about teaching reading, encouraging an interest in reading actually promotes freedom of choice, while merely teaching it dispassionately as a useful skill is usually a good way to limit freedom. For Shanahan public schools, although obligated to impose explicit instruction of the kind Bill Gates found so tedious when he went to public elementary school, are not allowed to offer students encouragement and nurturing of the very practices that will allow freedom.
Conclusion: We need to create a culture of reading, even in public schools
Why is Shanahan so uncomfortable with the notion of encouraging interest in reading, even though he acknowledges that reading is important? Why does Gates spend his billions to promote increased class size and increased testing, even though he sends his kids to a school that brags about its average class size of 16 and that manages to have 40% of its Seniors be National Merit Scholarship Finalists without having done any of the kind of high stakes testing Gates is working to impose on the rest of us? The obvious answer for Shanahan is that he has spent his career promoting explicit skill instruction, and for Shanahan to admit that it's important to teach reading as an organic, pleasurable experience, or to admit that reading is largely a socially mediated activity, might seem to him to call into question his life's work.
As for Gates, perhaps he doesn't know how to address the social and cultural aspects of learning, or perhaps he thinks the changes he's pushing will lead indirectly to an improved cultural and social environment in the classroom. My guess is that Gates sees public school as properly different from what he offers his own children. When Gates himself switched from public school to private school, he noticed a dramatic cultural shift. As he recalls, "it was a change at first. And the idea of just being kind of a goof-off wasn't the sort of high reward position like it had been in public schools." It seems possible that, partly based on this experience, Gates doesn't think it possible to change that culture.
But he should think so, for in the same interview I quoted before, he offers an excellent example of a public institution that encourages reading. Gates remembers that when he was a kid, the library would give you a gold star if you read ten books over the summer, and two stars if you read twenty. According to Gates, he and "five or six girls" would compete to see who could read the most books. For reading is a solitary activity, but reading is also a social activity, and it can be encouraged.
The first job of every high school English class should be creating a culture of reading. This is difficult to do when many of our expert authorities don't believe that interest matters, and think that human beings are mechanisms that have only to be properly programmed for "proficiency." The best way make sure that our public schools are not like the one Bill Gates went to, where "being a goof-off was more socially rewarding," is to replace the interest in goofing off with an interest in reading and thinking, and that can only happen if we encourage that interest. We must make sure that our public schools do "encourage reading"--even inspire a love for it. If reading is, and has always been, strongly linked to social class, we don't have to accept the social class divisions that we are given.
Like Barack Obama and Arne Duncan, Bill Gates did not go to a public high school. Instead, Gates, a scion of an elite Seattle family, went to a fancy prep school called Lakeside. Lakeside's English curriculum is quite different from the Common Core Standards that Gates paid millions to have created and is spending millions now to promote, and that Obama and Duncan are pushing as well, through their "Race to the Top" (sic) program. The Common Core standards suggest long and detailed classroom analyses of extremely difficult texts, and offer absolutely nothing in the way of requiring extensive reading or encouraging a love of reading. This curriculum is dramatically different from the ones offered at Lakeside, where Bill Gates's kids now go, but I wouldn't expect Lakeside to change its ways anytime soon.
Here are the mission statements for Lakeside's English programs at the middle school and high school levels:
"The Middle School English Department is dedicated to nurturing a lifelong love of reading
and writing. We strive to create a community of readers and writers that inspires students to
experiment with a variety of written forms."
"Lakeside’s [High School] English Department’s highest goals are to inspire in students a
love of literature and to help students become great writers."
Both the middle school and high school statements use the word "love" and emphasize writing in an "authentic voice" and "artistically." The curriculum is notably literary and cultural, and not narrowly designed to ready students for the business or political world.
It's also notable that these English departments aren't afraid to talk about encouraging a love of reading. Encouraging a love for reading might seem like an obvious goal of English class, but in the Orwellian world of the Education-Industrial-Complex that goal is controversial.
The masses are given "instruction" aimed at "proficiency"
This Orwellian madness surfaced in 2006, when the new President of the International Reading Association came out against encouraging a love for reading. Professor Tim Shanahan, one of the biggest names int he reading world, had already made clear that he was against natural reading: he was a prominent member of the "National Reading Panel" (2000) that after a cockeyed look at the evidence, argued at length for explicit instruction and dishonestly claimed that there was no evidence that independent silent reading was effective. In 2006, he became President of the International Reading Association, which has as one of its three stated purposes, in addition to improving reading instruction and promote reading proficiency, to "encourage reading and an interest in reading" (Reading Today, June 2006). Shanahan's first move as President of the Association was to say that while he could support improved instruction and promoting proficiency, he was not in favor of "encouraging reading and an interest in reading." Although Shanahan can be eloquent and passionate about why reading is important, he apparently thinks it's inappropriate and dangerous to encourage interest in it.
For this, Shanahan was not laughed out of the profession; he remains one of the big shots of the reading world. This past week, the thoughtful, intelligent instructor of my PD workshop referred to Shanahan in glowing terms and gave us a couple of his articles. How could this be? How could the President of the International Reading Association argue against teachers' trying to encourage "an interest in reading"?! Bill Gates's kids have teachers that nurture a lifelong love of reading, but the rest of us can't even encourage an interest in reading? Are there different rules for private and public schools? Well, yes--according to Shanahan.
Interest in reading and "freedom of choice"
For, although his central (if insane) argument is that encouraging an interest in reading is somehow inimical to effective teaching, and that we should be "jealous of instructional time" which would apparently be wasted by encouraging student interest in our subject, Shanahan also argues at length that it is beyond a public school teacher's mandate to encourage interest in his subject. In order to make this argument, Shanahan shifts the terms of the debate from the words "interest" to "pleasure" and then to "desire" and then to "love", and argues suggests that as "institutional beings," teachers have no right to try to instill love or desire in anyone. A teacher's "public responsibility," according to Shanahan, does not include "encouraging reading," which is, he says, a "personal goal" that might carry "danger." What danger? Apparently encouraging reading would limit "freedom of choice."
That encouraging an interest in reading could be considered as limiting to freedom of choice is obviously Orwellian. As Bill Gates found when he went from public school to private school, and as Shanahan should know, given his explanation of why he is passionate about teaching reading, encouraging an interest in reading actually promotes freedom of choice, while merely teaching it dispassionately as a useful skill is usually a good way to limit freedom. For Shanahan public schools, although obligated to impose explicit instruction of the kind Bill Gates found so tedious when he went to public elementary school, are not allowed to offer students encouragement and nurturing of the very practices that will allow freedom.
Conclusion: We need to create a culture of reading, even in public schools
Why is Shanahan so uncomfortable with the notion of encouraging interest in reading, even though he acknowledges that reading is important? Why does Gates spend his billions to promote increased class size and increased testing, even though he sends his kids to a school that brags about its average class size of 16 and that manages to have 40% of its Seniors be National Merit Scholarship Finalists without having done any of the kind of high stakes testing Gates is working to impose on the rest of us? The obvious answer for Shanahan is that he has spent his career promoting explicit skill instruction, and for Shanahan to admit that it's important to teach reading as an organic, pleasurable experience, or to admit that reading is largely a socially mediated activity, might seem to him to call into question his life's work.
As for Gates, perhaps he doesn't know how to address the social and cultural aspects of learning, or perhaps he thinks the changes he's pushing will lead indirectly to an improved cultural and social environment in the classroom. My guess is that Gates sees public school as properly different from what he offers his own children. When Gates himself switched from public school to private school, he noticed a dramatic cultural shift. As he recalls, "it was a change at first. And the idea of just being kind of a goof-off wasn't the sort of high reward position like it had been in public schools." It seems possible that, partly based on this experience, Gates doesn't think it possible to change that culture.
But he should think so, for in the same interview I quoted before, he offers an excellent example of a public institution that encourages reading. Gates remembers that when he was a kid, the library would give you a gold star if you read ten books over the summer, and two stars if you read twenty. According to Gates, he and "five or six girls" would compete to see who could read the most books. For reading is a solitary activity, but reading is also a social activity, and it can be encouraged.
The first job of every high school English class should be creating a culture of reading. This is difficult to do when many of our expert authorities don't believe that interest matters, and think that human beings are mechanisms that have only to be properly programmed for "proficiency." The best way make sure that our public schools are not like the one Bill Gates went to, where "being a goof-off was more socially rewarding," is to replace the interest in goofing off with an interest in reading and thinking, and that can only happen if we encourage that interest. We must make sure that our public schools do "encourage reading"--even inspire a love for it. If reading is, and has always been, strongly linked to social class, we don't have to accept the social class divisions that we are given.
Subscribe to:
Posts (Atom)