Friday, November 30, 2012

The Common Core is not "evidence-based"--but maybe that's okay!

My Curriculum Coordinator just got a subscription to the Marshall Memo, which may be bad for my mental health--I'll be reading a lot more Ed. research. This post came from reading an article in EdWeek that Marshall refers to in this week's memo.  I'm sorry the post is so long.  I think it served to clarify my thinking...

The Common Core is Not Evidence-Based
There's an article in a recent EdWeek with the remarkable headline: "New Literacy Research Infuses Common Core."  The article's subtitle reads, "In the 15 years since the National Reading Panel convened, the knowledge base on literacy has grown." As far as I can tell, both the headline and the subhead are essentially false.  The Common Core standards are not really evidence-based, and the knowledge base on literacy has not grown much in the 15 years since the now-discredited National Reading Panel --except perhaps in the Socratic sense of knowing how much it doesn't know.

This is interesting only because it points up the farcical nature of so much of today's educational discourse.  While most of what happens in schools today is worthwhile, the way people talk about it is just ridiculous.  One of my colleagues suggested that our schools would be better off if all the Graduate Schools of Education disappeared from the face of the earth (he used stronger words), and I think he might be right. Education research is like Hollywood: NOBODY KNOWS ANYTHING.  Of course, this isn't true, either of schools or of Hollywood, but it's partly true. 

The article itself is somewhat better than its headlines, and if you read it carefully you can see that the Ed. professors are basically just as in the dark as we teachers are--if not more so.  What is most amazing about an article that claims to be about "new literacy research" is that it describes very little actual research.  The article quotes many academics, but often they say things as questionable and non-evidence-based as the following paragraph, which manages to express the same simple idea, an idea that has been a truism for many decades now, over and over:

"In our knowledge-based economy, students are not only going to have to read, but develop knowledge-based capital. We need to help children use literacy to develop critical-thinking skills, problem-solving skills, making distinctions among different types of evidence," said Susan B. Neuman, a professor in educational studies specializing in early-literacy development at the University of Michigan in Ann Arbor. "The Common Core State Standards is privileging knowledge for the first time. To ensure they are career-and-college ready, we have to see students as lifelong learners and help them develop the knowledge-gathering skills they will use for the rest of their lives. That's the reality."

Privileging knowledge is a new idea?  Helping kids become life-learners is a new idea? Critical thinking in literacy is a new idea?  What?  Not only are these old ideas, there is not a the slightest bit of research or data to be found in that paragraph.

The recent history of "evidence-based" BS
The EdWeek article goes on to discuss the National Reading Panel of 2000, which was a much-ballyhooed effort to establish the most advanced and scientific thinking about how children learn to read and how we can help them.  The Panel's report came down decisively on the side of explicit instruction in skills: phonemic awareness; vocabulary; comprehension strategies; etc.  The panel's recommendations formed the basis of a $1 billion-a-year effort, "Reading First" by the Federal government to improve reading in the early grades.  Eight years later, there was a comprehensive assessment of the program, to find out how much difference this explicit instruction in skills had made.  The answer: zero difference.

The assessment reported three key findings: (1) Reading First did indeed result in students spending more time on reading "instruction" (phonemic awareness, vocab, etc.); (2) Reading First did indeed result in more professional development in "scientifically based reading instruction (SBRI)"; (3) however, "Reading First did not produce a statistically significant impact on student reading comprehension test scores in grades one, two or three" (page v).

Another finding, that the assessment did not consider "key," but that may have had some impact, was that the increased instructional time and the emphasis on skills did not result in any increase in students' actually reading.  As the assessment puts it: "Reading First had no statistically significant impacts on student engagement with print" (page xii).

This is remarkable: in 2000, only twelve years ago, the state of the research (the "knowledge-based capital," in the vapid phrase of the Michigan professor), which the panel of eminent experts claimed to hold to the "highest standards of scientific evidence," was utterly and completely wrong.

After being so wrong, the education experts tried to reposition themselves--but not very clearly.  One of them is quoted in the EdWeek article as saying that after the National Reading Panel, "comprehension became the 'next great frontier of reading research.'"  This is odd, since "comprehension" was one of the central topics of the NRP itself. (1)

Reading Next:
One of the ways the experts tried to reposition themselves was in a report called "Reading Next," which according to the EdWeek article "helped spark the common core's approach. Education professor Catherine A. Snow and then-doctoral student Gina Biancarosa of the Harvard Graduate School of Education found that explicit comprehension instruction, intensive writing, and the use of texts in a wide array of difficulty levels, subjects, and disciplines all helped improve literacy for struggling adolescent readers."

Reading Next focused on an array of fifteen "powerful tools" for improving literacy.  In an improvement on the NRP's exclusive focus on skills instruction, many of Reading Next's recommendations were so vague that no one could object ("Effective Instructional Principles Embedded in Content"), and many sounded fairly old-fashioned (Strategic Tutoring; Motivation and Self-Directed Learning; Extended Time for Literacy).  But when you actually looked more deeply into what the specific recommendations were, it became clear that the report was, like the NRP, trying as hard as possible to avoid mentioning the very simple strategy of having students actually read.

Avoiding all mention of actual reading:
Here is the passage from the Reading Next report that discusses "Extended Time for Literacy," which I had thought from its title might mean more time for students to actually read.  That may be what is meant, but the authors seem to twist themselves into jargony knots so as to avoid discussing actual "reading":

Extended Time for Literacy
None of the above-mentioned elements are likely to effect much change if instruction is limited to thirty or forty-five minutes per day. The panel strongly argued the need for two to four hours of literacy-connected learning daily. This time is to be spent with texts and a focus on reading and writing effectively. Although some of this time should be spent with a language arts teacher, instruction in science, history, and other subject areas qualifies as fulfilling the requirements of this element if the instruction is text centered and informed by instructional principles designed to convey content and also to practice and improve literacy skills.

To leverage time for increased interaction with texts across subject areas, teachers will need to reconceptualize their understanding of what it means to teach in a subject area. In other words, teachers need to realize they are not just teaching content knowledge but also ways of reading and writing specific to a subject area. This reconceptualization, in turn, will require rearticulation of standards and revision of preservice training.

This passage is amazing.  Despite the fact that it seems intended to promote spending more time having students actually reading, the language in this passage and the whole report seems to avoid saying that straight out. Instead we hear about "instruction" (four times), "literacy-connected learning," "interaction with texts," and "instructional principles designed to convey content." The word "reading" appears twice, but never on its own, never with the implication that the students might be actually reading; instead, we read that time should be spent with "a focus on reading" and in "teaching... ways of reading."  

This passage, like the whole report and indeed like so much of the discourse of reading experts, makes me think of Pearson and Gallagher's "Gradual Release of Responsibility Model."  These experts, perhaps because they are so far removed from teaching actual children, are not willing to release responsibility...

Much of the data that does exist is obvious
Much of the "research" that the EdWeek article mentions is super-obvious.  For instance, here is some expert wisdom:

"research showing that there is no bright line for when students start to read to learn"

"Kids have to read across texts, evaluate them, respond to them all at the same time. In office work of any sort, people are doing this sort of thing all the time."

"a student's depth and complexity of vocabulary knowledge predicts his or her academic achievement better than other early-reading indicators, such as phonemic awareness."

Didn't we all know these things already? But here is my personal favorite piece of obvious data:

"students who practiced reading, even when it was difficult, were significantly better 20 weeks later at reading rate, word recognition, and comprehension, in comparison with the control group."

Wow--if you read more, you get better.  Who knew?!

Education is like medicine, circa 1850
I have read that at the end of John Hattie's 2009 Magnum Opus, Visible Learning (I've ordered the book, but it hasn't come yet), Hattie compares the state of research in education to the state of medical research in the nineteenth century. In other words, we teachers might be better off with home remedies or folk wisdom.  And in a sense this makes me feel a bit better about the Common Core.  The Common Core is in no real sense, as far as I can tell, evidence-based (saying that students will one day have to write non-fiction is not scientific evidence for making them read it a lot when they are eight), but given the state of education research, maybe that's okay.  What matters is that students read a lot, think and talk about what they read, and look carefully at their own writing.  We English teachers can facilitate this process, but we shouldn't worry too much about the standards, which are, as Tim Shanahan says, in an expert opinion I can agree with completely, "a little goofy."

***************************************************
Footnotes:
(1)  In fact, one of my favorite passages from the NRP report is the following piece of meaningless verbiage: "Comprehension is critically important to the development of children’s reading skills and therefore to the ability to obtain an education. Indeed, reading comprehension has come to be the “essence of reading”."  This is as absurd as if one were to say, "Movement is critically important to the development of children's running skills and therefore to the ability to compete in many team sports.  Indeed, movement has come to be the "essence of running"."

2 comments:

  1. Good post. So... at what point are you going to become so disenchanted with ed research that you stop reading it?

    ReplyDelete
    Replies
    1. I should! Unfortunately, somebody just got the Marshall Memo for the department. Also, ed research looms large in all the public discourse about education right now, so it's hard to avoid--it would be like avoiding reading about religion during the Spanish Inquisition or something. And finally, there is a side of me that is really interested in science, and it drives me a little crazy to see such half-baked "science". So I've ordered John Hattie's 2009 book, Visible Learning, which supposedly offers a magisterial overview and synthesis of all the educational research ever done, or something like that. If that book's a disappointment, maybe I'll give up. I did pretty much give up on the New York Times, after years of wasting an hour or so a day on news that Thoreau had long since told me was never really new...

      Delete