The "Nation's Report Card," otherwise known as the report on the most recent National Assessment of Educational Progress, or NAEP (I always thought it rhymed with "Jeep," but it may rhyme with "tape"), has just come out. The NAEP is widely considered a good test, and since the NAEP is not high-stakes, there is little reason for schools or students to cheat, or for schools to attempt to teach directly to it. I remember Mike Dukakis telling an auditorium full of Leafstrewn students that NCLB was unnecessary because we already had the NAEP.
The NAEP is a good test, and because it is such a good test, and because reading and math abilities change so extraordinarily slowly, there aren't the kinds of variations in the NAEP that there are in, say, the MCAS. MCAS scores
have gone up dramatically over the past fifteen years, while the NAEP scores are always more or less the same. Therefore, the release of the NAEP scores is a little like a Rorschach test.
Optimists who favor ed reform will point to little fluctuations and
say, See, NAEP scores have risen over the past four years! People who are against ed reform will
say, Look, 17-year olds reading scores are below where they were in the nineties! Nerds will rightly point out that the demographics of the students taking the test have changed significantly over the years, and that breaking out subgroups can be interesting (for instance, black kids' scores have improved much more than those of white kids). Contrarians might say, the scores haven't changed much, so school doesn't matter.
I'd like to point out an interesting feature of the new report and suggest a possible explanation for it. I'm not sure I'm right, but my explanation goes along with some of what I have said in the past about the difference between short-term thinking and long-term thinking.
NAEP scores rise for 9 and 13 year-olds
The new "Nation's Report Card" has a very clear lede, and here it is:
Both 9- and 13-year-olds scored higher in reading and mathematics in 2012 than students their age in the early 1970s . Scores were 8 to 25 points higher in 2012 than in the first assessment year. Seventeen-year-olds, however, did not show similar gains. Average reading and mathematics scores in 2012 for 17-year-olds were not significantly different from scores in the first assessment year.
In other words, the most significant result coming out of this year's data is that over the past forty years, scores of nine and thirteen-year olds have gone up somewhat, but scores of seventeen-year olds are not significantly different than they were forty years ago.
One way to interpret this is to say, Well, we sure are doing a better job in the elementary schools; but high schools just aren't getting better.
This view--that elementary schools are improving but high schools are not--doesn't seem totally unreasonable, but there's an interesting problem with it that I have not seen anyone point out. The problem is that this result doesn't fit well with the standard theories of education, which would expect increased achievement by K-8 students to lead directly to increased achievement by 17-year-olds. The standard theories of education--that is, the views of people like Tim Shanahan, the ed reformers, and so on--see learning as essentially a step-by-step process of learning skills. According to this view, it is important to teach reading early, and to focus heavily on skills. But if you have this view, then the higher scores of 9 and 13 year-olds would predict higher scores of 17-year-olds, and so the lower scores of 17-year-olds would seem to imply, not only that high schools were not getting any better, but that they were actually getting significantly worse. For you would expect that if you took two 13-year-olds and put them through the same secondary education, the one who was a better reader as a 13-year-old would end up a better reader at 17. For a much better reader at 13 to end up the same as his less-skilled peer at 17 would seem to imply a much worse secondary education.
Have US high schools gotten worse over the past 40 years?
The NAEP scores are, if you follow the standard model, evidence that high schools have actually gotten worse over the past forty years. This is certainly possible. It's also possible that there are demographic issues involved (changes in dropout rates could affect the scores of 17-year-olds). But there is another possibility: it could be that the short-term thinking that has been increasingly prevalent over the past few decades has actually led to short-term success, but to a kind of short-term success that has
not supported long-term improvement.
I can see two ways this could work. One is a direct cost: the short-term teaching could be actively bad in the long run. For instance, it might turn kids off to learning or reading. If you drill kids for tests, the drilling might improve their scores but make them less creative thinkers. Two, there might be an opportunity cost: by teaching skills or teaching to the test, you might do less of the kinds of things that prepare kids for learning later on. For instance, you might read aloud to the kids less, or you might cut down on recess, or you might reduce the time allotted for free, creative play. Any of these could be imagined to result in lower reading scores a decade later.
Evidence that this kind of short-term/long-term trade-off might be possible can be found in the studies on Waldorf schools that I
wrote about a couple of weeks ago. Waldorf schools do not do any explicit teaching of reading skills until the second grade, and their reading scores are, not surprisingly, markedly below those of other schools in the early grades:
The Waldorf schools do, however, catch up. I'd love to see data on where those students are at the age of 17. It seems very possible that by the age of 17 the Waldorf kids, who while everyone else was drilling on phonics instead did a lot of listening to stories, singing songs, reciting poems, and observing nature, might be way ahead. And that may be the same dynamic we see in the NAEP scores.
If our goals are long-term, why are we all thinking short-term?
As I have repeatedly argued, getting better at reading is a very long-term process, and yet far too much of our thinking as teachers, like far too much of the discourse about education, focuses on the short term. Most discussion of lesson-planning, for instance, seems based on teaching students a discrete skill that they have never attempted before. John Hattie, in the introduction to his magnum opus,
Visible Learning, offers my favorite example of this short-term thinking: Hattie describes in loving detail the excitement of an initial lesson in rappelling down a building, and then says that this is "
the heart of the model of successful teaching and learning." This
is absurd, since reading is neither dangerous nor novel to most of our students, and Hattie would find that teaching rappelling to people who'd already been rappelling for ten years would be very, very different. But Hattie's absurdity is just an extreme version of the kind of thinking we all do. We are after all called "teachers," and just as it is
natural to think to ourselves, "So, what particular skill am I going to teach today?", it is also natural to want to measure students' improvement over the relatively short time periods of a unit, a semester, or a year. So, under MCAS and NCLB, we now give students high-stakes tests every year, and we are moving to a system by which teachers are evaluated by the results of these short-term assessments.
I doubt this is wise, but in the end it may not do too much harm. The remarkable stability of the NAEP scores is a healthy reminder that changes in educational regimes in the US have not made much difference to test scores. On the other hand, the NAEP may also mean that short-term thinking is ineffective: it may lead to short-term successes, but those short-term successes do not necessarily lead to longer-term success.