Saturday, March 30, 2013

Should We Indict Bush, Boehner, Obama and Duncan?

Today's indictment of the former Atlanta schools superintendent is an amazing turn of events, but why stop with her?  The indictment offers no evidence that ex-superintendent Beverly Hall had ever directly instructed any teachers to cheat on state tests, but bases its case on the "no excuses" culture of "constant pressure" for higher test scores.  According to the District Attorney, Dr. Hall "is a full participant in the conspiracy.  Without her, this conspiracy could not have taken place, particularly in the degree it took place."  This makes sense to me, but it is worth noting that by this logic we should also be indicting a lot of other people: the Secretary of Education, a couple of Presidents, sundry Congressmen.

There has never, as far as I know, been even a single school that managed to get very poor kids to test as well as rich kids.  The basic premise of the No Child Left Behind Act was flawed and cynical in its essence.  I hope we're starting to realize that, and I hope that what has happened in Atlanta will be seen as an indictment, not only of Dr. Hall, but of our whole misguided system of high-stakes testing and gross inequality.

Tuesday, March 26, 2013

Summer Reading Blues

A committee of teachers
I'm on a committee charged with rethinking the summer reading program at Leafstrewn.  Over the past several years there has been a whole-school summer reading book.  The title changes very year: Fahrenheit 451; Farewell, My Subaru; First Crossings; etc.  This afternoon our committee talked about changing things around--perhaps by having a book for every grade, so that maybe every year Freshmen would have to read Fahrenheit 451, but maybe Juniors would have to read Dreams From My Father--or whatever; we didn't talk about specific titles.  I advocated, predictably, for a free-choice program that would require reading at least one book but would encourage reading many more, a program in which the choice would be a central part of the conversation--but I didn't think what we ended up with was so terrible.  The department chair sent out notes about the meeting, which included the following bullet point:

•    We agreed that the goals of summer reading are  1) to promote the enjoyment of reading and 2) to give students practice in choosing books they enjoy reading.

Pleased to see these admirable goals, I went happily off to a seder.

A student's perspective
At the seder table, my cousin, a recent graduate of Leafstrewn and now an English major at a fine college, was sitting--or should I say, reclining--next to me. When I mentioned in passing that I'd been at a meeting about summer reading--I didn't say anything about what had been said--my cousin said, Oh, summer reading!  I said, Yeah--what did you think about the summer reading program?  He said, I can describe it in one word: CRUEL.  I said, Really?!  He said, Well, I don't resent it that much anymore, but if you'd asked me six years ago I would have gone on a rant.  I said, I'm surprised; I didn't think it was so terrible.  He said, Being forced to waste your time reading a terrible book OVER THE SUMMER?  The books were awful, we were forced to read them, and then we didn't even do anything intetresting with them in the fall.  It was infuriating.  My friends and I all hated it.

What now?
First, we might consider surveying students to see what they think.  Second, if our goals are really to promote the enjoyment of reading and to give students practice in choosing their own books to read, then having a single assigned book (whether for the whole school or by grade) is almost certainly not the best way to accomplish them, and is possibly even counterproductive.

An admission
Although I have been a compulsive reader since the age of 5, when I was assigned a book to read over the summer before my freshman year of college (it was "The Machine in the Garden," by Leo Marx), I didn't read it.  And I am not alone.  Many people--even many teachers at Leafstrewn--just don't like assigned reading.  Designing a summer reading program that actually encourages reading may require some unnatural teaching.

Friday, March 22, 2013

The Common Core is not "research-based" and its "efficacy" is unproven

In poking around looking at PARCC (the new testing associated with the Common Core) I stumbled onto an interesting document: the "publishers' criteria" set out by the two lead authors (Coleman and Pimentel) of the ELA Common Core "State" Standards. These criteria are intended to guide publishers in their development of curricular materials for teaching under the Common Core regime. (Although I don't actually know any teachers who use a publisher's curriculum materials, many must do so, since producing these materials seems to be a huge industry.)
Here's the conclusion--again, this is Coleman and Pimentel telling publishers how to create materials for teaching to the Common Core standards:

Curriculum materials must have a clear and documented research base. The most important evidence is that the curriculum accelerates student progress toward career and college readiness. It can be surprising which questions, tasks, and instructions provoke the most productive engagement with text, accelerate student growth, and deepen instructor facility with the materials. A great deal of the material designed for the standards will by necessity be new, but as much as possible the work should be based on research and developed and refined through actual testing in classrooms. Publishers should provide a clear research plan for how the efficacy of their materials will be assessed and improved over time. Revisions should be based on evidence of actual use and results with a wide range of students, including English language learners.

This paragraph may seem fairly reasonable on its face; nevertheless, I have two thoughts about it.

I. The Common Core does not, itself, "have a clear and documented research base"

My first thought is that what Coleman and Pimentel say about publishers' Common Core-aligned materials seems quite relevant for the Common Core itself.  It is as if Coleman and Pimentel had realized all the things that should have been done, but that they didn't do, in writing up educational standards for the entire country.

The Common Core is hardly "research-based"; the research base on which it rests is incredibly flimsy.  The authors of the Common Core make their case, such as it is, in an appendix purporting to offer, among other things, "research supporting key elements of the standards."  The relevant section of this appendix is only three pages long and offers "research" worthy of an undergraduate paper or a blog post, not of a major national endeavor.  Its three pages make roughly the following case:

1) College performance correlates with the ability to read and understand difficult texts, and especially expository texts. (This is probably true.)

2) Complexity of texts assigned in high school has declined over the last 50 years, and high school students read relatively little expository text. (This may be true.

3) Therefore, students need to be assigned more difficult texts, and more of those texts should be expository. (This third part neither follows logically nor is supported by empirical data.)

As I noted, the first part of this argument seems very likely, and the second is plausible, but the third part is very problematic, neither seeming logical nor being supported by the historical record or empirical research. The gap in logic is obvious: just because kids who are better at reading complex expository texts do better in college does not mean that most of the reading kids do in high school should be of complex expository texts, and just because the complexity of texts assigned in high school has declined somewhat over the past several decades does not mean that assigning more complex texts is the right remedy.  For these conclusions to be valid, there would have to be empirical support either of the historical or experimental variety.  There is neither.

First, the historical fallacy.  Robert Hass tells us that all poetry is about loss.  The same is true of the stories Ed reformers tell—we have lost the good old days when teachers taught more rigorously and even poor students could achieve like the rich kids.  The problem with this story is that it is by no means clear that students were more college-ready 50 years ago.  The only historical evidence the Common Core authors cite is what they call a "statistically significant" decline in adult reading proficiency.  What is the actual decline? From 15% in 1992 to 13% in 2007.  This hardly seems precipitous, it covers fifteen years, not fifty, and is belied by (slightly) rising reading scores on the NAEP.  The historical record does not provide a clear argument for making students read more difficult texts or having more of those texts be expository--and Ed reformers, unlike poets, don't have the excuse of poetic license. (For a longer analysis of this historical fallacy in the stories Ed reformers tell, see here.)

Second, there is a shocking lack of experimental data.  As usual in writing about education, the discussion in this Common Core appendix is a mish-mash of much unfounded assertion and some offhand citations of actual empirical research; also as usual, even when there are references, the articles cited often fail to support the assertion.  I'll discuss just one example--the first specific citation I looked at--but they are almost all equally embarrassing.

     References that don't support what they are cited to support

I am skeptical of the idea that students need to read a lot of specifically expository text; my suspicion, based on my own and others' experience, is that the key thing is to simply to read a lot, and that a high volume of reading even of trashy airport thrillers will lead, with only a bit of specific practice, to skillful reading of expository text.  The Common Core document states that "students need sustained exposure to expository text to develop important reading strategies." Now, I doubt this very much, but while I only have anecdotal evidence (me, Malcolm Gladwell, and everyone else of our generation, who learned to read before reading strategy instruction was current) to support my skepticism, the Common Core folks seem to have more: they cite several articles to support their statement.  I looked at the first citation (Afflerbach, Pearson and Paris, 2008), a 2008 overview from "The Reading Teacher" about the difference between strategies and skills (unsurprisingly, the article suggests that skills are unconscious, strategies are conscious, and you need both; I agree, but I'm skeptical that these things can be explicitly taught or usefully assessed)--the article does not support the assertion it is cited to support.  

I have now read the article three times looking for evidence that "students need sustained exposure to expository text to develop important reading strategies," and I have so far failed to find any evidence at all.  Thinking maybe I'd missed it, I searched for "expository," for "exposure" and for "sustained".  None of those words appears in the text.  Neither does "non-fiction."  Looking through again, the closest I could come to anything supporting the claim that you need to read a lot of expository text in order to develop reading strategies is the following very general assertion about practicing strategies, which is completely untethered to any actual data:
"The scope and complexity of these strategies are large, and there is ample variety of text difficulty and genre variety to practice so that the skills become automatic. The general rule is, teach children many strategies, teach them early, reteach them often, and connect assessment with reteaching." 

First of all, we should note that this "general rule" is, like most general rules in writing about education, totally unproven and highly dubious. Many people, like me, like Ben Franklin, like Frederick Douglass, received virtually no formal instruction, and absolutely no assessment, in reading strategies, and yet learned to be highly skilled readers and writers.  Second, this brief, unfounded passage bears little relation to the Common Core's assertion that you need to read a lot of expository text in order to get better at it. If this-- "there is ample ... genre variety to practice so that the skills become automatic"--is supposed to mean this--"students need sustained exposure to expository text to develop important reading strategies"-- then either I am a bad reader, David Coleman is a bad reader, or he simply has no idea what a "clear and documented research base" would mean in a field that was, unlike education, actually scientific.

Again, the first citation offered by Coleman and Pimentel to support one of their central claims provides absolutely no support; far from reporting research or empirical data, the article never even mentions the matter at hand.  This is still amazing to me, despite my having found this to be the case with so many other supposedly "research-based" recommendations to teachers.

II. Books and (maybe) a teacher are all students really need

The other thing to notice about the Common Core’s recommendation to publishers is although the authors say that most of the materials "will by necessity be new", this is probably untrue. In fact, developing new materials may be unnecessary, since books alone would seem to fit most of their requirements.  Books support student "readiness for college."  Books have "a clear and documented research base."  Books have a long history of "actual use" with a "wide range of students."

The best ELA program for infants is simply a lot of natural adult talking and reading aloud, and the best college ELA curriculum is simply good books and an engaging professor, but somehow schools have fallen into a muddy puddle of "instruction" and "curriculum materials." Also, one of the major stated purposes of the whole Common Core/PARCC effort is to make sure young people are prepared for college.  I wonder why, then, the best private high schools and the best private colleges aren't using publisher-created curriculum.  Is it possible that Andover and Harvard are delivering a sub-par product.  Perhaps--but far more likely is that either their curriculum is better or that the curriculum really doesn't matter all that much. 

The Common Core anticipates that the questions, tasks and instructions used with readings will be created by publishing companies.  This begs the question of what role, if any, we teachers are supposed to play.  I suppose eventually we will be replaced by computer programs.  That might be okay...


Except that it's not okay. Children need human connection, and the best thing that we do every day is provide that connection.  If scholars and bureaucrats with zero teaching experience can tell us what we can do that will help us connect better, great!  But all too often--that is, almost all the time--their recommendations are shockingly unfounded on any empirical data.

On the one hand, many of the CC's specific recommendations—like the suggestion that students focus on close reading, or the observation that “it can be surprising which questions, tasks, and instructions provoke the most productive engagement with text”—seem reasonable, but the hypocrisy and hubris of the whole enterprise give off a very questionable smell

Wednesday, March 20, 2013

Trying to peek through the PARCC gates

I have been trying to learn about the new assessments that are coming down the pike, and it has not been easy.  All I can find out for sure is that the new testing regime--known as PARCC (Partnership for Assessment of Readiness for College and Careers)--will be computer-based, instead of paper-and-pencil, that it is tied to the new Common Core Standards, and that it is supposed to be fully operational in two years, during the 2014-2015 schoolyear.  

What I suspect is that the new testing regime will end up meaning more testing than what we have now, and that it will require enormous expenditure of money and time, but despite a couple of hours of poking around the internet, I haven't been able to find any very clear discussion of what the testing will actually look like.  I have a few very basic questions, none of which have been answered by the many pages of the PARCC website and the state DOE website (which just sends us back to the PARCC website!), nor by the dozens of other articles and websites I have consulted.  Here are my questions:

1)  What will these computer-based PARCC tests will mean for schools, like ours, that don't have nearly enough computers?

Our school has about 500 students in each grade, and we have nowhere near 500 computers, so giving a computer-based test would mean either buying a LOT of new machines or...

2) Could these tests be given to different students on different days?

One way to handle the issue of not having enough computers would be to do the testing over a period of several days.  This would be a logistical headache, but could work.  This would mean, however, that students who took it later might hear about the content of the exam from students who had already taken it.  With paper-based tests that is not possible, as we in Massachusetts have just learned. Whether the computer-based tests would have so many possible questions that this wouldn't be an issue is unclear; I'm skeptical.

3) Will we need to add huge amounts of bandwidth?

The answer to this seems to be yes--but I don't know how much, nor how much it will cost.  I've read a report about the technical requirements of states that have already done computer-based testing, and the report says we will need a fair amount, and that "this bandwidth requirement must be in addition to the normal day-to-day bandwidth capacity needed for teaching and learning, communications, and management/accountability systems."  In other words, whatever bandwidth we already have shouldn't be used for the testing; the bandwidth for testing will all have to be newly installed.

4) Will there be more testing than we have now?

This is perhaps the most important question, and it is striking how completely everybody seems to avoid it.  The PARCC consortium, the Common Core people, the state DOE, and so on--none of their websites address the question of how much testing will be done.  Nevertheless, the system is clearly being set up to test kids several times a year; one of the great virtues of the computer-based testing is supposedly that the results will come back more quickly.

Slide 5 of a powerpoint presentation given by a PARCC official to a PARCC meeting in Massachusetts seems to show 5 assessments over the course of a year: Diagnostic Assessment; Midyear Assessment; ELA Speaking and Listening Assessment; Performance Based Assessment; and an End-of-Year Assessment.  If this model were to be adopted, it looks to me like kids would take 5 state tests in several subjects every single year.  That's a lot!

5) What do we take away from this?

The fact that PARCC and the DOE are so remarkably unforthcoming about their plans, and about the cost of their plans, does not strike me as auspicious.  My school is spending a lot of time and energy this week just on giving our tenth graders the English MCAS.  What I'm learning about PARCC makes me worry that much more testing, and much more expense of time and money, is on the way.

Tuesday, March 19, 2013

Why can't US experts remember that reading builds vocabulary?

An Edweek blog post (h/t SK) asks, "What are the best ways to help students -- mainstream and/or English Language Learners -- develop academic vocabulary?"  The author of the blog answers the question himself, then offers advice from a number of a number of other experts.  In the blog post proper, five experts offer advice; not one mentions reading. 

Instead, we get wisdom like "Select vocabulary strategically."  Hm.  There is also a wonderful Freudian slip in the author's step-by-step instructions in how to teach a word list.  He would have us first ask kids what they think the words mean, and then: "The following step is to illicit these student ideas and and guide them toward an accurate definition of each word, which they then write down."

Only in an appended group of reader-tweeted responses to the questions does someone finally remember that reading might be a good way to learn academic vocabulary: "Students develop their academic vocabulary best by reading academic texts on topics they are interested in."  Curious, I looked up the tweeter.  Who was this wise person who remembered reading?  Not a US expert, but "first and foremost a a family person: a mother of two wonderful boys, my husband’s wife, a daughter, a sister, an aunt…" She's also, in her professional capacity, a reading teacher in a college for tourism studies in Slovenia.

Why can't US experts remember that reading builds vocabulary?  Why is it only professors of tourism studies in Slovenia?!

Who knew: Singapore is a den of Dewey-eyed hippies!

On this snow day, I'm about to get down to the business of reading what my students have written about the books they're reading, but I just spent about ten minutes psyching myself up by reading a bit of Pasi Sahlberg's book about Finland.  I haven't looked much into Finland, because it seemed to me that a lot of people who hold it up as a model are ignoring the huge differences between Finland's relatively equal society and our relatively cutthroat one, but Sahlberg's book is interesting.  One thing I learned from it has nothing to do with Finland at all, but with a country I think of as dramatically different: Singapore.

I always thought Singapore was a super-conservative city state that kept its culture business-friendly by such illiberal practices as cracking down on freedom of speech, executing a lot of people, and caning schoolchildren.  It may be that way, but this morning I learned (maybe everyone else already knows this?) that its Ministry of Education has been, over the past fifteen years, promoting a vision of education that even A. S. Neill might have admired.  Singapore's initiative is called:


That is a beautiful slogan, and it's elaborated with a full-on liberalizing zeal that must be partly necessitated by Singapore's history of cane-wielding Gradgrindian severity but was still inspiring even to me.  We in Leafstrewn are, I think, with them in theory, but not always in practice.  So during this week in which Mother Nature seems to be offering her own critique of high-stakes testing, here are some Singaporean lessons (from the Ministry of Education's website):

Remember Why We Teach -
More… Less…
For the Learner  
To Rush through the Syllabus 
To Excite Passion 
Out of Fear of Failure 
For Understanding 
To Dispense Information Only 
For the Test of Life 
For a Life of Tests 

Reflect on What We Teach - 
More… Less…
The Whole Child
The Subject
Searching Questions
Textbook Answers

Reconsider How We Teach - 
More… Less…
Engaged Learning
Drill and Practice
Differentiated Teaching
‘One-size-fits-all’ Instruction
Guiding, Facilitating, Modelling
Formative and Qualitative Assessing
Summative and Quantitative Testing
Spirit of innovation and enterprise
Set Formulae, Standard Answers 

Friday, March 15, 2013

How many books have our students read this year so far?

My assumption has long been that our students are not reading enough, and that the most important thing we could do to help them improve in this essential area would be to help them read more.  I just did a quick check of their reading logs and my own records, and we put up some stickers on our chart--so I'm going to spend a quick post on how it's going.

Last Year's results
Last year I asked this same question about the students in my own classes and the students in the academic support program I was working in, and the results were interesting:

Students in the academic support program (from both Honors and Standard classes): 6 books in ten months (on average)

Students in my classes (all Standard classes): 8.5 books in ten months (on average)

Both groups were reading, on average less than a book a month--far from ideal.  (I'd say the average should be three books a month, but I would be happy with two.  My 12 year old reads four or five every month, and it doesn't take up much of his spare time.)

This year: slight improvement

I have made an even more concerted effort this year to get my students to read more.  We have done more independent reading, and less reading of whole-class texts.  I am pretty sure there has been a real opportunity cost to this focus, but it has at least resulted in somewhat more reading.  This week my students and I calculated their reading, and the results were:

Ninth graders in my (standard-level) classes this year so far: 8.2 books in 7 months (on average)

(This average is skewed a bit by a few kids who are reading a lot (over 20); the median is 6.)

This is better than last year--by the end of the schoolyear, the average should be up to 11 or 12 books--but it is still not great.  Two of my 32 ninth grade students have only read two books all year.  (On the other hand, one of those students is actually a success story, having read zero books in the first three months and two books in the second three months.)

Does more reading mean increased vocabulary and improved skills?
So, I have been somewhat successful at getting my students to read more.  What remains unclear is how much benefit my students have gained from the increased reading.  My assumption is that the increased reading volume will mean increased vocabulary, improved reading comprehension skills, and perhaps even improved writing--but I don't know if that's true, and I can't check in a very reliable way, because I really only have a good baseline for their vocabulary.

Interestingly, whether their skills are improved or not, the reading is not obviously correlated to how well the students do in school.  One of my students, who has failed ninth grade twice already and is now taking both my grade English class and a tenth grade English class, has read 14 books this year.  He tells me he didn't finish any last year.  Nevertheless, he is failing both my class and the tenth grade English class--because he never does any written work outside of class.  In fact, as I write this, I am waiting for him to show up for an afterschool appointment to do some of his missing work.  He's not going to show; we need a new system to help kids like him!  Success or failure in school has less to do with skill than with being able to get the work done--we need to figure out how to help them do it.  I've figured out how to get this kid to read books; now I need to work on the writing piece.

I'll look into this again in June. Until then, I'll keep trying to get them to read, and I'll get the Dean to help me get my failing students to come after school to do their work.

Wednesday, March 13, 2013

Students who need to read in school

It works to make kids do their homework in school

Leafstrewn has just gone through a remarkable episode in which the headmaster's proposal to cut the administrator of an alternative program for kids who are having a lot of trouble in school was met with such opposition from the faculty that she changed her mind.  The episode was remarkable mostly for the teachers' support for a program serving a small fraction of our students, support that will mean losing teacher jobs, but also because the headmaster admitted, quite gracefully, that she had made a mistake.  Everyone came out looking pretty good, and I was proud of our school.

The episode also reminded me of how effective this particular alternative program has been. Five students from my ninth grade classes last year are now in the program, and four of the five have spoken to me positively about it (the fifth I haven't happened to bump into). As the potential cut to the program was in the balance over the past week, I've been thinking a lot about what has made it so successful for my former students.  It certainly helps that the program has very good teachers and a caring and very hands-on administrator, but perhaps the key element of the program is that the students are made to do their homework in school.  As one of them told me recently, after I'd asked if he was doing the assigned reading, they have to get the work done, or they don't get to go home at the end of the day.

This is a simple but extremely important practice.  It seems indisputable that the number one proximate cause (as opposed to indirect factors like poverty) of student failure at Leafstrewn is failure to do homework.  Having an even better teacher might help, being more genetically gifted might help, and coming from a richer and more educated family certainly helps, but the line separating passing from failing is basically whether kids get their homework done. Certainly for my five students from last year who are now in our alternative program, had they all done their homework, they all would easily have passed English for the year.  (Another program here, the "Tutorial" program, essentially gives students time, space and encouragement to do their homework in school.) This makes me wonder, again, about the value of homework, and it makes me think that Bruce Baker is right and the (modest) success of programs like KIPP is mainly due to spending more money on, among other things, a longer schoolday.  It also makes me think, again, about the value of reading in school.

A student who doesn't read

I had a chat today with a former student who isn't in this alternative program.  I'll call him "Billy."  Billy is a bright, polite, very appealing kid whose family has not had it easy, to put it mildly.  When he was in my ninth grade class two years ago, Billy told me regularly that he wanted to drop out of school so that he could get a job and help support his family.  Like everyone else, I told him that he would be able to help his family better if he stayed in school.  Billy passed my class, barely.  He read virtually none of the assigned reading or our whole-class texts.  He did, however, read four or five Alex Rider books (which are at about a fifth grade level).  Those may have been the only books he has completed in high school.

Billy told me he had failed English in 10th grade and he was in danger of failing this year as well. I asked why. He said: "Because I don't do my homework."  I said, "Do you do the reading?"  He said, "No.  Never."  I said, "But you read in my class.  I remember you reading Alex Rider."  He said, "Yeah, because we read in class. I don't read at home.  I never read at home in your class either."  I don't think that's strictly accurate--I remember him reading Alex Rider at home as well--but it gets at an important truth.

Reading is the most important academic skill.  As I wrote last year, "Our school has kids under its control for over six hours a day.  There is no good reason we can't have them sitting and silently reading books for at least an hour each day.  Nothing else we do with them is as important; nothing else would be as efficient, productive, and individualized." Or, as a Mexican novelist recently wrote in the New York Times, "One cannot help but ask the Mexican educational system, “How is it possible that I hand over a child for six hours every day, five days a week, and you give me back someone who is basically illiterate?”"

If we think reading is valuable, we should be using time in school to have them do it. That a child as bright and wonderful as Billy--a student who devoured Alex Rider books during a six-week independent reading unit--can have read fewer than ten books over his whole high school career is shameful for our school.  We have taken care of Billy in many ways, and we have always treated him with gentleness and respect, but we have mostly not done for him what we would do for our own children--given him books he can read and time to read them.

Homework in School and Social Justice

The proximate cause of these students' failure is that they don't do their homework, but they all share a deeper cause as well: Billy, the students I had last year who ended up in the alternative program, and every student I have this year who is in danger of failing, all come from families that are at the low end of the economic spectrum.  If we really want to "reform" education in a way that will help poorer children succeed, we should start by finding a way to do what our excellent alternative program here at Leafstrewn does, and give kids time to do their homework--especially their reading--in school.

Friday, March 8, 2013

Recording reading conferences

A month ago I wrote about reading conferences, and I said that I wasn't sure how to keep good records of the discussions I had with students.  Some of my brilliant colleagues, who have been thinking about these kinds of discussions too, have started recording their conferences using a voice recorder. Today I started doing this too, using my ipod.  I have the kids read to me for 30 seconds or a minute from wherever they are in the book they're reading independently, and then I ask them some questions.  I usually start with something like, "What do you think the author is doing in this passage?"  I try to follow up with "What specifics in the passage do you see that make you say that?"  And then we go from there.

A few thoughts after my first day:

1) It's a good thing we get used early to the look of ourselves in mirrors; I'm not yet used to the sound of my voice on a recording.

2) Maybe if I did this more my voice would improve?

3) All of my students have a fair amount to say about what they're reading--that's good!

4) I wonder if they would have as much to say if they were talking about a whole-class text.  I suspect that talking about the independent reading book makes the students the experts, and therefore empowers them--but I don't have a control group, so I don't really know.

5) It takes longer than I realized to get at an important question, or to get to something that makes the students stop and think.  When I wrote about this a month ago, it seemed easy and quick to get to those points; I think I had had an exceptionally good day of conferences just before I wrote that post. (I wrote, "One thing that's striking in doing these one-on-one conversations is how quickly we get to points at which the students need to stop and think before they respond.)  Listening to today's recordings, I am struck by how relatively smooth the conversations are.

6) I wonder if this is partly because in today's conversations I allowed the student to direct the conversations more than I often do.  A couple of days ago I was talking to colleagues about conferences, and we discussed using open-ended questions, so today I tended to start the same way with everyone, and I didn't have the explicit goal of coming up with a question that made the student stop and think. This way of questioning, which was partly modeled on the VTS method ("Visual Thinking Strategies"), showed me more clearly what the student was capable of on her own, but didn't lead to the "stop and think" moments that last month I was apparently so proud of facilitating.

6) Nevertheless, I do get something out of these conversations--they are possibly useful assessment tools.  From conversations today I learned that: student A doesn't know the historical background, so can't get the humor and nuance of the conversation; student B almost never refers back to the text, even when asked repeatedly about "specifics in the passage"; student C makes great connections between this scene and other parts of the book; students B, D and F are not very fluent or accurate in their reading aloud, but B and D nevertheless seem to understand the passage perfectly; student E doesn't seem to remember anything from the book except what he's just read to me; etc.

7) It seems to me that these conversations might be pretty good for diagnosing issues--and for instruction, as I was thinking before--but again, as with all instruction, it is SO INEFFICIENT!

8) My main instructional function in these conferences seems to be to push them toward a closer attention to the text and toward deeper thinking. This is really difficult!

9) I still haven't figured out how to keep a good record of these conferences.  In one sense I have excellent records--I have audio recordings--but in another sense these records are uselessly unwieldy.  From 13 conferences I have about a little over an hour of audio.  I did not try to take notes at the same time--but I probably should, so that I can later check my notes against the audio and improve my note taking...

10) I am still haunted by the sheer slipperiness of trying to improve reading comprehension.  I do think that making kids look more closely at text is a worthwhile exercise, but I also wonder how much my own reading comprehension "skills" have improved in the years since I was 12 or 13.  I know much more (background knowledge, vocabulary, literary terms, etc.), but are my actual reading skills (questioning, inferring, making connections) any better?  I am reading an apparently excellent 2005 overview of the research on comprehension acquisition (Perfetti, Landi, Oakhill), and it seems that their recommendations for instruction center on: 1) Reading more (and making sure the reading is "successful"--i.e. comprehensible input); (2) Instruction that tries in various ways to get students to pay more attention to the text and to their own understanding of it ("monitoring").  I take some comfort in thinking that reading conferences are one way to try to do #2--and that while I'm conferencing, the rest of the kids are doing #1.

I look forward to recording more conversations with students about what they're reading next week, and trying to put together the data into some kind of coherent record.  For now, I'm going to just try to get used to the sound of my own voice.

"English" class and language classes

Last summer I spent a few days with a childhood friend, Becca Lynch, who's now a Spanish teacher in Maine. I told her I'd been thinking a lot about how to teach reading, I'd been worrying that kids weren't reading enough, etc.--and she said, Oh, that sounds a lot like what I've been thinking about recently with my own Spanish teaching.

Becca told me that over the past few years she had modified her teaching approach to try to focus mainly on making sure her kids heard or read as much comprehensible Spanish as possible. Instead of explicitly teaching a grammar rule, or a verb tense, or whatever, she tried to create experiences in her classroom that would allow for these aspects of the language to be heard in a meaningful and understandable context. This "comprehension-based instruction" had radically improved her teaching practice, and she was enthusiastic about the conferences, books and blogs that had helped her figure out how to do it.

Becca often used the phrase "comprehensible input", a phrase I recognized, and we soon figured out that one of the gurus of her style of language teaching, Stephen Krashen, was also one of the proponents of independent reading in English class. I knew that Krashen had done most of his work on learning foreign languages, and I had read a lot of Krashen's articles (and his book, "The Power of Reading") about reading, but I didn't know anything about how these ideas were actually put into practice in, say, a high school Spanish class. My friend sent me a couple of links the week after we saw each other, but I was pressed for time and didn't really look at them.

Last week, Becca sent along another link, this time to a blog by a Spanish teacher. When I looked at it, I thought, wow--I have to talk to the World Languages people at Leafstrewn! And it renewed my desire to make my classroom full of meaningful reading and writing--comprehensible input.

I guess my questions are: what do World Language classrooms look like at Leafstrewn? (I need to come visit some!) Are my WL colleagues focused on this "Comprehensible Input" idea, or does it seem like old hat or like a new gimmick? How does teaching reading comprehension in English class relate to teaching comprehension of Spanish in Spanish class? What about "TPRS" (Teaching Proficiency through Reading and Storytelling")--is anybody doing that? What about the "embedded reading" idea (, in which students start with an easy version of a text, and then read gradually more complex versions--do any Leafstrewn WL teachers use that? Could "embedded reading" be done in an English class? And so on.

I haven't gotten any answers yet, but I need to start asking some colleagues.

Wednesday, March 6, 2013

Thousands of little countries that don't read

There's an interesting op-ed in the Times today about reading in Mexico ("The Country that Stopped Reading").  Its author, David Toscana, an eminent Mexican novelist, argues that his native country, which took next-to-last place in a Unesco survey of reading habits, ought to encourage a love of reading by providing students with enjoyable and compelling texts and giving them time to read in school:

One cannot help but ask the Mexican educational system, “How is it possible that I hand over a child for six hours every day, five days a week, and you give me back someone who is basically illiterate?”


A few years back, I spoke with the education secretary of my home state, Nuevo León, about reading in schools. He looked at me, not understanding what I wanted. “In school, children are taught to read,” he said. “Yes,” I replied, “but they don’t read.” I explained the difference between knowing how to read and actually reading, between deciphering street signs and accessing the literary canon. He wondered what the point of the students’ reading “Don Quixote” was. He said we needed to teach them to read the newspaper. 

When my daughter was 15, her literature teacher banished all fiction from her classroom. “We’re going to read history and biology textbooks,” she said, “because that way you’ll read and learn at the same time.”

The US is very different from Mexico, but I'm afraid that the Common Core and other Ed Reform efforts are taking us in the wrong directionWe, too, can learn from Toscana's prescription: give kids enjoyable books and time to read.  Thousands of our children are not getting that, and thousands of our kids are, in themselves, little countries that don't read.