We got an email this morning: the MCAS results from last year are out, and in Leafstrewn they are good. Ninety-six percent of our 10th graders scored at either the proficient or advanced level. Ninety-six percent seems excellent, and after I read the email about it I thought, “Wow, that’s cool! Our kids are good readers and writers! We are great teachers!”
Then, while waiting in line at the Registry of Motor
Vehicles, I became a little less cheery.
Maybe the Registry had something to do with my mood swing, but I also
happened to pick up today’s Boston Globe while I was in line, and I read a
story about last year's MCAS scores, in which I learned that 88% of all students statewide scored at the proficient level or above. 88% is not so different from 96%, so my students and colleagues in Leafstrewn seemed a little less
special—and I even started to wonder whether the results were reliable.
Depending on how you define “proficient,” it does seem
possible that nearly all 15 year olds in Massachusetts are proficient at
reading and writing. But it also
seems possible that the scores are getting better partly because the test is getting easier. MCAS scores have
been improving pretty dramatically since the very beginning. Here are the percentages of 10th graders scoring proficient or above on the ELA test since 1998:
1998 38
1999 34
2000 36
2001 51
2002 59
2003 61
2002 62
2005 64
2006 70
2007 71
2008 75
2009 79
2010 78
2011 84
2012 88
Those numbers tell a remarkable story of dramatic, nearly continuous improvement. Can we believe it? I wonder. (1) If the story is not believable, then a lot of unpleasant questions arise.
I haven't given a lot of thought to testing or the questions around it, but as we get ready for a supposedly harder test that will replace the MCAS in a few years, and as we get ready for legally mandated use of gradewide assessments in teacher evaluations, we need to be thinking about this stuff. For now, I just want to know if I can possibly believe that uncannily steady improvement.
(1) Massachusetts NAEP proficiency percentages start at about the same level, 35%, in 1998, and they do go up over the next 14 years, but in ups and downs, and only to 50%, a very, very different level.
I haven't given a lot of thought to testing or the questions around it, but as we get ready for a supposedly harder test that will replace the MCAS in a few years, and as we get ready for legally mandated use of gradewide assessments in teacher evaluations, we need to be thinking about this stuff. For now, I just want to know if I can possibly believe that uncannily steady improvement.
(1) Massachusetts NAEP proficiency percentages start at about the same level, 35%, in 1998, and they do go up over the next 14 years, but in ups and downs, and only to 50%, a very, very different level.
You are right to be somewhat skeptical about ever-rising scores. On the bright side, there's no way to construe it as BAD news.
ReplyDeleteIn previous posts you have said that you don't think standardized tests are even good at ensuring a modicum of academic competency. I understand your point of view but respectfully disagree. I feel that this is exactly what MCAS has been good at. MCAS has forced both good and failing schools to focus hard on the students who have even a slim chance of reaching the "proficient" level and working with them to make sure they achieve that. That is not nothing. Is it the best way to spend energy and time in a school? Perhaps not, but the high stakes nature of this has certainly forced everyone on board and putting struggling students in a high priority position on the radar across the state. So I would argue (at the risk of making every reader of this blog angry) that MCAS has been a net gain for those borderline students in almost all schools -- because people (teachers, administrators, the public) are PAYING ATTENTION to them.
As for the test getting easier, well, it has been erratic. The 2011 10th grade ELA test was actually quite difficult, suddenly. Last year's test was indeed easier than that in comparison.
My feeling is that preparing students for a particular test is like working on push-ups. If you took a group of students and had them work on push-ups for a year, I'm sure they would be able to do more push-ups and you would get better each year at honing your method of getting them to do more and more push-ups. Does the fact that they can do more push-ups mean that those students are in better shape? Not necessarily -- but they probably aren't in worse shape either. Now the new test will add jumping jacks and our students will no doubt get better at jumping jacks. And so on, forever and ever, amen.
Leave it to me to construe almost anything as bad news! Ever-rising test scores could be a problem for at least two important reasons: (1) they could set up impossible expectations; they could make people think that high-stakes testing worked and should be expanded.
DeleteBut really I haven't figured out how to think about testing. I am not necessarily against it. I have been using more and more assessments in my own classes, and I do believe that they measure something. However, I am not sure that they are reliable on an individual level, I don't think they are a good way to compare one teacher to another, and because the incentives are so significant I am suspicious of the ever-more-proficient progression of the MCAS scores, which look anything but "erratic" to me.
I don't have any experience with the tests, never having taught 10th grade, but I find it pretty hard to believe that the 2011 10th grade ELA test was "quite difficult, suddenly", since the percentage of kids scoring proficient or advanced was 5 points higher than it had ever been before. I know the state says it has a rigorous process through which it tries to make the scores from one year mean the same as the scores from the year before, but I'm skeptical. I think kids are better at doing pushups, but I don't think they're THAT much better. The percentage of MA kids scoring proficient or above on the NAEP in 1998 was 35%, similar to the MCAS number of 38%. The NAEP number has moved generally upward over the past 14 years, but in 2011 it was still only 50%, which is dramatically lower than the 2011 MCAS number of 84%. Why?
Yes, my skepticism about testing is related to my Eeyore-like pessimism about everything, but it also strikes me that a lot of the testing-boosterism is overly optimistic,, with impossible fantasies about infinite growth of the kind your last sentence half endorses and half lampoons.