Those of you who have been with me for a while may recall a post I wrote on educational research, in which I stated my overall disdain for most research. One set of data beloved of our current administration’s education minions is the NAEP, the National Assessement of Educational Progress, the “only test administered across the nation in all the states.” In fact, it bills itself as “The Nation’s Report Card.” That’s how important it is.
Back when I first became aware of the NAEP, I thought, well, at least it’s consistent, right? Only sort of no: after I thought about it, I realized that my school was not taking this test every year, and in fact I didn’t remember it ever taking the test. So where is this report card coming from?
The way it works is that schools are randomly selected from across the country each year to be the data sources. Well, that’s okay, sort of, because you can get a “scientific sample” to give good data.
Only sort of no: within each randomly selected school, a small number of students is chosen to take the test. It’s not even random, because selected students and their parents must agree to be the guinea pigs. This small number of students then takes the test, which lasts less than an hour, and it’s from this set of data that we get our Report Card.
That’s right: when you read about how “scores are up” or “down” or our children are the stupidest of all civilized nations, this is the test they’re talking about. A 50-minute test administered to a not-quite-random sample of students scattered across the nation.
It gets worse. I often wondered at the drop off in scores between elementary and high school. You’ve read about that, how our 4th graders are up there with the rest of the world, but somehow they all get stupid by 8th grade (which makes sense, if you know middle school), but then get even stupider in high school. How does this happen?
As a principal of my acquaintance told me, when she was in charge of the test at the high school, the only students who would agree to the test were the losers who simply didn’t want to be in class. Not one of our best and brightest were included. The AP kids, the gifted kids, didn’t want to miss class.
At least at the elementary level, they work their little hearts out on the test. We tell them it’s important, and they believe us. The scabs at the high school don’t care whether it’s important or not. They just slough their way through it and kick back, enjoying their hour of freedom before sauntering back to “Life Skills Math.”
Nor does the NAEP gather any data about what might produce a school’s scores. Funding? Nope. Funding for the media center? Nope. What reading series do K-3 classrooms use? Nada. They scope out the kids’ home language and whether the family owns a computer, and that’s about it. (Their database of info on schools only includes data from the 2003-2004 school year. Their population data for my school, for example, is about 200 students off. Next year, if they follow the same schedule, they’ll be about 500 off.)
So there it is. The next time you read about the requirement to use “research-based strategies” in improving student “achievement,” remember that it’s from the same people who bring you the “Nation’s Report Card,” with about the same level of rigor in their “research.”