Category Archives: testing

NAEP and Reading Research

I just finished reading Gerald Bracey‘s Reading Educational Research: How to Avoid Getting Statistically Snookered.  It was an excellent review of my quantitative research class as well as a very real-life application of the information from that class.   Bracey spends several pages discussing the National Assessment of Educational Progress, or NAEP.  This test, which began in the late 1960s, was designed to be a national descriptive assessment.   However, Bracey points out that in 1988, Congress changed the law under which NAEP is administered to allow state comparisons and established a governing board, whose first president, added achievement levels to the test.  The test became a prescriptive assessment.  According to Bracey, “the achievement levels–basic, proficient, and advanced–were set ludicrously high and led to demeaning statements about the competence of students and, by implication, the competence of teachers and the quality of schools” (p. 149).  Many groups have suggested that the achievement levels are “fundamentally” flawed.  The problem, Bracey says, is with the word “proficient.”  This word differs from state to state.  But some folks, including Diane Ravitch (whose new book is on my shelf), would like to see a national definition and believe NAEP would be the perfect vehicle for getting at that definition, something Bracey says would be a disaster.

After I closed up Bracey, I reached for this week’s Education Week.  There on the front was a note about the recent results of the NAEP Economics test.  The headline indicated that the results were better than expected and the article inside indicated that the results showed that students had a pretty good understanding of the basics of economics.  They only ran into trouble with the details such as concepts like interest rates, gross domestic product, or the role of the Federal Reserve.  After reading Bracey, I had to wonder how many of the students taking the test had had direct instruction in these concepts.  According to the Education Week article, “87 percent of test-takers said they had received some kind of exposure to economics content during high school.  Sixteen percent reported they had taken advanced economics; the greatest proportion, 49 percent, said they had taken general economics, 11 percent said they had taken a business or personal finance class; 12 percent said they had been enrolled in some kind of combined economics course; and 13 percent said they had not taken any economics course.”   The article goes on to quote a federal transcript study that showed that 66 percent of high school graduates had taken an economics class in 2005.  So, this really is an example of an invalid test, since it is testing material to which the students may or may not have been exposed.

The article quotes Darvin M. Winick, the chairman of the NAEP Governing Board, who helps perpetuate the idea that our schools are failing.  He says, “While there is clear room for improvement, the results are not discouraging.  Given the number of students who finish high school with a limited vocabulary, not reading well, and weak in math, the results may be as good as or better than we should expect.”  OUCH!  There’s finally some good news about education, and this guy manages to turn it into an insult.

The issue of Education Week also has a general article about using NAEP as a way of comparing the states.  The fundamental problem for now seems to be that there needs to be more information about “exclusion rates,” that is the number of students that each state keeps from taking the test or to whom they provide special help or accommodations.  This information, according to the article, will be more visible in future NAEP reports.  And, the governing board will press for nation-wide inclusion policies for the exams scheduled in 2009.

One other criticism that Bracey has of NAEP that I think is particularly important is how meaningless the test is to students.  Since they are not held accountable for the test results since they can’t be traced to an individual student,  getting kids to take the test seriously is a problem.  Bracey quotes Archie Lapointe,  a former executive director of NAEP: “Yes, the problem with NAEP is keeping students awake during the test” (p. 147).

The National Center for Educational Statistics has a NAEP website that has lots of information about the history of the test.   And, here’s a Bracey post from February 2007 about NAEP testing.  I added his blog to my aggregator!  I LOVE it when people like him use weblogs to comment on contemporary education issues.