State tests of student achievement echo state-level trends on the National Assessment of Educational Progress鈥攃onsidered the nation鈥檚 gold standard of academic testing鈥攎ore closely than has been generally recognized, according to a new study.
, the found 鈥渕ore agreement than is commonly acknowledged鈥 between trends on test scores in 23 states and those on NAEP.
Sixty-seven percent of the states studied showed progress on both state tests and the national assessment in 4th grade reading between 2005 and 2009, with 76 percent showing progress in 8th grade reading, 79 percent in 4th grade mathematics, and 90 percent in 8th grade math, the report says.
The study also found, however, that while trends were positive on both types of tests, the gains were usually far larger on the state tests than on NAEP. And some experts argued that upward trends on both tests have limited significance. More significant, they said, is the fact that the rates of improvement on state tests outstrip those on 鈥渢he nation鈥檚 report card.鈥
A new study compares trends on state tests and on NAEP from 2005 to 2009.
95%: Of the 21 states with sufficient state test data in grade 8 reading, 20 showed gains between 2005 and 2009 in the percentage of students reaching the proficient level on state tests.
81%: Seventeen of these 21 states showed gains during this period in the percentage of students reaching the basic level on NAEP.
SOURCE: Center on Education Policy
The CEP, a research group based in Washington, examined trends in the tests that states use for federal-accountability purposes under the No Child Left Behind Act in the 23 states that had sufficient comparable data. The study analyzes students鈥 average scores on state tests and NAEP and compares trends in the percentage of students scoring at the 鈥減roficient鈥 level on state tests with the percentage scoring at the 鈥渂asic鈥 level on NAEP.
The comparison of states鈥 proficient benchmark with NAEP鈥檚 basic one was informed in part by a study of 47 states last year by the U.S. Department of Education鈥檚 National Center for Education Statistics, which found that most states鈥 definitions of proficiency were closer to NAEP鈥檚 basic level than to NAEP鈥檚 proficient level. That study also suggested that many states might have lowered their student-proficiency bars to meet the requirements of the NCLB law. (鈥淭est Rigor Drops Off, Study Finds鈥, November 4, 2009.)
Upward Bound
In the CEP study, researchers found that in most states, a rise in state test scores coincided with rising NAEP scores. That was true when students鈥 average scores were examined, as well as when the percentage scoring proficient on state tests was compared with the percentage scoring basic on NAEP.
鈥淚n nearly all cases, trends went up on both assessments,鈥 the report says. 鈥淯pward trends on both the state test and NAEP in the same state offer stronger evidence that students are mastering higher levels of knowledge and skills.鈥
The study noted, however, that the states with the largest increases in their own test scores were often not the same states that saw the biggest gains on NAEP. Its authors say that the larger gains in state tests could be explained by score inflation that was driven by 鈥渢eaching to the test鈥 or by the possibility that state tests are aligned more closely to state standards. Students鈥 motivation could also factor into the results, they say, since the national assessment is a no-stakes undertaking for students, teachers, and schools.
The authors note key differences between state and the national tests, such as their content, their proficiency cutoff scores, and what they measure. State tests reach nearly all students, for instance, and NAEP reaches a representative sample of its target group.
鈥楲ack of Agreement鈥
Andrew D. Ho, an assistant professor of education at Harvard University鈥檚 Graduate School of Education, said he did not see the agreement rates between state tests and NAEP as high鈥78 percent across all states, subjects, and grade levels, according to his own calculations鈥攚hen the agreement rate by sheer chance would be about 76 percent. It isn鈥檛 meaningful, he said, to focus on agreement rates when the trends on both types of tests happen to be positive.
More meaningful, Mr. Ho suggested, is the study鈥檚 finding that the rate of progress on state tests far outstrips that on NAEP.
鈥淭he take-home point isn鈥檛 about agreement, but the surprising lack of agreement about the amount of progress being made,鈥 he said.
The study is notable for confirming, with NAEP trend data, the progress being made on state tests, said Jack Jennings, the president of the Center on Education Policy. But it鈥檚 equally valuable for its findings on how some states鈥 test progress diverges from that on NAEP, he said.
鈥淲hat we鈥檝e seen overall is an affirmation by NAEP of the general trend in state test scores. That鈥檚 encouraging,鈥 he said. 鈥淏ut it鈥檚 still worth raising questions about why states see different results [from the national assessment].鈥