As many experts raise questions about the future of 鈥渢he nation鈥檚 report card,鈥 the governing board for the assessment program is exploring changes aimed at leveraging the achievement data to better inform education policy and practice.
The core idea, outlined in a to the board, is to expand and make far greater use of the background information collected when the National Assessment of Educational Progress is given. In doing so, the report suggests, NAEP could identify factors that may differentiate high-performing states and urban districts from low performers.
The effort, it says, would parallel the extensive reporting of background variables in global assessment systems, such as the , or PISA.
The report was released just weeks after the Obama administration proposed a fiscal 2013 budget that would cut the NAEP budget by $6 million, while funding a pilot program of state participation in PISA.
鈥淐urrently, the NAEP background questions are a potentially important but largely underused national resource,鈥 says the report by a six-member expert panel commissioned by the , or NAGB, which sets policy for the testing program. 鈥淭hese data could provide rich insights into a wide range of important issues about the nature and quality of American primary and secondary education and the context for understanding achievement and its improvement.鈥
In addition, the report says NAEP background questions could help track policy trends, such as implementation of the Common Core State Standards or new teacher-evaluation systems.
The report, presented this month to NAGB at a meeting in New Orleans, was apparently well-received by many board members, including the chairman, former Massachusetts Commissioner of Education David P. Driscoll. But some of the ideas are generating pushback from current and former federal officials.
鈥淣AGB has a tool that they want to use for everything,鈥 said Mark S. Schneider, a former commissioner of the National Center for Education Statistics, the arm of the U.S. Department of Education that administers the test. He argues that NAEP should stick to its core strengths, namely measuring student achievement and serving as a benchmark for state assessments.
鈥淚 find this just a distraction,鈥 Mr. Schneider said of the proposed plan.
Causation vs. Correlation
Although the report emphasizes the importance of not letting correlations between math achievement and rates of absenteeism, for instance, be confused for causation, Mr. Schneider argues that such distinctions would be lost on the public and risk damaging NAEP鈥檚 reputation.
鈥淭hey will make statements that will inevitably push the boundaries, and you will end up with questionable reports, in my opinion,鈥 said Mr. Schneider, who is now a vice president of the Washington-based American Institutes for Research. Other concerns raised about the proposals are the cost involved, especially given the president鈥檚 proposed cut to NAEP, and what some experts say may be resistance to the federal government鈥檚 collection and reporting of more information on students, given privacy concerns.
The new report, commissioned by NAGB, notes that complementing the NAEP tests is a 鈥渞ich collection鈥 of background questions regularly asked of students, teachers, and schools. But the collection and the public reporting of such information have been significantly scaled back over the past decade, the report says.
鈥淣AEP should restore and improve upon its earlier practice of making much greater use of background data,鈥 the report says, 鈥渂ut do so in a more sound and research-supported way.鈥
It offers recommendations in four areas related to the background questions: asking 鈥渋mportant questions,鈥 improving the accuracy of measures, strengthening sampling efficiency, and reinstituting what it calls 鈥渕eaningful analysis and reporting.鈥
It鈥檚 the fourth area, analysis and reporting, that is proving especially controversial.
Marshall S. 鈥淢ike鈥 Smith, a co-author of the report and a former U.S. undersecretary of education in the Clinton administration, notes that the report comes at a time when NAEP鈥檚 long-term relevance is at issue. He cites the work to develop common assessments across states in English/language arts and mathematics, as well as the growing prominence of international exams, like PISA.
鈥淭he future of NAEP is somewhat in doubt,鈥 Mr. Smith said.
PISA鈥檚 use of extensive background questions, he said, has enabled it to have wide influence.
鈥淭hey鈥檝e built narratives around the assessments: Why are there differences among countries鈥 in achievement, he said. 鈥淲e can鈥檛 do that with NAEP. We鈥檙e not able to construct plausible scenarios or narratives about why there are different achievement levels among states. And we鈥檝e seen that can be a powerful mechanism for motivating reform.鈥
Mr. Driscoll, the chairman of NAGB, said the next step is for board staff members to draft recommendations on how the proposed changes could be implemented.
鈥淚 have challenged the board to think about how NAEP and NAGB can make a difference and have an impact,鈥 he said. 鈥淭here is some very valuable information that we can lay out ... that would be instructive for all of us.鈥
The report makes clear that NAEP should not be used to assert causes for variation in student achievement, but that a series of 鈥渄escriptive findings鈥 could be illustrative and help 鈥済enerate hypotheses鈥 for further study. For example, it might highlight differences in access to 8th grade algebra courses or to a teacher who majored in math.
鈥淎 valid concern over causal interpretations has led to a serious and unjustified overreaction,鈥 the report says.
But some observers see reason for concern.
鈥淚t鈥檚 a mistake to present results that are purely descriptive,鈥 said Grover J. 鈥淩uss鈥 Whitehurst, a senior fellow at the Brookings Institution in Washington who was the director of the federal Institute of Education Sciences under President George W. Bush. 鈥淚t is misleading, and it doesn鈥檛 make any difference if you have a footnote saying these results should not be considered causally.鈥
Jack Buckley, the current NCES commissioner, expressed reservations about some of the suggestions, especially in the analysis and reporting of the background data.
鈥淭he panel is looking toward PISA as an exemplar,鈥 he said. 鈥淔olks at [the Organization for Economic Cooperation and Development, which administers PISA] write these papers and get a broad audience, but it鈥檚 not always clear that the data can support the conclusions they reach about what works.鈥
Mr. Buckley said he understands NAGB鈥檚 desire to be 鈥減olicy-relevant,鈥 but he cautioned that 鈥渨e have to carefully determine what is the best data source for measuring different things.鈥
Mr. Driscoll said he鈥檚 keenly aware of not going too far with how the background data are used.
鈥淚 agree ... that we have to be careful about the causal effects,鈥 he said. 鈥淚 think we鈥檝e gone too far in one direction to de-emphasize the background questions, and the danger is to go too far in the other direction.鈥