The General Accounting Office last week issued a stinging rebuke to the National Assessment Governing Board over its effort to set standards for student performance.
Representatives William D. Ford, the chairman of the House Education and Labor Committee, and Dale E. Kildee, the chairman of its subcommittee on elementary, secondary, and vocational education, had asked the agency last fall to investigate the standards-setting activity after the N.A.G.B. fired a team of researchers who prepared a critical evaluation of the effort. (See °ÄÃÅÅܹ·ÂÛ̳, Sept. 4, 1991.)
In an interim report, sent last week to Mr. Ford and Mr. Kildee, the G.A.O. concludes that the effort was seriously flawed, and warns that it should not be continued without substantial changes.
It also proposes that the board be redesigned to improve the quality of technical expertise available to it.
“We found problems of procedure, of reliability, of validity, and of reporting,’' Eleanor Chelimsky, an assistant comptroller general, wrote to the lawmakers. “We examined plans for further work of this sort in light of these problems and concluded that commitments to the future use of levels now being set in similar fashion seem premature.’'
Roy E. Truby, the executive director of the governing board, said that the panel is taking seriously the agency’s report, and that it would consider additional changes.
But he noted that the N.A.G.B. has already taken steps to ensure that the achievement levels in 1992 are sound.
“We said the first effort was a trial,’' he said. “I think it’s improving.’'
‘Too Much Is Uncertain’
The controversial move to set standards, adopted by the board in 1990, was aimed at substantially changing the way results of the National Assessment of Educational Progress are reported.
Unlike in the past, when NAEP simply reported how students had performed on its assessments, the new method called for reporting the results against agreed-upon standards for achievement.
Under the plan--which was first used on NAEP’s 1990 mathematics assessment, the first to contain state-level data--the panel reported the proportion of students who performed at the “basic,’' “proficient,’' and “advanced’’ levels of achievement.
In its report, however, the G.A.O. found that the information the board reported was not reliable and possibly not valid.
Moreover, it states, the board’s report on the achievement levels, which was issued last September, failed to disclose such limitations.
Officials from the National Center for Education Statistics, which administers NAEP, the auditors state, “told us that the N.A.G.B. standard-setting results, which were published under N.A.G.B.'s independent authority, probably would not have passed N.C.E.S.'s pre-publication statistical-quality review.’'
The G.A.O. also points out that the governing board has hired American College Testing to conduct the standards-setting process for 1992 without ensuring that the results will be valid and useful.
“Too much is uncertain, we believe,’' the auditors state, “to support N.A.G.B.'s decision to organize the reporting and analysis of 1992 NAEP results around achievement levels that may or may not be workable.’'
Mr. Truby responded that the board will consider whether to continue reporting NAEP results by the traditional method as well as according to the achievement levels.
“The board has already said that achievement levels will be the primary way of reporting results,’' he said. “I’m sure they will want to consider, as a result of this, whether or not that should be the only way.’'
Technical Expertise
But the G.A.O. also says that, even if the problems in setting the standards can be ironed out, the approach itself may not be suitable.
For one thing, it notes, the decision to use three levels of achievement may threaten NAEP’s reliability by requiring an expansion in the number of harder items, to ensure that there are enough at the advanced level.
In addition, it states, the achievement-level reports only provide information on overall performance, not on performance on different areas of math content. Such reports, it says, “cannot help educators identify specific areas of weakness in student performance.’'
The auditors also take issue with the N.A.G.B.'s ability to make sound decisions about highly technical issues.
In spite of the complexity of standards-setting, the report notes, the board embarked on the effort “with the slenderest of technical resources: chiefly, one staff psychometrician and one part-time consultant.’'
To strengthen the operation of NAEP, the agency proposes two alternatives.
Under one option, which the auditors suggest is more feasible, the N.A.G.B. and the N.C.E.S. could reconfigure themselves to take advantage of the strengths of each.
In such a structure, the board would identify what is to be done; the center would then develop and implement an approach once it is found to be feasible and appropriate, and report back to the board.
As an alternative, the G.A.O. proposes strengthening the N.A.G.B.'s capacity for making technical decisions by adding experts as members or as staff members, or by using the resources available through the N.C.E.S. and its contractors.
Mr. Truby, however, rejected the suggestion that the board is incapable of making technical decisions. Although its staff is small, he noted, it has sought advice from a wide range of experts.
“The characterization of the board as isolated from the technical community is just wrong,’' he said.