When the federal government released the 2005 results last month for “the nation’s report card,” a few observers detected some subtle changes in the way the scores were presented—changes they say could lead to lower expectations all around for the level of performance considered good enough on the tests.
The National Assessment of Educational Progress, a testing program mandated by Congress in 1969 to periodically take the national pulse on student achievement, sets three achievement levels against which student test scores are measured. From lowest to highest, they are basic, proficient, and advanced.
Return to the main story,
Reports on the results usually include a chart showing how states have fared over time on the percentages of students who reach or exceed the proficient level. That is the level that the National Assessment Governing Board, the board that sets policy for NAEP, describes as representing “solid academic performance for each grade assessed.”
But this year, in both the reading and the mathematics reports released last month, the charts showing state-by-state trends focused on results for just the basic level, which denotes what NAGB regards as “partial mastery” of the skills students should acquire at particular grade levels.
The reports also omit some of the customary language that had been used in previous reports to flag “proficient” as the target for which states should be shooting.
“It’s a subtle shift that could indicate a shift at NAGB or at the [U.S. Department of Education] to focus on basic rather than proficiency,” said Michael J. Petrilli, the vice president for national programs and policy at the Washington-based Thomas B. Fordham Foundation. “It’s important to remind everyone that the goal is proficient.”
Mismatch With States
Governing-board officials contacted last week insisted that the board had no plans to lower the performance bar for the tests and that proficient was still the desired achievement goal. But, board officials said, there was a deliberate attempt this year to draw more attention to the basic-level results.
“If we don’t also show basic along with proficient, you’re not going to have a good read on whether or not every state is moving upward with regard to bottom-performing students,” Charles E. Smith, the board’s executive director, said in an interview.
“We’re trying to draw attention to basic as an achievement level with some value,” John H. Stevens, the chairman of the board’s reporting and dissemination committee, said in an interview. “We think it’s been devalued and even interpreted as, perhaps, failing in the past.”
The problem, Mr. Petrilli and other education policy experts said, is that the changes come as state exams are painting a different picture of educational progress than NAEP’s proficient-level results indicate.
For example, in New York state, which fared well on the national tests, only 34 percent of 4th graders reached proficient or higher levels in reading. That’s just under half the percentage of 4th graders who passed the state’s own reading exams this year. The passing rate for that test was 70 percent—identical to the percentage of students who met or exceeded the basic standard on the national reading exam.
But basic is not the same as proficient, Diane Ravitch, a member of the national governing board from 1997 to 2004, wrote in a commentary published Oct. 26 in the New York Daily News on New York state’s test results. “Congress intended that the NAEP serve as a reality check on the state tests,” Ms. Ravitch, a research professor of education at New York University said in the piece. “Most states have standards far lower than those of NAEP.”
That’s why it’s all the more important, said Mr. Petrilli, who until recently served in the Education Department, for federal officials to send the right signals. “If NAGB or NAEP isn’t going to stand up for high standards,” he said, “it doesn’t appear that anyone else is going to.”