Outside of a handful of Asian nations, the typical 8th grader in many foreign countries would not meet “proficient” levels on U.S. tests of mathematics and science, a new reanalysis of international achievement shows.
Then again, the study also shows, neither do most American students.
The study, which was posted April 24 on the Web site of the American Institutes for Research, a Washington-based research organization, offers fodder for national debates over whether the United States has set its achievement targets too high. The new analysis, the second in a year to draw nearly the same conclusions from reinterpreted international data, comes from the AIR’s chief scientist, Gary W. Phillips.
is posted by the .
Mr. Phillips’ idea was to statistically “link” scores from two well-known testing programs: the Trends in International Mathematics and Science Study, or TIMSS, which is given every few years to students in more than three dozen countries, and the National Assessment of Educational Progress, a program mandated by the U.S. Congress that is known as “the nation’s report card.”
“I wanted to re-express the TIMSS results in terms of a set of standards that U.S. policymakers would be familiar with,” said Mr. Phillips, who was acting commissioner of the National Center for Education Statistics, the data-gathering arm of the U.S. Education Department, from 1999 to 2002. “It’s a lot like looking at world-poverty levels. It’s impossible to get your head around it until you get a common metric.”
Defining Proficient
A study designed to predict how other countries would fare on U.S. mathematics and science tests found that the typical 8th grader in most places, including the United States, would score at the “basic” level or below on the National Assessment of Educational Progress math test.
Note: “Basic” denotes a minimum score of 469, “proficient” a minimum of 556, and “advanced,” 637.
SOURCE: American Institutes for Research
Although Mr. Phillips saw his study as primarily a methodological test, the results come at a time when the NAEP achievement levels, which have been debated on and off for years, are attracting renewed attention in national discussions over what it means for U.S. students to be academically “proficient” or “world class.”
The federal No Child Left Behind Act calls for all students to be proficient by the 2013-14 school year. But studies show huge gaps between NAEP’s definition of proficient and the definitions states use in their own assessment programs. (“Gaps in Proficiency Levels on State Tests And NAEP Found to Grow,” April 18, 2007.)
“The notion of all students being proficient is an oxymoron,” said Richard Rothstein, a research associate for the Economic Policy Institute, a Washington-based think tank. “You can’t have a standard that challenges students at all ends of the distribution system.”
Using slightly different methodology, Mr. Rothstein put out a paper last fall that found results similar to Mr. Phillips’: Countries that score highest on the TIMSS tests don’t ensure that all students meet the NAEP definition of proficient.
“My conclusion is that we need to take a good look at where all the cut points are for defining basic skills or proficiency,” said Tom Loveless, the director of the Brown Center on Education Policy at the Brookings Institution, a Washington-based research organization, referring specifically to the NAEP tests.
However, Chester E. Finn Jr., the president of the Thomas B. Fordham Foundation, another Washington group that studies school improvement issues, said the results confirmed the current levels for the NAEP tests. “If you have normal students from normal public schools in normal countries and more than half of them are reaching proficient levels, then we have set exactly the right target that everybody should aim for,” said Mr. Finn, who once sat on the governing board that sets policy for the federal assessment program.
Mr. Phillips, who undertook the study on his own initiative, said he agreed. “A lot of people have said the NAEP achievement levels are too high,” he noted. “I think it’s pretty clear that’s not the case.”
‘A World Apart’
NAEP uses three levels to gauge student achievement: “basic,” which generally denotes partial mastery of the academic material; “proficient,” for solid mastery of challenging academic content; and “advanced,” to describe superior achievement. Students who reach none of those levels are described as performing “below basic.”
By Mr. Phillips’ estimate, of the 8th graders from 46 nations who took TIMSS mathematics tests in 2003, only students from Hong Kong, Japan, Singapore, South Korea, and Taiwan would reach the proficient level, on average, if they were to take the 8th grade NAEP in that subject.
For example, in Singapore, the nation that topped the international tests, the analysis shows that 73 percent of 8th graders would score at or above the proficient level on NAEP, compared with 26 percent in the United States. The percentage of 8th graders reaching the NAEP advanced level in that Asian country was estimated to be 35 percent—seven times greater than the U.S. rate.
On the other hand, in four nations—Botswana, Ghana, Saudi Arabia, and South Africa—almost no students scored at the advanced level.
The science results suggest a similar pattern. On that test, the analysis shows, 8th graders in only two of the TIMSS countries—Singapore and Taiwan—would, on average, achieve at the proficient level. In Singapore, 55 percent of students were estimated to score at that level, compared with 31 percent in the United States.
On both the science and math tests, the average U.S. 8th grader, like those in the largest group of TIMSS countries, tends to score at levels considered basic on the U.S. NAEP. On the math test, 22 of 46 countries were projected to achieve that status; in science, the number was 20 of 38. The percentage of students who score at that level in the United States is 66.
However, almost as many nations, most of them in Africa and the Middle East, would achieve at below-basic levels on the 8th grade math and science NAEP, according to the analysis, which suggests they would not meet the U.S. definition for grade-level competency. On the math tests, 19 nations fell in that category.
“When you compare the results for those countries to a place like Hong Kong, you can see that’s a world apart,” said Mr. Phillips.
Interpretations Mixed
Mr. Loveless and other experts warned about reading too much into the findings, because the two tests cover different content. But Mr. Phillips said the coverage areas, though not exact, are similar enough to make his comparisons worthwhile.
To arrive at his results, Mr. Phillips drew on earlier studies that analyzed results for two national samples of U.S. students—one of which took TIMSS in 1999 and another that took NAEP in 2000—to find statistical relationships to predict how well students who took one test would do on the other one. The purpose of that earlier study was to determine how students in particular U.S. states would have performed on the international tests. Mr. Phillips used essentially the same process and extended it so that he could estimate how students taking the international tests in 2003 would fare on the U.S. exams.
He acknowledged, though, that the method may have some drawbacks, one of which is that the statistics on which the calculations are drawn are based on U.S. students because they took both NAEP and TIMSS. In the other countries included in the study, students took only the international tests.
Dennis W. Cheek, the vice president of education for the Ewing Marion Kauffman Foundation, based in Kansas City, Mo., said such caveats suggest the study should be viewed as more of a methodological exercise than a policymaking guide: “When you pile on assumption on top of assumption, it’s kind of a shaky platform on which to link all these things.”