More than 20 years ago, when federal officials sought to publicize data portraying the relative quality of the states’ school systems, the best statistics they could find were scores on college-admissions tests and state-reported graduation rates.
Now that states have results from their own tests and state-level results from the National Assessment of Educational Progress—as well as a wealth of other data—the Department of Education is publishing a two-page report on each state that gives a glimpse of the quality of its K-12 schools.
also should answer the public’s questions about the success of the No Child Left Behind Act in promoting increased student achievement, said Secretary of Education Margaret Spellings.
The data reports show that gaps in achievement between minority and white students are narrowing, the secretary said. And they show the proportion of schools making their achievement targets under the NCLB law; nationally, the rate is about 70 percent.
“When they see in black and white what [NCLB] means in their own backyard, … I think it’s very useful for parents and policymakers,” Ms. Spellings said in an interview last week.
Data Quality Improves
The amount and quality of data available today represent a dramatic improvement over what was available in the so-called “wall chart,” a state-by-state compilation of resource inputs, performance outcomes, and population characteristics that the Education Department published for six years, starting in 1984 under Secretary Terrel H. Bell. (“E.D. Issues Study Ranking States On Education,” Jan. 11, 1984.)
Educators challenged the validity of those comparisons, and Mr. Bell at the time acknowledged the limitations of the data. But he defended such a “scoreboard” as a way of raising awareness of the need for school improvement.
Since 1990, NAEP has published state-by-state results for 4th graders and 8th graders from its reading, mathematics, science, and writing tests based on a sampling of student achievement. States voluntarily participated in the NAEP tests until 2003, when the NCLB law required them to join the assessment for the reading and math exams.
In addition, the NCLB law requires states to create their own tests to measure the reading skills and mathematical abilities of students in grades 3-8 and once in high school.
Those data are the heart of the new Education Department reports, which the agency is calling “dashboards.” Like the dashboard of a car, the reports give people “pieces of information … in a way that are quickly consumable and usable,” said Secretary Spellings, who unveiled the reports last month in a speech at the National Press Club in Washington.
A chart in each state’s report compares how its students are doing on the NAEP tests and the state’s own exams. The chart also disaggregates the data by the performance of white, African-American, Hispanic, and low-income students.
Each state’s report lists the high school graduation rate as reported by the state. It compares that rate with the so-called average freshman graduation rate, which estimates the percentage of 9th graders who earn their diplomas within four years in that state.
While the data aren’t exhaustive, they are far more extensive than what the federal government was able to publish shortly after the 1983 report helped set off a wave of school reforms, said Chester E. Finn Jr. Mr. Finn was an assistant secretary of education under Mr. Bell’s successor, William J. Bennett. Lamar Alexander ceased publishing the “wall chart” shortly after he became secretary of education in 1991.
“We have had approximately two decades of movement toward state-level results data that can be compared,” said Mr. Finn, who is the president of the Thomas B. Fordham Foundation, a Washington-based think tank that supports accountability measures and school choice. “I expect we’ll see . . . the data will continue to get better, faster, more precise, more fine grained, better able to be analyzed in various useful ways by various constituencies.”
Long before the department first published its dashboards, Mr. Finn said, states, nonprofits, and companies published a variety of Web sites that allow users to find data and, often, compare schools, districts, and states on measures such as student achievement, spending, and other data.
Giving NCLB Credit
The increase in the amount and quality of data is mostly the outgrowth of the NCLB law, said Chrys Dougherty, the research director of the National Center on Educational Accountability, an Austin, Texas-based nonprofit group that supports data-based efforts to improve schools.
“It’s something people need to look at,” Mr. Dougherty said. “It gives you an idea how tough their state test is or how high the standards are.”
But Mr. Dougherty and other advocates of such use of data are laying the groundwork to help states measure whether their students are graduating prepared for college or the workforce.
ACT Inc. has developed a method for reporting such percentages based on students’ scores on the ACT college-admissions exam, which the Iowa City, Iowa-based nonprofit produces. Other states can calculate similar percentages if they benchmark their high school exams to the ACT’s standards.
“That’s an indicator that’s coming down the pike,” Mr. Dougherty said. “But somebody has to do the data analysis.”