Three years after a federal law required states to collect a host of education data, much of that information and more will now be available in one place鈥攇iving the public a newfound resource and giving educators headaches over how schools can be compared.
On a free Web site to be launched this week, a public-private partnership will post test scores, school spending, student demographics, and other relevant data. The site will feature research tools that allow users to compare achievement across districts, track districts鈥 and individual schools鈥 progress in reaching student-achievement goals under the federal No Child Left Behind Act, and find schools and districts that may be outperforming others.
The site鈥斺攁lso will give people ways to measure whether school spending translates into student learning, as well the chance to compare schools鈥 effectiveness.
Developed by Standard & Poor鈥檚 School Evaluation Services with help from the Council of Chief State School Officers, the project marks a new era of so-called transparency of school-related data, some analysts say.
鈥淚t鈥檚 a significant step,鈥 said Chrys Dougherty, the research director for the National Center for Educational Accountability. The Austin, Texas-based group has worked on similar projects in the past, but is not involved in this one.
鈥淭his is taking information that states already have and making it more accessible to the public,鈥 Mr. Dougherty said.
But some educators have questioned whether the Web site provides a fair way to compare schools, especially in the section that calculates a school鈥檚 鈥渞eturn on spending.鈥
A new Web site developed by Standard & Poor鈥檚 offers a wide array of information on public schools and tools for analyzing it.
Standard & Poor鈥檚鈥攁 New York City-based division of the McGraw Hill Cos. known for its research on stocks and bonds鈥攄elayed the launch of the site by two months while its staff responded to complaints of the Washington-based CCSSO that centered on the spending index. ( 鈥淎lbeit Late, State Data to Go Online in March,鈥 Feb. 9, 2005.)
State officials said that they would be watching how critics and supporters of public schools use the data to bolster arguments that specific schools should or shouldn鈥檛 get more money.
鈥淲e still have concerns about it as a simple method of determining a school鈥檚 efficiency,鈥 Lisa Y. Gross, a spokeswoman for the Kentucky Department of Education, said of the spending index. 鈥淚n business, you can do that. Schools are not that simple.鈥
Yet while they don鈥檛 always think the debate over the bang for the buck is fair, state leaders also realize that it鈥檚 going to occur, one state official said鈥攚hether or not Standard & Poor鈥檚 puts those measures on the site.
鈥淎 lot of our members understand that this was the inevitable next step in the way data is reported,鈥 said Scott S. Montgomery, the CCSSO鈥檚 chief of staff.
For their part, the site鈥檚 developers say it will help educators and parents find solutions to their problems by pointing to schools with similar demographics and spending patterns that are doing better at raising achievement.
鈥淭he point of this 鈥 is to figure out where they鈥檙e doing something right and what can we learn from that,鈥 said Paul Gazzerro, a director of Standard & Poor鈥檚 School Evaluation Services, which collected and organized the data on the site.
Latest Development
The new site builds on an existing database of states鈥 student-achievement data by adding new information and features. That earlier database鈥斺攚as completed last year with funding from the U.S. Department of Education and the Broad Foundation, a Los Angeles-based philanthropy that supports efforts to improve education.
Broad joined with the Bill & Melinda Gates Foundation in underwriting the new site with $45 million. That funding is to last until early 2007. After that, the funders expect states to pay for the project to continue.
The www.schoolmatters.com site was scheduled to launch on March 29 at 7 a.m. Eastern Time. All users of www.schoolresults.org were expected to be redirected to the new site.
Schoolresults.org provided data on student demographics, published scores on state tests for every school in a state, and listed whether each school was making adequate yearly progress toward achieving student proficiency in reading and mathematics under the No Child Left Behind Act. Only Nebraska refused to provide data to the site.
That earlier site offered the ability to compare a school鈥檚 achievement with that of others with similar demographics.
By comparison, the new site collects a wealth of data and offers several new tools that help users analyze the information. In addition to all the data on the previous site, it includes:
鈥 School spending amounts, with estimates of how much a district spends on instruction;
鈥 State and district scores on the SAT and the ACT and participation rates on those college-admission tests;
鈥 鈥淪chool environment data,鈥 such as class sizes, pupil-teacher ratios, and student-suspension and -retention rates;
鈥 Community information, such as income levels, property values, and educational attainment of adults in the area; and
鈥 Teacher compensation.
New Features
Schoolmatters.com also includes new tools that help users compare district and school performance with that of others having similar backgrounds and offers several indexes that help quantify school success.
One tool identifies schools that are outperforming others with similar demographics. A new index鈥攃alled RAMP鈥攃ombines reading and mathematics proficiency and measures how close a school, district, or state is to meeting the goal of 100 percent student proficiency in those two subjects by 2014鈥攖he goal set under the No Child Left Behind law.
The most controversial index quantifies a school鈥檚 and district鈥檚 鈥渞eturn on spending.鈥 It uses data on student achievement and school finance to calculate a figure that suggests whether a school or district is spending its money effectively.
During the development of the Web site, CCSSO members complained that the index could unfairly label schools. A low-performing school that鈥檚 received an influx of resources, for example, may score low on the index even though the extra money may be helping achievement.
As S&P produced the site, state officials expressed concerns about the data on it and how the data were presented, Mr. Montgomery of the CCSSO said.
The research firm made more than 100 changes to the Web site from its original version, Mr. Montgomery said. Some were minor corrections to data posted on the site; others changed how the site presented the indexes to ensure it didn鈥檛 conflict with previously released data.
For example, in earlier versions of the site, the RAMP index appeared when a user requested information about state student performance. Mr. Montgomery said that would confuse users because states report the test scores for individual subjects and across individual grade levels.
Now the layout separates the RAMP index from the test scores that states report on their own.
鈥淚t reshapes the look of the site so it doesn鈥檛 conflict with what states have previously reported,鈥 Mr. Montgomery said.
While most state education officials endorse the resulting version, concerns persist.
鈥淵ou don鈥檛 have the context in there,鈥 said Andy Tompkins, the Kansas commissioner of education and a CCSSO board member. 鈥淚t might not represent it in a way that we might have represented it.鈥
Still, Mr. Gazzerro of S&P鈥檚 School Evaluation Services said he expects that the site鈥檚 users will investigate further when the data provide results that are either too good or too bad to be true.
鈥淪ometimes the data tell the whole story,鈥 he said. 鈥淪ometimes you have to find out more.鈥