UPDATED
The U.S Department of Education鈥檚 second annual snapshot of the controversial School Improvement Grant program paints a mixed picture of the program, leaving open the question of whether an eye-popping infusion of federal cash鈥$3 billion in stimulus funding alone鈥攁nd some serious federal strings had a dramatic impact on the nation鈥檚 lowest-performing schools.
While more than two-thirds of schools in the first cohort (which started in 2010-11) saw gains in reading and math after two years in the program, another third of the schools actually declined, despite the major federal investment, which included grants of up to $500,000 a year, the department said Thursday. And schools that entered the program in its second year (the 2011-12 school year) didn鈥檛 post gains in math and reading as impressive as those the first cohort saw in their first year.
With two years of data now, it鈥檚 interesting to note that rural and small-town SIG schools posted bigger gains than their city and suburban counterparts in math鈥攁nd were almost as impressive in reading. When the SIG program was first unveiled, rural advocates worried that the remedies specified wouldn鈥檛 work for their schools.
Schools that attempted more-dramatic interventions generally saw greater gains than schools that took more flexible approaches. (More on that below.)
U.S. Secretary of Education Arne Duncan said the data show that SIG has pointed long-troubled schools in the right direction.
鈥淭he progress, while incremental, indicates that local leaders and educators are leading the way to raising standards and achievement and driving innovation over the next few years,鈥 he said in a statement.
But some researchers had a different take.
The fact that so many schools actually slid backward despite the big federal bucks is 鈥渁 little bit alarming鈥 said Robin Lake, the director of the Center on Reinventing Public Education at the University of Washington, which has studied the impact of the SIG program in the Evergreen State.
鈥淕iven the amount of money that was put in here, the return on investment looks negligible at this point,鈥 she said. 鈥淚 don鈥檛 know how you can interpret it any other way.鈥
Among the highlights from the department鈥檚 analysis, which you can read in full below:
鈥 Generally, 69 percent of schools that entered the three-year program during its first year saw an increase in math after two years of participation. But 30 percent of schools saw declines, and 2 percent didn鈥檛 demonstrate any change.
鈥 The results in reading for Cohort 1 were similar--66 percent demonstrated gains in reading, while 31 percent saw declines, and 3 percent saw no gains.
鈥 Fifty-five percent of schools in the second cohort, which had only been in the program for one year at the time the analysis was released, showed gains in math, while 38 percent saw declines, and 7 percent demonstrated no change. Reading results were similar, with 61 percent of schools showing gains, 34 percent seeing declines, and 6 percent of schools demonstrating no change.
Overall, schools in the first cohort saw a bump of 8 percentage points in math over the course of the two years, and 5 percentage points in reading. Cohort 2 schools, which were in the program for a shorter period, went up 2 points in math and just 1 point in reading.
Things unanswered: Big questions still loom, including the scope of these gains. Last year, the Education Department identified schools whose performance had jumped by double digits. But the way much of this data is presented, it鈥檚 impossible to tell whether particularly high-performing schools are pulling up the average for schools that didn鈥檛 do nearly as well.
It鈥檚 important to note that this data compares schools in different states, which all set different bars for what it means to be proficient. So as the Education Department explains, the 鈥渁verages鈥 will very much be influenced by states that set both relatively high or relatively low proficiency standards. And without more specific data, it鈥檚 impossible to draw more sophisticated conclusions about where these test score gains are coming from.
And the analysis doesn鈥檛 contain any information about a major chunk of schools that were in the first cohort. For instance, the summary data showing changes in math scores covers just 534 schools. But there were more than 730 schools in Cohort 1. According to the notes provided by the department, the missing schools had changes in state tests or other factors that excluded them from the analysis.
There鈥檚 also no school-level data available for the second year of the program. The department released individual school data for the first year, but you have to be the ultimate Excel geek to get it.
鈥淭his is just the tip of the iceberg on the information we really need,鈥 said Diane Stark Rentner, the deputy director of the Center on Education Policy, which has done extensive research on the School Improvement Grant program. It will be difficult to draw hard and fast conclusions about the three-year program鈥檚 effectiveness until there鈥檚 a third year of data, she added.
Plus, it鈥檚 unclear whether schools鈥 gains can be traced to the program itself, or to homegrown turnaround efforts already in progress, Rentner said.
Some background: The School Improvement Grant program required schools to choose from one of four turnaround models, all of which involved schools getting rid of the principal if that person had been on the job for more than two years.
Most schools choose the 鈥渢ransformation鈥 model, considered the most flexible, which called for schools to extend learning time and gauge teacher effectiveness by gains in test scores. Another group of schools opted for 鈥渢urnaround,"which required replacing 50 percent of a school鈥檚 staff. Other schools opted to 鈥渞estart鈥 - meaning that they turned themselves over to a charter-management organization. And a very small handful of schools closed altogether.
There were 835 schools tapped in the first cohort of the program, and another 503 in the second cohort. Schools received grants of up to $500,000 a year for three years.
So which models were most successful? That鈥檚 been a major question in SIG implementation. Preliminary results show that, at least for cohort 1 (the first schools into the program), the more dramatic interventions seemed to have yielded the biggest gains.
For example, cohort 1 schools that used 鈥渢ransformation鈥 improved about 6 points in math over two years, and 3 points in reading. Schools in the same cohort using the 鈥渢urnaround鈥 model jumped 11 percentage points in math and 6 in reading, over the same time period. And schools using the 鈥渞estart鈥 model gained 9 points in math, and about 7 in reading.
Of course, it鈥檚 notable that the more dramatic models were also the least frequently used. Roughly 74 percent of schools in the first cohort used 鈥渢ransformation,鈥 while just 16 percent used 鈥渢urnaround鈥 and 6 percent used 鈥渞estart.鈥
Political background: The SIG program is almost certain to undergo major changes if and when Congress reauthorizes the Elementary and Secondary Education Act. Congressional Republicans have moved to slash all funding for the program, and it wasn鈥檛 included in a bill to reauthorize the ESEA that passed the House with only GOP support earlier this year. And a bill supported only by Democrats that was approved by the Senate education committee in June would give states new turnaround options, including allowing them to submit turnaround ideas to the U.S. Secretary of Education for approval.
Rep. John Kline, R-Minn., the chairman of the House education committee and the author of the House GOP legislation, said in a statement that the data support the House鈥檚 approach to take turnarounds out of federal hands.
鈥淭hese tepid results underscore the limits of top-down mandates and the need for a new approach to education reform鈥攐ne that allows state and local leaders to determine the best way to raise the bar in our schools,鈥 he said.