Clarification: An earlier version of this article gave an incorrect title for Nadya Chinoy Dabby. She is the assistant deputy secretary for the office of innovation and improvement. In addition, an earlier version of this story left out identifying information for Michele McLaughlin, the president of the Knowledge Alliance.
The federal Investing in Innovation program, or i3, test-drives and scales up promising ideas in education. Here鈥檚 an introduction to the program.
What is Investing in Innovation?
The Investing in Innovation, or i3, program was created under the American Recovery and Reinvestment Act of 2009, better known as the financial stimulus. The program was intended to help test-drive, investigate, and scale up promising ideas in school districts working with their nonprofit partners. To date, i3 has received more than $1.3 billion in federal funding for 157 projects. After the initial investment of nearly $650 million, the program has received smaller sums each year and dropped to about $120 million for fiscal 2016. Recipients ranged from well-known players, such as Teach For America and KIPP, to small school districts, such as Virginia鈥檚 Albemarle County schools.
What are the grants?
There are three basic types of grants that vary in size based on how much evidence a program has to back up its approach. Some grantees鈥攕uch as the Building Assets-Reducing Risks program鈥攈ave changed tiers in multiple grants.
From year to year, the U.S. Department of Education also prioritized different special areas for grants, including science, technology, engineering, and math education; and teacher effectiveness.
Development: The smallest鈥攁nd at 105 to date, the most prevalent鈥攇rants test promising practices with minimal formal research evidence. Single grants have ranged from about $2 million in recent years to about $5 million when i3 was first created. These grants require the lowest level of evidence, and may even include a strong correlational study.
Validation: These are the midrange grants, to further investigate programs backed by a moderate level of evidence. In the early rounds of i3, single grants could be as large as $30 million, but more recently, they鈥檝e hovered around $12 million. To qualify, a program has to have at least one experimental or quasi-experimental study that meets the federal What Works Clearinghouse standards鈥攁lthough it could have had a small sample size, compared groups that were not entirely equivalent, or for some other reason could not be able to be generalized to other schools or students鈥攐r have high-quality correlational research that controlled for selection bias and other factors. So far, there have been 43 validation grants.
Scale Up: The largest and most rigorous grants are intended to help proven programs expand quickly. As in the other grant levels, the biggest scale-up grants鈥攁s large as $50 million鈥攃ame early, when the i3 program was flush with cash. Since then, scale-up awards have shrunk, with the most recent round of grants averaging $20 million. To qualify for a scale-up grant, a program has to have the highest level of research evidence, including multiple experimental or quasi-experimental studies, or one large, multisite randomized controlled trial study, that meets federal What Works Clearinghouse standards. That evidence level has proven to be a high bar: There have been only nine scale-up grantees so far.
What were the initial goals of i3?
Investing in Innovation was initially designed to 鈥渃reate an innovation pipeline鈥 in K-12 education, said Jim Shelton, a former deputy secretary in the Obama administration and an architect of i3. That鈥檚 something other sectors鈥攍ike technology鈥攖ake for granted. The Education Department required grantees to match the grant with contributions from private sources.
Another 鈥渘ot so subtle goal,鈥 Shelton says: to fill the What Works Clearinghouse, a federal treasure trove of strategies backed by strong evidence of effectiveness.
For the Obama administration, i3 was one of the first of a group of six federal programs designed to experiment with ways to bring more evidence into social programs. Variations on the i3 evidence model are also part of the Education Department鈥檚 First in the World grants to improve higher education. 鈥淚nnovation looks different in different sectors and evidence looks different,鈥 said Nadya Chinoy Dabby, the assistant deputy secretary for the Education Department鈥檚 office of innovation and improvement.
How has i3 changed under the Every Student Succeeds Act?
i3 is the only one of the big competitive education grants that the Obama administration launched during the stimulus to be mostly sustained in ESSA, the latest edition of the federal government鈥檚 centerpiece education law, the Elementary and Secondary Education Act. Signed into law late last year, ESSA includes a 鈥渟uccessor鈥 to i3 called the Education Innovation and Research program.
EIR is similar to i3 in that it includes three different types of grants for proposals with successively more evidence to back them up. There are some new twists, however. For instance, under i3, school districts could partner only with nonprofit groups, but EIR will allow schools to collaborate with for-profit businesses. And under EIR, states鈥攏ot just districts鈥攚ill be eligible to receive the grants, said congressional aides who worked on creating the program. EIR places a greater emphasis on the research priorities of the field in setting priorities for grants, as opposed to i3鈥檚 annual changing priorities. There鈥檚 also a greater focus on 鈥渄evelopment鈥 and other early-stage grants. EIR also puts more of a premium on programs proceeding through the grant phases鈥攁s BARR and North Carolina New Schools/Breakthrough Learning have in i3鈥攁lthough that鈥檚 not required.
Grants awarded to date: 157
Initial money awarded: Nearly $650 million
Total federal money awarded: More than $1.3 billion
Total private matching contributions: More than $200 million
Source: U.S. Department of Education
What are i3鈥檚 greatest successes?
i3鈥攐r the basic idea of a program aimed at exploring evidence-based practices and driving innovation鈥攊s probably one of the most politically popular of the Obama administration鈥檚 K-12 initiatives. Interest in the program was high from the get-go, with nearly 1,700 applications for the first round of grants and more than 4,000 applications total.
鈥淲e鈥檝e built an infrastructure that supports educators innovating, and educators are instinctively innovative and want to solve challenges. 鈥 This program at its core is about helping people learn about their work,鈥 said the Education Department鈥檚 Chinoy Dabby.
The initiative also ignited a burst of private and philanthropic interest in direct partnerships with districts. A dozen large national education philanthropies developed a registry of funding opportunities worth more than a half billion dollars to help applicants find matching money, and private venture capitalists met with runners-up to the first i3 winners. In its fiscal 2016 budget request, the Education Department said the evaluations of all its validation and scale-up grants and a majority of its development grants were on track to meet What Works Clearinghouse standards, 鈥渟o that the largest investments are based on the highest level of rigor regarding efficacy.鈥
The final evaluations of the first cohort of i3 programs have started to roll out. And the Education Department expects to release a cross-cutting analysis of the findings across all the completed i3 grants by the end of 2016.
鈥淚t鈥檚 really important for us as a sector to be comfortable about taking on risks,鈥 Chinoy Dabby said. On average, venture capitalists expect about 1 in 4 investments to produce significant results, she said, and, 鈥渂y our counts, we are scheduled to far exceed that benchmark.鈥
In the end, survival may be the program鈥檚 most important success. While two other programs that were developed through the stimulus鈥攖he School Improvement Grant program and Race to the Top鈥攈ave gone by the wayside, i3 has endured, and a version of the program has been enshrined in the Every Student Succeeds Act.
What have been the program鈥檚 biggest challenges?
In the first year, the Federal Education department required grantees to get a 20 percent private sector match for their federal grants, but many winners were scrambling right up until the deadline to secure those funds.
And scoring anomalies in the program have been questioned; different peer reviewers can have very different rating systems, particularly if they are examining proposals from different topic areas.
What鈥檚 more, prospective applicants didn鈥檛 always have a firm grasp of the evidence levels needed to apply. The department did a lot of outreach鈥攊ncluding a series of webinars鈥攁nd ultimately did a preliminary review of grants.
What have been some of the criticisms of i3?
When the first round of grantees was rolled out, some in the education community questioned whether i3 had actually found 鈥渋nnovation,鈥 since some of the biggest recipients鈥攊ncluding the Knowledge Is Power Program charter network and Teach For America鈥攈ad been around for a decade or more.
But Chinoy Dabby and Shelton said the department wanted to help the most-effective programs have a bigger impact. Programs got 鈥渁 lot of money for a lot of evidence, that was the whole point,鈥 Shelton said in an interview.
Some districts have complained that the application takes a lot of time and a skilled grant writer to complete. That鈥檚 particularly tough on rural applicants, advocates have said. And it鈥檚 unclear if the department鈥檚 efforts to alleviate this problem helped matters. At least when i3 first launched, it gave a competitive edge to rural projects. But a 2011 report from the Rural Trust, an advocacy group, noted that many of the organizations that won weren鈥檛, in their view, authentically rural, even though they served rural districts.
When i3 was first rolled out, the department accepted applications addressing a range of areas鈥攆rom teacher quality to special education. But in subsequent years, the department sought to put its own emphasis on different iterations of the program, asking for applications that addressed college-readiness, for instance, or STEM education.
That made scoring easier, since peer reviewers were looking at applications along a similar theme, but it also spurred pushback from some quarters, said Michele McLaughlin, the president of the Knowledge Alliance, which advocates for education research groups. 鈥淭here was this feeling that the administration started getting too heavy-handed with the priorities,鈥 she said.
What could be i3鈥檚 impact going forward?
Knowledge gained from i3 could have a new life, now that ESSA is the law of the land. The new law shies away from federally dictated models for improving schools that are struggling鈥攅ither with a certain group of kids, such as students in special education鈥攐r the student population as a whole. But states, districts, and schools must employ 鈥渆vidence-based鈥 strategies to fix persistent problems. Shelton, for one, wants to see states and districts consider using their federal funds for strategies that have a proven track record thanks to i3.
A version of i3鈥檚 tiered-evidence framework also became the formal definition of evidence for programs under ESSA as a whole, and certain types of programs, such as those intended to improve teacher effectiveness or turn around the lowest-performing 5 percent of schools.