Just last month, Mark Schneider wrapped up his six-year term as the director of the Institute of Education Sciences. At IES, he was charged with overseeing the nation鈥檚 education research efforts, including such well-known efforts as the National Assessment of Educational Progress and the What Works Clearinghouse. Before assuming his role at IES, Mark was a vice president at the American Institutes for Research, commissioner at the National Center for Education Statistics, and spent many years chairing the political science department at SUNY-Stony Brook. Having known Mark for many, many years, I was interested in his reflections on his tenure at IES. Here鈥檚 what he had to say. (This is Part One of a two-part interview, the second of which is scheduled to be published on Wednesday.)
鈥搁颈肠办
Rick: You鈥檝e just wrapped up your tenure as director of the Institute of Education Sciences. What鈥檚 the state of American educational research?
Mark: When I was a political science professor at Stony Brook University, fights over resources often pitted the 鈥渉ard sciences鈥 against the social sciences. At faculty meetings, I would use James March鈥檚 observation that 鈥淕od gave all the easy problems to the physicists.鈥 That would lead to plenty of eye-rolling from everyone but the social science chairs and faculty. But the fact is that humans have far more agency than electrons, and so many of our most beautiful and parsimonious models crash and burn when tested against real-world data. Our work is difficult, and we need to face the reality that most of our ideas, hypotheses, and interventions will not stand up to empirical reality. Indeed, IES-supported interventions have a 鈥渉it rate鈥 that hovers around 15 percent, meaning most things don鈥檛 work. Depressing, but that success rate is not much different from most other human-centered fields. Indeed, by some , only 10 percent of clinical trials succeed.
Over its 20-year history, IES has developed and perfected a standard 鈥渂usiness model.鈥 I summarize it as: 鈥淔ive years, five million dollars, failure,鈥 which reflects our standard grant length, funding level, and outcome. This may sound like an indictment of how IES has supported the education research 鈥渋ndustry,鈥 but we need to understand the reality we face. And we need to embrace the idea that 鈥渢he only failure in failure is the failure to learn.鈥
Rick: Given the challenges you鈥檝e just sketched, how have you sought to shift IES efforts?
Mark: I have tried to push IES to change its standard business model. Most notably, there are now a growing number of large learning platforms鈥擨ES鈥檚 among them鈥攖hat let researchers access large numbers of students, enable rapid-cycle experiments, and support more frequent replications. IES鈥檚 mission is to determine 鈥渨hat works for whom under what conditions.鈥 This can only be achieved by replicating work in different audiences as a routine part of our research. Of course, other sciences have supported replications for years. Many now face a 鈥渞eplication crisis,鈥 in which other research teams鈥攐r even the research team that conducted the original study鈥攃annot verify findings. We have not been pursuing replication work long enough to generate a replication crisis in our field, but I look forward to that day.
The field is also beginning to face the problem that even after we identify practices and programs that 鈥渨ork,鈥 we do not scale those interventions by moving from, say, 200 students in an experiment to 2,000 students in a school district to 2 million students in the nation. Without paying far more attention to scaling, we will never affect enough students to achieve our goal of creating a strong democratic citizenry capable of earning family-sustaining wages.
Rick: As director, what鈥檚 been your biggest frustration with the education research community?
Mark: Education science, like many disciplines, is overly affected by jargon and fads. We are an applied science charged by law with helping legislators, educators, parents, and students understand what we are saying and how to join us in implementing changes. But we often fail to translate our work into plain, accessible English. I have tried to impose several policies to make IES鈥檚 publications and reports more user-friendly鈥攆or instance, by imposing a 15-page limit on our reports. But perhaps the biggest indictment of the field is that the writing in education research reports, including IES鈥檚 own, is often amazingly poor.
鈥淪hort sentences. Strong verbs.鈥 That鈥檚 the approach I ask people to bring to their writing, yet we rarely hit the mark. Perhaps the problem is that so many IES staff and contractors have Ph.D.s, so they think a greater number of poorly crafted reports is better than fewer tightly focused ones. As a result, our journals drip with incomprehensible jargon and big words that signify little.
Rick: What kinds of changes could help on this count?
Mark: For several years, I have been pushing for the creation of a new center in IES focused on informed risk, high reward, and rapid-cycle research. Like the establishment of centers throughout the federal government modeled on the Defense Advanced Research Projects Agency, or DARPA, the new IES center鈥攃urrently called the National Center for Advanced Development in Education, or NCADE鈥攚ould breathe life into a large segment of IES鈥檚 research expenditures. The creation of NCADE would require an act of Congress.
We also need to continue updating the . The Standards for Excellence in Education Research lay out some best practices that would guide the education science field to produce stronger work, plus some principles that are specific to education science research. Should the next director continue to refine these, I believe that our field will be far better off than it was earlier in IES鈥檚 life鈥攁nd even more so when comparing the quality and the rigor of today鈥檚 work with that of the pre-2002, pre-IES world.
Rick: You argue eloquently for new research centers within IES. But IES already has four of them. What makes you confident that a new one, NCADE, would fare any better? Put another way, why can鈥檛 the existing centers just do the research that you鈥檇 ask NCADE to do?
Mark: This is the hardest question to grapple with when considering the need for NCADE. I sometimes think that if COVID hadn鈥檛 punched a two-year hole in my tenure, I might have been able to make enough changes in the culture and the practices of IES鈥檚 two existing research centers to alleviate the need for NCADE. But changes in personnel, practices, and especially culture in a federal bureaucracy, even a relatively small one such as IES, are among the most persistent challenges any reform-minded leader faces.
I can document many barriers that are just hard to overcome. It is not surprising that a time-honored solution to such inertia is to create a new bureaucracy in which modern practices and new personnel can be brought in to drive new initiatives. On an abstract level, do we really need NCADE? Maybe not. But if we want to modernize and streamline education science research, NCADE is, I believe, the best and fastest way to bring about changes the field鈥攁nd the nation鈥攏eeds.
Rick: You鈥檝e made a priority of accelerating the speed at which we release federal education data and research. What鈥檚 the problem and what鈥檚 holding up efforts to do better?
Mark: The lack of timeliness is the single most common complaint that IES fields year after year. Many of these complaints come directly from Congress. Indeed, when the Senate HELP Committee a reauthorization of the Education Sciences Reform Act, the senators added a new paragraph about the responsibilities of each commissioner with the same title: 鈥渢imeliness.鈥 Our research centers, for instance, are largely focused on measuring how students grow and learn鈥攚hich can take years. On top of that, we often fail to push for timely publications and product development once that research is complete.
Meanwhile, IES鈥檚 National Center for Education Statistics, or NCES, has other problems that slow down the release of its work. Officials at NCES argue that they simply don鈥檛 have enough staff to produce timely products. And maybe they don鈥檛. But they also do not seem to understand their audience鈥檚 need for access to fast and reliable information.
Researchers like Marguerite Roza, Emily Oster, and Sean Reardon are helping to fill in the gap by standing up work that NCES could and should do. The signature example is how slow NCES has been in releasing critically needed information on how schools and districts are spending money. There will always be a lag between data collection after a fiscal year ends until it can be released鈥攂ut the lag has often been years rather than months. I don鈥檛 think that鈥檚 a result of limited staffing. I think it鈥檚 a result of limited vision.
Part of my push for NCADE has been to break the mindset of too many IES staffers and to bring in new blood committed to identifying critical shortages in education research and pushing for more rapid solutions, including greater sensitivity to the needs of research consumers鈥攚hich would almost always mean faster turnaround.