Includes updates and/or revisions.
A new by an independent panel of experts concludes that the U.S. Department of Education鈥檚 much-criticized is doing a 鈥渞easonable job鈥 of reviewing and rating the research evidence on the effectiveness of programs and practices in education.
Created in 2002 by the Institute of Education Sciences, the department鈥檚 primary research arm, the clearinghouse has come under fire from policymakers, researchers, and practitioners, who question its usefulness and methods. Some have dubbed it the 鈥渘othing works鈥 clearinghouse because of the limited number of programs and studies that meet its strict screening standards.
But in its report, which was posted online Nov. 19, the six-member panel contends that the clearinghouse鈥檚 review standards are 鈥渨ell documented鈥 and 鈥渞easonable.鈥 The study further characterizes the reports that the clearinghouse produces as 鈥渟uccinct and meaningful.鈥
Noting that their review focused narrowly on whether the clearinghouse makes judgments that are scientifically valid, the panelists also call, however, for a fuller look at the entire mission of the enterprise.
Both Grover J.鈥淩uss鈥 Whitehurst, the outgoing director of the IES, and Robert C. Granger, who chairs the national board that advises the research agency, called the panel鈥檚 findings reassuring.
鈥淚n a marketplace that is unsophisticated with regard to research quality, there has to be an entity that uses rigorous standards to vet research on education program effectiveness for practitioners and policymakers,鈥 Mr. Whitehurst wrote in a letter accompanying the report.
Mixed Reactions
Members of the research community offered mixed reactions.
鈥淔or me, the key question is how usable is the information that the clearinghouse produces,鈥 added James W. Kohlmoos, the president of the Knowledge Alliance, a Washington group that represents research organizations. 鈥淲e haven鈥檛 answered that yet.鈥
Robert E. Slavin, a researcher and co-founder of the Baltimore-based Success for All Foundation, concurred. 鈥淭he panel acknowledged that it was given too little time and too narrow a mandate to adequately evaluate the WWC,鈥 Mr. Slavin, who has been an outspoken critic of the clearinghouse, said in an e-mail.
Begun in late July, the review was commissioned by the National Board for Education Sciences, the presidentially appointed panel that advises the IES.
Mr. Granger, the board鈥檚 chairman, said the House Appropriations panel that oversees education programs called for a more comprehensive investigation by the U.S. Government Accountability Office, the watchdog agency for Congress. On the Senate side, though, appropriators requested a more focused look at the scientific validity of the clearinghouse鈥檚 review procedures.
鈥淭here was just not enough time with the current board to take on a broader review,鈥 Mr. Granger said, noting that the terms of five members of the board, him included, are due to end later this month. While replacement members have been named, their nominations are not likely to be approved by the Senate before the new administration takes control of the White House in January. Currently, only 11 of the 15 slots on the board are filled.
鈥淭he focus is narrow by design,鈥 added Mr. Granger, who is the president of the William T. Grant Foundation in New York City. 鈥淚t is the crux of the matter. If you can鈥檛 trust the information that鈥檚 on the What Works site, then you can鈥檛 trust anything else it does.鈥
Changes Recommended
Mr. Granger, working with staff members at the IES, helped select the review-panel members, most of whom are experts at synthesizing research findings in fields other than education.
The panelists are: C. Hendricks Brown, a biostatistician at the University of South Florida in Tampa; David Card, an economist at the University of California, Berkeley; Kay Dickerson, an epidemiologist at Johns Hopkins University in Baltimore; Joel B. Greenhouse, a biostatistician at Carnegie Mellon University in Pittsburgh; Jeffrey R. Kling, an economist at the Brookings Institution, a think tank in Washington; and Julia H. Littell, a professor of social work and social research at Bryn Mawr College in Bryn Mawr, Pa.
Besides calling for a more comprehensive study of the clearinghouse, the panel also urged the research agency to set up a process for regular reviews of the clearinghouse standards鈥攁n action that Mr. Whitehurst said the IES would undertake.
Among its other recommendations, the panel called on the clearinghouse to: review its standards regarding the attrition rates of subjects who take part in experiments; establish a formal process for tracking potential conflicts of interest in the studies it reviews, especially when program developers pay for studies of their own programs; and take another look at the standards it uses to account for cases when subjects fail to comply with the intervention being studied or when intervention practices cross over from the experimental to the control group.