澳门跑狗论坛

Curriculum What the Research Says

Picking 鈥楨vidence-Based鈥 Programs: 5 Mistakes for Educators to Avoid

By Sarah D. Sparks 鈥 January 19, 2024 6 min read
Conceptual image of magnifying glass and rating/grades.
  • Save to favorites
  • Print
Email Copy URL

With hundreds of education programs and interventions purporting to be 鈥渆vidence based,鈥 and dozens of websites and clearinghouses that compile thousands of studies and evaluations, it鈥檚 easy for practitioners to become overwhelmed by the sheer amount of information to sift through to figure out what will best help their students.

It鈥檚 understandable that teachers, as well as district and school leaders, get frustrated when studies and clearinghouses seem to come to different conclusions about a given program.

But hoping for a single seal of approval is 鈥渘ot realistic,鈥 said Jonathan Jacobson, the the branch chief for the , , and at the National Center for Education Evaluation and Regional Assistance.

See Also

photograph of a magnifying glass on an open book
Valiantsin Suprunovich/iStock

The sometimes conflicting information is akin to the range of reviews consumers encounter for all kinds of products, Jacobson said.

鈥淲hen I鈥檓 looking to buy a new car, and I read different magazines that have reviewed the car, the results or characterizations are not going to be identical, even though the vehicle might be the same,鈥 Jacobson said. 鈥淭hat doesn鈥檛 mean that [evidence-based decision-making] isn鈥檛 scientific, but it鈥檚 not as simple as solving a math problem where everyone should get a single right answer.鈥

With that in mind, here are five pitfalls for educators to avoid when using research to choose evidence-based programs.

1. Equating research quality with program quality

Research clearinghouses have become a popular way for educators and leaders to review a lot of studies on a subject quickly, but Jacobson warned, 鈥渢here鈥檚 a distinction between the quality of the research and the quality of the program or the practice or the intervention.鈥

鈥淛ust because studies are high-quality, that doesn鈥檛 mean they will show favorable results鈥 for a product or program, he said.

A randomized controlled trial鈥攃onsidered the 鈥済old standard鈥 in research, might show that an intervention worked or didn鈥檛, but it won鈥檛 necessarily give insight into why that happened.

Educators shouldn鈥檛 鈥渁ssum[e] that the What Works Clearinghouse endorses a particular program or product simply because we report and confirm favorable findings,鈥 Jacobson said referring to the U.S. Education Department鈥檚 education research review site. 鈥淲e鈥檙e not endorsing that; we are simply characterizing the strength of the research and reporting whatever findings we were able to confirm with our standards or procedures. Educators need to make their own decisions.鈥

2. Taking 鈥榥o effect鈥 for a conclusive answer

When a study finds that an intervention has 鈥渘o effect,鈥 it means that students who used the intervention performed equally well as similar students who did not. This comparison group might be using a different intervention, or just whatever they would normally do in school, often dubbed 鈥渂usiness as usual.鈥

Finding no effect 鈥渄oesn鈥檛 necessarily mean that is a bad finding,鈥 said Erin Pollard, an education research analyst with IES鈥檚 knowledge utilization division. 鈥淚f you have a very expensive program as your business-as-usual, and you鈥檙e [testing] a less expensive program and there鈥檚 no difference, that could be a good thing.鈥

3. Looking only at the summary (or rating)

It can be tempting to just skim the abstract of an evaluation study to determine whether the program was effective鈥攐r, similarly, to look at the overview given of an intervention on a clearinghouse鈥攂ut looking only at a summary can give educators the wrong idea about a program, researchers said.

That鈥檚 because evaluations often measure several different outcomes in a variety of different areas, from specific academic subjects to social-emotional outcomes like motivation. The What Works Clearinghouse alone can include more than 70 different outcomes in its reports. It鈥檚 critical to dig into these findings.

鈥淪ometimes users will take the high-level summary, like the evidence tier of the whole study and intervention, but not look at the individual outcome domains,鈥 IES research scientist Betsy Wolf said. 鈥淚t might be that the study has evidence of a positive finding [in the summary], but not in the outcome domain that people really care about for that intervention.鈥

For example, Pollard pointed to one What Works Clearinghouse review of a . The summary shows a 鈥減romising鈥 rating for having at least one positive result in a rigorous study.

An intervention rating in the What Works Clearinghouse includes several different measures.

鈥淧eople might look at it and say, 鈥榃ell, we should do this reading intervention?鈥欌 Pollard said. 鈥淲ell, you can, but actually the effects were on math.鈥

The study had assessed the children on early word identification, but also early number skills and two tasks measuring self-regulation, such as a 鈥渉ead, shoulders, knees and toes鈥 game. Of all of those, students participating in the intervention only performed better than the control group of students in math.

Even within a given area, it鈥檚 important to look at the details of how an outcome is being measured. Wolf noted that researchers鈥攁nd the WWC鈥攈ave expanded the methods they use and ways they look at things like a program鈥檚 effects on teachers.

鈥淚n the past we had teacher practice, we had teacher retention and turnover鈥攙ery observable things鈥攂ut we didn鈥檛 have outcomes like how teachers were perceiving their jobs or their self-efficacy. So we have that now,鈥 Wolf said.

4. Focusing too much on effect size

To ensure that a study can isolate the effects of, say, a reading program, researchers try to tightly control how it is put into practice. They may give several days of teacher training or provide their own tutors. They may create and use particular lesson plans and manipulatives. All of these implementation pieces can take time and money.

鈥淥ne concern is that people might pick an intervention that has the largest effect size, and not think about the resources needed for implementing that,鈥 said Liz Eisner, IES鈥檚 associate commissioner for knowledge utilization.

When choosing an intervention, Eisner advised teachers and leaders to balance its benefits with the effort of implementation: the materials, staff, and training, as well as how much of the intervention students need to see results. A cheap, 5-minute intervention that boosts background knowledge before a new lesson and a yearlong, intensive tutoring program can both benefit students, but they each require very different levels of investment from schools.

5. Forgetting whom the program serves

The most 鈥渆vidence based鈥 program in the world won鈥檛 be much help if it doesn鈥檛 help the students an educator needs to serve. Eisner said educators should remember to review the demographics, grades, and other aspects of students in a study鈥檚 treatment and control groups.

鈥淔or example,鈥 she said, 鈥渋f a school that is trying to improve math achievement serves mostly English-learner students, but the intervention of interest has not been studied in schools with many ELs, the staff may decide that they want to identify a different intervention that has some empirical evidence of positive outcomes with EL students.鈥

That goes for students in the comparison group, too, Jacobson said.

鈥淚t鈥檚 important to understand what was the intervention being compared against. For which population of students? For which outcomes?鈥 he said. 鈥淚t鈥檚 really important, I think, for decisionmakers to be thoughtful about what applies to them.鈥

See also

Evidence-Based Practice
July 9, 2020

Events

Artificial Intelligence K-12 Essentials Forum Big AI Questions for Schools. How They Should Respond鈥
Join this free virtual event to unpack some of the big questions around the use of AI in K-12 education.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM鈥檚 Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Curriculum 7 Curriculum Trends That Defined 2024
From religious-themed mandates to reading to career prep, take a look at what EdWeek covered in curriculum in 2024.
9 min read
Student with books and laptop computer
iStock/Getty
Curriculum Inside a Class Teaching Teens to Stop Scrolling and Think Critically
The course helps students learn to determine what鈥檚 true online so they can be more informed citizens.
9 min read
Teacher Brie Wattier leads a 7th and 8th grade social studies class at the Inspired Teaching Demonstration School for a classroom discussion on the credibility of social media posts and AI-generated imagery on Nov. 19, 2024 in Washington, D.C.
Teacher Brie Wattier leads an 8th grade social studies class at the Inspired Teaching Demonstration School for a classroom discussion on the credibility of social media posts and AI-generated imagery on Nov. 19, 2024, in Washington, D.C.
Courtesy of Dylan Singleton/University of Maryland
Curriculum Inside the Effort to Shed Light on Districts' Curriculum Choices
Few states make the information easily searchable.
4 min read
Image of a U.S. map with conceptual data points.
iStock/Getty
Curriculum Texas Students May Soon Be Reading Bible Stories in English Classes
The state has advanced a controversial curriculum that includes Christian teachings in K-5 lessons.
5 min read
A Texas flag is displayed in an elementary school in Murphy, Texas, Thursday, Dec. 3, 2020.
A Texas flag is displayed in an elementary school in Murphy, Texas, in 2020.
LM Otero/AP