When an ed-tech product doesn鈥檛 get much traction in the school district that bought it, someone inevitably asks: What went wrong?
The EdTech Genome Project last year started to tackle that problem to help educators and district administrators make better-informed decisions about education technology.
The 鈥済enome鈥 project鈥攁 collaborative effort involving more than 100 education research and advocacy organizations鈥攊s mapping the many factors that influence the outcome of an ed-tech product鈥檚 usage.
Ultimately, they identified about 80 variables. And recently, they winnowed the field to what they deemed to be the 10 most worthy of more study this year. Those implementation factors are:
- Adoption plans
- Competing priorities
- Foundational resources, such as technology and financial resources and operational tech support
- Implementation plans
- Professional development/learning and support
- School or staff culture
- Support from school and district administration
- Teacher agency or autonomy
- Teachers鈥 beliefs about technology/self-efficacy and technological pedagogical-content knowledge
- Vision for teaching and learning with technology
Each variable is being evaluated further by a working group that will build consensus about how to define and measure them those factors, said Bart Epstein, the president and CEO of the Jefferson Education Exchange, which is leading the project. JEX is a nonprofit affiliated with the University of Virginia Curry School of Education and Human Development.
These variables that make or break implementation echo comments Johns Hopkins University has heard in its longitudinal studies of school districts鈥 ed-tech initiatives, said Steven M. Ross, a senior research scientist and professor at the Center for Research and Reform in Education there.
Although Johns Hopkins has not studied the same 10 factors, many of them have been mentioned as important by teachers and administrators in the center鈥檚 ongoing work with such districts as 111,000-student Baltimore County in Maryland and 188,000-student Fairfax County in Virginia, he said.
鈥淎ll of those factors are valuable to look at,鈥 said Ross, 鈥渂ut, ultimately, the question is, 鈥榃hat raises achievement and what doesn鈥檛?鈥欌
What the Research Reflects
Having good implementation doesn鈥檛 necessarily mean math scores will go up, he said. 鈥淭hat may depend on the teacher.鈥
Epstein said the genome project identified the variables by reviewing academic research that showed they could be responsible for the success or failure of various interventions. Some studies found that it was impossible to get statistically significant data because a product or program wasn鈥檛 used enough, he said.
Among the working-group members trying to get more clarity by studying specific variables are representatives from school districts, research organizations, nonprofits, and companies across the country.
Eventually, the collaborating organizations plan to publish an implementation framework for educators, companies, and other interested parties late this year.
鈥淓ducation is overdue to do what every other industry has done, which is to develop shared instruments of measurement and language to describe how the tools of their trade are applied,鈥 said Epstein.
The lack of a common language and means of monitoring progress has created a challenge for teachers and administrators alike, according to Epstein.
鈥淥ne thing that keeps coming up over and over is how much frustration there is in education when we have to rely on anecdotes,鈥 he said. 鈥淲hen you go into a school and ask, 鈥楬ow much agency and authority do teachers have about the decision to bring in technology?,鈥 you鈥檙e almost sure to get anecdotes.鈥
The point of conducting this research is to 鈥渉elp schools understand how ready they are鈥攐r are not鈥攆or different products, programs, and policies,鈥 said Epstein. 鈥淒istricts across the U.S. are spending billions of dollars on well-intentioned efforts that fail because they don鈥檛 have the data to understand that their environment is not conducive to the success of what they鈥檙e buying.鈥
Did Specific Tools Work?
Ross commended the project鈥檚 intention to help districts conduct self-evaluations in areas they want to improve, and, he said the variables may be a useful checklist with 鈥済ood discussion points鈥 for educators and administrators alike. But he said they should have 鈥渞easonable expectations鈥 about outcomes since 鈥渢he jury is out as to what degree these tools and resources are actually used to make changes.鈥
Meanwhile, plans are underway for the next phase of study on implementation.
Next year, JEX expects to develop a platform that will allow districts to document the technologies in their classrooms, how they were chosen, and what the implementation is like. 鈥淚n the interim, we鈥檙e collecting research manually and paying cash stipends to large numbers of educators in return for their taking the time to provide this information,鈥 Epstein said.
The teachers document their experiences and perspectives to help ed-tech decisionmakers and policymakers understand and explain why any given ed-tech tool can work in some environments but not in others.
鈥淭he goal,鈥 Epstein said, 鈥渋s to avoid educators鈥 buying things that won鈥檛 work.鈥