School districts will soon have opportunities to compare and learn from each other鈥檚 methods of collecting and managing data through technology, when the lessons from one of the largest studies of district data practices are unveiled in June.
The study, which focuses on 71 districts of all sizes and demographics, is a joint project run by the Austin, Texas-based Data Quality Campaign and APQC, a nonprofit education and research group in Houston.
While some previous studies have examined technology and data management in individual school districts, this effort aims to draw universal lessons from all the districts studied, says Aimee Guidera, the director of the Data Quality Campaign.
鈥淎s we keep increasing the ability and capacity of states and districts to produce data, there鈥檚 an increasing hunger surrounding what to do with that data and how to create an environment to use that data to improve school processes and student achievement,鈥 Guidera says.
In the past, there have been small studies of district data technology and what Guidera calls 鈥渁necdotal islands of excellence,鈥 in which an isolated district or a school had success. But the goal of the new study, she says, is to be able to look closely at data and technology practices and extract information that can benefit all districts.
鈥淲e want to create a model for how districts can think about using data,鈥 she says. 鈥淚t鈥檚 moving us from these islands of success into a way to understand how to do this and transfer it to another environment, another district, another school.鈥
Daniel Grayson, a project manager at APQC, which is formerly known as the American Productivity and Quality Center, is guiding the research. Through interviews and other research, Grayson and others identified 鈥渂est practice鈥 districts that featured unique or innovative data practices. Some received visits from the research team and often from representatives of other districts looking to learn more for their own data efforts. The visits included conference calls and webcasts of presentations so that officials from other districts who could not attend could still gather knowledge and ask questions. APQC also asked districts to fill out a comprehensive survey about data practices.
Avoiding Pitfalls
All that information will be used to build a customized report for participating districts that is designed to assess their strengths and weaknesses in collecting and using data. They鈥檒l also be able to compare their techniques with those of other districts. Ultimately, the reports will be available online to all school districts, Grayson says.
In addition, a knowledge-transfer session is scheduled for July 9-10 in Houston, where representatives from each of the participating districts can take part in roundtable discussions that highlight tactics in data-driven decisionmaking.
鈥淭here鈥檚 a tremendous amount districts can learn from each other, especially districts at different places in terms of technology options,鈥 Grayson says. 鈥淭hey can find huge shortcuts or pitfalls to avoid.鈥
Mathew K. Fail is the director of quality for the 20,000-student Iredell-Statesville, N.C., school district, which was chosen as a 鈥渂est practice鈥 district. Over a period of several years, the district has developed a vast data warehouse and has worked with a vendor to customize software to allow the system to collect and analyze data, and present it to those who need it, in a way that is tailored to the district and the user.
District data are presented in a way that鈥檚 easy to understand for teachers and administrators and doesn鈥檛 require much additional calculation. The district is working to put all school and student data collected into one system that can sort and analyze everything from demographic data, to the results of predictive assessments鈥攚hich help determine whether students are struggling with subject matter before being tested鈥攖o attendance and state-testing data.
鈥淚t鈥檚 a growing journey with that particular piece of software, but it鈥檚 been a wonderful tool,鈥 Fail says.
The Iredell-Statesville district also has been able to cut the amount of time it takes to get the results of district assessments of students to teachers to just a few days. Teachers then can evaluate the data and change their teaching styles and methods accordingly鈥攁nd much faster.
鈥淲hen we first started doing predictive assessments, two weeks later the teachers would get all the information back and have to do a lot of manipulation of the data and then analyze their strategies,鈥 Fail says. The current system, he points out, returns the data, already analyzed for the most part, within four days.
But Fail says that while building the system, there were steps he would have taken differently. He hopes other districts can learn from his district鈥檚 journey.
For example, he says, Iredell-Statesville learned over time that customizing software or processes was crucial; now the district has a policy that it will not work with a key vendor if the vendor is not willing to customize and work collaboratively.
鈥淚f we had set that standard up front, it would have made it much easier,鈥 Fail says. 鈥淲e learned that when you鈥檙e beginning to work with the vendor, we should have been demanding up front.鈥