States scrambling to come up with more nuanced ways to measure school quality under the new federal K-12 law are running smack into an old problem: how to make sure they have the right data.
The Every Student Succeeds Act requires that states鈥攊n addition to using English-language proficiency, graduation rates, and scores on statewide achievement tests鈥攁dd at least one new indicator of school quality or student success, such as school climate, chronic absenteeism, discipline, or college and career readiness.
For many states, adding that new indicator may mean spending more on data systems and collection, avoiding approaches that might demand too much of a data lift, or picking something off the shelf rather than crafting a more challenging indicator, because the information isn鈥檛 easily available.
Complicating the matter, the law requires that the data for the new school-quality indicator must be valid, reliable, and comparable across districts, and that officials be able to break out the information by student demographics.
That presents a challenge for state education agencies that want to pick indicators that use classroom observations or teacher and parent surveys to measure schoolwide indicators. Those might include whether parents feel engaged or if teachers are participating in effective peer-mentor programs, for example.
鈥淗ere鈥檚 a great opportunity for departments to innovate, and they鈥檙e being placed right back in a box,鈥 said Mark Elgart, the president and chief executive officer of AdvancEd, a group that鈥檚 consulted with education departments to help them create new accountability systems.
But many consultants working with state departments are advising that they not let data-collection issues impede innovation.
鈥淚f something鈥檚 not feasible to collect, you have to treat it as an implementation issue,鈥 said Joanne Weiss, who was a chief of staff to former U.S. Secretary of Education Arne Duncan and who currently consults with state education departments. 鈥淭hat doesn鈥檛 mean it鈥檚 not an important indicator that shouldn鈥檛 be included in the system.鈥
Tough Choices
The U.S. Department of Education in June issued its proposed regulations for states putting together new accountability systems under ESSA, which is due to go fully into effect in fall 2017. The draft repeats the law鈥檚 requirements for four mandated academic indicators, as well as for the new 鈥渇ifth indicator鈥 of school quality or student success.
As states revamp their accountability systems under the Every Student Succeeds Act, some are wrestling with what it will take to collect the data needed to incorporate new indicators into those systems.
CALIFORNIA
Indicators: Considering indicators in four areas: chronic absenteeism; suspension rates; college and career readiness; school climate.
Time Frame: Data collection starts in September; system rolls out fall of 2017.
Data Specifics: Work needed to collect clear, consistent, up-to-date data across districts in certain areas.
CONNECTICUT
Indicators: Adopted five indicators: attendance/chronic absence; college and career readiness; postsecondary entrance; physical fitness; arts access.
Time Frame: The state plans to submit new accountability system to U.S. Department of Education after making minor tweaks to comply with the recently released regulations.
Data Specifics: State for several years has collected all the data required to measure the indicators picked for the accountability system.
SOUTH CAROLINA
Indicators: Refining indicators in four areas: elementary school readiness; middle school readiness; high school readiness; college and career readiness.
Time Frame: End of summer to refine indicators, accountability system to roll out at the start of the 2017-18 school year.
Data Specifics: New, centralized data-management system estimated to cost more than $1 million. The system will flag course work schools offer and how students perform in those courses.
Sources: California, Connecticut, South Carolina Departments of Education
In the meantime, many states already are wrestling with whether to pick a school quality indicator that is ideal and ambitious, versus one that is practical and safe, with data collection and analysis a major factor.
鈥 California鈥檚 education department has pushed back against aggressive efforts by parent and advocates to measure school climate, an indicator that officials say they don鈥檛 yet have enough reliable information to measure.
鈥 Connecticut鈥檚 education department rejected proposals to add civic engagement to that state鈥檚 accountability system鈥攁n indicator that would require collecting new data.
鈥 And South Carolina officials, not wanting to trample on the state鈥檚 accountability task force鈥檚 imagination, will spend more than $1 million to measure school and career readiness as part of its new accountability system. 鈥淲e know we鈥檙e going to be collecting significantly more data with this new system,鈥 said Sheila Quinn, South Carolina鈥檚 deputy schools superintendent.
One big issue: whether states and districts are able to retrofit their data-collection systems to answer new and increasingly difficult questions, a potentially arduous and expensive task.
For many measures, state officials say they lack the infrastructure to collect enough reliable information to attach high stakes. Many districts鈥 data-collection sytems are scattershot and outdated. Scores of technicians responsible for processing data have been laid off in recent years amid budget cuts. And local superintendents have complained that they鈥檙e already required by states to collect an inordinate amount of data.
The details are daunting. Scott Norton, the Council of Chief State School Officers鈥 strategic-initiative director for standards, assessment, and accountability, said pulling all the right data together requires syncing districts鈥 systems, then coding those systems to collect the right information.
Some data points, such as whether a student is a foster child or part of a military family, are pretty straightforward. But others鈥攕uch as how students feel about a school鈥檚 climate or whether teachers are receiving a certain amount of professional development鈥攎ay require a bevy of surveys that then must be manually entered into the database.
As a result, many education departments, depending on their capacity, will consider outsourcing the work or paying millions of dollars to purchase entire new systems, consultants say.
Student-Level Information
There鈥檚 also the sheer volume of information. School districts today collect hundreds of thousands of data points about children that are often stored in large data warehouses. Students track their academic progress in data binders, teachers tweak their curriculum based on rapid-fire online quizzes, and principals tally office referrals to craft new discipline procedures.
Against that backdrop, Brennan McMahon Parton, the Data Quality Campaign鈥檚 associate director for state policy and advocacy, has traversed the country in recent months urging state education departments and lawmakers to evaluate data they already collect before deciding to collect more as they weigh new school quality indicators.
鈥淢any states have meaningful and useful data in their system already,鈥 she said. 鈥淭hat鈥檚 not to say with a push of a button, you get what you need.鈥
In Connecticut, more than two-thirds of local superintendents said in a 2012 survey that the amount of data the state requires them to collect was duplicative, burdensome, and costly. That year, Democratic Gov. Dannel Malloy signed an education bill that tasked the state department to reduce by a third the number of data forms districts annually fill out.
So when the education department formed a task force two years ago to construct a new accountability system, superintendents and the agency pledged that any new indicators would have to be based on information the department already collected.
鈥淥ftentimes, when the state asks for new data, we tell them we already have it or we鈥檝e been giving it to you in other ways,鈥 said Joseph J. Cirasuolo, the executive director of Connecticut鈥檚 superintendents association. 鈥淯sually, it鈥檚 not where it has to be.鈥
In the end, Connecticut decided to add access to arts courses, chronic absenteeism, career readiness (based on students鈥 performance on the state鈥檚 achievement test, SAT, ACT, Advanced Placement, or International Baccalaureate tests), schools鈥 college-entrance rates, and three new ways to measure graduation rates to its accountability system.
Big Price Tag
In South Carolina, the task force designated to come up with a new accountability system decided to collect information on elementary, middle and high school readiness and career readiness. That state鈥檚 districts all collect data using separate systems, many with different contractors. Definitions of indicators such as chronic absenteeism or what qualifies as a suspension vary widely.
In order for South Carolina to measure the new indicators, the department will spend more than $1 million to buy a new collection system that pulls data points from each district鈥檚 systems.
鈥淲e collect attendance, but the question is: What is the quality of the attendance data that we receive?鈥 asked Daniel Ralyea, the director of the state education department鈥檚 office of research and data analysis. 鈥淚 can aggregate it at the state level, but what happens is, in practice, elementary schools may not be as concerned with recording attendance as high schools are.鈥
And in California, the debate over whether to use school climate as an indicator involves such factors as classroom observations and a host of student, parent, and teacher surveys.
鈥淲e think (those surveys) are used best at the local level,鈥 said Keric Ashley, the deputy superintendent of California鈥檚 education department, pointing out that the data are prone to errors.