The use of analytic tools to predict student performance is exploding in higher education, and experts say the tools show even more promise for K-12 schools, in everything from teacher placement to dropout prevention.
Use of such statistical techniques is hindered in precollegiate schools, however, by a lack of researchers trained to help districts make sense of the data, according to education watchers.
Predictive analytics include an array of statistical methods, such as data mining and modeling, used to identify the factors that predict the likelihood of a specific result. They鈥檝e long been a standard in the business world鈥攂oth credit scores and car-insurance premiums are calculated with predictive analytic tools. Yet they have been slower to take hold in education.
鈥淪chool districts are great at looking annually at things, doing summative assessments and looking back, but very few are looking forward,鈥 said Bill Erlendson, the assistant superintendent for the 32,000-student San Jos茅 Unified School District in California. 鈥淐onsidering our economy survives on predictive analytics, it鈥檚 amazing to me that predictive analytics don鈥檛 drive public education. Maybe in education it鈥檚 considered a luxury, but it shouldn鈥檛 be; it should be a foundation for making decisions.鈥
Experts in predictive analytics in higher education and business say education may have a long way to go to develop the data infrastructure and staff capacity to make the tools useful on a broad scale.
鈥淕ood quantitative researchers are as hard to find in academia as Farsi linguists are for the military; we do not train enough researchers to work with these methods,鈥 said Phil Ice, the vice president of research and development for the 90,000-student online American Public University System, in Manassas, Va. 鈥淭here are plenty of numbers people, but they work in the corporate sector, and they don鈥檛 know how to apply it to the education sector. You have to understand the pedagogy, the social issues around education, and you have to understand the numbers.鈥
Mark D. Milliron, the deputy director for postsecondary improvement at the Seattle-based Bill & Melinda Gates Foundation, agreed. 鈥淚鈥檝e heard again and again that there is a desperate need for funding data fellows and all sorts of people who can get out there in the field and do [education] research, but it鈥檚 not just about doing traditional research; it鈥檚 about doing more of this predictive modeling and analytics, because that鈥檚 what鈥檚 going to drive the scale and the implementation work to actually get people involved in the process. It鈥檚 a different set of questions.鈥
While predictive analytics cannot say definitively whether an intervention causes a particular outcome, they show potential for K-12 education, because they can be faster and provide more practical results than traditional experimental methods such as randomized controlled trials, Mr. Ice said. 鈥淚f you compare one class of 30 students to another class of 30 students that鈥檚 one thing, but with this you can take an entire school district, taking all the data, not just a sample of it,鈥 Mr. Ice said.
Predictive models might show the likelihood that a student with certain characteristics will excel in college or a teacher鈥檚 credentials and instructional style will gel in a new school, but they aren鈥檛 guarantees of specific results for individual teachers or children. Administrators have to walk a thin line between targeting support and changing expectations and opportunities based on predicted risks, according to Yeow M. Thum, a senior research fellow at the Northwest Evaluation Association, in Portland, Ore., who studies education growth modeling.
鈥淩isk management and topics like that are really foreign to education research and management,鈥 Mr. Thum said.
Predicting Graduation
One district that is taking the plunge into using predictive analytics for policymaking is San Jos茅. The district is modeling high school graduation and college-going trends based on 15 years worth of student academic, behavioral, social development and health data, as well as information on school climate from teachers, parents and students. The district is finalizing a risk-assessment protocol that identifies changing issues that contribute to a student鈥檚 risk of dropping out at different grades. Each student has an individual profile, including the status of key indicators like benchmarking and summative tests, behavior, attendance and health. Student data will be updated daily and monitored for accuracy, Mr. Erlendson said, but is not used for student report cards.
鈥淚t鈥檚 not quite a credit score, but it could be鈥 like that eventually, as more indicators are added to the mix, he said. 鈥淩ight now we鈥檙e on the cutting edge of what a student dashboard looks like. The indicators will be very clear: test scores, behavior, academics, health. The data there are going to be pretty solid. If the student is failing or is not showing up, that will be pretty obvious.鈥
Similarly, in 2007, Hamilton County schools, which serve 41,000 students in and around Chattanooga, Tenn., used districtwide data on student demographics, test scores, attendance and other information, comparing the graduation and dropout rates of students with different characteristics at each grade level to develop an 鈥渙n-track鈥 predictor of each student鈥檚 likelihood to graduate. Kirk Kelly, director of accountability and testing for Hamilton County schools, said individual schools have been focusing on the risk factors that carried the most weight at their schools.
For example, some elementary schools found that their students who failed early grades such as kindergarten or 1st grade ended up likelier to drop out in high school. 鈥淲e found one of the factors was being over-age for their grade and separated from their peers,鈥 Mr. Kelly said. 鈥淭he student fails twice in elementary school, and then they get to high school at 16. A 16-year-old is not likely to stay in high school until they are 20.鈥
As a result, elementary schools began more-intensive monitoring and remediation for students at risk of failing an early grade, and the district started 鈥渁dult鈥 high school programs for students considered over-age for their grade.
It鈥檚 been four years since the district started using the tool to target interventions, Mr. Kelly said, and 鈥渢he predictors have been good in how they play out. We鈥檝e seen the tool starting to bear fruit.鈥 The four-year graduation rate has increased steadily from 70.9 percent in 2008, the first school year of implementation, to 80.2 today, and the rate of students dropping out in each school year tumbled from 6.4 percent in 2008 to 1.8 percent today.
San Jos茅, along with the Dallas, Pittsburgh and Philadelphia school districts and the New York-based New Visions for Public Schools charter school network, are now participating in a three-year project analyzing school feeder systems to identify the elements at each school level that predict a student鈥檚 understanding of college entrance, readiness for college content, and ability to complete a degree.
Data System Demands
Yet both Mr. Erlandson and Mr. Kelly cautioned that school districts must have a well-developed data system with several years鈥 worth of data in multiple areas to get a nuanced prediction.
鈥淲e had to look at many things that could be predictive at different times,鈥 Mr. Erlandson said. For example, he said, 鈥渋n 9th grade the most predictive factor is attendance鈥攊t really jumped out at us鈥攂ut in sophomore year, the most predictive factor is grades. Public education is such a fluid thing, it鈥檚 like trying to lasso an amoeba.鈥
That complexity causes problems when trying to connect a child鈥檚 experiences in elementary school with success in college 15 years later, Mr. Thum said. 鈥淲e know that predictions on the near term are very useful; predictions in the far term are fraught with problems,鈥 Mr. Thum said. 鈥淚n any prediction you assume that the future will be like the past, and that鈥檚 a very large assumption to make.鈥
Moreover, many predictive analytics systems in the business world, like that which provides a credit score, use the data to rank people based on the likelihood of a specific behavior, something Mr. Thum said generally is not appropriate for education.
鈥淗ow people couch the accountability question will affect the results鈥 of an analysis, he said. 鈥淰ery often the folks hired into key policymaking positions probably do not have as much patience as they ought to have. Often ranking is all they are after; they鈥檙e not looking at measuring student learning.鈥
Finding Candidates
Yet sometimes, ranking can be useful, such as when a principal is trying to pick the right new teacher to fit into a school staff.
Teresa Khirallah, the senior director of Peak Preparatory, a K-12 charter school in eastern Dallas, has seen the hiring process work with and without predictive tools. When she took the administrative reins of the school four years ago, principals found new teachers by receiving personal applications from teachers or combing through the central candidate pool for the school鈥檚 parent organization, Uplift Education of Dallas.
鈥淭here was always some lag time, and you might be juggling seven to eight people. We spent a lot of time weeding people out, not really knowing from a five-minute phone conversation or email if they were the right person to continue this process,鈥 Ms. Khirallah said. 鈥淭here was just a lot of time spent on individuals who by the end of the hiring process you realized did not match the mission or would not be a good fit with your kids.鈥
Two years ago, with the 17-school network growing 25 percent or more a year, Uplift鈥檚 Chief Executive Officer Yasmin Bhatia overhauled the network鈥檚 hiring process in advance of hiring 160 teachers in a staff of 400. Uplift, working with the Emeryville, Calif.-based 3D Group, analyzed 44 different tasks that Uplift teachers perform, using a combination of surveys and interviews and classroom observations with the teachers considered to be exemplary based on qualifications, experience and recommendations.
The teachers rated each task on its importance to their daily job, the equipment or materials needed, and whether a new teacher should enter able to do the task or expect to learn on the job. Teachers also related specific examples of their biggest successes and mistakes in performing different teaching tasks.
After crunching the data, Uplift had a list of 29 ranked tasks in five teaching areas which formed the backbone of a new teacher hiring process. A potential candidate鈥檚 written email and application essay questions can be rated according to those indicators, and principals follow formal interview questions, with answers also rated to make it easier to compare candidates.
鈥淭hese are very specific situations we are asking candidates to describe for us; you cannot fluff our questions,鈥 Ms. Bhatia said. 鈥淚f you see that this person is only a 2.5 out of a 5 [in one of the five areas], it forces the discussion of, 鈥楢m I going to make the trade-off or am I going to hold off and keep looking for someone who is a better fit?鈥欌
For principals, the change has clarified the hiring process, Ms. Khirallah said. 鈥淲hat I鈥檓 able to do as a principal is to spend more time talking with and training the right people, rather than spending a lot of time weeding people out.鈥
While Uplift is still measuring the achievement effects of the teachers hired under the new system, Ms. Khirallah said she has already seen a decline in teacher turnover. 鈥淲e鈥檝e noticed teachers are not only staying but are stronger and higher-performing teachers, and we鈥檝e got a lot of like-minded individuals who share the same philosophy of education, so you have fewer issues at the end of the year about how people fit.鈥
Ms. Bhatia said the network has already started to use the initial task scores to tailor professional development based on a teacher鈥檚 areas of strength and weakness, and said the schools hope to eventually have more fine-grained data that will help match teachers to specific subjects or grade levels within schools. The May issue of the Harvard Education Letter noted Uplift鈥檚 predictive analytics show promise in helping districts match educators to the schools where they will be most effective in teaching.
Analyzing Personnel Policy
Hamilton County doesn鈥檛 use predictive analytics to hire or place teachers as the Uplift charters do, but it does use the tool to analyze personnel policy. For example, Mr. Kelly said he was able to track the broader-than-expected effects of teacher absenteeism on student performance throughout a building. 鈥淚f you see a teacher is out and there鈥檚 no substitute, that may not just affect that teacher, but might also affect ... other teachers, because we might have to split a class and not be able to do the lessons we had planned.鈥
As a result, the district has moved to a 95 percent teacher attendance goal and is putting into place more support structures to account for teacher absences.
Next year, the district plans to expand the system to provide online accounts at each school so that principals and teachers can run small-scale predictive studies or simply get more frequent updated reports on their students.
Higher education has started to explore the validity of using predictive analytics for long-term goals like college success. For example, the Western Interstate Commission for Higher Education has won a $1 million grant from the Seattle-based Bill & Melinda Gates Foundation to develop a Predictive Analytics Reporting Framework using data from more than 400,000 students in six college systems with online students: the American Public University System, Colorado Community College System, Rio Salado College based in Tempe, Ariz., University of Hawaii System, University of Illinois Springfield, and the University of Phoenix.
The online colleges are looking at 34 different factors that may contribute to a student鈥檚 ability to enter, stay and succeed in college.
鈥淭he methodology we鈥檙e applying could certainly be used in the face-to-face campus and in the K-12 arena as well,鈥 said Mr. Ice, the principal investigator for the study. 鈥淚 think that the K-12 environment may even be a richer source of material than the university setting; in K-12, we know so much more about what our students are doing and where they are coming from,鈥 he said.
鈥淚t takes a lot of computing power and there鈥檚 some pretty high-level math,鈥 Mr. Ice said, but, 鈥渋t鈥檚 a matter of political will more than math or statistics.鈥 With the right resources and access to the data, he predicted, widespread use of predictive analytics could become a reality in K-12 education within four years.