Students learn more from certified teachers than they do from uncertified teachers, even when the uncredentialed teachers are Teach For America recruits from some of the nation鈥檚 top colleges, a Stanford University research team concludes from a study of test scores in Houston.
Findings from the study, which researchers presented here April 15 during the annual conference of the American Educational Research Association, have refueled the fierce, continuing debate in research and policy circles over programs that let new teachers into the field without traditional training.
鈥淥ur study would suggest that it does matter that you actually complete some teacher-preparation program,鈥 said Linda Darling-Hammond, the study鈥檚 lead author and an education professor at Stanford. 鈥淎nd that seems to cut across a variety of tests and a variety of fields.鈥
The report, , is available from at Stanford University. ()
Some scholars, though, viewed the findings with skepticism and suggested that they had been released prematurely.
鈥淚 guess my bottom line on it is that it looks like it has very good data,鈥 Jane Hannaway, the director of the Education Policy Center at the Urban Institute, a Washington think tank, said of the Stanford study. 鈥淚t鈥檚 asking important questions. But I wouldn鈥檛 have a lot of confidence in these results unless more analyses were done, and they were more clearly reported.鈥
The Stanford team arrived at its conclusion after analyzing seven years of test-score data for 4,400 4th and 5th graders in Houston, a district that leans heavily on the Peace Corps-style Teach For America program to fill teaching positions in hard-to-staff schools.
The findings come at a time when alternative routes into the profession, such as the TFA approach, are proliferating. Policymakers, in commission reports and in legislative sessions, have asked whether requiring teachers to take the usual route to certification鈥攇enerally, four years in a college- or university-based teacher education program鈥攊s keeping too many otherwise qualified prospects out of the classroom.
Evaluating Effectiveness
Since its inception in 1990, the privately organized, New York City-based Teach For America program has recruited more than 12,000 young liberal-arts graduates to teach for two years in disadvantaged rural and inner-city schools. For the upcoming 2005-06 school year, according to program materials, a record 17,000 applicants have applied for teaching assignments.
See the accompanying item,
The program has become a lightning rod in the national debate on teacher certification because its recruits get minimal training鈥攗sually a five-week program that includes student teaching and intensive coursework鈥攂efore they step into classrooms.
Yet two studies have suggested that Teach For America recruits may be as effective as or better than other teachers in their schools and districts. In the more recent of those reports, researchers from Mathematica Policy Research Inc., of Princeton, N.J., found that elementary students taught by TFA recruits in eight cities learned more mathematics over the course of a school year than their peers whose teachers were hired through more traditional routes. In reading, the two groups of students fared about the same. (鈥淪tudy Finds Benefits In Teach For America,鈥 June 16, 2004.)
That pattern also held when the researchers compared the program participants with the small groups of regularly certified teachers in their buildings.
The Mathematica study was widely praised for the rigor of its experimental design. An earlier study, published in 2001, found similar results for participants in the TFA program. Like the new Stanford study, that one was based on Houston鈥檚 experience.
But Ms. Darling-Hammond, a longtime critic of such fast-track pathways into teaching, said the regular teachers in the schools involved in both of those studies were often as likely as the Teach For America recruits to be uncredentialed and inexperienced.
鈥淭he obvious next question is whether there are differences in the effectiveness of certified and uncertified teachers鈥擳FA and others鈥攊n supporting student achievement,鈥 she said.
Test Data Analyzed
The Stanford researchers drew on data from three types of tests that Houston elementary students took between 1995 and 2002. They were state-required reading and math tests; a national standardized test, the Stanford Achievement Test- 9th Edition; and the Aprenda, which Spanish-speaking students take.
For at least the first three years of the study, the results from the state testing program mirrored those of the earlier Houston study: Students of Teach For America teachers did better in math and about the same in reading as their counterparts in other classrooms across the district. The edge in math evaporated, however, in later years and on some other tests.
鈥淚t looks as though the district was recruiting more teachers with certification in those later years,鈥 Ms. Darling-Hammond said.
Compared with teachers who had achieved standard certification, the TFA teachers seemed, for the most part, to have less of an impact on improving their students鈥 learning. That was true, the researchers said, even after they accounted for differences between the two groups of students, such as their prior achievement levels and their teachers鈥 years on the job.
On the state reading test, for instance, researchers estimated that the learning difference they found would put the students in Teach For America classrooms about a month behind counterparts taught by certified teachers.
But in Houston, as in many other cities where the national teaching program operates, most Teach For America teachers end up earning professional certification anyway before their two-year stints are up. To account for that reality, the Stanford researchers also compared scores for students of certified Teach For America teachers with those of other certified classroom teachers.
The certified TFA teachers鈥 students came out ahead on state tests of mathematics, but behind on the Aprenda test. On some other standardized tests, the two groups of students looked more similar, suggesting that the certified Teach For America teachers may have been doing just as well in those areas.
The Wrong Question?
However, critics said the study left out important information and analyses. They said the researchers omitted information on the number of teachers involved in some of the analyses, and analyzed other data in a way that might confuse teachers鈥 experience with their certification status.
鈥淎n independent review would have revealed some of the flaws that, it appears, would undermine the study鈥檚 conclusions,鈥 said Abigail Smith, the vice president for research and policy at Teach For America.
For her part, though, Ms. Darling-Hammond said she did conduct some of those alternative analyses. She did not discuss them in the report, she said, because the results were no different.
She said colleagues in the field also critiqued the study for her before the research meeting in Montreal. Neither the Mathematica study nor the earlier Houston study was published in a peer-reviewed journal before its release, she pointed out.
Even if the findings hold up, other scholars pointed out, Ms. Darling-Hammond may be asking the wrong question.
鈥淚f I was a principal, I would be asking, 鈥楽hould I go for TFA teachers or some other teachers who may not have certification?鈥 鈥 said Dan D. Goldhaber, a research associate professor at the University of Washington in Seattle. 鈥淔or schools hiring uncertified teachers, I would suspect they鈥檙e doing so because they don鈥檛 have a lot of other options.鈥