Even while Obama administration officials have outlined plans to revamp measures of teacher-preparation quality, states and colleges of education are busy catching up to the existing requirements added less than three years ago.
Named for the section of the Higher Education Act dealing with teacher education programs, the Title II data collection now under way is the first to be based on changes instituted in the federal law鈥檚 2008 reauthorization.
That rewrite expanded reporting requirements first put into place in 1998. In general, both preparation programs and states produce 鈥渞eport cards鈥 for the public based on the data, along with generating information for an annual report put out by the U.S. Department of Education.
Statewide Comparisons
To read about the current federal overhaul of HEA reporting requirements, see 鈥淎dministration Pushes Teacher-Prep Accountability,鈥 March 9, 2011.
Among other factors, the HEA now requires programs and states to detail average scores on licensing tests; the length and supervision of student-teaching requirements; the integration of technology in teacher preparation; and progress in preparing teachers in high-need subjects.
The data collection is supposed to wrap up in October, and the department鈥檚 first report based on the new data collection is scheduled to be released early next year.
Some of the tweaks were meant to address deficiencies in the prior iteration of the law.
In response to a widespread sense that passing rates on licensing tests鈥攖he major Title II reporting benchmark鈥攄id little to help states and the public gauge program quality, the collection now requires states to report 鈥渟caled scores鈥 on the tests, which allow for comparisons among students statewide. Such scores will give a better sense of the range of performance relative to the state-set passing mark on the tests, officials said.
鈥淚 think the federal government, by asking for the scale-score information, is looking toward getting more tangible information about how well all the students across the country entering the teaching profession are doing on these standardized tests,鈥 said Florence M. Cucchi, the director of client services for the Princeton, N.J.-based Educational Testing Service, which produces the Praxis series tests many states use for licensing.
The ETS is providing the scaled-score information to 35 states and territories in all.
Better or Worse?
Officials at colleges of education said carrying out the new requirements hasn鈥檛 been as chaotic as for the original 1998 requirements, whose implementation was complicated by missed deadlines, faulty data, and concerns about the validity of the measures. (鈥淓d. Schools Strain To File Report Cards,鈥 March 28, 2001.)
Nevertheless, several deans contended that the new reporting wouldn鈥檛 produce substantially more useful information than the former collection. Randy A. Hitz, the dean of the school of education at Portland State University, in Oregon, pointed to ongoing differences in how states and institutions interpret terms such as 鈥渃linical experience,鈥 despite a somewhat tighter definition in the law.
鈥淲e鈥檝e had to sit down and say, 鈥淭his is how we鈥檒l define it. Let鈥檚 make sure to do it like that every year,鈥 鈥 Mr. Hitz said. 鈥淭hen you multiply that process by 1,200 [teacher education] programs, and you end up with a mess. ... I don鈥檛 have much faith that it will provide the [U.S.] Secretary of Education with what he wants.鈥
Some education school officials, though, have found utility in the information.
Joyce E. Many, the executive associate dean of the school of education at Georgia State University, in Atlanta, said the law鈥檚 requirements for reporting on whether candidates are taught to use technology to analyze achievement data have spurred conversations within the school. She also praised the more detailed test-score reporting.
鈥淎ny level of information programs can have beyond pass and fail rates is beneficial,鈥 she said. 鈥淚t鈥檚 helpful for them to look at strengths and weaknesses at the subscales.鈥