Scoring problems are delaying the release of student test results in Connecticut, leading education officials there to warn that testing companies could be stretched thin as more states gear up to meet new federal requirements for gauging academic performance.
The written responses of some 120,000 Connecticut students are being rescored by CTB/McGraw-Hill after the state education department said the initial results didn鈥檛 jibe with outcomes from an earlier pilot of the same exams. Connecticut Commissioner of Education Betty J. Sternberg planned to meet this week with executives from the Monterey, Calif.-based company to resolve the issue.
The holdup means the results that districts typically would have gotten in January might not be available until this spring. As elsewhere, educators in Connecticut are especially anxious this year to find out where their schools stand on state exams as new accountability mechanisms kick in under the federal No Child Left Behind Act. (鈥淢ichigan May Feel Full Force of Federal Law,鈥 See related story, Page 11.)
Noting that this is the first year the company has processed Connecticut鈥檚 tests, officials with CTB/McGraw-Hill say they鈥檙e working to fix any initial glitches. But the state education commissioner said the postponement bodes ill for the testing industry鈥檚 ability to handle the substantially increasing need for their services across the country.
鈥淲hen you consider that in the nation there are a limited number of testing companies that appear to be prepared to do this, you are creating a tremendous supply-demand issue,鈥 Ms. Sternberg said.
Test Questions
At the center of the immediate issue are the Connecticut Mastery Tests, which the state has long given each fall in reading, writing, and mathematics in grades 4, 6, and 8. After a bidding process two years ago, Connecticut switched vendors for developing and scoring the tests to CTB/McGraw-Hill from the San Antonio-based Harcourt Assessment.
Connecticut officials say they鈥檝e run into a number of technical problems with their new testing company, but their chief concern deals with the scoring of open-ended responses. Unlike multiple-choice items, which are easily scanned by computer, written answers are reviewed by human evaluators trained to recognize what constitutes an adequate response.
When the state education department saw the results produced by CTB/McGraw-Hill late last fall for the open-ended questions, agency officials noticed that the scores were substantially lower than when the same testing company pilot-tested the assessments for the state the year before. The state officials say the company agreed with them that the tests should be rescored.
April Hattori, a spokeswoman for McGraw Hill Education, the testing group鈥檚 parent company, said the problems were being addressed.
鈥淚t is a complex program, and there are transitional issues that need to be resolved,鈥 she said last week. 鈥淚 believe wholeheartedly that CTB/McGraw-Hill has the capacity to meet the state鈥檚 needs.鈥
In the meantime, district leaders in Connecticut say they鈥檙e feeling the fallout. Jacqueline Jacoby, the superintendent of the 6,800- student Glastonbury schools, points out that educators often adjust their teaching based on the scores.
鈥淭he later you get them,鈥 she said, 鈥渢he less likely that teacher is going to have the opportunity to use them to improve their instruction.鈥
Scaling Up
Although frustrated by the delay, Commissioner Sternberg said her main goal was to ensure that the results were accurate.
Her bigger worry is about the future. Although Connecticut changed vendors, last fall鈥檚 tests were essentially the same鈥攁nd were given in the same grades鈥攁s in previous years. What will happen, the state chief asked, as nearly all states update and scale up their testing programs in the coming years, as required by the No Child Left Behind Act. The law includes a requirement that states test all students in reading and mathematics in grades 3-8 and once in high school.
鈥淚 think this will be an issue, over time, for every state,鈥 Ms. Sternberg said.
Researchers at Boston College say it鈥檚 a legitimate concern. The National Board on Educational Testing and Public Policy, based at the college, issued a report last spring contending that test- scoring problems were on the rise. (鈥淢ore Errors Are Seen in the Scoring of Tests, Boston Researchers Say,鈥 June 18, 2003.)
鈥淭he incidence of human error in standardized testing has increased dramatically over the last eight years,鈥 said Kathleen Rhoades, an author of the report. 鈥淪o there is a huge increase already prior to NCLB, and it only stands to reason that it will grow even more.鈥