Field-testing of two multistate online assessments is going more smoothly than many educators had expected, despite technological glitches in the coast-to-coast experiment. And even though the exams, aligned with the Common Core State Standards, are still in the tryout phase, they are proving tougher than the ones students are used to taking.
Those are the two major themes that emerged from an 澳门跑狗论坛 reporting project that examined the experiences so far of a collection of school districts across nearly half the states with the trial run of tests designed by the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers, or PARCC.
The field-testing, designed to find problems with the assessments before their designs become final, began in late March and runs through early June. Participation varies widely; a few states are giving the exams to nearly all students, while in most places, students in only some classrooms, at some grade levels, are involved.
Although potential technological problems with the move to large-scale online testing have topped educators鈥 list of concerns, district and school leaders reached for this article, with only one exception, reported relatively minor problems.
鈥淲e had our reservations going in, but it鈥檚 not the nightmare that some people predicted it would be. So far, there have been no big problems,鈥 said Paul Richter, the director of assessment in Nevada鈥檚 Washoe County district, which is field-testing the Smarter Balanced exams.
The reporting conducted in late April by 澳门跑狗论坛 is not based on a nationally representative sample, but rather reflects a snapshot of experiences drawn from teachers, principals, and district assessment and curriculum officials in 29 school districts in 24 of the 36 states participating in the field-testing.
It shows key themes that are likely to shape the tests鈥 final designs and inform education leaders as they make decisions about which tests they鈥檒l use to gauge learning under the common core. The new standards, in English/language arts and mathematics, were unveiled in 2010 and are being put into practice in all but five states.
Not surprisingly, familiarity with online testing tended to make the field-testing run more smoothly.
The Coeur D鈥橝lene district in Idaho had built up a good infrastructure because its current state exam, the Idaho Standards Achievement Test, or ISAT, is computer-based, said Mike Nelson, the district鈥檚 director of curriculum and assessment, who said the Smarter Balanced field-testing went well.
Even so, there were hitches. Computer audio functions for the ISAT conflicted with the audio needed for the field tests, so the computers had to be restarted, Mr. Nelson said.
Delaware also already gives its tests online, so students are accustomed to that approach. But some struggled as they learned to use the Smarter Balanced test鈥檚 new tools, such as highlighting and on-screen calculators, said Travis Moorman, the director of teaching and learning for the Milford school system.
From Paper to Computer
Districts less familiar with online testing had a steeper learning curve.
Vermont, which has been giving the paper-and-pencil New England Common Assessment Program, or NECAP, plans to move to Smarter Balanced, and the Blue Mountain Union School, a K-12 building of 445 students in rural Wells River, 70 miles south of the Canadian border, had a rough-and-tumble taste of the difference.
Principal Emilie Knisley said that when members of her staff tried to log on to the assessment, their screens repeatedly displayed messages that the site was down for maintenance. Students who managed to start the test often found their sessions abruptly terminated. As a result, 40 percent or more of her students had to repeat the test another day, Ms. Knisley said. Those issues did improve, though, as the days passed, she said.
Administrators in many districts reported that many students, particularly those from homes without computers, were not prepared for the level of technological savvy the tests demanded.
鈥淥ur students don鈥檛 necessarily have a lot of experience navigating different frames or highlighting things on the computer, so this has been an eye-opening experience for our teachers, to see what we need to be preparing students for,鈥 said Deborah Warr, the principal of Knollwood Heights Elementary School in Rapid City, S.D., where nine in 10 students qualify for subsidized meals. 鈥淪tarting in kindergarten, we need to do a lot more work just navigating the computer.鈥
Those kinds of challenges raised big questions for David Estrop, the superintendent of the Springfield, Ohio, system, which is field-testing the PARCC exams. He said his students struggled less with the tests鈥 content than with the technological features, such as clicking and dragging items on the computer screen. He worries about next year, when the tests will be given in final form.
鈥淎re we testing [students鈥橾 knowledge and skill of the content, or their knowledge and skill in using the device?鈥 he said.
Many students apparently enjoyed the online experience.
It 鈥渢ook a little time to get used to taking the test on computers, [but] students liked being able to scroll up and down during the test and see the materials side by side,鈥 said Kay Dugan, the assistant superintendent for learning in Bensenville School District 2, a K-8 district in Illinois.
Officials in the Reeths-Puffer district in Muskegon, Mich., surveyed their high school students after the field-testing, and they reported enjoying its accessibility tools, such as highlighting and striking out text, said Terri Portice, the director of teaching and learning.
Gearing Up
Many districts have been in heavy preparation mode for the field-testing with students, teachers, and technology, and reported that such measures paid off.
In the Millville, N.J., district, which serves a large share of students living in poverty, students used PARCC鈥檚 practice tests, and children as young as kindergarten have been working on their typing skills, said Harry Drew, the principal of R.D. Wood Elementary School.
The Springfield City Schools in Ohio spent several days training teachers in test-administration procedures, Mr. Estrop said. That effort, combined with a $3 million investment in wired and wireless technology, minimized problems with the tests, he said.
The Pulaski County Special School District in Little Rock, Ark., credited a bandwidth upgrade and lots of upfront technology work with an uneventful PARCC field-testing experience. Chief Technology Officer Will Reid said his staff loaded computers with Java and other software updates and tested every computer鈥檚 functionality ahead of time.
Districts that didn鈥檛 think ahead about software updates reported problems. Computers at Cibola High School in Albuquerque, N.M., are set to update automatically during the school day, said Ryan Kettler, the assistant principal for 11th grade, so 鈥渢hat would come up and boot the kids off.鈥
Mr. Reid, from Pulaski County, said the field-testing 鈥渨ent much smoother than we anticipated from a technological perspective.鈥 But the trial run involved a very small number of the district鈥檚 students, and he thinks it will be 鈥渁n extreme challenge鈥 when the test goes districtwide next year.
鈥淚t takes such a coordinated effort between IT, teachers, administrators, and students for this to go off successfully,鈥 he said.
The most common problems at call centers staffed by PARCC and Smarter Balanced involved difficulty logging on to the system, especially when administrators had forgotten their passwords, officials of the testing consortia said.
The consortia also got feedback that some of the instructions for administrators, including instructions they read aloud to students, were confusing or too lengthy, so those will be revised, spokeswomen from both groups said.
It鈥檚 too soon, they said, for feedback on how well the test items themselves worked. An analysis that will focus on specific questions, or types of questions, that caused problems will begin when field-testing concludes, the spokeswomen said.
Computer-system capacity was an issue for some schools and districts. Connecticut, one of a few states that are involving nearly all their students in field-testing, saw more than its share of slowdowns and access issues because so many students sat for the test right at the beginning of the testing window, said Smarter Balanced spokeswoman Jacqueline King.
鈥淭hey took the lion鈥檚 share of Day One growing pains,鈥 but those problems eased as the days went by, she said.
That wasn鈥檛 exactly the experience in New Jersey鈥檚 Millville district, where R.D. Wood Elementary School saw problems worsen as more schools around the state joined the field-testing. Principal Drew said that 鈥渃ould cause problems down the road鈥 when the real PARCC tests come online.
Tough Questions
Based on field-testing experiences, computer hardware looms as a potential problem in operational testing as well.
The schools in central Wisconsin鈥檚 Weyauwega-Fremont district rotated 400 students through the desktops in their computer labs, but officials there said they see adding another computer lab as crucial for a successful administration of the real test next year.
Educators have worried that the new tests will be harder for students, and that concern appeared to be well founded. Most teachers and administrators reported that their students did find the tests more difficult than their states鈥 current tests."The quote I heard over and over again was, 鈥榃ow, that was hard,鈥 鈥 said Scott Moran, the director of secondary school improvement for the Denison district in western Iowa, which administered the Smarter Balanced exam to 3rd and 5th graders.
In Delaware, the Milford district surveyed students and teachers, who reported that the Smarter Balanced assessments were 鈥渁 complete and total shift in thinking, teaching, and assessment,鈥 said Mr. Moorman. The longer, more complicated performance tasks, however鈥攖he piece teachers were most worried about鈥攚ere what drew the warmest comments from students.
鈥淭hey enjoyed those the most because they were taking academics and applying them to current situations, things they were connected with,鈥 Mr. Moorman said.
The performance tasks proved tough for many test-takers in Michigan鈥檚 Reeths-Puffer district. Their performance on those items showed district leaders that more classroom work is needed on 鈥渢hat [kind of] deep-level, multiple-step, multiple-day project,鈥 Ms. Portice said. Some high school students, surveyed after the field tests, said they missed the state鈥檚 current test 鈥渂ecause it was easier,鈥 she said.
Students in various places complained of having to read long text passages for the English/language arts portion of the tests, and educators reported that many took a long time to finish.
It took some students 90 minutes to complete the four required essays, said Kandi Martin, the curriculum director in Wisconsin鈥檚 Weyauwega-Fremont district.
Vicky Lynch, who supervises accountability and testing in the Bossier Parish in Benton, La., said teachers and students reported that the test questions were difficult. Some of that difficulty might have been attributable to the online testing experience, or lack of familiarity with PARCC-type items, she said, but some was due to the rigor of the questions.
鈥淭hey had expected the different format,鈥 she said, 鈥渂ut even some of my high-performing kids would call the teacher over and say, 鈥楾his is confusing.鈥 They definitely thought the questions were difficult.鈥
Children in Vermont鈥檚 Blue Mountain Union School told Principal Knisley that they wrestled with the math portion of the test because it made them blend skills together to solve a problem, instead of regurgitating facts.
鈥淥ur math curriculum is 20 years old; you learn multiplication and you practice it,鈥 she said.
Accessibility Tools
How well the online accommodations would work was another area of worry for school personnel. Midway through field-testing, they reported a variety of good鈥攁nd not so good鈥攅xperiences with them.
Officials in Iowa鈥檚 Denison district found that the text-to-speech function on the Smarter Balanced test didn鈥檛 always work smoothly, said Mr. Moran.
In Ohio, educators in the Springfield system couldn鈥檛 figure out how to make that tool work, and discovered that there were accommodations tools they didn鈥檛 use because they didn鈥檛 know about them, said Crystal Aker, the district鈥檚 coordinator of testing, accountability, and research.
School leaders in Colorado鈥檚 Adams 12 Five Star Schools were disappointed to find they couldn鈥檛 test all students together, said David Bahna, the director of assessment and accountability. The district had to create specific testing sessions for students with special needs, he said.
Other districts, meanwhile, found that those accommodations could be loaded onto computers ahead of time so students who needed them did not have to be tested separately.
鈥淭o have your test customized for you is huge,鈥 said Ms. Portice of the Reeths-Puffer district in Michigan.
Officials in the Hartford, Conn., system noted that it was easy to overlook that important step, so administrators had to be mindful to preload those accommodations, said Michelle Puhlick, the executive director of curriculum and instruction.
Rob Watson, the superintendent of the Bozeman, Mont., schools, said he had worried that his students might have trouble manipulating various tools on the computer screen; that turned out not to be an issue at all. He theorized that time spent on practice tests, and on showing children how to navigate back and forth between windows, paid off.
The accommodations have 鈥渢remendous鈥 promise for students with disabilities, said Mr. Richter of Nevada鈥檚 Washoe County system. He singled out the read-aloud tool, which allows all children to hear questions read in the exact same way, minimizing variation from reader to reader.
A number of district leaders said that a hefty dose of staff training was important in preparing for the field-testing. The Milford schools in Delaware, for instance, got a grant to train staff members with online modules outside the school day. But many administrators said a particular challenge was finding the time for such sessions.
鈥淭raining, training, and training helped to alleviate the panic and frustration,鈥 said Razak Garoui, the executive director of accountability, research, and assessment for the Kent district in Washington state. 鈥淭he first day, [the] second day, people are panicking, but as soon as they start the test, everything goes smoothly.鈥
There have been few, if any, reports so far of what would be a significant hurdle in common-core testing鈥攖hat the exams don鈥檛 faithfully reflect the standards on which they鈥檙e based.
鈥淭eachers felt the [PARCC] test was aligned to what the common-core standards are,鈥 said Ms. Dugan, of Illinois鈥 Bensenville District 2, 鈥渁nd they felt good about that.鈥