The architects of one of the most highly regarded gauges of student achievement鈥攖he National Assessment of Educational Progress鈥攁re preparing for a dramatic expansion of technology-based assessment, while relying on a strikingly different approach from the one that will be used to give online common-core exams in the states.
Federal officials say the plan for administering NAEP, often called the 鈥渘ation鈥檚 report card,鈥 is for the government to rent tablet computers from a private contractor for the test, distribute them to the sample of participating schools, and retrieve those devices after the exam.
That strategy has been used in NAEP鈥檚 earlier, less-ambitious forays into computer-based tests, with the goal of ensuring that the tests are delivered securely, reliably, and consistently. It stands in contrast to the method that will soon become familiar in schools nationwide: About three dozen states have agreed to give tests aligned to the Common Core State Standards, exams that are scheduled to begin a year from now. Those assessments will be given using the eclectic array of computing devices currently in place in schools鈥攅xpected to include a mix of desktop and laptop computers and tablets, with different features and operating systems, as long as they are compatible with test requirement
Testing officials familiar with the two assessment programs say the contrasting strategies are a reflection of the tests鈥 very different purposes and needs.
The common-core tests will become a core part of state accountability systems. As such, they will be taken by almost all students in participating states, and the results will carry potentially high stakes for schools.
The NAEP, by contrast, is given to a much smaller sample of schools and students, one designed to produce a nationally representative set of results. Since 1969, its role has been to provide a common benchmark for gauging student performance over time, at the national level and across states and subgroups of students.
Inevitable Shift
Renting and temporarily distributing the devices makes more sense than buying them, given the relatively small number of schools and students and testing window involved, federal officials say. And they also believe using the same stock of devices, rather than the different ones in place in K-12 districts, will help ensure that students have a standardized testing experience.
The officials who direct and oversee the NAEP have been planning for full-scale technology-based testing for years. They say plans for computer-based expansion represent a dramatic, and in some ways inevitable, shift for an exam that seeks to serve as a model in the testing field.
Key dates for the National Assessment of Educational Progress鈥 move to technology-based testing鈥攁lthough future developments are subject to change鈥攊nclude:
SOURCE: National Center for Education Statistics
鈥淚t鈥檚 a sea change in terms of the face of the NAEP and what it鈥檚 going to be able to do,鈥 said Peggy G. Carr, an associate commissioner for the National Center for Education Statistics, which administers the NAEP. 鈥淲e need to be a leader, an innovator, in this area.鈥
But some crucial unknowns surround the new approach, such as the cost. Neither NCES nor the contractor assigned to help distribute the computing devices would provide an estimate to 澳门跑狗论坛 of the total projected price tag of using the tablets for testing, saying that those costs could involve proprietary information from individual contractors.
The NAEP鈥檚 evolution into technology-based testing is part of a broader, nationwide shift from paper-and-pencil exams to assessments delivered with computing devices, one that is playing out at the state level and in individual classrooms.
A growing number of states have begun using computer-based assessments in recent years鈥攁bout two-thirds of them conduct some form of online testing, according to the Assessment Solutions Group, a Danville, Calif.-based company that works with states and districts.
But the plans to deliver tests aligned to the common-core standards via computing devices represent a major acceleration of the online testing movement. Over the next few months, the two consortia of states designing exams tied to the standards鈥攖he Partnership for Assessment of Readiness for College and Careers and the Smarter Balanced Assessment Consortium鈥攁re staging field tests of those assessments. When that process ends, more than four million students will have taken near-final versions of the tests in English/language arts and mathematics.
NAEP tests are given to a nationally representative sample of schools and students, designed to ensure that the assessment reflects the student population. They produce a portrait of achievement at many levels, including across the nation and states, and among subgroups.
For instance, in an average state, about 2,500 students in approximately 100 public schools are assessed per grade, per subject, on NAEP tests to gauge state performance, according to recent federal estimates.
By comparison, individual states each typically test a much larger number of students鈥攕everal hundred thousand鈥攁s part of their mandatory assessments.
Over the past few years, computer-based testing has been integrated into a number of NAEP tests, including in science and writing. But those tests were relatively small-scale compared with what鈥檚 ahead, noted Ms. Carr.
Phasing In All Subjects
After two years of pilot-testing, by 2017, tablets will be used for all reading, math, and writing tests for the 鈥渕ain鈥 NAEP, which provides state-by-state comparisons of scores and national results. And by 2020, federal officials want to have all NAEP tests in all subjects delivered with computing devices, she said.
The current plan is for NCES to rent tablets from Westat, the Rockville, Md.-based sampling and data-collection contractor for the NAEP. Westat will bring the devices to the schools for the testing periods鈥攁s it has with paper-and-pencil and early computer-based tests鈥攁rrange for trained staff members to oversee that process, and then take back the devices when the testing ends. In 2017, each technology-based field team is likely to bring about 26 tablets to each school, Ms. Carr explained.
According to the plan, student responses will be sent to, and stored on, a test administrator鈥檚 computing device provided to each data-collection team, Ms. Carr said. Results will then be transmitted electronically and securely to another contractor, Pearson, which is charged with scoring the tests, she said.
Federal officials plan to use Microsoft Surface Pro 2 tablets, with attachable keyboards, for the testing, Ms. Carr added, though it鈥檚 possible the device could change after the pilot assessments. Microsoft currently lists the base cost of the Pro 2 tablets at $899 apiece.
Dianne Walsh, a vice president for Westat, referred questions about the project鈥檚 total cost to NCES, which declined to provide it. But she said distributing computing devices on a temporary basis makes sense, because it will relieve schools from having to free up their own devices to use during NAEP testing, and create greater assurance of a consistent test experience across schools. The early experiences distributing laptops to schools for NAEP testing has shown the model will work, Ms. Walsh said.
Moving from paper-and-pencil to technology-based tests brings myriad benefits, many testing experts say. Among the ones cited most often: scoring the tests is faster; students take tests using devices similar to those they鈥檙e likely to be using in their daily classes and at home; and potentially, tests can be designed to collect more sophisticated information about students鈥 knowledge, in some cases by tailoring questions based on students鈥 earlier responses.
Rich Results
Those potential payoffs were evident on NAEP鈥檚 nascent attempts at computer-based testing, such as in science, in which students were led through interactive tasks via computer, recalled Mary Crovo, the deputy executive director for the National Assessment Governing Board, which sets policy for the NAEP.
鈥淪tudents were extremely engaged, and the information collected was rich and useful,鈥 Ms. Crovo said. Students鈥 interest in the test questions appeared to be strong, even among 12th graders, a group that federal officials have sometimes struggled to convince to take the NAEP seriously.
The goal is not to take paper-and-pencil questions and 鈥渢hrow them on a computer screen,鈥 Ms. Crovo said, but to craft questions that produce more information about how students think and solve problems.
Gary Phillips, a vice president for the American Institutes for Research, a Washington-based research and evaluation organization, said NAEP鈥檚 shift to online testing is 鈥渁 big change and a necessary change鈥 given the overall shift to online testing, and the benefits it brings.
But distributing tablets is not the 鈥渕ost efficient鈥 way to administer the tests, said Mr. Phillips, who favors using schools鈥 computing devices, the strategy being used by the two main state testing consortia. (AIR has contracts with Smarter Balanced to help develop and administer its tests.)
That said, Mr. Phillips predicted that many students would be receptive to using tablets, and would respond more positively to them on test day than they would laptops or desktops.
While NAEP鈥檚 turn to technology-based assessment is receiving much less attention than the two consortia鈥檚 plans for online testing, it鈥檚 crucial for the national assessment to make that switch, and deliver the tests effectively, said Mr. Phillips, a former acting commissioner of NCES.
Soon, some states will be judging their performance on tests designed by Smarter Balanced, while others will be using PARCC-designed tests, and still others that rejected both consortia will be producing their own assessments. Policymakers and the public will need an objective way of evaluating student performance that rises above those different approaches, Mr. Phillips argued.
鈥淭he role of the NAEP is even more important than it has been in the past,鈥 he insisted. 鈥淣AEP is the only source of comparable data. When things are in flux, it鈥檚 important that you have an independent barometer of student performance.鈥