A national inventory of educational technology is evolving as school districts try to determine what digital tools they have鈥攁nd what they鈥檒l need鈥攖o deploy online testing for all students on common academic standards just a few years from now.
A new tool released by the two coalitions helping to develop those online assessments is intended to aid states and districts in taking a snapshot of their current rosters of laptops, netbooks, and other mobile devices, as well as their overall technological bandwidth. It then will highlight where districts are lacking in their capability to assess students under the Common Core State Standards by 2014-15, when such testing is set to be introduced.
The free, Web-based Technology Readiness Tool is kicking up myriad concerns among educators, who worry that there鈥檚 little new money to bring their technology capabilities up to the level needed, that such testing could overwhelm district infrastructure, and that assessments could end up evaluating students鈥 technology skills more than their mastery of common-core material.
But the readiness tool is the first step toward addressing those concerns, and some ed-tech leaders hope it can provide the leverage needed to encourage state lawmakers to add funding to bring lagging districts up to speed, says Douglas Levin, the executive director of the State Educational Technology Directors Association. The Glen Burnie, Md.-based group is helping state education agencies deploy the tool.
The publishing and educational technology company Pearson, based in London, which developed the readiness tool, is seeking to develop assessments for the common core.
鈥淲e haven鈥檛 done an inventory like this [on a nationwide scale] ever,鈥 says Levin. 鈥淧eople are viewing this as an assessment issue, but it鈥檚 also a large-scale technology project. At the end of the day, the test can鈥檛 work if the technology doesn鈥檛 work.鈥
What Districts Need
Forty-five states and the District of Columbia have adopted the common standards in both English/language arts and math, and a 46th state鈥擬innesota鈥攈as adopted just the English/language arts standards. The standards were unveiled in 2010 under an initiative led by the nation鈥檚 governors and state schools chiefs and are now moving into the implementation phase. However, the challenge of implementing the common core has some districts and education experts concerned about finances and logistics.
The intention is to use 鈥渘ext generation鈥 assessments to determine how well students have grasped instruction based on those standards.
Most states are choosing to back assessments being developed by one of two nonprofit coalitions, the Smarter Balanced Assessment Consortium or the Partnership for Assessment of Readiness for College and Careers (PARCC), although some states have joined both. Assessments from both consortia will be administered using technology, and both will make use of new testing options such as simulations, video, and audio.
The main difference between the consortia is that the assessments created by Smarter Balanced will be adaptive, meaning the level of difficulty changes based on how well students are answering questions. That could create a situation in which students being tested on the same material could end up taking exams that are significantly different from those their classmates take.
Some states are already doing many or all of their student assessments online. They include Delaware, Indiana, and Oregon, which have adopted the new standards, as well as Virginia, which hasn鈥檛. But most states haven鈥檛 moved in that direction in a big way, and the idea of doing common-core assessments online by 2014-15 is daunting.
鈥淭here鈥檚 a big concern that school districts won鈥檛 have the capacity to do that,鈥 says Daniel A. Domenech, the executive director of the American Association of School Administrators, based in Alexandria, Va. 鈥淭he tool is a great idea to give us a factual definition of where school districts are and sound the alert that resources are going to be needed.鈥
The readiness tool, which was released to states in March and is just starting to reach school districts, allows schools and district technology leaders to log in and register how many and what types of computers and other devices they have. There will be at least two rounds of data collection, to be analyzed by the assessment coalitions. The first window of data collection ran from March 20 through June 14, says Chad Colby, a spokesman for PARCC.
The coalitions are seeking information on school district operating systems, the types of technological devices they have, the ratio of students to those devices, available bandwidth, wireless access, network speed, and other categories.
Raymond Reitz, the chief technology officer for the 12,000-student Chapel Hill-Carrboro school system in North Carolina says he鈥檚 just starting to work with the readiness tool and is concerned about his district鈥檚 ability to be ready for online assessments by the 2014-15 deadline.
He says his district at the very least will have to increase its number of mobile devices and its wireless network capacity. And that will require additional funding, he says, but with no extra money in sight.
Reitz hopes the data gathered by the readiness tool will 鈥渟omehow paint a picture and communicate to the legislature and get them concerned about what it鈥檚 going to take to make this possible.鈥
Other states have made it work.
Michael Stetter, the director of accountability resources for the Delaware Department of Education, says that over the past two years, Delaware has implemented computer-based assessments to track student growth in reading and math several times a year.
To get school districts ready, the state had them do their own technology inventories, and lawmakers allocated money for 10,000 additional netbooks, Stetter says.
鈥淪tates are worried right now because they鈥檙e doing paper-and-pencil [tests] and can鈥檛 imagine having all the computers to get this done,鈥 he says.
But the Delaware inventory showed that districts had more resources, in some cases, than they initially thought they had. For example, Stetter says, different departments in a school might buy computers for one dedicated purpose, but wouldn鈥檛 share them, even though they might be used infrequently.
Domenech, of the AASA, says he, too, hopes the information gathered by the tool can be used as leverage. 鈥淭here鈥檚 no question it will definitely flag the need for greater investment in technology,鈥 he says. 鈥淏ut because we鈥檙e still seeing states cutting back on educational dollars, we鈥檙e wondering where that money will come from.鈥
Testing Content Knowledge
The information gathered by the technology-readiness tool will have an added benefit, says Colby, of PARCC. It will help the two coalitions creating the assessments ensure the tests, at least in part, work with the technology districts already have, rather than what they might acquire.
鈥淲e want to know what devices are already being used, and the assessments should follow using that infrastructure,鈥 says Colby. 鈥淲e don鈥檛 want to create a scenario where the assessment is driving the purchasing.鈥
But the information collected by the readiness tool will likely drive some purchasing decisions, says Wes Bruce, the chief assessment officer for the Indiana Department of Education and the chairman of PARCC鈥檚 technology operational working group.
For example, even though Microsoft has said it鈥檚 going to terminate support for its Windows XP operating system by the 2014-15 school year, when the online assessments are launched, if enough schools and districts say they鈥檙e still using it, the consortia must make sure that the assessments will work with Windows XP.
In addition, says Levin, of the State Educational Technology Directors Association, the hope is that once schools and districts focus on their technology needs, they鈥檒l get up to speed enough to give students a chance to try out the technology before the real assessments take place.
鈥淲e鈥檙e critically aware that the test itself should not be the first time the student is exposed to this technology,鈥 he says. 鈥淲e want to assess their content knowledge, not their technology skills.鈥