It鈥檚 one thing for all but a few states to agree on one shared set of academic standards. It鈥檚 quite another for them to agree on when students are 鈥渃ollege ready鈥 and to set that test score at a dauntingly high place. Yet that鈥檚 what two state assessment groups are doing.
The two common-assessment consortia are taking early steps to align the 鈥渃ollege readiness鈥 achievement levels on their tests with the rigorous proficiency standard of the National Assessment of Educational Progress, a move that is expected to set many states up for a steep drop in scores.
After all, fewer than four in 10 children reached the 鈥減roficient鈥 level on the 2013 NAEP in reading and math.
Facing that music on yet another round of tests will be politically painful for the many states planning to use the common-core exams being developed by the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers, or PARCC, in 2015.
But it appears unlikely that students themselves will feel the sting right away. That鈥檚 because states are taking a wait-and-see approach to attaching individual-level student stakes鈥攕uch as high school graduation or grade-to-grade promotion鈥攖o those high cutoff scores.
Cautious Approach
Massachusetts, for instance, recently approved a two-year transition plan that will keep its statewide test, the Massachusetts Comprehensive Assessment System, or MCAS, as a high school graduation requirement through 2016 while it decides whether to use the exams developed by the 19-member PARCC organization for that purpose instead.
Education Commissioner Mitchell D. Chester said it doesn鈥檛 make sense to expect teenagers to suddenly meet a college-ready bar in order to graduate, when four in 10 students who pass the MCAS have to take at least one remedial class in state colleges or universities.
鈥淥ur system isn鈥檛 ready to deliver a college-ready education to all our students off the bat,鈥 he said. 鈥淚 don鈥檛 want to get there by having students punished by not meeting that bar.鈥
Even in Massachusetts, a state that consistently outperforms other states, and many countries, on tests of scholastic prowess, high school diplomas often don鈥檛 signify adequate preparation for college. That reality is driving a shift in education leaders鈥 thinking鈥攁nd rhetoric鈥攁bout common-core tests. In meeting rooms across the country, the 2015 tests are slowly being recast as a humbling new starting gate, rather than an ax ready to fall on schools.
The tests provided by the 25-state Smarter Balanced consortium will have four achievement levels, connoting 鈥渢horough,鈥 鈥渁dequate,鈥 鈥減artial,鈥 or 鈥渕inimal鈥 understanding of math and English/language arts content. PARCC鈥檚 will have five levels, describing students鈥 mastery level as 鈥渄istinguished,鈥 鈥渟trong,鈥 鈥渕oderate,鈥 鈥減artial,鈥 or 鈥渕inimal.鈥 The college-ready point鈥攁t which public colleges and universities have agreed to consider students exempt from remedial work鈥攊s at level 4 on each test.
The U.S. Department of Education, which is funding the tests鈥 development as part of its Race to the Top program, issued rules in 2010 that require states to report their students鈥 performance on those tests for federal accountability. But the rules do not require states to use the threshold score for any achievement level鈥攐r any specified scale score鈥攆or policy decisions such as teacher evaluations, grade-to-grade promotion, high school graduation, or schools鈥 ratings under state accountability plans. Those decisions are up to each state.
A Gradual Ascent?
As states make transition plans to common-core-aligned tests, they are showing an inclination to shield students from the consequences of lackluster school performance on new tests with a higher bar.
If Massachusetts decides to replace the MCAS with PARCC, Mr. Chester said, he is considering using, at least for a while, a lower cutoff score for high school graduation than the one PARCC states will collectively choose as the college-ready point.
Louisiana鈥檚 plan would require schools to produce average scores at the college-ready level on PARCC tests in order to earn an A in its accountability system, but the state will give schools a decade to reach that point. The plan exempts high schools from using PARCC in 2015, opting instead to stick with its current end-of-course tests as a graduation requirement. It eases the state鈥檚 traditional promotion gates at the 4th and 8th grades, too: If 4th graders fall short of mastery on the new tests, they could proceed to 5th grade by a district waiver if they show their proficiency in other ways. Instead of being held back for poor performance on the test, 8th graders could move on to 9th grade and receive remediation there.
Arizona and Rhode Island were two of the states that said in a 2012 survey that they planned to use a consortium test as a graduation requirement. Arizona has at least temporarily stepped back from that plan, choosing instead to shop for possible alternatives to the PARCC exam. Rhode Island plans to use the PARCC test as a graduation requirement in 2015, but officials haven鈥檛 yet decided what score, or level, students must reach.
In New Jersey, the class of 2015 will be the last required to take its High School Proficiency Assessment to earn a diploma. Students in the next three graduating classes will be expected to take the PARCC test, but won鈥檛 have to reach a specific score to graduate, said Assistant Commissioner Bari A. Erlichson. That would be too big a leap to expect, when the current test reflects 8th-to-9th-grade skills, she said.
鈥淢oving all the way to capturing [an] 11th grade skill set seemed too drastic,鈥 Ms. Erlichson said. 鈥淚t鈥檚 not a reasonable action when you鈥檙e talking about student-level decisions.鈥
Requiring states to report their consortium test scores publicly, without demanding that they use specific cutoff scores for policy decisions, reflects the federal Education Department鈥檚 priorities for the PARCC and Smarter Balanced tests.
Joanne S. Weiss, who oversaw the Race to the Top Assessment competition that provided $360 million for the two consortia in 2010, said the key is for states in each consortium to report their students鈥 achievement on a shared scale. Unlike current practice, which allows each state to define 鈥減roficiency鈥 as it likes, each consortium鈥檚 states鈥42 states in all鈥攚ill agree on a common level of mastery and show the public where their students line up relative to it.
鈥淓veryone using the same cut score for what it means to be proficient is a huge deal,鈥 said Ms. Weiss, who is now an education consultant.
How the two testing groups will do that is still a work in progress. The Smarter Balanced coalition plans to embed NAEP items in its spring 2014 field test, so it can begin to envision how students鈥 performance on those items compares with their performance on consortium-designed items, said SBAC Executive Director Joe Willhoft. That information will be brought in as 鈥渋mpact data鈥 as panels of experts discuss where to set the cutoff scores for the four achievement levels of the test, he said. Smarter Balanced will do the same thing with items from the Program for International Student Assessment, or PISA, a high-profile global assessment, Mr. Willhoft said.
PARCC does not plan to include NAEP items in its 2014 field test but is considering other ways to use NAEP data to inform the setting of its own cutoff scores, said Jeffrey Nellhaus, the consortium鈥檚 assessment director. Since PARCC modeled its descriptions of achievement at each test level after NAEP鈥檚, and will use the national assessment鈥檚 data to help it set threshold scores, 鈥渋t should follow that PARCC鈥檚 cut scores would be in the same ballpark as NAEP鈥檚 in terms of rigor,鈥 he said.
Ripple Effects
The two testing groups鈥 work to set a higher level for mastery on their tests is praised in some quarters as a welcome boost in expectations. But in others, it鈥檚 decried as a setup for student and school failure, and a welcome mat for companies looking to profit from remediation, school turnaround, professional development, and other types of responses to lagging performance on tests.
鈥淚t鈥檚 all very convenient for the testing companies, and the textbook companies, that districts are desperate to make sure their schools aren鈥檛 seen as failing,鈥 said Leonie Haimson, a New York City education activist.
In setting the speed for higher academic expectations, ripple effects at many levels must be considered, activists and experts say.
Sandy Kress, who helped craft the No Child Left Behind Act when he was an education adviser to President George W. Bush, has watched with dismay as policymakers ramp up expectations quickly only to see them backfire. In Texas, where he practices law, Mr. Kress points to the recent domino effect of a precipitous drop in scores on a new, more rigorous test. State lawmakers responded by easing up on testing and curriculum requirements.
What鈥檚 needed, he said, is a reasonable middle ground, in which states boost expectations enough to 鈥渇eel the pinch,鈥 but do it gradually enough, with enough support, that the effort doesn鈥檛 go down in flames.
When using the mastery levels of PARCC and Smarter Balanced tests in policymaking, states would be wise to 鈥渟et goals that are stretch-but-not-break,鈥 said Kati Haycock, the president of the Washington-based Education Trust, which advocates policies that help disadvantaged students.
鈥淲hen you ask yourself how many kids are hitting that [NAEP] level of proficiency now, and ask yourself how fast we can move systems and kids,鈥 she said, 鈥渢here is no way this is anything less than a 10-year transition.鈥