Corrected: This article has been updated to reflect the current name for the WIDA consortium.
The latest data on how English-learners around the country fared in their language development during the pandemic are out. But there’s a major catch: About 30 percent fewer students took a widely used standardized assessment in 2021 than did two years prior, making the data difficult to analyze over time.
Students also took the test at different points in the school year, with some taking it up to 15 months later than their peers in other places, according to a new from the WIDA consortium that offers the online assessment. Thirty-six states, several United States territories, and agencies such as the Bureau of Indian Education use WIDA assessments, with nearly half of all English-learners nationally taking those tests.
While it’s tempting to look at the aggregate results, which show overall declines, particularly in speaking skills, and indicate that English-learners are worse off than they were before the pandemic, experts caution that the test results are skewed and have limited uses. The WIDA data problems are indicative of a pervasive challenge educators and researchers are facing right now in trying to pin down student achievement across the subjects: The COVID-19 pandemic disrupted assessments, leaving major asterisks on most data collection.
English-proficiency tests, offered by WIDA and others, are required by state and federal law for ELL students in kindergarten through 12th grade. They measure students’ English-proficiency in four domains—listening, speaking, reading, and writing—and are used to determine if a student will remain in an English-learner program.
Because the WIDA test is generally given in the spring, schools were largely able to administer the 2020 assessment prior to shutdowns that year.
WIDA, one of the nation’s largest English-learner testing programs, required that its 2021 assessment be taken in person—even in places where students were still learning remotely. That move, stemming from concerns about the validity of a take-home test and students’ inability to access the internet, caused some uproar. Advocates claimed it was unfair and possibly dangerous to ask these students—most of whom are Latino, Asian, and Black—to go into school buildings with COVID-19 raging in their communities.
While the 2021 WIDA data show where test-takers were in terms of their language skills at the time of the test, experts say using the data to make assumptions about English-learners’ proficiency and growth as a population is problematic.
“I would not use [2021] test data to say, ‘Oh, this is where we’re going as a nation,’ other than to see if it aligns with something that we have some evidence of locally,” said Kristina Robertson, the English-language-development program administrator at Roseville Area Schools in Minnesota.
Instead, the main value in the latest WIDA scores, experts say, is in considering them a starting point in assessing English-learners’ language proficiency and growth during the pandemic.
There are major caveats in reviewing results
The 30 percent decline in English-learners taking the language assessment in 2021 raises a number of questions both in terms of students tested and those whose scores are missing.
For instance, the results may be rosier or worse off than the true results depending on whether the students who happened to be tested had higher or lower levels of language proficiency than those not tested, said Gary Cook, senior director of assessment at WIDA. This prevents an apples to apples comparison to both past and future aggregate results.
There are even concerns among state education agencies that the most vulnerable English-learners weren’t tested, said Amaya Garcia, deputy director of preK-12 education at the think tank New America.
In the Roseville district there was a 17 percent drop in the number of English-learners taking the 2021 WIDA test from the year before, and that’s despite outreach efforts such as providing transportation to school for the test during their remote learning year, said Robertson.
While she can’t definitively say why these students didn’t participate, she identified possible disconnects.
“It’s not a test that they need to graduate. And parents are sort of, like, why am I sending my kid to take this test when they’re not even coming to school to learn?” she said.
English-learner advocates in some states last year were pushing for parents to be able to opt-out of the in-person test, or for tests to be delayed due to COVID safety concerns. The federal government encouraged this testing to proceed in 2021, even if it meant extending the testing window.
School closures forced some schools to administer the test well into the current academic year as opposed to the traditional spring testing window, Cook said, possibly giving some students more time than others to develop their language skills before being assessed.
Robertson said bringing students into school buildings may also have affected their performance. Many students who had been stuck at home for remote learning were seeing their peers for the first time in a while on test day, which was likely exciting and distracting.
Even with the caveats in mind, the 2021 test results do have one potential use, said Derek Briggs, a University of Colorado Boulder professor and president of the National Council on Measurement in Education, whose members design and study K-12 assessments: They’re a tool for helping figure out where individual students are with English proficiency, and what their needs are moving forward.
But because of the major changes in who got tested overall, you can’t make inferences about English-learners as a population, even at a district level, he added.
What do we know from the results?
Amid all the caveats, aggregate student data from 2021 revealed an overall downward trend in language proficiency and growth compared to 2020 and 2019.
The pandemic’s impact as seen in the 2021 scores varied by grade and each of the four language domains. There were relatively larger declines in speaking, according to WIDA’s report, and in 1st and 6th grades.
Garcia, with New America, found it concerning that there were more declines in language growth in the younger years since there’s the hope that children will become proficient early enough to exit out of an English-learner program by 4th grade to avoid the risk of becoming a long-term English-learner, who may end up stagnated academically.
And while Garcia was surprised that the decline in reading wasn’t as big as in the other domains, opening up the question of how much reading was involved in virtual learning, the decline in speaking wasn’t a shock.
“Virtual learning didn’t necessarily lend itself to a lot of speaking and practicing talking,” Garcia said.
While the return to in-person school itself can address the need for more opportunities to develop speaking skills, the decline is of note considering how key speaking is to learning a language overall, she added.
Educators should look at multiple factors
As Robertson’s team in Minnesota reviews local English-learner assessment results, they’re keeping in mind that students can do more in a classroom setting than they can in a standardized testing situation.
Standardized tests can’t, for instance, track advances students make in their home languages, Robertson said. And the test results alone shouldn’t be used to make assumptions on student progress.
“I just think there has to be a broader conversation about triangulation of data, how we look at the multiple factors that go into this,” she said.
Robertson is also mindful of the continued outreach staff within her district are doing to determine how the pandemic impacted students beyond academics. She’s aware, for instance, that there were several COVID-related deaths in their English-learner community.