澳门跑狗论坛

Assessment

The 鈥楴ation鈥檚 Report Card鈥 Is Getting an Overhaul: 5 Things to Know

By Stephen Sawchuk 鈥 March 22, 2022 9 min read
Image of a bank of computers in a library.
  • Save to favorites
  • Print
Email Copy URL

A more flexible test, given on the devices schools and students are already using, that quickly produces actionable information for educators and policymakers: That鈥檚 the vision going forward for the test known as the Nation鈥檚 Report Card.

The National Assessment of Educational Progress, or NAEP, is the only national, comparative gauge of K-12 student achievement. The pandemic鈥攁s it did with so many other fields鈥攗tterly upended things, resulting in the disappointing cancellation of its 2021 administration. Now its leaders say they鈥檝e taken what they鈥檝e learned to heart and are devising plans for a more resilient, purposeful exam.

In a lengthy blog post last week, Peggy Carr鈥攁 longtime civil servant who was named the commissioner of NCES in August 2021鈥攁nd Lesley Muldoon, the executive director of the National Assessment Governing Board, outlined these priorities (The two agencies, both within the U.S. Department of Education, share the responsibility for the exam. NAGB develops the test frameworks and policies, while NCES analyzes all the numbers and reports out the results.)

In interviews with 澳门跑狗论坛, the leaders explained more about what these priorities will mean for NAEP.

Their blueprint is expected to be bolstered by 鈥攐ne of a series commissioned in 2018 to study the NCES as part of its 150th anniversary.

Here鈥檚 a rundown of what鈥檚 to come for NAEP.

1. Soon, NAEP will be given on different devices.

It wasn鈥檛 that long ago, in 2017, that the venerable exam began to be given on devices rather than paper and pencil fill-in-the-bubble forms. Testing agents went out to schools with special laptops to administer the exams, eliminating the need for all those pesky scoring sheets.

But that wasn鈥檛 enough to keep NAEP going during the pandemic. Schools were operating in widely different modes鈥攕ome in-person, some hybrid, some virtual only鈥攁nd test contractors couldn鈥檛 access all of them. It all threatened to skew the results so badly that the data wouldn鈥檛 have been usable. The agencies had no choice but to push back the test.

Like many other fields, 鈥淲hen it hit we were kind of caught off guard鈥攆lat footed in a way. We could not reach these students,鈥 said Carr. 鈥淎nd I think there was this awakening of the large-scale assessment community and stakeholders that we were not prepared to do what they needed us to do when the chips were down. Our infrastructure was not ready.鈥

That鈥檚 the impetus for plans to design a way for NAEP to be taken on different kinds of devices, like Chromebooks or school-issued laptops鈥攚hatever鈥檚 in use where students are taking the exam.

It will take until 2026 until this is completely up and running, but when it is, NAEP will be in much better shape to weather another massive disruption to schools. There will also be fewer contractors needed to make the testing happen. And over time, this could also potentially produce more accurate results. Here鈥檚 why.

A growing number of students are enrolled in some kind of online learning program. There used to be no way to capture these students because NAEP was only administered at physical school buildings. After this switch, though, these students could possibly be included鈥攁nd that would help maintain an accurate picture of achievement as more students enroll in virtual offerings.

Making NAEP 鈥渄evice agnostic鈥 does pose some interesting technical challenges for the agencies. They鈥檒l have to ensure kids don鈥檛 have an unfair advantage from using one kind of device instead of another. (In the early days of online testing, researchers found a 鈥渕ode effect鈥 that produced higher scores for students tested using paper-and-pencil vs. online tests; NAEP will need to be sure some devices don鈥檛 produce their own mode effect.) This will require slow, steady work and pilot studies to perfect.

2. NAEP will experiment with adaptive testing and other innovations.

When we think of a test, we think of every student getting the same set of questions. Computer-adaptive testing is different. This kind of test varies the questions students get as they answer: Miss the first few and a student is given easier questions; get them right and they鈥檒l get more difficult ones. The benefit, in theory, is getting better information about either very high- or low-performing students. (On a traditional exam, most questions are in the middle range, not at the very easy or hard levels.)

The approach is used by the Smarter Balanced series of K-12 state exams, as well as the GRE, a popular graduate school entrance exam.

Now, NAEP will investigate using computer-adaptive technology, too. This is a bit of a challenge because unlike state tests or the GRE, NAEP doesn鈥檛 measure any one individual student鈥檚 outcomes. The results we see are a composite score of lots of students who all took different segments of the exam.

Peggy Carr is the new Commissioner of the NCES.

Still, Carr said, it鈥檚 possible to use the technology within the discrete block of questions each student takes. And if it鈥檚 successful, it should help to generate more fine-grained information on what students who are scoring at NAEP鈥檚 lowest achievement levels are having the most difficulty with, and similarly what sets apart top performers. (That鈥檚 important because of a disturbing recent trend, both on NAEP and international tests, of these two groups鈥 performance moving in opposite directions.)

NAEP also wants to experiment with artificial intelligence to help it write new exam questions and to help score open-ended questions鈥攂oth technically tricky ideas that could offer significant cost savings.

And it wants to support teachers, policymakers, and others to use the findings as they come out.

鈥淗ow can we speed up the return of results and get them back in people鈥檚 hands faster? How can we help researchers dig into the raw data of NAEP more quickly so they can answer questions that, as federal agencies, are a bridge too far for us?鈥 said Muldoon, ticking off some of the driving questions she, Carr, and their teams will consider. 鈥淗ow do we translate the results into language real people can understand? How do we modernize the infrastructure [to help] with things like speeding up results? We want to explore those kinds of ideas and utilities so NAEP is as relevant as it can be.鈥

3. An important measure of the pandemic鈥檚 impact on learning is on the runway.

In December 2021, NAGB in addition to 9-year-olds. That work is beginning now.

The long-term trend exams are the only continuous measure of student achievement, dating from the 1970s. (By contrast, the trends for the main NAEP, which produces state-by-state results, get reset each time NAGB updates the testing blueprint.)

This is a bit of balm after the disappointing delay of the main NAEP in late 2020. And it will offer a tight pre- and post-pandemic gauge of learning, because the long-term trend exam for those two age groups was also the final exam given before nearly every school shuttered in spring 2020. (EdWeek鈥檚 Sarah Sparks took a look at the results from the last long-term trends test.)

Results for 9-year-olds are just finishing up now, and they鈥檒l be completed for 13-year-olds this fall鈥攁longside the regular NAEP exams. When the results are released, they will be the only national measure of the pandemic鈥檚 impact on learning.

The NAEP folks do face a small interpretative challenge in releasing these results. The long-term trend exams haven鈥檛 changed significantly since they were created, and they tend to measure foundational knowledge and skills rather than higher-order ones. This means that expected pandemic-related declines on this measure might not show up as steeply as they do on other measures鈥攅specially if basic content is what teachers have prioritized the last few years.

4. The NAEP experts will spotlight equity.

Via better yardsticks for poverty and more context in its reporting, the agencies want to add clarity to the discussions of achievement patterns on NAEP.

For example, NAEP reports often talk about gaps in student performance. That鈥檚 important, but without context, such findings risk fueling a narrative that somehow students are to blame for these disparities鈥攔ather than their varied experiences and uneven access to well-funded schools and good teaching. (Some K-12 researchers and media organizations, including 澳门跑狗论坛, now generally prefer to call them 鈥渙pportunity鈥 rather than achievement gaps.)

And analyses of how students perform often counter stereotypes. Carr pointed out, for example, significant progress in the proportion of Black high school students who took calculus, according to the NCES鈥 most recent report on high school transcripts; such a picture is one of resiliency and improvement, she noted.

Equity is important, if somewhat politically touchy, territory for the organizations. The term 鈥渆quity鈥 has become a lightning rod in discussions about race and schooling, and even NAEP has been no exception. NAGB faced some internecine drama last year over equity when it was finalizing a new reading framework, though most of it ultimately centered on disagreements about how best to assess lower-performing students fairly.

Muldoon of NAGB said the organization is also commissioning studies about how new test frameworks, like its upcoming science revision, can continue to embrace equity and give all students a shot to show what they know鈥攚hile maintaining technical quality.

NAEP will also continue to work to get a better indicator for students鈥 socioeconomic status. The usual measure, eligibility for free and reduced-price lunch, is increasingly problematic because of policy shifts that permit more students to receive those services regardless of income level.

    5. NAEP鈥檚 architecture will continue to support new research.

    During the pandemic, the Biden administration issued an executive order requiring the Education Department to track the pandemic鈥檚 impact on schools, which led to showing the proportion of schools using different modes of learning. To pull this off, NCES used the NAEP architecture to get the surveys out quickly. After all, NAEP testing relies on a nationally representative sample of schools鈥攁nd that happens to be just what researchers need for surveys.

    In fall 2021, the NCES extended that approach for its new 鈥減ulse鈥 surveys, designed to give additional quick-turnaround survey research. And the agency has set itself , while also slimming down how long the surveys take to fill out and number-crunch. (NCES鈥 other major collections, on principals, teachers, school finance, and scores of other indicators typically take a few years to complete.)

    鈥淚t was an example of how nimble and flexible NAEP can be,鈥 Carr said. 鈥淲e need to take advantage of this infrastructure to help us quickly go in, ask a few questions of schools鈥攖housands of schools鈥攁nd gather the information that鈥檚 needed.鈥

    A version of this article appeared in the April 06, 2022 edition of 澳门跑狗论坛 as The 鈥楴ation鈥檚 Report Card鈥 Is Getting An Overhaul: 5 Things to Know

    Events

    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
    Sponsor
    Reading & Literacy Webinar
    Literacy Success: How Districts Are Closing Reading Gaps Fast
    67% of 4th graders read below grade level. Learn how high-dosage virtual tutoring is closing the reading gap in schools across the country.
    Content provided by 
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
    Sponsor
    Artificial Intelligence Webinar
    AI and Educational Leadership: Driving Innovation and Equity
    Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
    Content provided by 
    This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
    Sponsor
    School Climate & Safety Webinar
    Investing in Success: Leading a Culture of Safety and Support
    Content provided by 

    EdWeek Top School Jobs

    Teacher Jobs
    Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
    Principal Jobs
    Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
    Administrator Jobs
    Over a thousand district-level jobs: superintendents, directors, more.
    Support Staff Jobs
    Search thousands of jobs, from paraprofessionals to counselors and more.

    Read Next

    Assessment Opinion Students Shouldn't Have to Pass a State Test to Graduate High School
    There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
    Alex Green
    4 min read
    Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
    Frances Coch/iStock + 澳门跑狗论坛
    Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
    In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
    10 min read
    Image shows a multi-tailed arrow hitting the bullseye of a target.
    DigitalVision Vectors/Getty
    Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
    Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
    5 min read
    Student writing at a desk balancing on a scale. Weighing test scores against grades.
    Vanessa Solis/澳门跑狗论坛 + Getty Images
    Assessment Why Are States So Slow to Release Test Scores?
    Nearly a dozen states still haven't put out scores from spring tests. What's taking so long?
    7 min read
    Illustration of a man near a sheet of paper with test scores on which lies a magnifying glass and next to it is a question mark.
    iStock/Getty