ܹ̳

Federal

New Uses Explored for ‘Value Added’ Data

By Debra Viadero — May 28, 2008 6 min read
  • Save to favorites
  • Print
Email Copy URL

With “value added” methods of measuring student-learning gains continuing to grow in popularity, policymakers and researchers met here last week to explore possible new ways of using the sometimes controversial approaches and to debate their pluses and pitfalls.

The May 23 conference at the Urban Institute, a think tank based here in the nation’s capital, examined the policy implications for , which typically measure students’ learning gains from one year to the next. Such methods have been spreading since the early 1990s.

While value-added designs are still imperfect technically, various speakers at the gathering said, they can provide new information to help identify ineffective teaching and the impact of certain programs and practices, for example. The data they provide can help educators reflect on their own practices, give administrators grounds for denying tenure to poorly performing teachers, or be used by states to calculate whether districts are making adequate yearly progress under the federal No Child Left Behind Act.

And value-added models can answer such important research questions as what makes a good teacher and whether problems in retaining early-career teachers actually harm or help schools, speakers said.

Yet when it comes to high-stakes decisions, supporters, critics, and scholars of value-added research models seemed to agree on one point: Value-added calculations, if they’re used at all, should be one among several measures used in judging the quality of schools or teachers.

“Assessment results are one critically important measure,” said Ross Wiener, the vice president for programs and policy at the Education Trust, a Washington-based research and advocacy group that focuses on educational inequities. “There are other things that teachers do that are important.”

Last week’s Urban Institute event piggybacked on an April conference at the University of Wisconsin-Madison, where researchers aired technical cautions about value-added research methodology and shared some other research supporting its usefulness. (“Scrutiny Heightens for ‘Value Added’ Research Methods,” May 7, 2008.)

An organizer of the Wisconsin meeting said at the Washington event that the limitations of value-added designs should be kept in perspective. Both the Washington conference and the Wisconsin gathering that preceded it were sponsored jointly by the Carnegie Corporation of New York, the Joyce Foundation, and the Spencer Foundation. (All three philanthropies underwrite coverage in ܹ̳.)

“I ask you not to lose sight of what I think is the main message,” said Adam Gamoran, the director of the Madison-based Wisconsin Center for Education Research, “which is that value-added models are better than the alternatives.”

Measuring Change

When it comes to accountability efforts, the alternatives for most education systems are techniques that rely on snapshots of student achievement at a single time, such as percentages of students who meet state academic targets.

The theoretical appeal of value-added accountability systems, which measure learning gains from one year to the next, is that educators would get credit only for the progress students made in their classrooms and not get penalized for the learning deficiencies that students brought with them to school.

In practice, though, various value-added models are proving controversial. A case in point is the New York City school system’s efforts to use such techniques to rate schools and evaluate teachers’ job performance, noted Leo E. Casey, the vice president for academic high schools for the 200,000-member United Federation of Teachers, the local teachers’ union.

At the conference, Mr. Casey faulted the school system’s teacher-evaluation project for relying on scores from tests taken by students in January, for failing to take into account the fact that students are not randomly assigned to classes, and for employing statistical calculations that he said are unintelligible to nonstatisticians.

“It’s really important that teachers, students, and parents believe the system on which they are being graded is a fair system,” he told conference-goers.

Opposition from his group, which is an affiliate of the American Federation of Teachers, and other teachers’ unions led New York state lawmakers in April to legislate a two-year moratorium on any efforts by districts to link student-performance data to teacher-tenure decisions. In the meantime, a state task force will be formed to study the issue.

Teacher Characteristics

Studies that try to identify which characteristics of teachers are linked to students’ learning gains are another, less controversial use of value-added methodology. Do veteran teachers do a better job, for example, than novices?

Studies examining such questions have shown that, while experience has proved to be important in some ways, possession of other credentials, such as a master’s degree, seems to have no impact on student performance, according to Douglas N. Harris, an assistant professor of educational policy studies at the Wisconsin research center.

Given the cost of a master’s degree—about $80,000, by his calculations—value-added methods might be a less expensive way to reward good teachers and signal which ones a school system ought to hire, Mr. Harris suggested.

“But we still need a path to improvement, and existing credentials might serve that function,” he said.

Value-added research models can also provide more information than experimental studies about the long-term effectiveness of particular programs or interventions in schools, said Anthony S. Bryk, a Stanford University scholar who is the incoming president of the Carnegie Foundation for the Advancement of Teaching, based in Stanford, Calif.

Mr. Bryk is currently using the statistical technique to track the progress of a professional-development program known as the Literacy Collaborative in 750 schools. He said that, while randomized studies are considered the gold standard for research on effectiveness, they can’t provide information about the different contexts in which a particular program works, the range of effect sizes that are possible, or whether the improvements change over time.

“You can only get so far by weighing and measuring,” he said. “What I’m arguing for is the use of value-added models toward building a science of improvement.”

From Data to Decisions

Whether schools will know how to make use of data collected through value-added statistical techniques is an open question, however.

Daniel F. McCaffrey, a senior statistician in the Pittsburgh office of the Santa Monica, Calif.-based RAND Corp., studied 32 Pennsylvania school districts taking part in the first wave of a state pilot program aimed at providing districts with value-added student-achievement data in mathematics.

He and his research colleagues surveyed principals, other administrators, teachers, and parents in the districts involved in the program and compared their responses with those from other districts having similar demographic characteristics.

“We found it was really having no effect relative to the comparison districts,” Mr. McCaffrey said.

Even though educators, for instance, seemed to like the data they were getting and viewed the information as useful, few were doing anything with the results, he said. Twenty percent of the principals didn’t know they were participating in the study, Mr. McCaffrey said, noting also that the program was still young at that point in the evaluation process.

Despite such challenges, other speakers at the conference argued that the use of value-added methodology should become more widespread. Said Robert Gordon, a senior fellow at the Center for American Progress, a Washington think tank: “The way we will learn about implementation problems, I think, is to implement.”

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of ܹ̳'s editorial staff.
Sponsor
Reading & Literacy Webinar
Literacy Success: How Districts Are Closing Reading Gaps Fast
67% of 4th graders read below grade level. Learn how high-dosage virtual tutoring is closing the reading gap in schools across the country.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of ܹ̳'s editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of ܹ̳'s editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Federal From Our Research Center How Educators Say They'll Vote in the 2024 Election
Educators' feelings on Vice President Kamala Harris and former President Donald Trump vary by age and the communities where they work.
4 min read
Jacob Lewis, 3, waits at a privacy booth as his grandfather, Robert Schroyer, fills out his ballot while voting at Sabillasville Elementary School, Nov. 8, 2022, in Sabillasville, Md.
Jacob Lewis, 3, waits at a privacy booth as his grandfather, Robert Schroyer, fills out his ballot while voting at Sabillasville Elementary School, Nov. 8, 2022, in Sabillasville, Md.
Julio Cortez/AP
Federal Q&A Oklahoma State Chief Ryan Walters: 'Trump's Won the Argument on Education'
The state schools chief's name comes up as Republicans discuss who could become education secretary in a second Trump administration.
8 min read
Ryan Walters, then-Republican candidate for Oklahoma State Superintendent, speaks at a rally, Nov. 1, 2022, in Oklahoma City.
Ryan Walters speaks at a rally on Nov. 1, 2022, in Oklahoma City as a candidate for state superintendent of public instruction. He won the race and has built a national profile for governing in the MAGA mold.
Sue Ogrocki/AP
Federal Why Trump and Harris Have Barely Talked About Schools This Election
Kamala Harris and Donald Trump haven't outlined many plans for K-12 schools, reflecting what's been the norm in recent contests for the White House.
6 min read
Republican presidential nominee former President Donald Trump and Democratic presidential nominee Vice President Kamala Harris participate during an ABC News presidential debate at the National Constitution Center, Tuesday, Sept.10, 2024, in Philadelphia.
Republican presidential nominee former President Donald Trump and Democratic presidential nominee Vice President Kamala Harris participate in an ABC News presidential debate at the National Constitution Center on Sept.10, 2024, in Philadelphia.
Alex Brandon/AP
Federal Who Could Be Donald Trump's Next Education Secretary?
Trump must decide if he wants someone with a "proven track record" or a "culture warrior," says a former GOP Hill staffer.
9 min read
President Donald Trump, right, arrives in a classroom at St. Andrew Catholic School in Orlando, Fla., on March 3, 2017.
President Donald Trump, right, arrives in a classroom at St. Andrew Catholic School in Orlando, Fla., on March 3, 2017.
Joe Burbank/Orlando Sentinel via AP