ܹ̳

Assessment

Test Industry Split Over ‘Formative’ Assessment

By Scott J. Cech — September 16, 2008 7 min read
Testing expert Richard J. Stiggins says he has stopped using the term “formative assessment.”
  • Save to favorites
  • Print
Email Copy URL

There’s a war of sorts going on within the normally staid assessment industry, and it’s a war over the definition of a type of assessment that many educators understand in only the sketchiest fashion.

Formative assessments, also known as “classroom assessments,” are in some ways easier to define by what they are not. They’re not like the long, year-end, state-administered, standardized, No Child Left Behind Act-required exams that testing professionals call “summative.” Nor are they like the shorter, middle-of-the-year assessments referred to as “benchmark” or “interim” assessments.

Or they shouldn’t be, at least according to experts inside and outside the testing industry, who believe that truly “formative” assessments must blend seamlessly into classroom instruction itself.

“It makes me want to scream and run out of the room,” said Ray Wilson, the executive director of assessment and accountability for the 33,000-student Poway Unified School District in Poway, Calif., referring to off-the-shelf commercial products labeled “formative assessment” that major test-makers try to sell him. “I still contend that so long as a teacher doesn’t have primary control [over assessment content],” he added, “you will never have truly formative assessment.”

“I had not heard, frankly, that there was some resistance to companies selling formative assessment,” said Robert Block, the associate vice president for K-12 learning and development at the Princeton, N.J.-based Educational Testing Service, which sells test questions in bulk to schools as “formative assessment item banks.”

But he pointed out that school districts’ requests for proposals often ask specifically for formative assessments.

“It has become the standard,” he said of the testing industry’s practice of labeling some assessment products as “formative.”

“I’m not sure if it’s good or bad—it’s just what the market is looking for.”

‘AѾԳٲ’

The schism was in full view at the Council of Chief State School Officers’ National Conference on Student Assessment in Orlando last June. In the main exhibit hall, lined with competing assessment-makers, Peter D. Hofman, the vice president of marketing and business development at the Dover, N.H.-based testing company Measured Progress, handed out boxes of mints labeled “AssessMints” to strolling psychometricians and state-assessment directors.

“These are the only legitimate formative-assessment products,” declared Mr. Hofman, whose company does not sell formative-assessment materials.

He left unsaid his implicit message: The room’s rival exam firms—like almost every other testing company in the country—sell something that’s not what it purports to be.

It’s not just semantic consistency that’s at stake: Formative assessments are a more than half-billion-dollar business in the United States, according to the latest analysis by Outsell Inc., a Burlingame, Calif.-based research and advisory firm for the information, education, and publishing industries.

That total, including tests, test delivery, scoring, scoring analysis, professional development, and other services, but not such bundled products as textbooks, accounted for about 30 percent of the $2.1 billion in overall assessment revenue generated in the United States in the 2006-07 academic year—the most recent year for which statistics are available.

What’s more, said Laurence Bloom, an affiliate analyst for Outsell, the latest estimates project that formative-assessment revenue will climb at a rate of 10 percent to 20 percent per year through 2010, making it the fastest-growing segment in the assessment market.

That’s a lot of money being spent on something that experts say can’t really be sold—only practiced.

More Than Word Choice

Measured Progress is one of the few large testing companies taking a pass on the lucrative market. That’s mostly because Stuart R. Kahl, the company’s president and chief executive officer, takes his cues, as many other testing experts do, from a 1998 research-literature review of Paul Black and Dylan Wiliam, then at King’s College, University of London. The , which predated the formative-assessment market, concluded that the research to date showed achievement gains using formative-assessment strategies that were “among the largest ever reported for educational interventions.”

“We use the general term ‘assessment’ to refer to all those activities undertaken by teachers—and by their students in assessing themselves—that provide information to be used as feedback to modify teaching and learning activities,” they wrote. “Such assessment becomes formative assessment when the evidence is actually used to adapt the teaching to meet student needs.”

Product or Approach?

“What they’re talking about is not what people are selling; it’s what teachers are doing day in and day out,” said Mr. Kahl, whose company produces state standardized tests used by Kentucky, Massachusetts, Montana, New Hampshire, Rhode Island, and Vermont.

“People say, ‘You’re just being picky about a word choice.’ I think it’s more serious than that,” Mr. Kahl said. “The consumers out there ... are being told [in advertisements and pamphlets] that these [products] satisfy their formative needs. They believe that and purchase it and don’t think they have to do anything more for formative assessment.”

With so much money at stake for products sold under the label “formative,” said W. James Popham, a professor emeritus of education at the University of California, Los Angeles, and the author of the 2008 book Transformative Assessment, “They’d call it ‘Clytemnestra’ if they thought it would sell,” referring to the mythological Greek figure best known for helping kill her husband.

When test companies sell things and call them formative, he said, “these vendors are being disingenuous—we used to call it lying.”

Calls seeking comment from the Maple Grove, Minn.-based Data Recognition Corp. and from Pearson Education, a division of the global media giant with headquarters in London and New York City, were not returned.

In response to criticism of commercial “formative assessment” products, Hillary Michaels, a senior research scientist for Monterey, Calif.-based CTB/McGraw-Hill, said at a National Conference on Student Assessment session that “when it comes to formative assessment, we need to rethink some of the ways we’re accustomed to thinking of testing.”

Avoiding the Term

Other experts acknowledge that the lack of widespread understanding about the nature of formative assessment makes the term vulnerable to appropriation, but don’t necessarily agree that testing companies are deliberately obfuscating the distinction.

“I would say that the interests of any business in maintaining their revenues is such that when the world around them changes, they redescribe what they have and repurpose what they have,” said Neal M. Kingston, an associate professor of the Program in Research, Evaluation, Measurement, and Statistics at the University of Kansas in Lawrence who has held high-ranking positions at Measured Progress, CTB/McGraw-Hill, and the ETS. He now helps create the state’s annual standardized Kansas Assessment Program exam.

“It’s not that there’s complete agreement on what makes something ‘formative’—it’s all how you’re using it,” Mr. Kingston said.

Richard J. Stiggins, the executive director of the Portland, Ore.-based Assessment Training Institute, mostly stays away from the term. “There’s not universal agreement on what it means—I’ve actually stopped using the word,” said Mr. Stiggins. He said that he now usually refers to formative assessment as “assessment for learning” to cut down on ambiguity.

Mr. Stiggins said he doesn’t believe testing companies are shading the truth, but he added, “Merely selling people a test on the assumption that it will be used formatively won’t cut it.”

In-House Differences

No one whom ܹ̳ asked would identify a specific product or company that he or she thought was misusing the term. Many experts were leery of pointing to a company, in part because there are often differences of opinion within a company itself.

“It’s really hard to do that because even within companies, there’s a wide range,” said Mr. Kingston. His own university, for example, uses the word formative when referring to an assessment its staff created.

Another example is Mr. Stiggins. Although he has made the assertion “formative assessment isn’t something you buy—it’s something you practice” something of a mantra in the many talks he gives to educators around the country, his company is owned by the ETS, which sells item banks under the label “formative.”

Within any testing company, said Mr. Kingston, “you have your curriculum experts, you have your psychometric experts, you have your marketing and business people, and they all approach the needs that are out there in different ways.”

Mr. Kahl, the CEO of Measured Progress, said he sympathizes—to a point. “I know of individuals within those companies that feel [ambivalent about some products],” he said. “On the the other hand, you look at the [product] catalogs and you see things in there labeled ‘formative assessment,’ and you’ve got to wonder who’s driving the truck in those big companies.”

Related Tags:

Coverage of new schooling arrangements and classroom improvement efforts is supported by a grant from the Annenberg Foundation.
A version of this article appeared in the September 17, 2008 edition of ܹ̳ as Test Industry Split Over ‘Formative’ Assessment

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of ܹ̳'s editorial staff.
Sponsor
Reading & Literacy Webinar
Literacy Success: How Districts Are Closing Reading Gaps Fast
67% of 4th graders read below grade level. Learn how high-dosage virtual tutoring is closing the reading gap in schools across the country.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of ܹ̳'s editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of ܹ̳'s editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Opinion 'Academic Rigor Is in Decline.' A College Professor Reflects on AP Scores
The College Board’s new tack on AP scoring means fewer students are prepared for college.
4 min read
The United States Capitol building as a bookcase filled with red, white, and blue policy books in a Washington DC landscape.
Luca D'Urbino for ܹ̳
Assessment Opinion Students Shouldn't Have to Pass a State Test to Graduate High School
There are better ways than high-stakes tests to think about whether students are prepared for their next step, writes a former high school teacher.
Alex Green
4 min read
Reaching hands from The Creation of Adam of Michelangelo illustration representing the creation or origins of of high stakes testing.
Frances Coch/iStock + ܹ̳
Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
5 min read
Student writing at a desk balancing on a scale. Weighing test scores against grades.
Vanessa Solis/ܹ̳ + Getty Images