澳门跑狗论坛

Special Report
Teaching

Performance Assessment: 4 Best Practices

By Stephen Sawchuk 鈥 February 05, 2019 | Corrected: February 06, 2019 8 min read
Griffin Walsh plays Kindville at Newnan Crossing Elementary School in Newnan, Ga. Some schools in the state, including Newnan Crossing, are piloting Kindville, a new formative education assessment program which looks, and plays, just like a video game, but will eventually spit out qualitative math and reading scores.
  • Save to favorites
  • Print
Email Copy URL

Corrected: An earlier version of this story contained an incorrect job description for Paul Leather. He oversees state and local partnerships for the Center for Innovation and Education.

Let鈥檚 get this out of the way first: Performance assessment鈥攖he idea of measuring what students can do, not merely what they know鈥攊s not a new idea in K-12 education.

Teachers have been told to engage students in projects at least since the days of John Dewey, and probably long before that. (The famous Socratic method, after all, requires students to advance and sustain their positions in an argument, not repeat back knowledge.)

Nevertheless, performance assessment has a bit of a riddled history in the United States. In the 1990s鈥攖he last major period of experimentation鈥攊t was tried at scale and then abandoned in Kentucky, Maryland, and Vermont.

The challenges begin with a definitional problem: Does an essay test count as a performance assessment? What about a short response on an otherwise multiple-choice test? Experts disagree, and such quibbles have fueled confusion over how to measure 鈥渁uthentic鈥 student performance.

澳门跑狗论坛鈥榮 special report sets out to inject some clarity into the debate. In this report, you鈥檒l find a glossary, examples of districts slowly expanding their use of capstone projects for graduation, states encouraging the use of tests that feel more like games, and colleges exploring whether demonstrations of competency can supplement traditional application credentials like seat time and transcripts.

The landscape of performance assessment remains hard to parse. Although most testing experts agree that it鈥檚 trending again, it鈥檚 unclear how widespread performance assessment is. Just ask Jenny Poon, a fellow at the Center for Innovation in Education, which advises four states on the development and use of the tests. She鈥檚 worked to create a continuously updated, comprehensive map of state action.

The problem is that, even in states that have policies supporting performance testing on paper, districts vary greatly in how rigorously they implement those ideas.

Still, Poon said a few trends help to raise interest in measuring student performance in richer ways than multiple-choice questions. About 20 states now use the Next Generation Science Standards, she points out, which specifically require students to engage in scientific practices, such as generating hypotheses and recording data from experiments.

A second thrust is states鈥 interest in better gauging what high school graduates know and can do, as evidenced by the spread of state and local adoptions of diploma seals and capstone projects鈥攑resumably a firmer indication of student ability than credit hours.

Experts also know more about performance assessment after years of experimentation. So as you read the report, keep in mind some of the four big lessons they鈥檝e offered up, which are distilled here for you.

1. Decide on goals first.

First and foremost, the experts say: Know why you want the assessment and what benefits you expect to achieve by investing in it.

鈥淭here鈥檚 no point in teaching someone to write an article for a newspaper and giving them a multiple-choice test to see if they鈥檙e able to do that,鈥 said Scott Marion, the executive director of the Center for Assessment, which advises states on testing. 鈥淧erformance assessment is made for those situations. But if you鈥檙e filling in grammar rules, then maybe multiple choice is fine.鈥

A related issue concerns how the results will be used. Performance assessments are generally more difficult to standardize and less likely to produce comparable results for individual students. That鈥檚 probably OK if the test is being used mainly to supplement curriculum or for classroom grading. But it鈥檚 a bigger problem if you want to use it for making decisions about whether a student should graduate from high school or for school ratings.

One well-known mishap occurred in Vermont in the early 1990s, when the state鈥檚 portfolio-assessment program rolled out. The program used teachers to score collections of students鈥 best math and writing work. Early results showed that the degree of agreement among teachers鈥 scores, known as rater reliability, was initially fairly low. In retrospect, RAND Corp. researcher Brian Stecher, who helped evaluate the program back then, wonders whether leaders there got the focus wrong.

鈥淚 think what was really beneficial in Vermont was the fact that this broadened to some extent how teachers were teaching mathematics, instead of a reductive 鈥業 do, we do, you do,鈥 鈥 Stecher said, referring to a common teaching method taught during teacher preparation. 鈥淭hat seems like a good thing to me and valuable in its own right鈥攁nd might have been a better use of this unstructured portfolio than trying to have it be the basis for a standardized judgment.鈥

2. Keep costs in mind.

Coming up with good performance tasks can be expensive as well as time-consuming. In short, it鈥檚 hard to do performance on the fly or on the cheap. That鈥檚 especially the case if what鈥檚 valued is the comparability and reliability of scores, which requires creating and field-testing many tasks.

鈥淲hen you open up assessments to getting students a wide range of response possibilities in terms of format, length, and activities, then it just becomes very hard to manage the time, and materials, and scheduling. It becomes hard to incorporate it into a structured system of assessments, and it also becomes more expensive,鈥 Stecher said.

That鈥檚 one reason so few states have done so at scale under federal annual-testing requirements. New Hampshire, the sole exception for now, is using some traditional exams in the years it doesn鈥檛 administer its locally developed performance measures.

Finally, even if a performance exam is only used locally or for classroom purposes, teachers must invest time and energy to familiarize themselves with its scoring frameworks to make sure they鈥檙e grading fairly. Many districts with expertise in performance assessment, in fact, use blind scoring or double reviews of student work鈥攁nd all that takes time.

And while teachers are generally more knowledgeable about scoring frameworks, or rubrics as they鈥檙e called in the field, than they were 20 years ago, there鈥檚 still often an expertise gap for teachers who are used to fill-in-the-blanks and true-false questions, said Steve Ferrara, who oversaw Maryland鈥檚 now-defunct performance-assessment program in the 1990s. (He鈥檚 now a senior adviser at Measured Progress, a testing company.)

3. Prioritize teaching and learning鈥攏ot just testing.

Performance assessment in education should be part and parcel of reforms to teaching and learning.

Much of the criticism of multiple-choice tests is that they encourage teachers to focus on low-level, easily measured skills. The inverse should be true, too: Give students rich assessment tasks worth teaching to and help support educators to redesign their instruction to boost development of skills like analysis and inference.

In fact, studies from the 1990s on the Maryland State Performance Assessment Program found that under it, teachers had higher expectations for the learning of their students, and principals had higher expectations on what they expected teachers to do. Schools with a high degree of curriculum alignment to the tests showed the most improvement, Ferrara said.

In other words, performance assessment truly requires system change.

鈥淚f you don鈥檛 include at least parallel reforms in teaching and learning, an assessment isn鈥檛 enough,鈥 Marion warns. 鈥淵ou have to improve the meaningfulness of the content, instructional quality, and improve student engagement, too. If you鈥檙e not doing those three things, then you鈥檙e just rearranging the deck chairs.鈥

There are also technical reasons why the mirroring of testing and instruction is desirable: Performance assessment hinges on students having had enough exposure to the content and skills needed to complete the task. Otherwise, the assessment might measure generic problem-solving intelligence, rather than how well students grasp and apply what they鈥檝e learned, noted Sean P. 鈥淛ack鈥 Buckley, the head of the U.S. Department of Education鈥檚 statistical wing from 2010 to 2013, during which he oversaw the development of the agency鈥檚 first performance tasks for exams administered as part of 鈥渢he nation鈥檚 report card.鈥

鈥淭his was always something we worried about,鈥 he said. 鈥淚t is way easier to make a hard test that smart people can do well on than one that shows growth tied to teaching and learning.鈥

4. Plan for scaling up the exams鈥攁nd communicating the results.

Parents and teachers can be a performance assessment鈥檚 biggest boosters or its toughest foes, which means it鈥檚 key to keep them apprised of the assessment program and the logic behind it as it鈥檚 piloted, rolled out, and scored.

Teachers, the experts say, should especially be intimately involved in test design and communications.

鈥淚t takes time to build the capacity to build quality assessments; it鈥檚 almost an apprenticeship approach,鈥 said Paul Leather, who helped get New Hampshire鈥檚 performance-assessment system off the ground and now oversees state and local partnerships for the Center for Innovation and Education, a research and consulting group.

鈥淎s we built our common tasks, we selected content teacher-leaders who led development of the content and the common tasks,鈥 he said. 鈥淥ver time, they start to lead the entire system because assessment literacy has reached such a high level, and we believe that actually has to happen for this kind of system to scale. You essentially create a way in which expertise is not just shared as a product, but something that helps others to gain that expertise over time.鈥

Even when teachers are involved in task design, they can feel left behind without the right training and supports, Ferrara cautioned. 鈥淚t took so much effort in the first few years [of MSPAP] to get the program up and running that all the investment went into the assessment program and not into鈥 professional development, he said. In fact, he recalled, missing materials and a lack of training in the 1992 assessment administration raised teacher ire and got the test slammed in newspapers as the 鈥淢SPAP Mishap.鈥

Finally, as performance assessments yield more-nuanced information on students鈥 abilities, there鈥檚 a related challenge of communicating those results. For six years, Maine required high schools to prepare students to demonstrate competency in eight subjects to earn a diploma. But the experiment faltered in part because districts struggled to communicate what the new grades, often issued on a 1-to-4 scale, meant鈥攁nd how they鈥檇 affect students鈥 chances of getting into college, according to news reports on the system. By 2018, the pressure caused state lawmakers to roll back the requirements, giving districts the option to return to traditional diplomas.

A version of this article appeared in the February 06, 2019 edition of 澳门跑狗论坛 as Four Lessons Learned When Teachers Went Beyond Bubble Tests

Events

Artificial Intelligence K-12 Essentials Forum Big AI Questions for Schools. How They Should Respond鈥
Join this free virtual event to unpack some of the big questions around the use of AI in K-12 education.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
School & District Management Webinar
Harnessing AI to Address Chronic Absenteeism in Schools
Learn how AI can help your district improve student attendance and boost academic outcomes.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
Science Webinar
Spark Minds, Reignite Students & Teachers: STEM鈥檚 Role in Supporting Presence and Engagement
Is your district struggling with chronic absenteeism? Discover how STEM can reignite students' and teachers' passion for learning.
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Teaching Opinion How to Make the Most of Class Time Before a School Vacation: 4 Tips
Winter break is coming. An education researcher shares tips to maximize learning in your classroom and school.
Christina Cipriano
5 min read
Illustration concept of a classroom; blackboard with empty space with textbooks and stationery inside classroom with flying paper airplanes
DigitalVision Vectors with Liz Yap/澳门跑狗论坛
Teaching Opinion 5 Ways to Up Your Classroom Game, According to Larry Ferlazzo
Stop telling your students what to do and other ideas from a veteran teacher to his colleagues.
4 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
Sonia Pulido for 澳门跑狗论坛
Teaching Opinion Music Teachers Are Instrumental. How They Can Bring Us Together Again
Composer Scott Joplin was a musical hero not because he was on stage but because his compositions allowed others to star and to socialize.
Sammy Miller
5 min read
Ragtime music collage background abstract design with piano keys, notes, and sheet music.
DigitalVision Vectors/Getty
Teaching Opinion What Helps Teachers Do Their Best Work, According to Educators
When teachers are happier and more fulfilled, their students are, too.
12 min read
Conceptual illustration of classroom conversations and fragmented education elements coming together to form a cohesive picture of a book of classroom knowledge.
Sonia Pulido for 澳门跑狗论坛