澳门跑狗论坛

Assessment

Value Lessons

By Lynn Olson 鈥 May 05, 2004 19 min read
  • Save to favorites
  • Print
Email Copy URL
British educators have been making use of 'value added' data for two decades. And still the task is challenging.

鈥淭hat is out of order. That鈥檚 not good enough,鈥 pronounces Denise Davies, pointing to a group of triangles in the lower right-hand corner of a graph.

Each tiny triangle represents the performance of a single girl at St. Martin-in-the-Fields High School, an all-girls comprehensive school here in the southern part of the city. Davies, a deputy head teacher at the school, is worried about a handful of teenagers who performed relatively well on national-curriculum tests at age 14. But, as the graph starkly illustrates, they have since slipped behind the average progress of their peers nationally who started out with similar performances. 鈥淪o we use that,鈥 Davies says, pointing to the graph again with exasperation. 鈥淭hat鈥檚 no good.鈥

As the former physical education teacher rattles off the myriad ways in which the school mines information about test-score gains to critique and improve its practice, it鈥檚 hard not to be impressed.

See Also...

View the accompanying chart, 鈥淭racking Growth.鈥 (Requires .)

The senior management uses such data to evaluate the whole school. The heads of subject departments meet annually with the school鈥檚 board of governors to discuss their results and explain what they鈥檙e doing to improve or maintain their standards. They also use the data within their departments to identify individual teachers or courses that are doing exceptionally well or might need shoring up. And across the school, such information is used to support, exhort, and push students to do better.

Carlene McKissack, a slim 16-year-old, rolls her eyes. 鈥淯ntil you鈥檝e got it done,鈥 she says, 鈥渢hey will not leave you alone.鈥

In the United States, interest is mounting in tracking the academic progress of individual students over time to determine how much value schools add to their learning. Here in England, a growing number of educators have been plumbing such data since the early 1980s. Just this past year, the national government began producing 鈥渧alue added鈥 information for every publicly financed primary and secondary school, as well as raw test results.

鈥淚 think the crucial thing about value-added is there鈥檚 no question that it鈥檚 improved the capacity of the system to judge itself at every level: classroom, school, individual teacher,鈥 says Judy Sebba, a professor of education at the University of Sussex.

Still, as the English have learned, getting such growth-oriented systems up and running is no easy task. Nor is deciding the best way to do value-added analyses.

鈥楽ignificant Improvement鈥

The history of value-added methods in Britain dates back at least to 1982. That鈥檚 when a handful of secondary schools in northeast England agreed to share data, on a confidential basis, about how much progress they were making in getting students to pass the A-level examinations required to enter universities.

Carol Taylor Fitz-Gibbon, now a professor emerita at the University of Durham, conducted the study, and subsequently founded the Curriculum, Evaluation, and Management Centre. It has grown into one of the larger educational research units at a British university, dealing with information from more than a million students ages 3 to 18 every year.

The center provides confidential value-added and other analyses under contract to about one in three secondary schools and one in five primary schools in England, as well as to schools in other countries.

Other groups also provide English schools with a wealth of value-added information about their students. The use of such methods is likely to accelerate as a result of the government鈥檚 investment in a massive pupil data base. Unique pupil-identification numbers track students from age 3 to age 16 and enable the government to match test results to an annual student census that provides background information on every youngster in a publicly financed school. The government also has set up a new computer system to make such longitudinal data easily accessible to teachers and head teachers.

鈥淚 think, nationally, data development and the use of data has been a significant improvement in probably the last four or five years,鈥 says Chris Ashton, the assistant director for school improvement and development in the Lambeth Education Authority, where St. Martin鈥檚 is located.

Data use nationwide still varies enormously, says Simon Bird, a policy officer with the Education Network, a membership organization of local education authorities, or LEAs. But the changes since the early 1990s have been nothing short of revolutionary, says Bird, who recently co-wrote a report that highlights innovative LEA data practices in Lambeth and elsewhere.

鈥榁alue-added has improved the capacity of the system to judge itself at every level: classroom, school, individual teacher.鈥

Judy Sebba,
Professor,
University of Sussex

At least three methods of value-added analysis are now used in England. The simplest, and the one now employed by the national government, measures the progress of individual students based solely on their prior attainment.

Other measures control for prior attainment in addition to a range of school-level factors that might affect students鈥 progress but are outside a school鈥檚 control, such as the proportion of boys in a school or the percent of students eligible for free school meals.

The most complicated methods鈥攐ften known as multilevel modeling鈥攖ake into account both school-level characteristics and those of individual classrooms and pupils that may also impinge on academic growth, such as the teacher鈥檚 years of experience or the number of days a pupil has missed school.

Among local authorities that have a robust history of using value-added analyses, two of the best examples are Lambeth, in inner-city London, and the Hampshire Education Authority, in the far south of England, about an hour鈥檚 journey from London by train.

Lambeth, in the neighborhood around Brixton, is one of the most ethnically, linguistically, and culturally diverse boroughs in the nation. About 73 percent of its 29,000 or so students are from black or other minority groups, including the largest proportion of black Caribbean children of any local education authority in England.

In contrast, Hampshire is one of England鈥檚 largest nonmetropolitan or 鈥渟hire鈥 counties. With more than 1.2 million residents, it serves about 2.5 percent of all children in England, and encompasses a vast area that ranges from the ancient city of Winchester to the extensive woodlands of the New Forest. The vast majority of its students are white, and a greater proportion come from affluent backgrounds than in most other parts of the country.

Making Figures Speak

It would be hard to find someone who feels more passionate about the use of data than Feyisa Demie, the director of the research and statistics unit for Lambeth. An economist and statistician by training, Demie joined the LEA seven years ago. Since then, he and his team have been providing schools such as St. Martin鈥檚 with a steady flow of information designed, in his words, to 鈥渕ake figures speak for themselves.鈥

鈥淲e really draw for each school a graphical presentation,鈥 he notes. 鈥淎utomatically, they can see where they stand. They don鈥檛 need to read it.鈥

In addition to providing each school with a student-level value-added analysis, Lambeth produces a customized 鈥渟chool profile.鈥 The 42-page document compares a school鈥檚 performance on a range of indicators with that of the LEA average, the national average, and a group of similar schools within the authority, based on student mobility, eligibility for free meals, and English fluency. A separate set of analyses provides secondary schools with data on how their students scored on national exams compared with other students in the LEA, broken down by gender, ethnicity, eligibility for free meals, and other characteristics. The reports also include five-year trend data. Finally, researchers provide each school with information about variations in performance across subjects, to identify both strong and weak departments.

St. Martin鈥檚 devours such data. The school, founded 305 years ago as a parish school for 12 impoverished girls, now occupies a graceful building with a multistory, domed foyer. Today, its student population reflects the changes in the community. Nearly 90 percent of the school鈥檚 approximately 700 girls, ages 11 to 18, come from minority ethnic backgrounds. More than one-fourth qualify for free meals.

Yet in 2003, 58 percent of its students earned five or more grades of C or higher on the General Certificate of Secondary Education exams, the national school-leaving exams, up from 35 percent in 1997. That focus on achievement shines through on a quick tour of the building. One display case posts the national test results in mathematics for St. Martin鈥檚 compared with the LEA and all of England. Another lists the top-scoring girls in science on this year鈥檚 GCSEs.

St. Martin鈥檚 also does its own projections of student performance, and it encourages teachers to plot their students鈥 results and set challenging, but realistic, targets. The data are shared with the school鈥檚 board of governors, who help determine the school鈥檚 budget and priorities. One weekend a year, the senior management team sifts through the data to pick out the highlights. Then every department spends a day writing a departmental-action plan, for which the department analyzes the results for every cohort of students.

鈥淪o each teacher has to plot their own results,鈥 says deputy head Davies, 鈥渨hich can be pretty hairy, if they鈥檙e the only person in the department who鈥檚 not making progress.

鈥淏ut it鈥檚 private within that department,鈥 she adds. 鈥淭he audit has to come back to me, but it doesn鈥檛 name names. But the department has got to deal with it.

鈥淲e鈥檝e worked quite hard within the school to make it as nonthreatening as possible,鈥 Davies continues, 鈥渨hile also making it very clear that if it鈥檚 not as good as it should be, then that鈥檚 not good enough for us.鈥

If, for example, staff members find a teacher whose class has not made the expected rates of growth, it鈥檚 up to the department head to help that teacher, perhaps through peer observation and support.

One school uses the value-added data to identify bright underachievers, whom it targets for special mentoring, field trips, and encouragement.

What may be most striking, however, is that the school uses the data on students鈥 past and projected performance to identify bright underachievers, whom it targets for special mentoring, field trips, and encouragement. It has also formed a literacy group for incoming students who need extra help. Starting this year, student report cards include information for parents about their children鈥檚 target performance for the upcoming set of national-curriculum tests, and whether they are on track, exceeding those expectations, or falling behind.

鈥淎s a parent, I want to know,鈥 explains Davies. 鈥淥ne of the biggest underachieving groups in this country is of black-Caribbean heritage, and we have got a high percentage. So if we are not driving up their results and saying, 鈥榊ou can and you will,鈥 then they鈥檙e not going to be making the progress they can, and they won鈥檛 have the opportunities they should. And that鈥檚 not fair.鈥

That message is pounded home to the girls through individual, 15-minute meetings twice a year to review their performance and devise individual action plans that spell out what they need to do to achieve their targets. Problem areas are highlighted in red so students can easily view trouble spots. By the end of the meeting, student and adviser have to agree on three action points, which are then recorded in their planners and taken home and signed by the girls鈥 parents.

鈥淚 think it helps,鈥 says 16-year-old Mimi Michaels. 鈥淚f they told you you鈥檙e meant to be getting an A or an A*, it makes you want to get that mark because your teacher knows what you can get. If you get less than that, you feel like you鈥檝e failed yourself, and you feel like other people know you鈥檝e failed yourself.鈥

McKissack says of an earlier meeting with the head teacher, 鈥淚 was quite pleased with what she said. Until then, I didn鈥檛 actually know if the head teacher dealt with that side of things or looked at my results. And when we pulled my sheet, it made me actually think that, yeah, I should work to that.鈥

If teachers predict that a student will earn D鈥檚, St. Martin鈥檚 students say, it鈥檚 upsetting, but it also makes them want to work harder to prove themselves.

Lambeth鈥檚 elementary schools make similar use of data to identify underachieving students and subjects. At Sudbourne Primary School, head teacher Susan Scarsbrook lays an enormous spreadsheet across her lap that tracks the progress of a cohort of the school鈥檚 600-some pupils from the time they enter the school until they leave at age 11.

鈥淲e do it for each year group,鈥 the blue-eyed, white-haired head teacher explains, 鈥渁nd we update it every year.鈥 The information is used, in part, to identify children who are not making expected progress. Based on those analyses, the school provides extra English or math instruction.

鈥淲hen I was a young teacher, if children failed a test, that was the child鈥檚 fault,鈥 says Scarsbrook, a congenial woman now in her late 50s. These days, she says, teachers in her school and others are poring over test results. 鈥淭hey鈥檙e not saying these children are stupid. They鈥檙e saying, 鈥楬ow is our teaching not being effective?鈥 And that鈥檚 such a big jump professionally.鈥

Even so, Scarsbrook frets: 鈥淚t鈥檚 worrying鈥攕tatistics.鈥 The actual strength of Sudbourne, she maintains, lies not in an obsession with numbers but in its caring ethos and its emphasis on creativity. The data are only a tool.

鈥楲ike With Like鈥

Demie, Lambeth鈥檚 research director, contends that for statistics to be useful, the measures must be straightforward and easily interpretable by head teachers, parents, and policymakers. The value-added method he uses, therefore, employs a simple regression model that predicts each student鈥檚 expected performance based on his or her prior attainment, without taking account of any other factors.

In contrast, the Hampshire LEA, working with Harvey Goldstein, an education professor at the University of London, pioneered the use of multilevel modeling with the authority鈥檚 primary schools. And it remains one of the few LEAs to use such complicated analyses.

The model controls for such individual characteristics as sex, previous test results at age 7, number of absences, eligibility for free school meals, length of time in the school, and special education status. It also accounts for such school-level factors as the percent of students receiving subsidized meals and the number of children enrolled in the tested grade.

鈥淭he national value-added methodology is fairly weak, really, in terms of its contextual information. It doesn鈥檛 do any at all,鈥 contends Nigel Hill, the director of information and research for the Hampshire County Council. 鈥淚f you don鈥檛 take account of those factors, then you鈥檙e not comparing like with like.鈥

The school system provides each of its 437 primary schools with value-added results for every pupil. But even with 71 secondary schools, it does not have enough data on students in classrooms to do multilevel modeling for them. Instead, it supplies secondary schools with value-added analyses prepared nationally by the Fischer Family Trust, which takes account of such school-level factors as gender and the percent of students eligible for free school meals.

All Hampshire schools also receive a snapshot of how their students have performed on national tests in each of the last five years compared with a group of 14 demographically similar schools in the county. An easy-to-digest summary sheet breaks the results out by subject and gender, with a check if the school鈥檚 performance is in the top quartile for its 鈥渟elf-reference鈥 group, an X if it鈥檚 in the bottom quartile, and a dot if it鈥檚 in the middle two quartiles. The same information is presented in bar graphs for those who prefer a more visual presentation.

鈥楩or a primary school, it鈥檚 very useful because you can actually benchmark your children at regular points during their progress.鈥

Says Caroline Carter, the head teacher of Braishfield Primary School, a tiny, red-brick structure located on a quiet country road, 鈥淔or a primary school, it鈥檚 very useful because you can actually benchmark your children at regular points during their progress.

鈥淏ecause our baseline is quite high,鈥 Carter adds, 鈥渙ur value-added is of particular concern to us. It means that I can identify children who are coming in at a high baseline, and I can look at any sorts of procedures I need to put into place鈥攅ither support or challenge鈥攖o make sure they achieve what they should be achieving and beyond.鈥

This year, for instance, the 99-student school decided to have its exceptionally strong 11-year-old math students work with bright pupils at age 9, so that the teacher was free to focus on the middle group of youngsters, and the classroom assistant could work with the lowest performers.

Carter, who still teaches math to the school鈥檚 oldest children, says, 鈥淏ecause I was mathematically inclined, I love statistics anyway. So I will sit and pore over these. Here, my staff suffer from the fact that I think it鈥檚 important, and it鈥檚 interesting, and they have to get involved because I dedicate staff meetings to look at it.

鈥淲hat I always say to staff is, 鈥楧on鈥檛 worry too much about the data,鈥 鈥 she adds, 鈥渂ecause all it does is make sure we鈥檙e asking questions.鈥 Particularly in a small school, she observes, the data tend to confirm what teachers already know about students.

What鈥檚 essential, argues Rob Sykes, the head teacher of the 1,330-student Thornden Secondary School, also in Hampshire, is only to use the data in a positive way.

鈥淵ou鈥檝e got to use it to enable things to happen,鈥 he says. 鈥淚t鈥檚 too often used as a stick to hit people with, and that isn鈥檛 sensible if what you鈥檙e trying to do is boost achievement.鈥

Contentious Measures

That鈥檚 one of the concerns now that the national government is publishing value-added information for every school in England. Most educators view the change as positive and preferable to the use of raw test results alone to judge schools. But some people have significant reservations about how the government computes and presents the figures.

Back in the mid-1990s, the government commissioned Fitz-Gibbon to do a feasibility study for a national system of school value-added indicators. The Department for Education and Skills began to set up such a system in 1998. But such measures require the results of pupils鈥 performance over time. And students take national tests only at key transition points鈥攗sually when they are 7, 11, 14, and 16. So it wasn鈥檛 until last year that the government was able to publish such information for every primary and secondary school in England.

From the start, says Cathy Christieson, the project manager for performance tables at the education department, it was agreed that the government鈥檚 value-added method 鈥渉ad to be simple and transparent.鈥

鈥淭his is why it has been based just on pupils鈥 attainment in national tests. And I firmly believe that was the right decision,鈥 she adds. 鈥淥ne of the reasons that the whole country now has a conversation about value-added is because they now understand the very basic concept. Thanks to that, we can now begin a conversation about going down more complex routes.鈥

But some of those decisions have proved controversial. For example, because primary schools in England tend to be small, the government reports results for as few as 11 pupils. A technical note that accompanies the tables cautions against overinterpreting such findings and provides guidelines for when to consider differences between schools statistically significant. But the warnings rarely make it into the newspapers. And critics charge the government hasn鈥檛 done enough to highlight measurement uncertainties, particularly when year-to-year changes in value-added results can be so volatile.

Critics charge that the methods used disadvantage schools with more high-ability pupils; ignore the variation in performance across subjects; and fail to account for student mobility.

鈥淚t鈥檚 buried in the fine print,鈥 says Goldstein of the University of London. 鈥淥ne of [the government鈥檚] responses is this is a very difficult idea to get across, and people won鈥檛 understand it.鈥 But, he adds, 鈥渢hat doesn鈥檛 mean you can ignore it.鈥 To do so, Goldstein says, implies significant differences between schools where often none exist.

To avoid giving a school a negative value-added score when its students fail to make as much progress as others with the same prior performance, the government also decided to add 100 to the final results, rather than centering the scores around zero.

鈥淭he rationale is to avoid negative numbers because people will imagine that pupils have gone backwards,鈥 says Ian Shagen, the head of statistics at the National Foundation for Educational Research, in Slough. But he argues: 鈥淎dding 100 to the final result makes it more obscure. A figure of, say, 94.6 meaning nothing鈥94.6 units of what?鈥

Critics also charge that the methods used by the education department disadvantage schools with more high-ability pupils; ignore the variation in performance across subjects, particularly at the secondary level; and fail to account for student mobility, which can be a significant problem.

Further consternation has been caused by the fact that, while proponents assumed value-added methods would level the playing field, selective grammar schools continue to come out ahead on some value-added tables, but not on others. Moreover, at the secondary level, the same school can find itself making good progress between ages 14 and 16, but not between 11 and 14, leading to confusion about whether the school is effective or not.

And as David Miliband, the minister of state for school standards, pointed out in a Jan. 8 speech, 鈥淭here is a flourishing debate as to whether we should take account of more than prior attainment when we calculate the value-added by schools. Over the coming months, we shall be consulting widely as we move towards a model of value-added which commands the confidence of all.鈥

Parent Choices

To its credit, the government has formed a working group that includes some of its sharpest critics of value-added analysis to help refine its methods. But Trevor Knight, an education department statistician who was involved in developing those methods, says it鈥檚 not easy reaching agreement about which changes are appropriate.

鈥淲e have always chosen a simple value-added model precisely because we have difficulty, nationally, agreeing on what a more complex set of measures should actually be,鈥 he says. 鈥淚n part, the government has been concerned about sending any signals that would suggest it expects children from poor backgrounds or from certain ethnic groups to do less well than their peers.

鈥淏ut we now have some basic information on the key characteristics of pupils in most state schools,鈥 Knight says, 鈥渁nd this will allow us to test nationally how we take on board other factors that influence performance. 鈥 We shall need to think very carefully how we add to what we鈥檝e done in a way which commands wide support.鈥

The government also is committed to measuring the progress of all students with special education needs who work below the national-curriculum levels.

鈥淚t鈥檚 dead-easy to design research studies using value-added,鈥 sighs David Hopkins, the head of the standards and effectiveness unit in the department. 鈥淚t鈥檚 another thing to design it for national consumption.鈥

How much attention parents pay to the information in judging or selecting schools is open to question.

鈥淚t has been regarded as a serious policy option, but it hasn鈥檛 impacted on the public debate in any meaningful sense, in my judgment,鈥 says Harry Torrance, a professor of education at Manchester Metropolitan University. 鈥淧arents, when it comes to the crunch, are still going to make judgments about schools based on raw test results. I鈥檓 pretty convinced of that.鈥

Coverage of cultural understanding and international issues in education is supported in part by the Atlantic Philanthropies.

Events

This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
Reading & Literacy Webinar
Literacy Success: How Districts Are Closing Reading Gaps Fast
67% of 4th graders read below grade level. Learn how high-dosage virtual tutoring is closing the reading gap in schools across the country.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
Artificial Intelligence Webinar
AI and Educational Leadership: Driving Innovation and Equity
Discover how to leverage AI to transform teaching, leadership, and administration. Network with experts and learn practical strategies.
Content provided by 
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of 澳门跑狗论坛's editorial staff.
Sponsor
School Climate & Safety Webinar
Investing in Success: Leading a Culture of Safety and Support
Content provided by 

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide 鈥 elementary, middle, high school and more.
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.

Read Next

Assessment Opinion Why Are Advanced Placement Scores Suddenly So High?
In 2024, nearly three-quarters of students passed the AP U.S. History exam, compared with less than half in 2022.
10 min read
Image shows a multi-tailed arrow hitting the bullseye of a target.
DigitalVision Vectors/Getty
Assessment Grades and Standardized Test Scores Aren't Matching Up. Here's Why
Researchers have found discrepancies between student grades and their scores on standardized tests such as the SAT and ACT.
5 min read
Student writing at a desk balancing on a scale. Weighing test scores against grades.
Vanessa Solis/澳门跑狗论坛 + Getty Images
Assessment Why Are States So Slow to Release Test Scores?
Nearly a dozen states still haven't put out scores from spring tests. What's taking so long?
7 min read
Illustration of a man near a sheet of paper with test scores on which lies a magnifying glass and next to it is a question mark.
iStock/Getty
Assessment A District鈥檚 Experiment: What Happens When Schools Do Less Testing?
Los Angeles Unified will excuse some schools from periodic assessments. Supporters hope it will inspire new ways to measure learning.
6 min read
An illustration on a red background of a silhouette of an individual carrying a ladder and walking away from a white arrow shaped sign post, with an arrow facing the opposite direction that has been cut out within the arrow shaped sign with cut pieces of paper on the ground below it.
DigitalVision Vectors