On a brisk fall morning, 11 adults pair up with students at John F. Deering Middle School and fan out into the building. The adults and their young guides will stick with each other for the rest of the day, absorbing the rhythms of the school. In one room, 6th grade sleuths conduct an acid test on cups of Diet Coke to determine who poisoned a fictitious security guard. In an industrial arts class, an 8th grade girl pounds a nail into a wooden robot she’s building. Upstairs, groups of 6th graders solve mathematical word problems. There’s a class where 8th grade “authors†present their work to a community of their peers, and a room full of 7th graders correcting sentences on the overhead projector. For four days, the team of 11 outsiders--six of them teachers--will camp out in this school. They’ll observe classes and talk to students, teachers, parents, and administrators. They’ll sift through documents and examine samples of students’ work.
Like detectives in a game of Clue, the team members will piece together evidence about teaching and learning, then head back into the hallways to refine or reject their conclusions. At the end of the week, the group will write a report that assesses the school’s strengths and weaknesses and recommends ways to improve. The four-day site visit is at the heart of SALT--or School Accountability for Learning and Teaching--Rhode Island’s state accountability system.
Like most states, Rhode Island is gathering large amounts of test data to help measure the performance of its individual schools. But here, test scores are only a starting point.
The state also has surveyed teachers, students, parents, and administrators about the learning environment in each school. It is requiring detailed financial information about how schools spend their money. And, through the SALT visits, the state is introducing a human component into the judging of teaching and learning.
Rhode Island’s multifaceted strategy shows that there are more expansive ways to think about accountability than just test scores.
States, if they choose, can have a variety of tools a their command, including measures that a school selects itself. By enacting charter school laws and school: choice plans that enable families to vote with their feet, states can allow parents to exercise accountability directly. And states can permit exceptions to their basic accountability systems for schools that do not fit the mold.
But in deciding how broadly to cast the net, states face a conflict: On the one hand, they need common measures of performance to hold all schools accountable. On the other hand, they need to respect the unique nature of individual schools to promote autonomy and design solutions that address a particular school’s needs. Measures such as test scores and attendance rates also may not provide schools with enough information about what they can do differently on a day-to-day basis.
Here’s how three states--Rhode Island, Massachusetts, and New York--are coping with these tensions.
The most notable thing about a SALT visit is its unconcealed reliance on professional judgment.
True, the visiting team looks at test scores and other data to help understand a school. But the real focus is on spending time in classrooms, hallways, and lunchrooms to get a sense of how the school works on a daily basis.
“It’s a way in which a group of professionals can bring to bear professional judgment about a school. We are not shying away from that,†says Ken Fish, the director of school improvement and accountability for the Rhode Island Department of Education. “It’s an attempt to have a well-balanced accountability system that deals with both quantitative and qualitative information.â€
The school visit plays both a monitoring and a supporting role. Its purpose is to generate information that will help the school improve and inform accountability decisions. It is not used to evaluate individual teachers.
Rhode Island’s site visits build on a long history of school inspections that began in Britain more than 150 years ago. Under the British system, full-time inspectors engage in the evaluation of schools throughout England. Often described as a “faithful witness,†they spend the majority of their time observing in classrooms.
In the United States, regional accreditation agencies, such as the New England Association of Schools and Colleges, also use site visits as an accountability tool, although traditionally these have not focused as heavily on teaching and learning.
New York state began experimenting with its own version of the British inspection system, known as “school quality reviews,†in 1992. In New York, teams of educators and other citizens descend upon a school for about a week. But schools in New York volunteer for the reviews, and they have never become a central feature of the state’s accountability system. By the end of this school year, teams in Illinois will have visited about 200 of the state’s more than 4,000 public schools as part of its “quality assurance†program.
School reform networks, such as the Bay Area School Reform Collaborative in San Francisco and the Maine Educational Partnership, also have employed such visits as a school improvement tool.
But only three states now include some type of onsite, qualitative review before identifying low-performing schools.
Such reviews have not become commonplace, experts say, partly because of their expense and partly because Americans don’t trust teachers’ judgments enough.
Here in Rhode Island, SALT calls for a visiting team composed primarily of teachers to visit a school once every four or five years. The size of the team depends on the size of the school.
For the 882-student Deering Middle School in West Warwick, the 11-member team includes six teachers, a school board member, a parent, and a vice principal from other districts, as well as a consultant and an official of the Rhode Island education department.
Before their arrival, team members review documents about the school, including a study the school has produced about itself.
On the first day, each team member joins a student to view the school from a child’s perspective. Team members also meet with the self-study committee. The second day, each team member follows a teacher or a cadre of teachers to see how the school functions from the faculty’s viewpoint.
Team members look at homework. They interview administrators. And they scrutinize test scores and other data. The goal is to find the gaps and identify ways to help the school close them, both by increasing the total proportion of students who meet state standards and by reducing variations in performance based on race, ethnic group, class, or gender.
The third day, the team focuses on the school as an organization. Team members interview students, parents, school and district administrators, and other employees. Throughout the visit, the team focuses on three broad questions: What evidence is there that students are learning to high standards? What do teaching and learning practices look like in this school, and how do they relate to what students are learning? How does the school function as a learning community?
Each day, the team members meet to compare notes. On the last day of the visit, they write their report. Each conclusion requires unanimous consent of the team and must be supported by at least two specific pieces of evidence. That might mean observations, statements drawn from school documents, conversations with people at the school, or test scores and other data. But all team members must agree that the evidence is accurate and the conclusions make sense.
The report includes no more than five key recommendations for how the school can improve and five commendations for what it already does well. It does not offer detailed prescriptions. “The intent of this visit is to move the school ahead,†explains Susan Rotblat-Walker, who helps oversee the SALT program for the state. “It’s not a ‘gotcha.’â€
A school’s faculty receives the report both orally and in writing. Then, school officials, the local superintendent, and the education department sign an agreement spelling out what each will do to help the school make specific changes. The agreement includes a schedule for when such changes will occur. The principal also releases the report to the public.
“We want to unleash competence,†Peter McWalters, the state commissioner of education, says, “not monitor incompetence.â€
Schools that helped field-test the Rhode Island program in 1997-98 describe the site visits as a bracing, but useful, experience.
“I’ve been here 11 years, and for me to look objectively at the school is almost impossible,†says Patrick Hannigan, the principal of Ponaganset Middle School in the Foster-Glocester district. “To have other eyes look at the school in a very objective way was a great experience. It ended up re-energizing the faculty.â€
Vincent Giuliano, the principal of Joseph H. Gaudet Middle School in Middletown, acknowledges that his school’s report came as something of a surprise. “We were anticipating, ‘Real, real good,’†he says. “What we got was, ‘Yeah, you’re good, but you’ve got a lot of room to grow here.’ We swallowed hard and realized with all the initiatives that we’re involved in, we needed to prioritize.â€
SALT teams have criticized schools for such shortcomings as failing to use state assessment results, not challenging their students, relying too heavily on worksheets, lacking specific outcomes for each grade and subject, using a too-narrow range of teaching practices, and lacking consistent standards for student behavior.
Debourah Raleigh, a member of Deering Middle School’s visiting team and a 7th grade teacher at Kickemuit Middle School in Warren, says the SALT reports carry a weight that test scores alone cannot.
“If you see the test scores, as a teacher, you can say, ‘That’s not me,’ †she says. “This is more like a true picture, an accurate picture. They were right in our classrooms, and they spent a huge amount of time there.â€
Before writing its report, the Deering team observed 167 classes and visited nearly every teacher.
The assumption In Rhode Island is that nobody is better equipped to judge schools and educators than fellow teachers. In the Ocean State’s lexicon, accountability does not just mean better test scores, but ensuring better professional practice on a daily basis.
But in a country where few teachers even visit one another’s classrooms, asking educators to sit in judgment on other schools and to agree on common standards of practice is an alien experience.
“It was a little daunting going into it,†says Paul Bovenzi, a 6th grade math and science teacher at Deering. “I wasn’t sure what to expect--whether the people would come in and sit in the back of the classroom frowning at us. It’s been a lot more pleasant than I thought it was going to be.â€
To help structure the site visits, states like Rhode Island have developed protocols that spell out everything from how to shadow a student to how to write a report. “It really is a method of knowing schools,†says Thomas A. Wilson, the primary consultant for SALT in Rhode Island. “The structural pieces are important. You make sure those are as solid as possible.â€
Rhode Island also provides training to team members before they visit their first school. But no one knows how much preparation is enough. The team that visited Deering Middle School received just one day of training beforehand. Issues like training and the other costs associated with site visits become critical as states try to gear up such programs to cover all their schools.
Rhode Island plans to visit about 20 schools this school year, 40 in 1999-2000, and 60 a year in 2000-01 and beyond. “We’re really trying to do this on the cheap,†Fish of the state education department says, “because we know if it’s too expensive, we would never be able to scale it up.â€
For now, team members volunteer, without pay. And the state doesn’t reimburse their home districts for the time they spend away from their classrooms. The cost for each site visit is about $4,000 per school. The cost in Illinois is between $3,000 and $4,000.
Scale Is also an issue In Boston. School leaders there plan to expand a model for reviewing schools that is now used only in the city’s “pilot†schools--a form of district-approved charter schools-to other public schools beginning this year. The plan is to visit one-quarter of the city’s 128 schools each year.
“I worry immediately, how do they staff it? How do they train these people?†says Larry Myatt, the director of Fenway Middle College High School, one of the pilot schools. “We always go back to these size and scale issues. So many good ideas are compromised by immediate implementation for everyone.â€
Some fear that as the number of site visits increases, they will become less rigorous--and teachers will be too easy on each other. “If the methodology is not rigorous,†cautions Wilson, the SALT consultant, “accuracy is compromised in saying supportive things to teachers.â€
Ultimately, the value of a site visit depends on what schools do with the information. In Rhode Island, it’s not clear what happens to schools that fail to act on a report’s recommendations.
Commissioner McWalters envisions a “progressive-intervention strategy,†in which the state assumes greater control if a school’s performance does not improve over time. By law, state officials have the authority to “reconstitute†a school, but they have not yet defined that term.
“Philosophically, I would hope we never find out,†Fish says. “From a design standpoint, if this system works well, we’ll never get to that point. We would have intervened early enough, provided enough support, that a takeover will never be an issue.â€
In contrast, the consequences for charter schools in Massachusetts that don’t meet their performance agreements are clear: The state can yank a charter, forcing the school to close. Parents and students also can choose a different school.
In fact, from almost any standpoint, the publicly funded, but largely independent charter schools are the most accountable public schools in the Bay State right now.
They must participate in the same statewide tests as other schools. They must use other “credible†assessment tools on a yearly basis. And they must set measurable performance objectives and demonstrate progress on them.
All this information goes into an annual progress report that is made available to the public and which includes detailed financial information.
Beginning in their second year, charter schools also receive an annual one-day site visit from a team led by a state official and made up of a small group of Massachusetts educators and citizens.
At the end of the five-year charter, the state weighs all of that information to decide whether to renew a school’s charter.
So far, the state has put three of its 30 charter schools on probation, one step away from losing their charters. Of those, one improved enough to come off the list, the second is still struggling, and the third closed voluntarily. The clarity of such ultimate consequences is central to school improvement, says Scott Hamilton, the state’s associate commissioner for charter schools.
“You’ve got to be willing to pull the plug,†he says. “I think a lot of charter school teachers in Massachusetts truly believe that if they can’t justify their existence by the time renewal comes around, they won’t have a job.â€
As in Rhode Island’s SALT program, state test scores aren’t the only basis on which Massachusetts judges a charter school’s success. Because charter schools are supposed to be different, the state also allows them to set their own performance goals and measures.
“We wanted to create a system of accountability that somehow recognizes the individual goals, mission, and student population of each school,†Hamilton explains, “so that it doesn’t trample those things and backhandedly push these schools toward uniformity.â€
“We don’t view standardized tests or state test results with the same sort of derogatory attitudes that some people do,†he adds, “but we do recognize the limits of those tests.â€
The accountability plan for the Benjamin Franklin Classical Charter School in Franklin, Mass., states that 4th graders will achieve a median score on the reading and mathematics portions of the state test that is 10 percent above that of their peers in the surrounding district. But it doesn’t stop there. The school pledges, for example, that 4th graders will be able to prepare, present, and defend a science research project, and that a jury of teachers and outside scientists will give the projects a median score of “competent.†Students also will be able to identify the salient features of the art and architecture of medieval cultures, as shown by their recording a median score of 75 percent on an in-house test developed by the school’s art faculty.
Such publicly embraced benchmarks are far more detailed than those set by most public schools. “We set very specific targets for ourselves because the state has insisted on that,†says James M. Bower, the school’s headmaster.
“I wish somebody asked me to do this 20 years ago,†he adds. “It gives us actual targets to shoot for.†Hamilton says schools may choose to use portfolios, juried exhibits, and other less traditional methods of assessing students’ work as long as they’re credible. “If a school is using portfolios,†he adds, but hasn’t set up the proper structure to assess them, “then it’s simply an internal measure that doesn’t have a lot of credibility for us.â€
Not everyone in Massachusetts’ charter schools is happy about the amount of time that now goes into writing and compiling information for the annual reports.
“There’s an upside and a downside,†says Deborah Springpeace, the principal of the K-8, 665-student Seven Hills Charter School in Worcester. “The upside is that it forces you to look long and hard at multiple aspects of children’s learning, and to do so in a collaborative way with your entire staff. The downside is that the effort that goes into just producing the annual report is superhuman.â€
Seven Hills’ progress report, for example, exceeded 50 pages last school year and took about 80 hours to complete.
“This particular level of accountability just strikes me as too much, too soon,†Springpeace says. “It’s a problem.â€
This particular level of accountability just strikes me as too much, too soon. It’s a problem.
Hamilton, the associate commissioner, also says it would be hard for the state to exercise the same level of scrutiny for all its public schools. His staff of three now does most of the work, with volunteers recruited to participate in the site visits.
Massachusetts requires charter schools to meet state testing requirements as well as their own, internally developed measures of performance.
But at least some schools in New York state think their students should have alternatives to the state tests.
In June, 11th graders will take a six-hour, two-day language arts exam. The test is the first in a revised battery of state regents’ assessments that all students eventually will have to pass to graduate. Until now, fewer than half the Empire State’s students--primarily those bound for college--have taken the coursework required to prepare for the regents’ exams.
Many people believe that requiring all youngsters to pass the tests in English, math, global history, U.S. history, tory, and science will significantly raise expectations for students. But some educators question the idea that the new regents’ exams are appropriate, given their own schools’ educational philosophies and curricula. Some of those schools have banded together to form the New York Standards Performance Consortium, a group of more than 30 schools in New York City that are working on an alternative to the regents’ tests.
“If schools want to use the regents’ [exams], that’s fine,†says Ann Cook, a co-director of the 111-student Urban Academy in New York City. “But if schools have shown that they can develop an alternative-by using performance assessments, for example--that should be perfectly acceptable, as long as they can produce evidence that they have met high standards.â€
At the Urban Academy, located in Manhattan, students demonstrate their ability to analyze literature by meeting for about an hour with an external examiner--typically, a university professor--and discussing passages in the books they have read. To prove that the understand the concept of “volume,†they go out into the city with a protractor, a ruler, and graphing paper and calculate the volume of the landmark Flatiron Building, then write up their conclusions in a lab report.
“Basically, the problem for small, progressive schools is that we’ve adopted a very different instructional model,†says Eric Nadelstern, the principal of International High School at LaGuardia Community College. “You can’t cover the breadth of material expected in a regents'-type assessment and, at the same time, engage kids in sufficient depth to satisfy the kinds of requirements that we subject them to.â€
New York state has appointed a 15-member panel to review any proposed alternatives to the regents’ exams and make recommendations to the state education commissioner.
But the alternatives must meet three criteria: They must be aligned with the state’s standards and be at least as rigorous as the regents’ tests. They must meet technical criteria for validity, reliability, and freedom from bias. And they must be “externally†developed and administered under “secure conditions.†That means the tests cannot be created exclusively by teachers in the school, and students cannot have prior knowledge of specific questions. The College Board has submitted some of its Advanced Placement exams and achievement tests for consideration, as has the International Baccalaureate program.
But the schools that belong to the New York Performance Standards Consortium are proposing something different. They have identified six tasks that all their students would complete in common: a research paper, a literary essay, a scientific experiment, a real-life application of high-level mathematics, a creative work of art, and a written self-evaluation.
New York City’s Center for Collaborative Education, a network of reform-minded high schools, also is working on a system of graduation by portfolio that it hopes to have approved by the state.
The New York City school system has supported the work of these groups by providing money for 31 of the schools to work on their performance-based assessments this past year. By working across schools, members of the collaboratives hope they can meet the state’s requirement for an externally developed assessment.
“We deeply believe that we have an obligation to show that these are valid and reliable assessments,†says Peter Steinberg, the coordinator of the performance-standards consortium and a staff member at New Visions for Public Schools, a nonprofit group that has sponsored many small public schools throughout the city. “What we’re hoping is that we have an opportunity to prove it.â€
But time is running out. For the past three years, many of the schools that belong to New Visions or the CCE have had an exemption from the state that freed their students from state testing requirements while they work on a performance-based system.
The exemption, at least in English, expires in June when the state gives the first revised regents’ exam in language arts. Schools worry they’ll put their youngsters at a disadvantage if they don’t prepare juniors to pass the tests and the state then rejects an alternative proposal. So far, the state is offering no assurances.
“If you were to look at the assessments that [the New Visions schools] do day in and day out, they probably cover the standards more than anyone else,†says Jim Kadamus, the deputy commissioner for the New York State Department of Education. “But they have virtually no data on the reliability of the tests: the scoring, whether it was done based on homework assignments, whether It really represents the kids’ work.â€
“Give us your technical data that prove that you’ve met these criteria,†he says. “If you can’t do that, then we’re probably not going to approve it.â€