Parents in Virginia will find a surprise in their mailboxes this month: detailed report cards telling them how their children鈥檚 schools are doing. In addition to results from state tests, the report cards will include statistics on school crime, the number of students who are taking accelerated classes, and the proportion of students who attend school regularly. Ohio will distribute its second trial round of report cards this spring. South Carolina will send home report cards beginning in 2001.
As states zero in on the performance of individual schools, report cards are one of their most popular tools for communicating information to parents, other taxpayers, and educators.
Thirty-six states will publish annual report cards on individual schools this year or require schools or districts to do so. Another four will start publishing them next year, and one more will join the club in 2001.
Report cards are considered a central feature of state accountability systems. The assumption is that they will improve education by providing the public with better information-spurring low performers into action and inspiring parents to become more involved. They also serve a marketing function, helping parents choose schools and assuring taxpayers that their money is well spent.
But across the 50 states, school report cards vary tremendously. No two states report exactly the same information. Some are just several pages of statistics, with no explanatory text. Others run to a dozen pages, with sample test questions and detailed descriptions of what constitutes exemplary performance.
Despite the tens of thousands of dollars states spend on report cards, it鈥檚 often hard to tell who the primary audience is or what purpose the reports were designed to serve. And there鈥檚 been little research on their content, format, or usefulness.
To take a look at this prime accountability tool, 澳门跑狗论坛 launched a special project on report cards for Quality Counts 鈥99 in conjunction with A-Plus Communications, an Arlington, Va.-based communications firm, and two opinion-research groups.
The project was supported by the Pew Charitable Trusts.
The project examined the reporting requirements across the 50 states and analyzed the content of existing report cards. It also featured a series of focus groups with parents, other taxpayers, and educators to find out what they thought about the current editions of school report cards and what they would like to see changed. In Baltimore and in Austin, Texas, separate small focus groups were held with parents and other taxpayers, and transcripts of those discussions were produced. In three larger focus groups--in Charlotte, N.C.; Colorado Springs, Colo.; and Worcester, Mass.--the participants used electronic dials to answer questions and react to what they were hearing. (See box, Page 29.)
While the responses do not reflect a nationally representative sample, they do provide insights into what the various constituents think about the information they鈥檙e now getting on their schools.
In general, the analysis suggests that school report cards are not living up to their potential either as a way to communicate with the public or as an accountability tool. Both parents and taxpayers believe they can improve education with the right information, but they do not now think they are getting it.
Moreover, educators and the taxpaying public tend to have different priorities for the kinds of information they want about schools and how it should be used. Taxpayers, and to a somewhat lesser degree parents of enrolled students, are more results-oriented and more likely to support real rewards and consequences for school performance. Educators want more information about the context in which schools operate, including funding. They generally resist attaching high-stakes decisions to school outcomes.
Such differences illustrate how states, as they develop and refine their report cards, must strike a balance. They must keep them short, simple, and focused on results if parents and other citizens are to see them as useful, and they must provide comprehensive information if educators are to view them as legitimate.
One of the most striking findings is that very few people have actually seen a report card on their neighborhood schools.
More than six in 10 parents--and seven in 10 taxpayers--in the electronic focus groups had never seen a report card on individual public schools in their areas. Nearly half the educators had never seen one.
Such findings roughly parallel those of a nationally representative survey conducted by Public Agenda in connection with Quality Counts. In that survey, 52 percent of teachers reported that they had seen school report cards in their communities, compared with 31 percent of parents and 39 percent of local employers.
Participants in the electronic focus groups generally did not believe, or were unsure, that they had the information needed to hold schools accountable. Yet most parents and taxpayers in the small focus groups said they had not sought out such data, relying instead on test scores in the local newspaper or on discussions with other parents, friends, and neighbors.
In fact, most participants in the small focus groups thought that obtaining information on their local schools would take a great deal of time and effort.
鈥淚f you want information on a specific school, you have to do some heavy digging,鈥 said Charles Neal, a taxpayer in the Baltimore area. 鈥淵ou can鈥檛 just put the name of the school up on the Internet and they鈥檒l give you everything you want.鈥
One problem may be that very few states require report cards to be given directly to all parents. Although 26 states will make them available on the World Wide Web by the end of the year, only 13 require that parents receive them at home.
Virginia, for example, gives a postage allowance to each of its 132 school districts to mail the report cards home. Although it doesn鈥檛 require school report cards get sent home, Delaware spent about $40,000 last year printing enough profiles so that schools could send one to every family and have extras.
Most participants in the electronic focus groups believed that widely publicized ratings on schools would motivate teachers to work harder and improve student performance. But there was a notable difference between taxpayers-of whom only about six in 10 believed the reports would have such an effect. The Public Agenda survey found that most parents and teachers thought ratings motivate teachers and principals to work harder, even though over half of teachers also said that such ratings tend to give people an unfair and inaccurate portrayal of their schools.
Even when parents and taxpayers are presented with the type of information that might be included on school report cards, many are unsure how to use it.
Short of deciding where to live--or, in some states, choosing a public school in which to enroll their children--many of the participants in the small focus groups said they felt a sense of powerlessness to change their schools.
鈥淚t鈥檚 very difficult for one or two or 20 parents to change something,鈥 said Jodie Epstein, a mother in suburban Baltimore County, Md. 鈥淵ou can complain, but I don鈥檛 know where it would get you.鈥
She added that such information would help her choose what school her child attends. 鈥淚f I was in a school district that did not score well in testing and I heard not-good things about it, I would do whatever I had to do--lie, cheat, steal--to get my child in a better school.鈥
Participants in the electronic focus groups were more likely to feel they had the power to change schools. Given a choice, most taxpayers and educators said they would use the report cards to insist on systemwide improvements. But nearly four in 10 parents said they would use the data to switch schools rather than fight for change.
One state, Ohio, includes suggestions on its report cards for how parents and taxpayers can follow up on the information provided, as well as questions they might ask their neighborhood schools. Given the findings of the 澳门跑狗论坛 project, such suggestions might prove useful.
One problem with report cards is that much of the information on them may have little bearing on what schools can do to improve achievement.
More than half the report cards produced by states include data on such outcomes as state test scores, attendance, and dropout rates. Fourteen states also include information on the results from Advanced Placement tests or participation in Advanced Placement courses.
Report cards can provide students, parents, educators, and districts with vital information about their schools.
But the report cards are less likely to include data about the school climate, course-taking patterns, levels of parent involvement, and the proportion of teachers with a college major in the subjects they teach. Research has linked those factors--things schools can do something about--with improvements in test scores.
Russell L. French, a professor of education at the University of Tennessee, Knoxville, has done a series of studies examining the relationship between the data in school report cards and student achievement.
鈥淥ne of the things that struck us most was that so many of the things that are reported have so little to do with student outcomes,鈥 he says. In many cases, what鈥檚 reported is simply what鈥檚 available or what鈥檚 required by state law.
The participants in the electronic focus groups were asked to rate the importance of 21 items that they might personally want in order to judge schools or hold them accountable. While the participants thought test scores were important, they did not rate them as the most important measure.
Participants rated school safety and teacher qualifications among their top two or three concerns, both for evaluating schools and holding them accountable. Parents and educators also included class size at the top of their lists, an issue that was of less importance to taxpayers who do not have children in school.
Despite overwhelming interest in school safety, only 17 states now offer such information on their report cards. And only 16 include information about teacher qualifications.
Other items that participants rated highly for holding schools accountable were graduation rates, dropout rates, test scores on statewide tests, test scores on college-admission tests, attendance rates, and parental-satisfaction survey data.
In general, educators rated achievement indicators, such as statewide test scores, lower than parents and taxpayers did.
And educators gave higher rankings to such 鈥渋nput鈥 indicators as per-pupil spending.
None of the groups viewed students鈥 grades as a particularly reliable source of information about schools. Many people believe that the requirements for earning an 鈥淎鈥 in one school or district can be very different from those required to earn an 鈥淎鈥 elsewhere.
All participants wanted both qualitative and quantitative information on schools. In general, parents wanted more information than other groups. In response to an open-ended question, parents in the small focus groups said they wanted information about the quality of life in the school, school leadership, different program offerings, parent and student satisfaction rates, and the levels of parent involvement, among other concerns. One or two pieces of data would not paint a complete picture for them.
Only about one-third of participants in the electronic focus groups believed test scores should be used as the main measure for holding schools accountable. While half of taxpayers supported that view, only 25 percent of educators and 36 percent of parents did so.
鈥淚 think it鈥檚 pretty important,鈥 said Epstein, the Baltimore County parent. 鈥淚 think it tells you the level and ability of the teachers. I think children only learn as much as they鈥檙e taught, and if they鈥檙e doing very well on these tests, they鈥檙e obviously being taught.鈥 But parents and other taxpayers also raised a number of drawbacks to standardized testing, including not trusting the results, concerns that all children do not test well, and fears that teachers spend too much time teaching to the tests.
鈥淭hese numbers that you see published, the statistical numbers, don鈥檛 really say what鈥檚 going on in this classroom, what they are doing with these kids,鈥 said Robert McDonough, a Baltimore taxpayer. 鈥淭hose things are hard to quantify, let alone hard for us outside the system to try to get a handle on.鈥
In the early 1990s, Richard M. Jaeger, a professor of educational research and methodology at the University of North Carolina at Greensboro, coordinated a study of what consumers want from school report cards.
鈥淥ne finding that sticks in my mind is that standardized-achievement-test scores were not the thing they most desired,鈥 Jaeger recalls. 鈥淭hey were rated by school superintendents as being most important, but not by parents. Parents were very concerned about issues like safety and access in schools. They were concerned about climate, about the availability of extracurricular programs, the availability of a broad curriculum.鈥
But while test scores appear to take a back seat to safety and teacher qualifications, their importance should not be underestimated. When participants in the electronic focus groups viewed a prototype report card designed by 澳门跑狗论坛 and A-Plus Communications, the performance section received the highest ratings. And whenever the topic of performance came up, the participants responded favorably.
In short, parents and taxpayers view school safety and the presence of qualified teachers as two essential ingredients of education. Once those basic conditions are met, they want results.
Most participants also wanted the ability to compare their schools and students with other schools and other children.
But the public divided fairly evenly over the relative importance of measuring students against each other or against a fixed standard. Given a choice, they鈥檇 prefer to do both.
鈥淐hildren will be competing against each other when they reach college,鈥 said Bob Moss, a Baltimore taxpayer, 鈥渟o you have to get some sort of accountability between the different districts and different schools with regard to what they do.鈥
The 澳门跑狗论坛 analysis of state report cards found that 20 allow a school鈥檚 performance-at least on test scores-to be compared with a district average, 25 with a state average, and 17 with a national average. But many school report cards make it difficult to compare individual schools within a district by name.
Report cards in nine states also show how schools are doing compared with those with similar student populations--for example, a similar proportion of students who are poor or who have limited English skills.
Participants in the focus groups were divided about whether such comparisons were a good idea. While some thought this method might be fairer, others thought that children ultimately would have to compete against students from all over the nation. In contrast, a majority of educators thought making comparisons only between similar schools was a 鈥済ood idea.鈥
The response of parents and taxpayers suggests that accountability systems based only on similar-school comparisons may lack credibility with the public. At a minimum, if states use such comparisons, they need to do a better job of explaining them.
In general, participants reacted negatively to including the ethnic and economic breakdown of students on school report cards. Demographics appeared at the very bottom of the list when they were asked to rate the importance of 21 items. Some worried that reporting demographic data would be divisive or would prejudice people鈥檚 expectations about the students in a school. Others simply perceived the information as irrelevant when it comes to holding schools accountable.
Educators were the most likely to want such data. Some teachers, for example, argued that it is unfair to compare a school that enrolls a high percentage of poor students with one whose students come from more affluent families.
Federal law now requires that states tabulate test scores by students鈥 race and gender as part of the Title I aid program for poor students. But Texas is the only state that makes such information a central feature of its report cards and holds schools accountable for ensuring a minimum level of performance by each subgroup.
In both the small focus groups and the larger electronic groups, participants reacted favorably to the idea of trend data that would show how a school鈥檚 performance has changed over time. Twenty-five states now include at least one year of past data on their report cards.
Many states also assign an overall rating to a school鈥檚 performance, based largely on test scores. Schools are then labeled with such terms as 鈥渁cceptable,鈥 鈥渦nacceptable,鈥 or 鈥渆xemplary.鈥 But educators--in particular--are opposed to such labels. A majority of educators in the electronic focus groups thought using either labels or letter grades to describe how well a school is doing was a 鈥渂ad idea.鈥 That view seems to reflect their general distrust of test scores and other statistical measures for judging schools. In contrast, while large percentages of parents and taxpayers didn鈥檛 like labels, a majority did favor letter grades.
Only nine states now include a school鈥檚 overall rating or evaluation on its report card. New York state plans to add a separate page about its accountability system to school report cards beginning this year.
In the small focus groups, participants liked the Connecticut report card because it does not focus solely on test scores and outcome measures. Separate sections describe the needs students bring to school, the resources at the school鈥檚 disposal, how the school spends its time, and the school鈥檚 performance. A space on the back allows the school to describe its goals, any special programs or accomplishments, and plans for improvement. Participants seemed to feel that such a framework, and the short narrative explanations that go with it, gives a more complete picture of the school.
States such as Delaware have also tried to offer a richer mix of quantitative and qualitative information on their report cards.
The Delaware report cards, for example, are almost evenly split between comparable, statistical indicators that the state collects and qualitative information. 鈥淲e didn鈥檛 want to make a profile that was just a bunch of numbers,鈥 explains Chester Freed, a staff member with the Delaware Department of Education. 鈥淲e wanted the school to be able to talk about itself and what makes it unique. That鈥檚 why we came up with that mix.鈥
In Providence, R.I., the GTECH Corp. and a citizens鈥 advocacy group known as PROBE, or the Providence Blueprint for Education, have worked with about a dozen schools to produce detailed annual progress reports for citizens. The reports, which can run more than 30 pages, include an overview of the school, its guiding principles and achievements, any special areas of concern, descriptions of classroom activities and educational programs, and ways in which parents and the community are involved, in addition to student outcomes.
鈥淥urs is, I think, a good attempt at communicating with parents and families not only about test scores, but also about the life of the school-what goes on in their child鈥檚 school,鈥 says Dan Challener, the director of PROBE. 鈥淚t allows a school to tell its story completely.鈥
The reports cost about $7,000 to $8,000 per school. The district is committed to expanding the report cards to 20 schools this school year. 鈥淭hat鈥檚 a lot of money,鈥 Challener admits. 鈥淥n the other hand, if that鈥檚 how you communicate with parents, it鈥檚 probably money very well spent.鈥
Costs and logistics clearly Influence the information that appears on school report cards. In addition, states must decide how much information the public can digest.
When faced with a list of potential indicators, the public often wants it all. But studies also suggest that people want report cards that are short, simple, and above all--understandable.
For example, participants in both the small and electronic focus groups reacted favorably to the New York state report card until they realized its length--about a dozen pages. Then their approval dropped off precipitously. New York publishes one of the longest report cards on individual schools.
A good solution, according to eight in 10 participants, would be to get a shorter report, but also have a longer version available for those who wanted more information.
鈥淭he trick is figuring out on school report cards how much data to share and what not to share,鈥 says Robert F. Sexton, the director of the Prichard Committee for Academic Excellence, a citizens鈥 advocacy group in Kentucky.
Both parents and educators also may need more help interpreting the statistics on report cards than is often assumed. 鈥淢ost people don鈥檛 get the patterns in the numbers,鈥 says Steve Rees, the editor and publisher of School Wise Press, a San Francisco-based company that publishes its own report cards on schools across the state. 鈥淪o we figured out that we would have to use words to describe the patterns. One of the things they do is help somebody who is not a numbers person read the text and get the meaning from the words.鈥
In Rhode Island, two-page snapshots on schools include short descriptions below each table or chart that explain 鈥渨hat you鈥檙e looking at鈥 and 鈥渨hat you鈥檙e looking for鈥 to help people interpret the information.
The state doesn鈥檛 send the information home to parents. Instead, it encourages each school to hold a 鈥渟chool report night鈥 to explain the data and what the school is doing about it. State officials have also conducted workshops for reporters on how to interpret and present the information.
Connecticut produces a slim report card for parents in the form of a brochure. But it also has five years鈥 worth of school profiles on a CD-ROM that is available in the state鈥檚 public schools and libraries. The interactive disk allows people to compare the performance of individual schools or districts over time on key variables and to download customized reports. 鈥淚t鈥檚 a way to go into the database with a very easy point-and-click methodology and get you the information you want,鈥 says Douglas Rindone, the chief of the bureau of research, evaluation, and student assessment in the state education department.
When Connecticut began producing school profiles in 1992, they were about eight pages long. To reduce the reporting burden and increase the dissemination to parents and community members, the state scaled back the report cards to a shorter, brochure format in 1995. A survey of superintendents, board members, principals, teachers, and parents in 1997 found that respondents unequivocally favored the shorter format. But the report cards will increase to four pages again this school year to include new information required by state law.
At least one study has found, based on prototypes, that people are more likely to identify the quality of a school correctly if the report cards are at least four pages long and include some narrative information.
To probe people鈥檚 thinking about report cards further, the 澳门跑狗论坛 project developed a prototype for a fictitious 鈥淛efferson Elementary School鈥 to present to the electronic focus groups. The prototype was based in part on the Connecticut report card, but added other elements that parents and taxpayers said they wanted. It has six sections: what we look like, how we spend our money, how our students perform, our school鈥檚 environment, how we spend our time, and what we are doing to improve. Participants rated all sections of the brochure highly on a scale of zero to 10. And nearly seven in 10 thought its brief, brochure-like format was 鈥渁bout right.鈥 The section on performance--which includes statewide test results, trend data, and attendance and promotion rates--received the highest rating. Based on the prototype, it appears that the essential ingredients for a report card to be well-received are: how students are performing, what they are doing, whether they are safe, and how money is being spent.
Participants were also asked to rate the credibility of various sources for producing school report cards. Local newspapers received the lowest ratings from all groups. Nonprofit watchdog groups received the highest overall rating. Second, but considerably lower, in credibility were state departments of education and the local school district. Educators trusted their local districts or principals as a source of information more than parents or taxpayers did.
Even If states get report cards right, they shouldn鈥檛 assume that simply publishing data will get schools started on the road to improvement.
鈥淪chools need help in seeing connections between data reported, what is happening in the school, and what should be done about it,鈥 notes a report by the Southern Regional Education Board.
Just what effect school report cards are having on school performance is hard to measure. Some critics contend that the report cards are often a waste of time and money because they sit on a shelf and gather dust. 鈥淚t might be a reasonable first step in accountability: that is, let鈥檚 get the information out there that鈥檚 available,鈥 French of the University of Tennessee says. 鈥淏ut there鈥檚 no follow-up.鈥
Others maintain that they are producing changes, albeit slowly. 鈥淲e think that over time--and the key word is time--that schools and their communities, they gain a greater understanding of what is in the data, will have a more informed local effort in terms of improving their schools,鈥 says Dennis W. Cheek, the director of information services and research for the Rhode Island Department of Education.
Since Georgia first published its report cards in 1996, the number of schools developing local school improvement plans has increased by more than 300 percent, says Pat Sandor, the communications director for the state education department. 鈥淚t鈥檚 just a real strong tool.鈥
In the Providence schools that have published GTECH Progress Reports, focus groups suggest that parents are more informed, ask better questions about what鈥檚 happening in their schools, and are more engaged in their children鈥檚 learning. And teachers at the schools are spending more time looking at data, identifying their schools鈥 goals, and defining next steps.
鈥淚 think the myth is that the state issues a report card and parents immediately band together to achieve something,鈥 Challener of PROBE says. 鈥淏ut that鈥檚 not to say the state report cards don鈥檛 serve a purpose.鈥
鈥淏ut I also think what鈥檚 needed is a wider, broader, deeper picture of the life of a school,鈥 he adds, 鈥渁nd I don鈥檛 know how states produce and communicate that. It may be that states are simply doing what they should do, and schools need to do the next piece.鈥