The video posted on Facebook was provocative: Grainy footage showed scenes of people cramming papers into boxes; a voiceover claimed it showed ballot-stuffing in the 2016 presidential election. 鈥淗ave you ever noticed that the ONLY people caught committing voter fraud are Democrats?鈥, text at the bottom of the post prompted.
Asked whether the video constituted strong evidence of voter fraud, many U.S. high school students concluded that it did.
As it turns out, the video showed scenes from Russia, not the United States. And even among those students who saw through the ruse, a quarter couldn鈥檛 articulate why the video was suspect. Only three among the more than 3,000 responses tracked down the source of the original video.
Students鈥 weak performance on that task andconstitutes alarming evidence that a large majority of students are not well prepared to investigate sources of information for their accuracy, relevance, and quality. And despite more than a decade鈥檚 worth of policy chatter about media literacy, whatever schools have been doing doesn鈥檛 appear to have been enough to inoculate students against 鈥渇ake鈥 news.
鈥淩eliable information is to civic health what proper sanitation ad potable water are to public health,鈥 wrote the researchers, part of the Stanford History Education Group at Stanford University. 鈥淲e need high-quality digital literacy curricula, validated by rigorous research, to guarantee the vitality of American democracy.鈥
The study is based on results from nearly 3,450 9th to 12th grade students who took a series of six media literacy exercises, including examining the Russian video. They also were asked to distinguish between news and ads on Slate鈥檚 website, to examine whether a nonprofit鈥檚 page on global warming was reliable, compare two webpages, and analyze a tweet from an advocacy group.
The study sample generally mirrored, though it didn鈥檛 entirely match, the demographic composition of U.S. schools, and it is not a nationally representative one, instead being pulled from 16 districts across some 14 states. Still, the results are sobering: Students did very poorly overall.
For each of the six tasks, at least two-thirds of students were rated 鈥渂eginning,鈥 rather than 鈥渆merging鈥 or 鈥渕astery.鈥 Students performed especially poorly on the global-warming nonprofit task: The website had been set up by a fossil fuel group, but most students focused on superficial features like its presentation, nonprofit status, or ".org鈥 URL. All but 3 percent of students got the lowest rating on that task.
Here鈥檚 a closer look at the findings and what they mean.
There are significant gaps between groups of students.
Mirroring other academic indicators, students qualifying for free and reduced-price lunches, a proxy for poverty, did significantly less well than other students. The results also showed that black students scored significantly lower than white students; that students whose mothers held more formal education also did better than those who did not; and that scores generally improved for the upper secondary grades compared to 9th graders.
Students living in urban locales also did better than students in rural locales, an intriguing finding that could point to more attention to media literacy in urban centers鈥攐r simply better funding and more well-equipped teachers.
Media literacy efforts haven鈥檛 been enough.
The clear implication from this is that despite an explosion in media literacy awareness and even professional development, schools haven鈥檛 cracked the media literacy nut yet.
This isn鈥檛 to criticize hard work by a lot of people. There are a number of groups offering training and support, and media literacy is by now intertwined in all kinds of policy documents. Aspects of media literacy fall into the information literacy standards promulgated by library-science professionals and the ISTE standards for teachers and students, created by a group that supports effective instructional-technology use in schools.
And the Common Core State Standards, still the backbone of K-12 academic expectations in most states, require students to 鈥済ather relevant information from multiple print and digital sources [and] assess the credibility and accuracy of each source.鈥 (Groups like the National Association of Media Literacy Education have also highlighted efforts.)
It is to suggest, though, that such efforts don鈥檛 yet appear to have translated into widespread understanding.
The report criticizes commonly taught because they tend to focus students鈥 attention on one website or news source. In fact, as prior research by SHEG demonstates, expert evaluators tend to read 鈥渓aterally,鈥 opening many browser tabs and cross checking the original source against many others.
There are many civics education implications here.
This is probably the most important takeaway. When students can鈥檛 identify the quality of a source, how can they make good decisions about how to vote, or form their own opinions on important topics like health care, housing, and economic policy?
It鈥檚 doubly concerning given that there are now entire cable-news networks and partisan news sites built around presenting a skewed accounting of facts. Social media hasn鈥檛 helped: The atomized nature of online interactions makes it easy to share doctored or fake information. And increasingly, as another recent report coined it, : Americans can鈥檛 even agree on a set of basic facts that underpin their arugments or conclusions anymore.
A second thing to worry about is the gap in media literacy skills between needy and black students relative to their peers. To my mind, this bolsters an emerging discourse among civics education advocates who are worried about a 鈥渃ivics gap.鈥 This refers to a phenomenon in which underserved groups have the least access to effective civics preparation. It鈥檚 not hard to understand why this is a problem. Communities need to be able to effectively advocate for lawmakers and policy and legislation that improves their members鈥 lives, and to do that you need to be able to marshal facts and make good arguments.
Image: Getty