Skip to content Skip to navigation

It doesn’t take long to learn how to spot misinformation online, Stanford study finds

Photo of students in a classroom with laptops
Photo: Pressmaster / Shutterstock

It doesn’t take long to learn how to spot misinformation online, Stanford study finds

Research from the Stanford History Education Group finds that less than six hours of instruction helps students learn to spot dubious sources online.

There may be new hope for helping young people – and anyone else – learn to navigate the torrents of misinformation and propaganda online.

A new study by researchers at Stanford Graduate School of Education (GSE) found that high school students who received only six 50-minute lessons in digital literacy were twice as likely to spot questionable websites as they were before the instruction took place. The lessons were based on a free curriculum developed by the Stanford History Education Group (SHEG) in 2019.

“Most young people are much more gullible than they think they are when it comes to websites with hidden or even malicious agendas,’’ said Sam Wineburg, the Margaret Jacks Professor Emeritus of Education at Stanford, founder of SHEG and the study’s lead author. “This research shows that a modest investment of time can help students avoid many of the traps.”

The study, published online by the Journal of Educational Psychology on April 19, is the first randomized experiment in an urban school district to investigate the effect of the new curriculum.

Lateral reading and ‘critically ignoring’

The curriculum, called Civic Online Reasoning (COR), draws on previous studies of how students and even highly educated adults can be fooled by polished websites with hidden agendas. 

Earlier research by SHEG, for example, has found that both professional historians and college undergraduates were far more easily duped by suspect websites than professional fact checkers. The big difference: Fact checkers read “laterally,” leaving a document and opening new tabs to run quick background checks on a source’s reputation, organizational ties and claims.

The Stanford researchers tested the novel curriculum with about 500 high school students in an urban midwestern school district. All of the students were enrolled in a semester-long, district-required class on government at each of the district’s six high schools in spring of 2019. 

Half the students were taught the usual curriculum, which included topics in political science and analyses of current events. The other half were taught the COR curriculum, and their teachers received professional development to prepare them to deliver the curriculum. (The teachers of the control group also attended a daylong workshop on COR after the study, to engage them in the curriculum and provide equitable access to the resources.)

The COR curriculum focused on teaching students how to read laterally, more like fact checkers.

Along the way, students learned how to scan and quickly assess the credibility of search results, and to “critically ignore” low-quality sites, said Wineburg.

“What the bad actors want is your attention,” he said. “They believe that the longer they can get you to stay on the page, the easier it is to suck you into their vortex.”

Instead of clicking on the first few websites that pop up, students were taught to scan through the search results and look for clues to a site’s credibility. They learned about search engine optimization and how it can distort results: A site that looks reputable and appears at the top of the list of results, for example, can still peddle junk science.

Another misleading clue, the researchers said, is a dot-org address. Many digital literacy guides, including ones that appear on college library websites, suggest this address is an indicator of credibility because they appear to be nonprofit organizations. However, the dot-org domain, like dot-com, is an “open” domain: Anyone who pays a fee can register a dot-org site.  

The researchers note, for instance, that the International Life Sciences Institute describes itself on its dot-org website as a “nonprofit global organization whose mission is to provide science that improves health.” But a few seconds of additional search shows that the group is backed by giant food corporations and has long been a platform for fighting regulation

Unlearning what schools teach about reading

By the end of the semester, the students who took the curriculum did twice as well as they had before on a test to spot dubious websites. Even then, however, the students earned only half the total number of possible points. 

“There’s undoubtedly more work to be done, but our study provides clear evidence that students can improve their ability to sort fact from fiction online,” said Joel Breakstone, director of SHEG and a co-author of the study. “Policy makers have focused on how tech platforms should address this problem. Yes, but: This is an educational issue as well. We can’t assume that students can discern the information that streams across their screens just because they grew up with iPhones. They need to be taught.”

A broader challenge, the researchers noted, is that the Internet demands that students learn a new way of reading. Schools teach students to carefully read and absorb material from beginning to end – but textbooks and other school materials are heavily vetted, and the underlying assumption is that the content is valid and worth learning.

That’s exactly the wrong assumption and the wrong strategy for dealing with the internet, the researchers said.

“School is an analog institution, but the Internet is digital,” says Wineburg, “The lesson here is that the way to judge credibility is not by reading every word, which eats up our time and our energy. Instead, we need to use the power of an electronically linked internet to make fast and frugal decisions about what to believe.”

The lesson isn’t just one for high school students, the researchers added, and it goes beyond blatant falsehoods generated by Russia or conspiracy groups like QAnon.

“Every single public policy issue has a spate of websites that claim to offer objective, unbiased information, and they are often sponsored by corporate interests,” said Breakstone.  “Unfortunately, this is how influence is transacted in modern society. And it’s not just students who get misled. We are all at risk.”

In addition to Wineburg and Breakstone, the study’s co-authors include Mark Smith, director of research at SHEG; Teresa Ortega, associate director of SHEG; and Sarah McGrew, an assistant professor at the University of Maryland. 

The study was funded by Google.org, the charitable arm of Google.


 


Faculty mentioned in this article: Sam Wineburg

Get the Educator

Subscribe to our monthly newsletter.

Back to the Top