From Digital Native to Digital Expert

People of all ages struggle to evaluate the integrity of the digital information that rains down with every web search and social media scroll. When the Stanford History Education Group released findings showing that most students couldn’t tell sponsored ads from real articles, among other miscues, it intensified the scramble for tools and strategies to help students discern better.

But a more recent study by Stanford’s Sam Wineburg and Sarah McGrew suggests that many of the techniques that students and teachers employ — which include checklists and other practices most recommended for digital literacy — are often misleading.

A better solution for navigating our cluttered online environment, they say, can be found in the practices of professional fact-checkers. Their approach, which harnesses the power of the web to determine trustworthiness, is more likely to expose dubious information.

The following guidelines for interrogating online information, inspired by the fact-checkers’ techniques, will increase students’ odds of determining unreliable sources (and consuming reliable ones).

Read Laterally, Not Vertically
Wineberg and McGrew followed three groups of readers as they evaluated digital sources provided in the study: historians, Stanford undergraduates, and professional fact-checkers. They found that the fact-checkers were fastest and most accurate in vetting information, while the historians and students were easily deceived.

The student participants did something they (and all of us) do often: they scrolled and read down the page. But their close reading of the very sources they were tasked to interrogate did little to advance their credibility assessment. Instead, it misled them.

“The close reading of a digital source, when one doesn’t yet know if the source can be trusted,” write Wineburg and McGrew, “proves a colossal waste of time.”

As students meandered and fluttered across their screens, they were drawn to websites’ most easily manipulated features — like scientific-sounding language or the presence of an “About Us” page. Their grounds for inferring trustworthiness were largely centered on these incomplete evaluations, and they frequently misjudged websites’ origins and reliability.

Unlike the student participants in the study, the professional fact-checkers began their evaluations by opening new tabs in their browser. They conducted refined searches, and consulted other sources with well-established credentials, to judge the integrity of the original website. This inclination to take bearings and gain a sense of direction fed fact-checkers’ success in the study. They often needed less prompting than historians and students, and learned far more by reading less.

Takeaway: Encourage students to take the indirect route and begin their investigation of unfamiliar digital sources by leaving them. When students read laterally, they will avoid diving too deep into the actual content of the website in question and gain a wider, more impartial view of its credibility.

Don’t Fall for Appearances
Students’ more superficial evaluations of digital sources are evidence of what Wineburg and McGrew call the “representativeness heuristic” — the tendency to evaluate probabilities by the degree to which A resembles B. It’s easy for cognitive bias to take over in such scenarios.

For the great majority of the study’s student participants, this reliance on appearance determined their perception of given sources and created a “false sense of security.” They were drawn to website layouts, abstracts, references, and, in one case, a .org domain — all elements that may easily meet the requirements of a checklist approach to verifying a digital source.

“[Fact-checkers] understood the web as a maze filled with trap doors and blind alleys, where things are not always as they seem,” Wineburg and McGrew write. “Their stance toward the unfamiliar was cautious: while things may be as they seem, in the words of Checker D, ‘I always want to make sure.’”

Takeaway: Communicate to students that more thorough evaluations, like those lateral reading allows, are crucial to establishing the trustworthiness of digital information.

Practice “Click Restraint”
While engaging in lateral reading, fact-checkers also exercised what Wineburg and McGrew call “click restraint.” They took more time than historians and students to sort through search results and, though slower to reach their conclusions, were the most selective and most accurate in assessing the integrity of sources.

“Fact-checkers possessed knowledge of online structures,” write the researchers. “They knew that the first result was not necessarily the most authoritative, and they spent time scrolling through results.”

Scanning through Google snippets, fact-checkers were able to bypass massive amounts of material and focused on credible information from news organizations like the New York Times and the Washington Post. Students, on the other hand, were far less strategic and “meandered to different parts of the site [itself], making decisions about where to click based on aspects that struck their fancy.”

Takeaway: When you encourage students to read laterally, you should also remind them to exercise restraint and avoid promiscuous clicking. Speed shouldn’t come at the expense of quality verifying — but more efficient, lateral reading will really make the mere minutes most spend searching count.