Digital fraudsters know humans are terrible at checking facts. A psychologist and researchers explain how better fact-checking may help foil cybercriminals' attempts at phishing.
"Research from the field of cognitive psychology indicates people are naturally poor fact-checkers," writes Lisa Fazio, assistant professor of psychology at Vanderbilt University, in this The Conversation article. "It is very difficult for us to compare things we read or hear to what we already know about a topic."
SEE: Incident response policy (Tech Pro Research)
It's a safe bet that cybercriminals--in particular, those who spear-phish--understand and use the research described by Fazio to improve their success rate. Besides relying on poor fact-checking, digital fraudsters place a great deal of importance on crafting official-looking, malicious emails and websites. "In phishing attacks, cybercriminals utilize manipulation and deception to trick users into providing the requested information (i.e., social engineering)," write authors Ina Wanca and Ashley Cannon in their paper How human behavior and decision making expose users to phishing attacks (PDF).
How many animals of each kind did Moses take on the Ark?
The above question has been part of surveys used by psychologists since the 1980s, and most participants miss that Noah, not Moses, was on the Ark. The Moses Illusion (also known as knowledge neglect) occurs when relevant information is available but not used in the decision-making process--a human trait that cybercriminals count on, unfortunately.
One reason why the Moses Illusion works so well is that people typically spend more time and effort trying to understand what's being heard or read than determining whether the information is true.
Truth bias is another reason why the illusion works. People tend to believe what they hear or read is true regardless of the source or any prior knowledge they may have about the subject. In other words, people expect information they receive to be correct.
The illusory-truth effect
Another successful psychological phenomenon cybercriminals employ to deceive their targets is called the illusory-truth effect. Fazio, along with Nadia M. Brashier (Duke University), B. Keith Payne (University of North Carolina at Chapel Hill), and Elizabeth J. Marsh (Duke University) in their research paper Knowledge Does Not Protect Against Illusory Truth (PDF) suggest it is human nature to attach more validity to information that has been repeated multiple times.
"Research on the illusory-truth effect demonstrates that repeated statements are easier to process, and subsequently perceived to be more truthful than new statements," write the coauthors. "Contrary to prior suppositions, illusory-truth effects occur even when participants know better."
Prior knowledge helps fight illusory truth
Prior knowledge does help, but not as much as previously thought. "Expertise did not eliminate the illusion, even when errors were bolded and underlined, meaning that it was unlikely that people simply skipped over errors," writes Allison D. Cantor and Elizabeth J. Marsh in their paper Expertise effects in the Moses illusion: detecting contradictions with stored knowledge. "The results support claims that people often use heuristics to judge truth, as opposed to directly retrieving information from memory, likely because such heuristics are adaptive and often lead to the correct answer."
The authors add, "Even experts sometimes use such shortcuts, suggesting that overlearned and accessible knowledge does not guarantee retrieval of that information."
Is there a solution?
Fazio and her colleagues tried several methods to improve fact-checking ability--most failed, with some making the situation worse. Fazio offers an example of a situation made worse:
"We tried highlighting the critical information in a red font. We told readers to pay particular attention to the information presented in red with the hope that paying special attention to the incorrect information would help them notice and avoid the errors. Instead, they paid additional attention to the errors and were thus more likely to repeat them on the later test."
There is good news: It seems survey participants avoided misinformation (being phished) when asked to edit a story and highlight inaccurate statements or read the stories--sentence by sentence--and decide whether each sentence contains an error.
It is far from a perfect solution. "It's important to note that even 'fact-checking' readers miss many of the errors and retain false information from the stories," adds Fazio. "For example, in the sentence-by-sentence detection task participants caught about 30 percent of the errors. But given their prior knowledge they should have been able to detect at least 70 percent."
How to act like a fact-checker
If acting like a fact-checker helps, it might be useful to look at how professionals fact-check. Alexios Mantzarlis, director of International Fact-Checking Network at Poynter Institute, helped develop a Fact-Checkers' Code of Principles. These are several of the concepts Mantzarlis feels strongly about that might help improve our ability to zero in on what is truth and what is misinformation:
- Follow the same methodology for every fact-check and let the evidence dictate conclusions;
- Be concerned if sources are not transparent, paying particular attention to funding sources; and
- Have a willingness to correct perceptions when fact-checking provides a different answer.
With both fake news and spear-phishing attacks trending high on the internet, it might behoove each of us to start wearing our fact-checking hat.
- Too smart to fall for a spear-phishing message? Think again (TechRepublic)
- Want to improve cybersecurity? Try phishing your own employees (TechRepublic)
- Extra, extra! That fake news story might come with malware (TechRepublic)
- How blockchain technology could prevent fake news from spreading (TechRepublic)
- Beware of the bots: How they're created and why they matter (TechRepublic)
- Google Alphabet's Schmidt: Here's why we can't keep fake news out of search results (ZDNet)