By Jennifer (Jennie) Latson
Why Is It So Convincing To Repeat A Claim Again And Again — Even If It’s Patently Untrue?
How do you use Head On? Apply directly to the forehead. How do you know? Because you’ve heard it a million times.
A decade ago, before viral videos were even a thing, Head On was making a mint using a marketing technique as old as advertising: repetition. More than six million tubes of the headache balm sold in less than a year (despite reports that it might not actually work), thanks to a 2006 ad campaign that simply repeated the phrase “Head On. Apply directly to the forehead,” over and over.
The marketing term “effective frequency” refers to the idea that a consumer has to see or hear an ad a number of times before its message hits home. Essentially, the more you say something, the more it sticks in — and possibly on — people’s heads. It doesn’t even have to be true — and that’s the problem. What advertisers call “effective frequency,” psychologists call the “illusory truth effect”: the more you hear something, the easier it is for your brain to process, which makes it feel true, regardless of its basis in fact.
“Each time, it takes fewer resources to understand,” says Lisa Fazio, a psychology professor at Vanderbilt University. “That ease of processing gives it the weight of a gut feeling.”
That feeling of truth allows misconceptions to sneak into our knowledge base, where they masquerade as facts, Fazio and her colleagues write in a 2015 journal article. (One example they give is the belief that vitamin C can prevent colds, blowing the minds of those of us who’ve taken this as fact our entire lives, which is about how long we’ve heard it repeated.)
Even in the absence of endless repetition, we’re more likely to believe what we hear than to question it objectively, thanks to yet another psychological principle: confirmation bias.
“In general, human beings, after hearing any claim, behave like naive scientists and tend to look for information that confirms the initial conjecture,” says Ajay Kalra, a marketing professor at Rice’s Jones Graduate School of Business. “In an interesting experiment, a group of consumers were told a leather jacket (Brand A) was very good. When they later examined several brands, they tended to spend more time looking at Brand A and evaluating it more highly than other brands.”
The same principle applies to a coffee company’s claim that its coffee is the “richest” in the world, Kalra says: It’s hard to find contradictory evidence for a statement so vague. “Confirmation bias typically applies to situations where information is ambiguous and hard to refute,” he explains. “The more often you hear a message, the more the confirmatory bias likely comes into play.”
So it’s no wonder that many of us fall for false claims on social media, especially when we see them tweeted and retweeted again and again. And if it feels like we’re seeing more falsehoods repeated more frequently these days, we are — especially from America’s top elected official, according to the Washington Post’s Fact Checker team.
The social implications are huge. For example: the fear that immigration drives crime, which Trump recently stoked on Twitter. “Crime in Germany is way up,” he tweeted June 18. “Big mistake made all over Europe in allowing millions of people in who have so strongly and violently changed their culture!”
Fact checkers quickly noted that German crime rates are at their lowest level since 1992, but Trump repeated the claim the following day. “Crime in Germany is up 10% plus (officials do not want to report these crimes) since migrants were accepted. Others countries are even worse. Be smart America!” he tweeted.
What happens when a powerful person makes — and repeats — a false claim? In this case, the danger is a backlash against immigrants. But the cumulative effect of constantly repeated falsehoods is even more insidious: it undermines truth altogether, leaving public discourse unmoored from fact.
“The constant repetition of the lie is the way to make truth meaningless,” Timothy Egan writes in a New York Times op-ed. “After a while, people come to ‘believe everything and nothing, think that everything was possible and that nothing was true,’ wrote Hannah Arendt, the German-born philosopher, in describing how truth lost its way in her native land.”
So how can we fight back? Inoculating ourselves against the power of repetition is harder than you’d think. Common sense tells us that knowing the truth should be the antidote — but that’s not enough, as Fazio and her colleagues demonstrate.
“The prevailing assumption in the literature has been that knowledge constrains this effect (i.e., repeating the statement ‘The Atlantic Ocean is the largest ocean on Earth’ will not make you believe it),” Fazio and her team wrote. “[However,] illusory truth effects occurred even when participants knew better.”
Janet Moore, director of MBA communications program at Rice Business, agrees that inoculation may be impossible — but there are ways to lessen the influence of repeated claims, she says. One of the best: don’t rely on a single source for information. Read stories from multiple news outlets and listen to a variety of opinions. Commit to staying open-minded, and consult with friends and colleagues whose perspectives differ.
“Especially if you have trusted friends with different viewpoints, openly discuss the repeated story,” she says. “Explore whether it’s really worth repeating.”
And Moore, who began her career as a lawyer, says it couldn’t hurt to think like one. “Try to examine every assertion ‘on the merits,’ as is done in the legal profession,” she suggests.
Fazio’s research backs this up. Just taking a second to consider how you know something is true can stymie the effects of repetition, she’s found. “It’s a matter of getting people to consult something other than that gut feeling,” she says. “It’s a great thing to do on social media: before you share something, take that second and pause.” Otherwise, you risk becoming part of the echo chamber that keeps falsehoods circulating.
Of course, our tendency to assume that people are telling the truth is not a bad thing in and of itself, Fazio points out. “If you had to constantly verify and second-guess others, you wouldn’t get very far in terms of relationships and social order,” she says.
Until recently, American society ranked relatively high in measures of trust, Fazio says: We’ve tended to believe what we hear from institutions and the media. That trust seems to be eroding. But even a newfound skepticism of the government and the press won’t change our basic cognitive processes.
“It’s still going to be harder to notice errors in things we hear over and over,” she says. That’s a universal human trait. Which also means it’s bipartisan — and that may at least level the playing field when it comes to fake news.
A recent study by Yale researchers finds that “[the] ‘illusory truth effect’ for fake news headlines occurs despite a low level of overall believability, and even when the stories are labeled as contested by fact checkers or are inconsistent with the reader’s political ideology.”
According to the study, even if a headline goes against your political leanings, you’re more likely to find it believable after seeing it multiple times. If nothing else, this finding may offer some consolation when we fall for falsehoods: It happens to the best of us, against both our better judgment and our own interests.
“What the research shows is that this isn’t something that just happens to stupid people,” Fazio says. “It’s part of how our brain functions. And it happens to everyone.”
Jennifer Latson is a writer and editor at Rice Business and the author of The Boy Who Loved Too Much, a nonfiction book about a rare disorder sometimes called the opposite of autism.
This article also appeared in the Houston Chronicle's Gray Matters as "How can we separate the truth from a lie?"