Deny All You Want, They'll Still Believe
Deny All You Want, They'll Still Believe
Cognitive barriers to understanding evolution, Political paranoia, motivated reasoning, correcting misinformation, more on biases
Why Public Denials May Only Fuel Conspiracy Theories
By JOHN ALLEN PAULOS - 11/4/2007 - original
Iraq and 9/11, sex trafficking, flu vaccines, widespread autism. Cognitive biases color our view of these and other issues and can affect our policy choices.
Because they are well-, but not widely understood, I'd like to briefly mention three of the most common ones and some related new and troubling research about denials.
First the biases.
Three Common Psychological Biases
1. The "availability heuristic" is the pronounced tendency of people to view any story through the lens of a superficially similar story that comes easily to mind or is psychologically available. For this reason, much of politics revolves around strengthening this tendency by keeping a preferred narrative uppermost in people's minds. It doesn't take too keen a political instinct, for example, to realize that some politicians' incessant invoking of 9/11 is an effort to keep it psychologically available, to help it color every aspect of the political agenda.
2. Another common psychological failing is called the "anchoring effect" and refers to our tendency to credit and easily become attached to the first number we hear about a particular phenomenon. If, say, an organization announces that there are X thousand new sufferers from some condition each year, the number may, even if grossly false, take a long time to debunk. Various groups, for example, announced a few years ago that 50,000 sex slaves were trafficked into the United States annually, and this very disturbing claim was taken up by The New York Times and other media. Nevertheless, although dozens of task forces and $150 million have been devoted to the problem since then, fewer than 1,500 cases have even been registered during the last seven years of effort.
3. "Confirmation bias" refers to the way we check a hypothesis by looking for occurrences that confirm it (as well as our amazing perspicacity) and ignoring those that do not. We "just know" something is true because it's the only possibility we consider as we search mightily for whatever might confirm our beliefs and pay scant attention to whatever might disconfirm them. Consider the Iraq war again, dot-com stocks or many medical snap judgments.
New Research on Clarification and Myth-Busting
These and other psychological foibles have been well-known, especially since cognitive psychologists Amos Tversky, Daniel Kahneman and many other researchers began describing and cataloging them more than 30 years ago. Despite this, a recent study by University of Michigan psychologist Norbert Schwarz is still rather surprising.
As indicated in his article (see link to it below) in Advances in Experimental Social Psychology and reported upon by Shankar Vedantam in The Washington Post (where I first learned of it), Schwarz copied a flier put out by the Centers for Disease Control and Prevention intended to combat various myths about the flu vaccine. It listed a number of common beliefs about the vaccine and indicated whether they were true or false. He then asked volunteers to read the flier. Some of them were old, some young, but shortly thereafter he found that many of the older people mistakenly remembered almost of a third of the false statements as being true, and after a few days young and old alike misclassified 40 percent of the myths as factual.
Even worse was that people now attributed these false beliefs to the CDC itself! In an effort to dispel misconceptions about the vaccine, the CDC had inadvertently lent its prestige to them. In many cases, truth and elucidation can actually strengthen misconceptions and make them more psychologically available.
Related studies by Kimberlee Weaver of Virginia Polytechnic University and others (link below) have shown that being repeatedly exposed to information from a single source is often tantamount to hearing it from many sources. People simply forget where they heard something after a while, and the repetition makes it more psychologically available and hence credible.
In one simple experiment, groups of students were given an e-mail to read, some groups receiving a version that seemed to have a software bug in it since the same crucial paragraph was reprinted two more times at the bottom of the e-mail. Those students with the repeated information were more persuaded by the e-mail and overestimated the information's general appeal than did students who had read the non-repetitive e-mail.
Denying and Dispelling Disinformation Is Dangerous
Trying to dispel myths can backfire. It's not just confirmation bias that helped sell people on the Iraq War. When war opponents denied the Bush administration's repeated assertions that Saddam Hussein was behind the 9/11 attacks, they were probably strengthening, at least in the minds of some, the belief that he was indeed involved. Likewise, the effect on many in the Arab world of denials that the hijackers were really Westerners, that Jews were warned to stay home, et cetera, also may have been to strengthen that flapdoodle.
The difficulty in processing denials is probably part of the reason for their frequent ineffectiveness. Complexity and logical connectives get lost in transmission. (Quick, what does the following sentence mean? "It's not the case that Waldo denied that evidence was lacking that he did in fact fail to do X.") In fact, the new research suggests that people quite often mentally transmute a denial into an assertion. They hear "X is not this bad thing Y," and soon enough what remains with them is the link between X and Y, which eventually becomes "X is Y."
These foibles, it should be repeated, affect our beliefs about many disparate phenomena, not just political issues. Health scares are a rich source for them. Sticking with the vaccine theme, I note that the denial by medical researchers and scientists of a connection between mercury in childhood vaccines and the increased incidence of autism (defined now to include Asperger's syndrome and other "autism spectrum disorders") has not quelled the controversy over the vaccines or weakened the belief that mercury was the culprit. The number of cases cited for all such disorders also serves to anchor our estimate of the prevalence of autism proper.
Being aware of these biases will presumably help us minimize their effects. One specific lesson from this research seems clear. Denials of assertions should in general not repeat the assertions. It's better to say "X is this good thing Z" rather than "X is not this bad thing Y."
One Last Issue
Of course, another way to use this research is to exploit our misguided tendencies. Not just politics, but advertising does this all the time. With regard to denials, an applied bit of marketing jujitsu suggests one might even sometimes deny what one wants people to believe. I will thus take this opportunity to modestly deny that my forthcoming book, Irreligion, a Mathematician Explains Why the Arguments for God Just Don't Add Up, has already been mentioned as a possible Pulitzer Prize winner.
John Allen Paulos, a professor of mathematics at Temple University, is the author of the best-sellers Innumeracy and A Mathematician Reads the Newspaper, as well as of the forthcoming (in December) Irreligion. His Who's Counting? column on ABCNEWS.com appears the first weekend of every month.