As the Conservatives lead us down the man-made climate change path to extinction, an article by Chris Mooney at Motherjones.com tells us, "The Science of Why We Don't Believe Science."
|Illustration: Jonathon Rosen|
"Festinger and several of his colleagues had infiltrated the Seekers, a small Chicago-area cult whose members thought they were communicating with aliens—including one, 'Sananda,' who they believed was the astral incarnation of Jesus Christ. The group was led by Dorothy Martin, a Dianetics devotee who transcribed the interstellar messages through automatic writing.
"Through her, the aliens had given the precise date of an Earth-rending cataclysm: December 21, 1954. Some of Martin's followers quit their jobs and sold their property, expecting to be rescued by a flying saucer when the continent split asunder and a new sea swallowed much of the United States. The disciples even went so far as to remove brassieres and rip zippers out of their trousers—the metal, they believed, would pose a danger on the spacecraft.
"Festinger and his team were with the cult when the prophecy failed. First, the 'boys upstairs' (as the aliens were sometimes called) did not show up and rescue the Seekers. Then December 21 arrived without incident. It was the moment Festinger had been waiting for: How would people so emotionally invested in a belief system react, now that it had been soundly refuted?
|Read also: the truth about Climategate.|
"From that day forward, the Seekers, previously shy of the press and indifferent toward evangelizing, began to proselytize. 'Their sense of urgency was enormous,' wrote Festinger. The devastation of all they had believed had made them even more certain of their beliefs.
"In the annals of denial, it doesn't get much more extreme than the Seekers. They lost their jobs, the press mocked them, and there were efforts to keep them away from impressionable young minds. But while Martin's space cult might lie at on the far end of the spectrum of human self-delusion, there's plenty to go around. And since Festinger's day, an array of new discoveries in psychology and neuroscience has further demonstrated how our preexisting beliefs, far more than any new facts, can skew our thoughts and even color what we consider our most dispassionate and logical conclusions. This tendency toward so-called 'motivated reasoning' helps explain why we find groups so polarized over matters where the evidence is so unequivocal: climate change, vaccines, 'death panels,' the birthplace and religion of the president (PDF), and much else. It would seem that expecting people to be convinced by the facts flies in the face of, you know, the facts.
"We apply fight-or-flight reflexes not only to predators, but to data itself.""The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call 'affect'). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we're aware of it. That shouldn't be surprising: Evolution required us to react very quickly to stimuli in our environment. It's a 'basic human survival skill,' explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.
"We're not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn't take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that's highly biased, especially on topics we care a great deal about.
"Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. 'They retrieve thoughts that are consistent with their previous beliefs,' says Taber, 'and that will lead them to build an argument and challenge what they're hearing.'
"In other words, when we think we're reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we're being scientists, but we're actually being lawyers (PDF). Our 'reasoning' is a means to a predetermined end—winning our 'case'—and is shot through with biases. They include 'confirmation bias,' in which we give greater heed to evidence and arguments that bolster our beliefs, and 'disconfirmation bias,' in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.
"That's a lot of jargon, but we all understand these mechanisms when it comes to interpersonal relationships. If I don't want to believe that my spouse is being unfaithful, or that my child is a bully, I can go to great lengths to explain away behavior that seems obvious to everybody else—everybody who isn't too emotionally invested to accept it, anyway. That's not to suggest that we aren't also motivated to perceive the world accurately—we are. Or that we never change our minds—we do. It's just that we have other important goals besides accuracy—including identity affirmation and protecting one's sense of self—and often those make us highly resistant to changing our beliefs when the facts say we should.
"Modern science originated from an attempt to weed out such subjective lapses—what that great 17th century theorist of the scientific method, Francis Bacon, dubbed the 'idols of the mind.' Even if individual researchers are prone to falling in love with their own theories, the broader processes of peer review and institutionalized skepticism are designed to ensure that, eventually, the best ideas prevail.
"Scientific evidence is highly susceptible to misinterpretation. Giving ideologues scientific data that's relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.""Our individual responses to the conclusions that science reaches, however, are quite another matter. Ironically, in part because researchers employ so much nuance and strive to disclose all remaining sources of uncertainty, scientific evidence is highly susceptible to selective reading and misinterpretation. Giving ideologues or partisans scientific data that's relevant to their beliefs is like unleashing them in the motivated-reasoning equivalent of a candy store.
"Sure enough, a large number of psychological studies have shown that people respond to scientific or technical evidence in ways that justify their preexisting beliefs. In a classic 1979 experiment (PDF), pro- and anti-death penalty advocates were exposed to descriptions of two fake scientific studies: one supporting and one undermining the notion that capital punishment deters violent crime and, in particular, murder. They were also shown detailed methodological critiques of the fake studies—and in a scientific sense, neither study was stronger than the other. Yet in each case, advocates more heavily criticized the study whose conclusions disagreed with their own, while describing the study that was more ideologically congenial as more 'convincing.'
"Since then, similar results have been found for how people respond to 'evidence' about affirmative action, gun control, the accuracy of gay stereotypes, and much else. Even when study subjects are explicitly instructed to be unbiased and even-handed about the evidence, they often fail.
"And it's not just that people twist or selectively read scientific evidence to support their preexisting views. According to research by Yale Law School professor Dan Kahan and his colleagues, people's deep-seated views about morality, and about the way society should be ordered, strongly predict whom they consider to be a legitimate scientific expert in the first place—and thus where they consider "scientific consensus" to lie on contested issues.
"In Kahan's research (PDF), individuals are classified, based on their cultural values, as either 'individualists' or 'communitarians,' and as either 'hierarchical' or 'egalitarian' in outlook. (Somewhat oversimplifying, you can think of hierarchical individualists as akin to conservative Republicans, and egalitarian communitarians as liberal Democrats.) In one study, subjects in the different groups were asked to help a close friend determine the risks associated with climate change, sequestering nuclear waste, or concealed carry laws: 'The friend tells you that he or she is planning to read a book about the issue but would like to get your opinion on whether the author seems like a knowledgeable and trustworthy expert.' A subject was then presented with the résumé of a fake expert 'depicted as a member of the National Academy of Sciences who had earned a Ph.D. in a pertinent field from one elite university and who was now on the faculty of another.' The subject was then shown a book excerpt by that 'expert,' in which the risk of the issue at hand was portrayed as high or low, well-founded or speculative. The results were stark: When the scientist's position stated that global warming is real and human-caused, for instance, only 23 percent of hierarchical individualists agreed the person was a 'trustworthy and knowledgeable expert.' Yet 88 percent of egalitarian communitarians accepted the same scientist's expertise. Similar divides were observed on whether nuclear waste can be safely stored underground and whether letting people carry guns deters crime. (The alliances did not always hold. In another study (PDF), hierarchs and communitarians were in favor of laws that would compel the mentally ill to accept treatment, whereas individualists and egalitarians were opposed.)
"Head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.""In other words, people rejected the validity of a scientific source because its conclusion contradicted their deeply held views—and thus the relative risks inherent in each scenario. A hierarchal individualist finds it difficult to believe that the things he prizes (commerce, industry, a man's freedom to possess a gun to defend his family) (PDF) could lead to outcomes deleterious to society. Whereas egalitarian communitarians tend to think that the free market causes harm, that patriarchal families mess up kids, and that people can't handle their guns. The study subjects weren't 'anti-science'—not in their own minds, anyway. It's just that 'science' was whatever they wanted it to be. 'We've come to a misadventure, a bad situation where diverse citizens, who rely on diverse systems of cultural certification, are in conflict,' says Kahan.
"And that undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts—they may hold their wrong views more tenaciously than ever.
"Take, for instance, the question of whether Saddam Hussein possessed hidden weapons of mass destruction just before the US invasion of Iraq in 2003. When political scientists Brendan Nyhan and Jason Reifler showed subjects fake newspaper articles (PDF) in which this was first suggested (in a 2004 quote from President Bush) and then refuted (with the findings of the Bush-commissioned Iraq Survey Group report, which found no evidence of active WMD programs in pre-invasion Iraq), they found that conservatives were more likely than before to believe the claim. (The researchers also tested how liberals responded when shown that Bush did not actually 'ban' embryonic stem-cell research. Liberals weren't particularly amenable to persuasion, either, but no backfire effect was observed.)"
CHRIS MOONEY Correspondent
Chris Mooney is a science and political journalist, podcaster, and the host of Climate Desk Live. He is the author of four books, including the New York Times bestselling The Republican War on Science. RSS | TWITTER
While Conservative Sheeplets and Tea Baggers "think (they're) reasoning, (they) may instead be rationalizing," and truthiness then erupts in their heads.
And when the heirarchical meets the egalitarian, no amount of logic will turn on the light of intelligence or discovery in the Sheeplet's brain, and when they are told by their propagandists that climate change is man-made, they are also informed that climate change doesn't exists!
The sad thing is that the Sheeplets are teaching their kids to be anti-science -- with tinfoil hat attached. With liberals the "backfire" reaction is nonexistent; with Sheeplets and Tea Baggers, it's a way of life, and the Conservative leadership has no qualms in instructing the Sheeplets to stay stupid.
It oughta be a crime.
"Carbon dioxide is portrayed as harmful. But there isn’t even one study that can
be produced that shows that carbon dioxide is a harmful gas."
GOP Rep. Michele Bachmann.
GOP Rep. Michele Bachmann.