- Sep 10, 2001
- 12,348
- 1
- 81
Some of you may have noticed that I don't really visit P&N anymore. This probably makes most of you very happy, as my approach here was never popular (except, perhaps, with the largely silent minority). I attempted, however unsuccessfully, to bring facts and logic to the argument. In this way, I changed my own mind much more often than I influenced anyone else's opinions. People on both sides of the aisle hate me because I don't have a side. I could never figure out why anyone would align themselves with one side or the other as neither side was aligned with reality. After talking with a friend (who is also an engineer) working on his MBA, he said that the hardest thing about managing people as an engineer is that he's used to working with other engineers who consider things logically and reasonably. With most people, that's simply not the case. Things started to click into place.
After years of beating my head against the wall that is the vocal majority of P&N posters, I finally found an article that explained what I already knew: most people simply accept tidbits which agree with their predisposed positions while summarily rejecting all other information as bogus. When confronted with facts, most of you will actually cling tighter to your position which is in direct opposition to those facts rather than adapting your position to bring it into line with reality. This is why conservatives prefer Fox News, why Rand Paul makes liberals so uncomfortable when he says that the poor here don't really have it so bad, and why most of you have never and will never change your opinions on any political issues.
In any case, here are some key points from the somewhat lengthy (4 page) article that about three of us will read.
After years of beating my head against the wall that is the vocal majority of P&N posters, I finally found an article that explained what I already knew: most people simply accept tidbits which agree with their predisposed positions while summarily rejecting all other information as bogus. When confronted with facts, most of you will actually cling tighter to your position which is in direct opposition to those facts rather than adapting your position to bring it into line with reality. This is why conservatives prefer Fox News, why Rand Paul makes liberals so uncomfortable when he says that the poor here don't really have it so bad, and why most of you have never and will never change your opinions on any political issues.
In any case, here are some key points from the somewhat lengthy (4 page) article that about three of us will read.
The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
The general idea is that its absolutely threatening to admit youre wrong, says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon known as backfire is a natural defense mechanism to avoid that cognitive dissonance.
In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept.
There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesnt. This is known as motivated reasoning. Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.
-----------------------------
New research, published in the journal Political Behavior last month, suggests that once those facts or facts are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigans Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there werent), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.
For the most part, it didnt. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic a factor known as salience the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didnt backfire, but the readers did still ignore the inconvenient fact that the Bush administrations restrictions werent total.
Its unclear what is driving the behavior it could range from simple defensiveness, to people working harder to defend their initial beliefs but as Nyhan dryly put it, Its hard to be optimistic about the effectiveness of fact-checking.
Source: http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/?page=1