Status
Not open for further replies.
Oct 16, 1999
10,490
4
0
Here's why:

http://scienceblogs.com/cortex/2010/07/political_dissonance.php
Joe Keohane has a fascinating summary of our political biases in the Boston Globe Ideas section this weekend. It's probably not surprising that voters aren't rational agents, but it's always a little depressing to realize just how irrational we are. (And it's worth pointing out that this irrationality applies to both sides of the political spectrum.) We cling to mistaken beliefs and ignore salient facts. We cherry-pick our information and vote for people based on an inexplicable stew of superficial hunches, stubborn ideologies and cultural trends. From the perspective of the human brain, it's a miracle that democracy works at all. Here's Keohane:

A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare -- the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct -- but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the "I know I'm right" syndrome, and considers it a "potentially formidable problem" in a democratic system. "It implies not only that most people will resist correcting their factual beliefs," he wrote, "but also that the very people who most need to correct them will be least likely to do so."

In How We Decide, I discuss the mental mechanisms behind these flaws, which are ultimately rooted in cognitive dissonance:

Partisan voters are convinced that they're rational⎯only the other side is irrational⎯but we're actually rationalizers. The Princeton political scientist Larry Bartels analyzed survey data from the 1990's to prove this point. During the first term of Bill Clinton's presidency, the budget deficit declined by more than 90 percent. However, when Republican voters were asked in 1996 what happened to the deficit under Clinton, more than 55 percent said that it had increased. What's interesting about this data is that so-called "high-information" voters⎯these are the Republicans who read the newspaper, watch cable news and can identify their representatives in Congress⎯weren't better informed than "low-information" voters. According to Bartels, the reason knowing more about politics doesn't erase partisan bias is that voters tend to only assimilate those facts that confirm what they already believe. If a piece of information doesn't follow Republican talking points⎯and Clinton's deficit reduction didn't fit the "tax and spend liberal" stereotype⎯then the information is conveniently ignored. "Voters think that they're thinking," Bartels says, "but what they're really doing is inventing facts or ignoring facts so that they can rationalize decisions they've already made." Once we identify with a political party, the world is edited so that it fits with our ideology.

At such moments, rationality actually becomes a liability, since it allows us to justify practically any belief. We use the our fancy brain as an information filter, a way to block-out disagreeable points of view. Consider this experiment, which was done in the late 1960's, by the cognitive psychologists Timothy Brock and Joe Balloun. They played a group of people a tape-recorded message attacking Christianity. Half of the subjects were regular churchgoers while the other half were committed atheists. To make the experiment more interesting, Brock and Balloun added an annoying amount of static⎯a crackle of white noise⎯to the recording. However, they allowed listeners to reduce the static by pressing a button, so that the message suddenly became easier to understand. Their results were utterly predicable and rather depressing: the non-believers always tried to remove the static, while the religious subjects actually preferred the message that was harder to hear. Later experiments by Brock and Balloun demonstrated a similar effect with smokers listening to a speech on the link between smoking and cancer. We silence the cognitive dissonance through self-imposed ignorance.

There is no cure for this ideological irrationality - it's simply the way we're built. Nevertheless, I think a few simple fixes could dramatically improve our political culture. We should begin by minimizing our exposure to political pundits. The problem with pundits is best illustrated by the classic work of Philip Tetlock, a psychologist at UC-Berkeley. (I've written about this before on this blog.) Starting in the early 1980s, Tetlock picked two hundred and eighty-four people who made their living "commenting or offering advice on political and economic trends" and began asking them to make predictions about future events. He had a long list of questions. Would George Bush be re-elected? Would there be a peaceful end to apartheid in South Africa? Would Quebec secede from Canada? Would the dot-com bubble burst? In each case, the pundits were asked to rate the probability of several possible outcomes. Tetlock then interrogated the pundits about their thought process, so that he could better understand how they made up their minds. By the end of the study, Tetlock had quantified 82,361 different predictions.

After Tetlock tallied up the data, the predictive failures of the pundits became obvious. Although they were paid for their keen insights into world affairs, they tended to perform worse than random chance. Most of Tetlock's questions had three possible answers; the pundits, on average, selected the right answer less than 33 percent of the time. In other words, a dart-throwing chimp would have beaten the vast majority of professionals.

So those talking heads on television are full of shit. Probably not surprising. What's much more troubling, however, is that they've become our model of political discourse. We now associate political interest with partisan blowhards on cable TV, these pundits and consultants and former politicians who trade facile talking points. Instead of engaging with contrary facts, the discourse has become one big study in cognitive dissonance. And this is why the predictions of pundits are so consistently inaccurate. Unless we engage with those uncomfortable data points, those stats which suggest that George W. Bush wasn't all bad, or that Obama isn't such a leftist radical, then our beliefs will never improve. (It doesn't help, of course, that our news sources are increasingly segregated along ideological lines.) So here's my theorem: The value of a political pundit is directly correlated with his or her willingness to admit past error. And when was the last time you heard Karl Rove admit that he was wrong?

I thought it might be beneficial to get to the root of all these issues partisan-types stay in denial about instead of trying to tackle each one by one. Because you obviously can't use a reasoned, factual argument to sway most folks on an issue they've already made their mind up about.
 

Moonbeam

Elite Member
Nov 24, 1999
72,327
6,040
126
I guess the reason I am right about everything is that I don't know anything and admit I couldn't possibly know the right answer.
 

bfdd

Lifer
Feb 3, 2007
13,312
1
0
Here's why:

http://scienceblogs.com/cortex/2010/07/political_dissonance.php


I thought it might be beneficial to get to the root of all these issues partisan-types stay in denial about instead of trying to tackle each one by one. Because you obviously can't use a reasoned, factual argument to sway most folks on an issue they've already made their mind up about.

I pretty much ignore people who identify with a group when talking about their personal beliefs. Odds are they're just parroting what someone else told them.
 

Binarycow

Golden Member
Jan 10, 2010
1,238
2
76
I pretty much ignore people who identify with a group when talking about their personal beliefs. Odds are they're just parroting what someone else told them.

So who else is left for you to talk to? Sad isn't it, that a good portion of the people who are eligible to vote in this country are practically imbeciles?
 
Last edited:

Hayabusa Rider

Admin Emeritus & Elite Member
Jan 26, 2000
50,879
4,265
126
Of course I'm wrong. Being human I'm not perfect but my general worldview is that I'm incorrect, and that I need to validate my beliefs before accepting them. That's why I can't identify with ideologies because they are preconceived notions of reality which are then used as a lens to view the "truth"

I prefer whenever possible to remove those filters. That doesn't mean we can avoid bias, because we are all products of our environment, but it doesn't justify not trying.
 

piasabird

Lifer
Feb 6, 2002
17,168
60
91
So this article points out the reduction of the deficeit under Bill Clinton. He did two important things. He extremely reduced the size of the miliatary and its effectiveness to defend the United States. He also signed into Law a very large tax increase. He is famous for doing nothing. What this really proves is that when the government does nothing the people will thrive. At least he had a fairly stable government. I guess this proves doing nothing works.
 

woolfe9999

Diamond Member
Mar 28, 2005
7,164
0
0
The findings described in the article accord with my own anecdotal observations of fellow human beings. The ones who are most certain they are correct are most likely to be wrong. In fact, they are almost always the stupidest person in the room.

- wolf
 

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
I think we need a study on this amazing coincidence of studies about how voters are stupid always coming out when Democrats are losing. Scary how that happens.
 

tweaker2

Lifer
Aug 5, 2000
14,476
6,896
136
I vaguely recall a study done at an Ivy League College whereby it was found that students that did well and got good grades had a habit of underestimating themselves, and students who didn't do well for the most part overestimated their abilities as students.

Makes sense that this trait would apply itself to many other aspects of a person's life.
 
Status
Not open for further replies.