Opinions based on beliefs

Here is a very disturbing article on the relationship of facts to beliefs.  Turns out, according to a study discussed in the article, for certain people, awareness of facts can have a negative correlations to the accuracy of a person's beliefs.  Studies such as these at first blush seem to portend the death of the traditional fact-driven conception of belief.  But then there's this:

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote. 

Pardon the irony, but I'm not sure if this changes anything.  I think we're still talking about the traditional fact-driven conception of belief–that is, you believe what you take to be facts, and disbelieve what you don't.  It's just that you don't believe that the facts alleged are facts.  The problem is that for whatever reason too many people kind of suck at distinguishing the true from the false.

 

18 thoughts on “Opinions based on beliefs”

  1. One thing that might help clear up this confusion is to ask the people presented with corrected facts their reason for rejecting these facts and sticking to their opinions. I think we might find that many of these people simply do not trust the source of these facts. In fact, I think trust, or lack thereof, of sources is the primary reason that facts often do not sway people. Unfortunately, most of our knowledge is taken on trust — as it must be in a highly specialized and diversified epistemic society.

  2. I think that's right.  And this brings us back to the basic, and sadly traditional point: people support false beliefs with defective reasoning, what Aristotle might have called "sophistry."

  3. This article threatens my deeply held belief that democracy is the best form of government, therefore I shall refuse to accept it as being true.

  4. Someone might say that they ought not to let everyone speak on equal terms and serve on the council, but rather just the cleverest and finest. Yet their policy is also excellent in this very point of allowing even the worst people to speak. For if the good men were to speak and make policy, it would be splendid for the likes of themselves but not so for the men of the people. But, as things are, any wretch who wants to can stand up and obtain what is good for him and the likes of himself. Someone might say, “What good would such a man propose for himself and the people?” But they know that this man's ignorance, baseness, and favour are more profitable than the good man's virtue, wisdom, and ill will. A city would not be the best on the basis of such a way of life, but the democracy would be best preserved that way. For the people do not want a good government under which they themselves are slaves; they want to be free and to rule. Bad government is of little concern to them. What you consider bad government is the very source of the people's strength and freedom. If it is good government you seek, you will first observe the cleverest men establishing the laws in their own interest. Then the good men will punish the bad; they will make policy for the city and not allow madmen to participate or to speak their minds or to meet in assembly. As a result of these excellent measures the people would swiftly fall into slavery.

  5. So, why did they not believe the facts? Was it poor reasoning? Was it the unreliability of the facts' source? Were they forcing themselves to believe in spite of the clear facts?
    Firstly, why do we expect people to believe once we present them with facts? I think the underlying assumption is that the human mind is completely ruled by reason. In my opinion, this assumption is false.
    The way I see it, people are not as open-minded as one might think; they are not in a "neutral" state where they can listen to the facts, reason and believe. In fact, I think, people always face a choice: reason/beliefs vs moods/feelings/prejudices/desires/pride/popularity/…
    To put it simply, I think the test of any belief should be: would I be willing to accept where the evidence leads me? And I think most people are not. As a result, it's not that they deny completely the facts, they just ignore it or fail to complete the pursuit of facts.

  6. I am currently attempting to conduct an exorcism for a friend who is devoted to Fox News and Mr. Beck. Thus far his responses (to include the 360 degree head turn and projectile vomit) have been exactly as described by the researchers. I, for one, would like to find a source like yours that uses logic to disconnect the dots on Beck's chalkboard. Thanks for your work. The reasonable middle of the road is out there somewhere.

  7. Doug–good luck.  My experience will be what the article predicts and your friend won't be convinced.

    BN, I don't know if anyone can seriously maintain there is some kind of perfect neutral state.  The question, as you correctly point out, is whether we have the right kind of attitude towards our beliefs.  One part of that attitude is an openness to correction.  If I run into things that contradict my beliefs, am I willing to revise them?  Beneath that, however, one has to have a certain set of skills–e.g., an ability to recognize sophistries when they come up, an capacity to distinguish empirical claims from logical ones.  In addition to this, one must have some knowledge of facts–or perhaps better–knowledge of what you don't know (call it the STFU knowledge–if you're not an expert, then admit it).

    So in the end I think the article brings us back to where we are.  But maybe I'm wrong.  

  8. I think the underlying assumption is that the human mind is completely ruled by reason. In my opinion, this assumption is false.

    The problem is that we have to take human rationality on faith. Tricky, isn't it?

  9. Doug, I second John's prediction.
    John, I think you nailed it: humility is the key. "Sometimes our humble hearts can help us more than our proud minds."
    Grace, I couldn't agree more: credo ut intelligam.

  10. I thought this might be relevant:
    "The morbid logician seeks to make everything lucid, and succeeds in making everything mysterious. The mystic allows one thing to be mysterious, and everything else becomes lucid."  — Chesterton —

  11. John, I think Grace brought up the point, but here's Chesteron's take on it: "Reason is itself a matter of faith. It is an act of faith to assert that our thoughts have any relation to reality at all. If you are merely a skeptic, you must sooner or later ask yourself the question, “Why should ANYTHING go right; even observation and deduction? Why should not good logic be as misleading as bad logic? They are both movements in the brain of a bewildered ape?” The young skeptic says, “I have a right to think for myself.” But the old skeptic, the complete skeptic, says, “I have no right to think for myself. I have no right to think at all.”

  12. I recently attended a lecture by the cognitive linguist George Lakoff. He made a rather compelling case that it is the repetition of rhetoric that establishes familiar neural resonance in the partisan's brain.
    If the Faux News folks writ large are good at anything, it's repeating the same phrases (death panels, govt. takeover, war on terrah, etc.). When the facts are presented, it's usually in tandem with the well-crafted language of the fear-monger. Thus the listener, says Lakoff, hears and leaves with the familiar phrase that resonates with them and the facts that countermand the framed talking-points are left behind or quickly forgotten.
    Take this with a grain of salt as it's anecdotal, but he advised Obama to use the "pound home the message by repetition" method (hope, change, etc.) which he adopted with great success. He advise H. Clinton to do the same, but she went the fact vs. rhetoric rout instead. The results are in the White House.

  13. "Facts don’t necessarily have the power to change our minds. In fact, quite the opposite."

    How can one believe that if true?
    I used to belive that there were three kinds of people in the world, those that can count and those that can't. But now I am supposed to believe that there are two kinds of people, those that believe in facts and those that don't?
    Seriously though, it is hard to chip away at someone's learned from childhood belief system (such as religion or conservatism) that is not based on facts.

Comments are closed.