The Godwinator

Fig.1: Obamacare analogy

George Will, whose pseudo-logical musings at the Washington Post inspired our work here so many years ago, has moved from ABC to Fox News.  In keeping with the tone of his new employer, he waxes historical about the legality of Obamacare (via Talking Points Memo):

In an interview with NPR’s “Morning Edition,” host Steve Inskeep asked Will about President Barack Obama’s argument that Republicans are short-circuiting the system by using government funding and the debt ceiling as leverage to dismantle Obamacare, rather than repealing the law outright.

“How does this short-circuit the system?” Will said. “I hear Democrats say, ‘The Affordable Care Act is the law,’ as though we’re supposed to genuflect at that sunburst of insight and move on. Well, the Fugitive Slave Act was the law, separate but equal was the law, lots of things are the law and then we change them.”

Many here are familiar with Godwin’s law, where as a discussion grows longer, the probability of a Hitler analogy approaches 1.  We might now offer two variations on that.  Given any possible disagreement, the probability of a completely inept Hitler is initially 1.  The second variation is implied in the first: Hitler is a mere stylistic choice: the invoker can select any other moral abomination according to need.

One further rule: some iron-manner will come to the defense of the Godwinator:

I generally agree with TPM, but this headline is an outrageous distortion of what GW said.

His view is that Obamacare law is wrong, which is a legitimate view (not  mine).  He then points out that we have rescinded laws that we all regard as wrong.  He was speaking to the process, not the content.

Nah.  That isn’t his view and this ignores the inappropriate analogy.  Looking past these kinds of rhetorical outrages keeps them alive.

I’m already plenty good at logic

Charles S. Peirce opens the “Fixation of Belief” with the observation:

Few persons care to study logic, because everybody conceives himself to be proficient enough in the art of reasoning already.

Many of the NS readers are familiar with the Dunning-Kruger effect.  In short, it’s that the less you know, the less likely you’ll recognize that you don’t know.  Poor performers regularly overestimate their abilities.  This is borne out in Dunning and Kruger’s case in grammar tests, logic tests, humor-recognition tests and in a variety of other areas (e.g., tests taken by medical students, engineers, and so on).   The trouble is that self-monitoring can’t give reliable feedback when you don’t have the proper criteria for evaluation.

So the question is: how do people perform when, after having done badly and evaluated themselves well, we give them clear criteria for evaluation?  How do they do, and how do they rate their performances? For sure, without this intervention, the D-K Effect becomes the D-K Cycle.  But can intervention as education work?

So after having bombed the last logic test, the subjects are brought back in and given a mini-lecture on how to solve the logic problems they were tested on last time.  They, then, had the criteria for discriminating good from bad performances. The result?  They judged their prior performances quite harshly and came to see that they had a long way to go before they were good at the tasks.  That’s pretty great news to logic teachers.

Ah, but an interesting wrinkle about the need for this intervention.  Dunning and Kreuger gave the same test to two different groups, but told one group it was a logic test and the other group that it was a computer-test.  Here’s the big difference:  people who took the logic test systematically more confident (and thereby overrated themselves in poor performances) than those who took the computer test.  Same test, but different name.  Why is this?  The thought is that it’s because we all think we’re already pretty good at reasoning.  We do it every day, so we must have some skills.  And it’s right there that logic teachers cringe.



Judged by your fans

Pope Francis I has criticized corporate greed and capitalism’s systematic failure to ensure that people are not exploited.  Despite the fact that the communists have a longstanding critical attitude toward the Catholic Church, Mark Gruenberg at The People’s World, has applauded the new pope’s statements. (More on the pope’s views regarding the church’s “worldliness” here.)

When communists agree with the Pope, it’s time for conservatives to get antsy.  Especially conservative Catholics.  Cue Paul Kegnor at AmSpec.  Kegnor is careful to note first that:

The article quoted the pontiff several times. To be sure, few of us would disagree with any of the quotes.

So not it’s that the communists agree with what the Pope says that’s the problem.  It’s that communists agree with pope says.  That’s the problem.

Communists, of all people, finally believe they have a pope who agrees with them, that they like, that they can embrace, that they can encourage. I knew that Francis’ controversial interview on abortion, contraception, and gay marriage had thrilled liberals, liberal Catholics, dissident Catholics, secular progressives, agnostics, atheists, and socialists. You can read their websites. They love this guy. But communists?

Oh, yeah, I hear you.  When I find out that I endorse views held by a group I hold in contempt, I never take that as evidence that I may not have an accurate representation of that group.  I always take it that their agreement with me (or with the things said by another person that I agree with) is either strategic or based on their misunderstandings.  Never ever should, say, a Catholic think that Luke’s social justice doctrines have any resonance with concerns about capitalism.  Kegnor’s clear about it:

It seems to me that this is not the kind of praise that the pope should want.

Of course, the problem is that if Kegnor thinks that few people would disagree with what Pope Francis said, then aren’t there many, many others who’d be trouble, too?  For sure, politics makes strange bedfellows.  But why is one’s credibility in question when there are many who take you as credible?



Fig 1: hypocrite

As we’ve argued here many times before, not all charges of hypocrisy are logically vicious.  Someone’s hypocrisy might be evidence that her view is too difficult to enact (like Newt Gingrich’s conception of traditional marriage) or, more importantly, that she’s logically incompetent.  Here is an example (from Talking Points Memo):

Rep. Renee Ellmers (R-NC) told a local television station that she would not be deferring her pay during the government shutdown, as some other members have done.

“I need my paycheck. That’s the bottom line,” Ellmers told WTVD in Raleigh, N.C. “I understand that there may be some other members who are deferring their paychecks, and I think that’s admirable. I’m not in that position.”

According to Ellmers’s official website, she was a registered nurse for 21 years before being elected to Congress. Her husband Brent, the website says, is a general surgeon.

Democratic Rep. G.K. Butterfield also told WTVD that he wouldn’t be deferring his pay. “I don’t think there should be a shutdown,” he said. “I didn’t create the shutdown.”

The other federal employees–some of whom continue to work–also need their paychecks.  That you cannot sustain the very thing you advocate is evidence that the thing you advocate is unsustainable.

Dumb and dumber

The other day I linked to an article about an unfortunate consequence uncovered by some research in cognitive psychology: some argumentation, however good (conclusive or decisive), reinforces the ignorance of the person who is wrong.  Arguing with someone who is wrong, but steadfast in her wrongness, just makes everything worse.

This Onion article makes a related, and equally chilling point:

SEATTLE—As debate continues in Washington over the funding of President Obama’s health care initiative, sources confirmed Thursday that 39-year-old Daniel Seaver, a man who understands a total of 8 percent of the Affordable Care Act, offered a vehement defense of the legislation to 41-year-old Alex Crawford, who understands 5 percent of it.

The conclusion:

“Hold on, Alex, let’s go back to the premiums for a second, because I feel like I need to drive this point home for you: they’ll get lower for most people,” said Seaver, straining the very limits of his 8 percent comprehension of the bill to the point of utter collapse. “Lower premiums, lower deductibles, and no denial of coverage to people with preexisting conditions.”

“Way lower premiums,” Seaver added.

At press time, both men’s understanding of Obamacare had dropped to 3 percent as a result of the debate.

I think this problem is undertheorized, but then again I don’t really know this literature.

When argument doesn’t work, try argument

Fig 1: arguing badly by going for the jugular

Courtesy of a former student, here’s an interesting read from Pacific Standard about the effectiveness of counter arguments and contrary information on people’s attitudes towards their own beliefs.  TL;DR: counter information makes people more likely to persist in their false beliefs:

Research by Nyhan and Reifler on what they’ve termed the “backfire effect” also suggests that the more a piece of information lowers self-worth, the less likely it is to have the desired impact. Specifically, they have found that when people are presented with corrective information that runs counter to their ideology, those who most strongly identify with the ideology will intensify their incorrect beliefs.

When conservatives read that the CBO claimed the Bush tax cuts did not increase government revenue, for example, they became more likely to believe that the tax cuts had indeed increased revenue (PDF).

In another study by Nyhan, Reifler, and Peter Ubel, politically knowledgeable Sarah Palin supporters became more likely to believe that death panels were real when they were presented with information demonstrating that death panels were a myth. The researchers’ favored explanation is that the information is so threatening it causes people to create counterarguments, even to the point that they overcompensate and become more convinced of their original view. The overall story is the same as in the self-affirmation research: When information presents a greater threat, it’s less likely to have an impact.

This naturally raises the question: are we doomed?  Part of the problem, I think, is that people generally argue very badly.  This is part of the point of Scott and Rob’s book: Why We Argue.  See here for a post the other day.  Take a look, for instance, at the following claim:

This plays out over and over in politics. The arguments that are most threatening to opponents are viewed as the strongest and cited most often. Liberals are baby-killers while conservatives won’t let women control their own body. Gun control is against the constitution, but a lack of gun control leads to innocent deaths. Each argument is game-set-match for those already partial to it, but too threatening to those who aren’t. We argue like boxers wildly throwing powerful haymakers that have no chance of landing. What if instead we threw carefully planned jabs that were weaker but stood a good chance of connecting?

I don’t have any issues with this advice.  Indeed, I think it does not show that argument of the basic logical variety we endorse here doesn’t work.  On the contrary, it works really well; this is just how you do it.

To rephrase the author’s advice: you’ve been arguing badly all along.  Constantly going for the knock out argument is a bad strategy primarily because it’s bad argumentation.  Such moves are very likely to distort the views of the person you’re trying to convince and in so doing alienate them.  What’s better is the slow accumulation of evidence and the careful demonstration of the truth or acceptability of your beliefs.