Straw man factory

Quote of the day:

In a heated and sometimes vitriolic debate Monday night, Rep. Anthony Weiner (D-NY) repeatedly called out former Lt. Gov. Betsy McCaughey for lying about health care reform. He said debating her was like "debating a pyromaniac in a straw man factory," prompting intense and immediate reaction from the audience.

I'm not sure what that means, but you can watch the video here.

Personal pronouns

George Will has written some pretty jerky things in the time we've been reading him–usually straw men or just plain lies.  This time he gets really personal with Obama.  Here's a taste:

Both Obamas gave heartfelt speeches about . . . themselves. Although the working of the committee's mind is murky, it could reasonably have rejected Chicago's bid for the 2016 Games on aesthetic grounds — unless narcissism has suddenly become an Olympic sport.

In the 41 sentences of her remarks, Michelle Obama used some form of the personal pronouns "I" or "me" 44 times. Her husband was, comparatively, a shrinking violet, using those pronouns only 26 times in 48 sentences. Still, 70 times in 89 sentences conveyed the message that somehow their fascinating selves were what made, or should have made, Chicago's case compelling.

I actually found myself downtown for the announcement: lots of emblematic scenes of Chicago 2016 signs on the ground or in the trash.  Then of course the strange cheering from people that the Olympics were not awarded to Chicago–such is their dislike of Obama.

Imagine for a moment "Crawford Texas 2016."  I think you'd hear a little autobiography from our former President–even though he didn't grow up there and he doesn't live there anymore.  So Chicago, my adopted home town, is a great place, even if I still root for the Tigers, Lions, and Red Wings. 

The President, who still lives here, sort of, and the First Lady, is was born and raised here, are naturally going to make a personal pitch.  I don't think, considering their relationship to this place, there was really any other choice.

Ice age

Here's a video which discusses, among other things, George Will's oft-repeated claim that scientists predicted a new ice age in the 1970s.  Hate to ruin it, but it turns out they didn't, and Will, according to the video, seems to have made up, that is to say fabricated, evidence that they did.

Brain death

A guest op-eder in the Washington Post asks: "Is Conservatism Brain-Dead?"  My immediate response is–so what if it is–it must be kept alive by heroic measures.  To be honest, my immediate response was: "Does that hyphen go there, I think not."  In any case, upon reading the article, I'm struck by the standard employed to determine brain death:

The best-selling conservative books these days tend to be red-meat titles such as Michelle Malkin's "Culture of Corruption," Glenn Beck's new "Arguing with Idiots" and all of Ann Coulter's well-calculated provocations that the left falls for like Pavlov's dogs. There is nothing intrinsically wrong with these books. Politics is not conducted by Socratic seminar, and Henry Adams's dictum that politics is the systematic organization of hatreds should remind us that partisan passions are an essential and necessary function of democratic life. The right has always produced, and always will produce, potboilers.

Conspicuously missing, however, are the intellectual works. The bestseller list used to be crowded with the likes of Friedman's "Free to Choose," George Gilder's "Wealth and Poverty," Paul Johnson's "Modern Times," Allan Bloom's "The Closing of the American Mind," Charles Murray's "Losing Ground" and "The Bell Curve," and Francis Fukuyama's "The End of History and the Last Man." There are still conservative intellectuals attempting to produce important work, but some publishers have been cutting back on serious conservative titles because they don't sell. (I have my own entry in the list: a two-volume political history titled "The Age of Reagan." But I never expected the books to sell well; at 750 pages each, you can hurt yourself picking them up.)

About the only recent successful title that harkens back to the older intellectual style is Jonah Goldberg's "Liberal Fascism," which argues that modern liberalism has much more in common with European fascism than conservatism has ever had. But because it deployed the incendiary f-word, the book was perceived as a mood-of-the-moment populist work, even though I predict that it will have a long shelf life as a serious work. Had Goldberg called the book "Aspects of Illiberal Policymaking: 1914 to the Present," it might have been received differently by its critics. And sold about 200 copies.

Jonah Goldberg?  Really?

Those Hollywood Liberals

Eugene Robinson, columnist for the Washington Post, complains today that maybe the conservative culture warrior types have a point about those Hollywood liberals.  Some them, so it seems, seems to have come to the defense of Roman Polanski, the Polish director (of the Pianist, among other films) whose his wife (Sharon Tate) was murdered by the Manson gang and who some years later pleaded guilty to sex with a drugged and drunken 13-year old.  Prior to sentencing, Polanski fled the country, and has since been living in Europe (pretty well, by all accounts).  Unfortunately for him, this week he was picked up by the Swiss Police.

Robinson, I think, ought to look closer to home for people with lax morals.  Here is his own colleague Richard Cohen, on the Polanski case:

It ought not to matter that Polanski is a Holocaust survivor. (His mother died at Auschwitz.) After all, countless others survived the Holocaust without committing crimes of any sort, especially ones involving moral depravity.

It ought not to matter, either, that in 1969 Polanski’s wife, the actress Sharon Tate, was horrifically murdered by the Manson family when she was eight months pregnant. This, too, does not excuse moral depravity, although it gives one pause. It ought to give one pause. (Polanski underwent a 42-day psychiatric examination following his 1977 arrest.)

And it ought not to matter that Polanski is a gifted artist. In fact, it ought to be held against him. He seduced — if that can possibly be the word — the 13-year-old Samantha Geimer with all the power and authority of a 44-year-old movie director who could make her famous. If this did not impress the girl, it must have impressed her mother. She permitted what was supposed to be a photo shoot.

There are two extenuating circumstances in Polanski’s case. The first is time. It has, after all, been over 30 years and Polanski, now 76, has been clean all that time — no crimes alleged, no crimes convicted. More importantly, his victim pleads his case. Geimer says, more or less, enough is enough. She does not excuse what Polanski did and does not forgive what he has done, but it is time for us all to move on. “He made a terrible mistake, but he’s paid for it,” she said some years back.

Time does not minimize the crime, which in its details is creepy, but jail would no longer serve a purpose. The victim and the victimizer are united — they both want clemency. The girl is now a woman, and the man is old, spending his dotage making fools of his champions, who cannot distinguish between sexual freedom and sexual assault. Let Polanski go — but first let me at him.

He forgot to mention the "booze" and the "drugs."  And here's another Post columnist Anne Applebaum:

Here are some of the facts: Polanski's crime — statutory rape of a 13-year-old girl — was committed in 1977. The girl, now 45, has said more than once that she forgives him, that she can live with the memory, that she does not want him to be put back in court or in jail, and that a new trial will hurt her husband and children. There is evidence of judicial misconduct in the original trial. There is evidence that Polanski did not know her real age. Polanski, who panicked and fled the U.S. during that trial, has been pursued by this case for 30 years, during which time he has never returned to America, has never returned to the United Kingdom., has avoided many other countries, and has never been convicted of anything else. He did commit a crime, but he has paid for the crime in many, many ways: In notoriety, in lawyers' fees, in professional stigma. He could not return to Los Angeles to receive his recent Oscar. He cannot visit Hollywood to direct or cast a film.

"Professional stigma" is but a few short words away from "Oscar."  My question at this point is whether there is some kind of prohibition keeping one Post writer from criticizing another.  One would expect, after all, that friends and colleagues (Hollywood liberals!) would rally around Polanski; they run in the same circles, have worked with him and known him.  Their defense of him ought to be seen through that lens.  Justifying such behavior, however, as a newspaper columnist seems rather more worthy of condemnation.

Argumentum ad Novi Eboraci Tempora

That would be "ad New York Times" I suppose.  I take as a matter or religious faith that global warming is a scientific issue, and that arguments concerning its reality or unreality should start and end there.  So when one frames the argument about global warming either in response to a Newsweek headline many years ago, or a New York Times article quoted out of context, I think that person is either not particularly well informed about how scientists work (they don't publish their work in the newspaper) or is just plain dishonest.  So George Will today frames his argument against the existence of a well-supported phenomenon by attacking the New York Times, as well as various context free quotes, meant–the quotes–to set up a pretty silly ad hominem.  

He writes:

Plateau in Temperatures

Adds Difficulty to Task

Of Reaching a Solution

— New York Times, Sept. 23

 

In this headline on a New York Times story about the difficulties confronting people alarmed about global warming, note the word "plateau." It dismisses the unpleasant — to some people — fact that global warming is maddeningly (to the same people) slow to vindicate their apocalyptic warnings about it.

The "difficulty" — the "intricate challenge," the Times says — is "building momentum" for carbon reduction "when global temperatures have been relatively stable for a decade and may even drop in the next few years." That was in the Times's first paragraph.

Whenever this guy quotes stuff, you'd better go read the original.  Here's what it says:

The plateau in temperatures has been seized upon by skeptics as evidence that the threat of global warming is overblown. And some climate experts worry that it could hamper treaty negotiations and slow the progress of legislation to curb carbon dioxide emissions in the United States.

Scientists say the pattern of the last decade — after a precipitous rise in average global temperatures in the 1990s — is a result of cyclical variations in ocean conditions and has no bearing on the long-term warming effects of greenhouse gases building up in the atmosphere.

The part about the scientists is where the argument ought to be.  Will instead insists that the real discussion is the political question of how to keep non-scientists from wrongly concluding, as Will has in this very piece, that the leveling off of temperatures means it's all a crock.  That's the point of the argument.  Will cites this piece extensively, and he seems to have no notion of what it's about.  Here's what he says:

The Times reported that "scientists" — all of them? — say the 11 years of temperature stability has "no bearing," none, on long-term warming. Some scientists say "cool stretches are inevitable." Others say there may be growth of Arctic sea ice, but the growth will be "temporary." According to the Times, however, "scientists" say that "trying to communicate such scientific nuances to the public — and to policymakers — can be frustrating." 

The quoted bits give the impression of some kind of fudging on the Times' part (like the black and white and weird voice in political commercials).  In any case, as I understand it, the basic point is this: The globe has heated up seriously for a quite a while.  Recently it has leveled off, but it still remains much hotter, so to speak, than before.  This is not unlike a guy with a really bad fever, experiencing a bit of dip, say a dip to 102.  He's still got a fever. 

Anyway, now for the ad hominem part:

The Times says "a short-term trend gives ammunition to skeptics of climate change." Actually, what makes skeptics skeptical is the accumulating evidence that theories predicting catastrophe from man-made climate change are impervious to evidence. The theories are unfalsifiable, at least in the "short run." And the "short run" is defined as however many decades must pass until the evidence begins to fit the hypotheses.

The Post recently reported the theory of a University of Virginia professor emeritus who thinks that, many millennia ago, primitive agriculture — burning forests, creating methane-emitting rice paddies, etc. — produced enough greenhouse gases to warm the planet at least a degree. The theory is interesting. Even more interesting is the reaction to it by people such as the Columbia University professor who says it makes him "really upset" because it might encourage opponents of legislation combating global warming.

This professor emeritus fellow is the only scientist Will cites in favor of his skeptical stance.  Nonetheless, the worry among scientists, justifiable as this piece indicates, is that people with no expertise will misunderstand the significance of the data.

 

Your opining career

This is how one gets that job:

Here’s your chance to put your opinions to the test — and win the opportunity to write a weekly column and a launching pad for your opinionating career!

Start making your case.
Use the entry form to send us a short opinion essay (400 words or less) pegged to a topic in the news and an additional paragraph (100 words or less) on yourself and why you should win. Entries will be judged on the basis of style, intelligence and freshness of argument, but not on whether Post editors agree or disagree with your point of view. Entry deadline: Oct. 21, 2009 at 11:59 p.m. ET.

Then get ready for the great debate.
Beginning on or about Oct. 30, ten prospective pundits will get to compete for the title of America’s Next Great Pundit, facing off in challenges that test the skills a modern pundit must possess. They’ll have to write on deadline, hold their own on video and field questions from Post readers. (Contestants won’t have to quit their day jobs, but they should be prepared to put in about eight hours a week for three weeks.) After each round, a panel of Post personalities will offer kudos and catcalls, and reader votes will help to determine who gets another chance at a byline and who has to shut down their laptop.

Eyes on the prize.
The ultimate winner will get the opportunity to write a weekly column that may appear in the print and/or online editions of The Washington Post, paid at a rate of $200 per column, for a total of 13 weeks and $2,600. Our Opinions lineup includes a dozen Pulitzer Prize winners, regulars on the national political talk shows and some of the most influential players inside the Beltway. We’ll set our promising pundit on a path to become the next byline in demand, the talking head every show wants to book, the voice that helps the country figure out what’s really going on.

So what are you waiting for?

"Your opinionating career"?  Another great day for the Post.

RIP William Safire

William Safire, former conservative columnist for the New York Times, has died.  From the obituary:

from 1973 to 2005, Mr. Safire wrote his twice-weekly “Essay” for the Op-Ed page of The Times, a forceful conservative voice in the liberal chorus. Unlike most Washington columnists who offer judgments with Olympian detachment, Mr. Safire was a pugnacious contrarian who did much of his own reporting, called people liars in print and laced his opinions with outrageous wordplay. 

I'll always remember him for his column on language.  May he rest in peace.

A vast social justice empire

This Kathleen Parker op-ed is a masterwork in insinuation.  The topic is ACORN, of course.  She has found a way to make ACORN the reason to be afraid of health care reform by linking them to a union, of all things.  

She writes:

You also don't talk about either organization without mention of Wade Rathke, co-founder of ACORN and founder of SEIU Local 100 in New Orleans. Rathke, who resigned from ACORN last year as "chief organizer" after it became known that his brother embezzled almost $1 million from the association, continues to run Local 100, as well as ACORN International, recently renamed Community Organizations International.

Rathke's social justice empire is so vast that he is more hydra than man. Nine heads are surely better than one when you're organizing communities in at least 12 countries. While Rathke and ACORN undoubtedly have done much good for impoverished people here and abroad, it appears likely that American taxpayers indirectly have been helping to underwrite unionizing activities and advance political goals through the commingling of Rathke's various interests.

A "social justice empire"?  Ponder that phrase for a moment.

The eternal present of the New York Times

Punditry is an accountability free occupation.

In today's New York Times, the grizzled warrior David Brooks performs a chest-beating war dance over Afghanistan of the type he and his tough guy comrades perfected in the run-up to the Iraq War.  It's filled with self-glorifying "war-is-hell" neocon platitudes that make the speaker feel tough and strong.  No more hiding like cowards in our bases.  It's time to send "small groups of American men and women [] outside the wire in dangerous places."  Those opposing escalation are succumbing to the "illusion of the easy path."  Chomping on a cigar in his war room, he roars:  "all out or all in."  The central question: will we "surrender the place to the Taliban?," etc. etc. 

Needless to say, Brooks was writing all the same things in late 2002 and early 2003 about Iraq — though, back then, he did so from the pages of Rupert Murdoch and Bill Kristol's The Weekly Standard.  When I went back to read some of that this morning, I was — as always — struck by how extreme and noxious it all was:  the snide, hubristic superiority combined with absolute wrongness about everything.  What people like David Brooks were saying back then was so severe — so severely wrong, pompous, blind, warmongering and, as it turns out, destructive — that no matter how many times one reviews the record of the leading opinion-makers of that era, one will never be inured to how poisonous they are.

All of this would be a fascinating study for historians if the people responsible were figures of the past.  But they're not.  They're the opposite.  The same people shaping our debates now are the same ones who did all of that, and they haven't changed at all.  They're doing the same things now that they did then.  When you go read what they said back then, that's what makes it so remarkable and noteworthy.  David Brooks got promoted within our establishment commentariat to The New York Times after (one might say:  because of) the ignorant bile and amoral idiocy he continuously spewed while at The Weekly Standard.  According to National Journal's recently convened "panel of Congressional and Political Insiders," Brooks is now the commentator who "who most help[s] to shape their own opinion or worldview" — second only to Tom "Suck On This" Friedman.  Charles Krauthammer came in third.  Ponder that for a minute.

Read the rest.  The truly odd thing about all of this, as a friend of ours suggested, is that these people operate as if no one has access to their past writings on these matters.  Odder than that is the fact that people do, and yet there they are.

Your argument is invalid