Saturday, December 31, 2011

About the Upcoming New Thomas Frank Book (2)

I quickly read and finished the book today. Let it be noted that this is the last book I read in 2011.

Thursday, December 29, 2011

Robert Reich on the Breakup of the Republican Party

Why the Republican Crackup is Bad For America
Tuesday, December 20, 2011
Two weeks before the Iowa caucuses, the Republican crackup threatens the future of the Grand Old Party more profoundly than at any time since the GOP’s eclipse in 1932. That’s bad for America.

The crackup isn’t just Romney the smooth versus Gingrich the bomb-thrower.

Not just House Republicans who just scotched the deal to continue payroll tax relief and extended unemployment insurance benefits beyond the end of the year, versus Senate Republicans who voted overwhelmingly for it.

Not just Speaker John Boehner, who keeps making agreements he can’t keep, versus Majority Leader Eric Cantor, who keeps making trouble he can’t control.

And not just venerable Republican senators like Indiana’s Richard Lugar, a giant of foreign policy for more than three decades, versus primary challenger state treasurer Richard Mourdock, who apparently misplaced and then rediscovered $320 million in state tax revenues.

Some describe the underlying conflict as Tea Partiers versus the Republican establishment. But this just begs the question of who the Tea Partiers really are and where they came from.

The underlying conflict lies deep into the nature and structure of the Republican Party. And its roots are very old.

As Michael Lind has noted, today’s Tea Party is less an ideological movement than the latest incarnation of an angry white minority – predominantly Southern, and mainly rural – that has repeatedly attacked American democracy in order to get its way.

It’s no mere coincidence that the states responsible for putting the most Tea Party representatives in the House are all former members of the Confederacy. Of the Tea Party caucus, twelve hail from Texas, seven from Florida, five from Louisiana, and five from Georgia, and three each from South Carolina, Tennessee, and border-state Missouri.

Others are from border states with significant Southern populations and Southern ties. The four Californians in the caucus are from the inland part of the state or Orange County, whose political culture has was shaped by Oklahomans and Southerners who migrated there during the Great Depression.

This isn’t to say all Tea Partiers are white, Southern or rural Republicans – only that these characteristics define the epicenter of Tea Party Land.

And the views separating these Republicans from Republicans elsewhere mirror the split between self-described Tea Partiers and other Republicans.

In a poll of Republicans conducted for CNN last September, nearly six in ten who identified themselves with the Tea Party say global warming isn’t a proven fact; most other Republicans say it is.

Six in ten Tea Partiers say evolution is wrong; other Republicans are split on the issue. Tea Party Republicans are twice as likely as other Republicans to say abortion should be illegal in all circumstances, and half as likely to support gay marriage.

Tea Partiers are more vehement advocates of states’ rights than other Republicans. Six in ten Tea Partiers want to abolish the Department of Education; only one in five other Republicans do. And Tea Party Republicans worry more about the federal deficit than jobs, while other Republicans say reducing unemployment is more important than reducing the deficit.

In other words, the radical right wing of today’s GOP isn’t that much different from the social conservatives who began asserting themselves in the Party during the 1990s, and, before them, the “Willie Horton” conservatives of the 1980s, and, before them, Richard Nixon’s “silent majority.”

Through most of these years, though, the GOP managed to contain these white, mainly rural and mostly Southern, radicals. After all, many of them were still Democrats. The conservative mantle of the GOP remained in the West and Midwest – with the libertarian legacies of Ohio Senator Robert A. Taft and Barry Goldwater, neither of whom was a barn-burner – while the epicenter of the Party remained in New York and the East.

But after the Civil Rights Act of 1964, as the South began its long shift toward the Republican Party and New York and the East became ever more solidly Democratic, it was only a matter of time. The GOP’s dominant coalition of big business, Wall Street, and Midwest and Western libertarians was losing its grip.

The watershed event was Newt Gingrich’s takeover of the House, in 1995. Suddenly, it seemed, the GOP had a personality transplant. The gentlemanly conservatism of House Minority Leader Bob Michel was replaced by the bomb-throwing antics of Gingrich, Dick Armey, and Tom DeLay.

Almost overnight Washington was transformed from a place where legislators tried to find common ground to a war zone. Compromise was replaced by brinkmanship, bargaining by obstructionism, normal legislative maneuvering by threats to close down government – which occurred at the end of 1995.

Before then, when I’d testified on the Hill as Secretary of Labor, I had come in for tough questioning from Republican senators and representatives – which was their job. After January 1995, I was verbally assaulted. “Mr. Secretary, are you a socialist?” I recall one of them asking.

But the first concrete sign that white, Southern radicals might take over the Republican Party came in the vote to impeach Bill Clinton, when two-thirds of senators from the South voted for impeachment. (A majority of the Senate, you may recall, voted to acquit.)

America has had a long history of white Southern radicals who will stop at nothing to get their way – seceding from the Union in 1861, refusing to obey Civil Rights legislation in the 1960s, shutting the government in 1995, and risking the full faith and credit of the United States in 2010.

Newt Gingrich’s recent assertion that public officials aren’t bound to follow the decisions of federal courts derives from the same tradition.

This stop-at-nothing radicalism is dangerous for the GOP because most Americans recoil from it. Gingrich himself became an object of ridicule in the late 1990s, and many Republicans today worry that if he heads the ticket the Party will suffer large losses.

It’s also dangerous for America. We need two political parties solidly grounded in the realities of governing. Our democracy can’t work any other way.

Wednesday, December 28, 2011

About the Upcoming New Thomas Frank Book

The Tea Party’s “utopian market populism”
Tom Frank on the dream that fueled the right wing's improbable comebackBy Jefferson Morley

Thomas Frank

Topics:Wall Street, Tea Party, Occupy Wall Street
In his new book, “Pity the Billionaire,” Tom Frank turns his mordant eye on the unlikeliest political development of the Obama presidency: how the crash of 2008 served to strengthen the political right. The deregulation of Wall Street, championed for 30 years by right-wing leaders, had led to an economic catastrophe so frightening that the country elected a liberal Democrat to the presidency. Yet two years later, the most conservative faction of the Republican Party, the Tea Party, had taken effective control of the House of Representatives, the regulation of Wall Street had stalled, and the champions of economic deregulation in Washington had emerged stronger than ever.

Frank, author of the bestselling book “What’s the Matter With Kansas?” provides a pithy and nuanced explanation of what he calls the “hard-times swindle.” He spoke with Salon from his father’s home in Kansas City, Mo.

Early in the book, you describe the moment in the spring of 2009 when free-market economics had been so thoroughly discredited that Newsweek could run a cover story proclaiming, “We’re all socialists now.” What happened? Why did that moment dissipate?

I saw that cover so many times [at Tea Party events]. For these people, that rang the alarm bell. I think the AIG moment [when the bailed-out insurance behemoth used taxpayer relief to dole out huge bonuses to its executives] was in some ways the high point of the crisis, when [the politics] could have gone either way. There was this amazing public outrage, and that for me was the turning point. Newsweek had another cover, “Thinking Man’s Guide to Populism,” and I remember this feeling around the country, that people were just furious. Somehow the right captured the sense of anger. They completely captured it. You could say they had no right to it, but they did. And one of the reasons they were able to do it was because the liberals were not interested in that anger.

I’m speaking here of the liberal culture in Washington, D.C. There was no Occupy Wall Street movement [at that time] and there was only people like me on the fringes talking about it. The liberals had their leader in Barack Obama … they had their various people in Congress. But these people are completely unfamiliar with populist anger. It’s an alien thing to them. They don’t trust it, and they have trouble speaking to it. I like Barack Obama, but at the end of the day he’s a very professorial kind of guy. The liberals totally missed the opportunity, and the right was able to grab it.

Looking back on it, I feel like people like myself were part of the problem. We sort of assumed with the Democrats in power, the system would correct itself.

One of the problems with liberalism in this country is that it’s headquartered in Washington and its leaders are a very comfortable class of people. Washington is one of the richest cities in the country, maybe the richest. It’s not a place that feels the crisis, that feels the economic downturn. By and large, the real estate market stayed OK. The city continued to boom. The contracts continued to flow. What we’re talking about here is the failure of modern liberalism. At one time it was a movement of working-class people. The idea that liberals wouldn’t feel economic pain was ridiculous. That’s who liberals were. No more.

You write that after Obama took office, “market populism was the only utopian scheme available to disgruntled Americans.” There was no liberal utopian scheme that said, “Here’s how we get out of this.”

There wasn’t even a Rooseveltian scheme, which was not utopian but very practical. Just to talk about Roosevelt would have been fantastic. One of the research points in the book that I thought was really interesting … was the history of the bailouts in 1932 and 1933 — when the Hoover administration did a lot of bailouts. We don’t remember that. [These bailouts] were massively unpopular for the same reason they were unpopular this time around: really blatant cronyism. We don’t remember that a big part of Franklin Roosevelt’s campaign [in 1932] was to be against these bailouts. There were maybe five newspaper articles in 2008 that mentioned this pre-history of the bailouts. It just never came up.

It was like the party’s muscle memory of the New Deal was lost. With Obama the muscle memory of the Democratic Party is the Clintonian technocracy of the 1990s.

That’s exactly right. Their message was: The technocratic way is going to solve our problems. Just leave it up to the experts who are going to figure a way out. [Obama and the Democrats] seemed to think they didn’t need to dirty their hands by making a populist appeal. They did a lot of good things — the stimulus package of 2008 was good thing — but they didn’t realize you have to sell something like that. They were like, “We know what the answer is: Keynesian stimulus. So let’s just do it.” They didn’t understand that this nation only adopted Keynesian stimulus spending back in the 1930s amidst this terrible wrenching experience, the Depression, and an enormous campaign [by FDR] to tell the nation why this was necessary.

If you don’t sell it — if you just do this spending — well, people have a lot of suspicion of government handouts. Government debt bothers people for very obvious reasons. [Obama] didn’t make any effort to make the argument. It was just “listen to the experts.” I have a quote from [Obama economic advisor] Christy Roemer where she says, “Things would be better if we listened to the experts.” And she’s one of the good guys, one of the best people in the Obama administration. That’s their view.

You have an interesting discussion about how the Tea Party movement mimics what was once the left-wing style. This seems to be the dominant mode: The right is saying, “We’re the revolutionaries. We’re taking on the powers that be.”

At first I thought it was a peculiarity of Glenn Beck, and then I noticed it across the board. They picked up this 1930s style and language, complete with utopianism, with this intense faith in an economic system that will solve all your problems and that represents you perfectly, this miraculous economic system … So [the right] is constantly talking about this infernal elite that controls government, controls corporations, and controls the academy, and that we have to wrench ourselves free.

So maybe it is true that the Obama technocracy is the infernal elite. Maybe not in the hellish way it is portrayed on the right, but in the sense that these are the defenders of bailouts, the defenders of the system.

I wouldn’t go too far with that, because I don’t think that is a way of understanding our modern world that can bear a lot of the weight. It is true that the Democrats completely imagine themselves as being the party of the professional class, and that is an elite. It’s not the elite, but it is an elite. The Democrats very definitely identify with academia. That’s the home of the professions, where they come from.

Still, I think that the conservative idea of revolting against the ruling class by holding up the market as an ideal is completely backwards. There is a ruling class in this country. But the notion that the free market is an act of rebellion against it seems pretty fanciful. I can say it stronger than that. It is absolutely preposterous.

At one point you talk about “a cognitive withdrawal from the shared world.” It seems like the modern digital communication revolution encourages this. “A cognitive withdrawal from the shared world” — that sounds like a description of the Internet.

This is where we’re going. You can now believe things that are demonstrably false and never be challenged, directly or indirectly. You can withdraw. That’s the end that the Internet is constantly pushing us toward. That’s what modern marketing is all about.

So it’s a technological phenomenon, but it’s also an ideological phenomenon, a product of the times we’re in. You saw this in the ’30s, especially on the left. People would be so committed to this economic utopia [communism]. They believed in it, and their faith in it was so great … It is a product of economic collapse. People are desperate. They think their entire way of life is crumbling around them, and they reach for … a utopian system where everything is explained.

This is the genius of Fox News. It is fun to watch, and if you agree with them, it’s very gratifying to watch — and on a level deeper than most TV entertainment. The message is “You’ve worked really hard. You played by the rules and now they’re disrespecting you. They won’t let you say the word ‘Christmas.’”

You finished this book around the time Occupy Wall Street started. Were you surprised by the emergence of the movement?

I was surprised. I thought the left’s moment had passed. That was almost exactly three years after the crash of September 2008, and it seemed like the expiration date had come and gone. I’m very pleased, but in a lot of ways the horse has already left the barn.

Is it possible for the Occupy movement to reverse the gains of the right?

I hope so, but I honestly don’t know. At the end of the day I doubt it. My liberal friends have been doubting the right for decades. They’re always saying, “There’s no way these guys can recover now after this screw-up. People will never come back to them after this.” But they keep coming back.

The right does seems to be a little bit on the defensive at the moment. The dominant narrative of last summer — government spending is the problem — has been lost. Occupy Wall Street has injected a change in discourse. People aren’t defensive when they talk about inequality.

Tell me about it. When I started writing about inequality 10 years ago, it … was not something for NPR Book Talk. It was not quite within the bounds of the acceptable. Now it is. And that’s a huge change.

The main thing that has to change is that Democrats and liberals have to be able to speak to the outrage, and that requires a complete change in the way they look at the world. The problem is that they’ve been going the other direction for 30 years. Ever since the right-wing backlash began, liberals have been making their own move to professionalism. [To voice outrage] would require them to reverse course. I would like to see that happen, but I don’t know how it’s going to happen.

I think one thing has happened: Middle-class or upper-middle-class liberals in Washington, all of a sudden we realize we are insecure. The system is not just screwed up for people out there who we sympathize with. It’s screwed up for us. Economic insecurity is now pervasive even in the professional class.

Professionals are feeling the heat. They’re not insulated from market forces, like the way the professions are supposed to be. The system is not working like that anymore. Maybe that is where the change will come from.

Another thing that I think changed things was the debt-ceiling debacle last summer. That scared everybody, and it was so patently the doing of the Tea Party Republicans in the House. That was a huge turning point.

The hostage taking?

They were holding a gun to the head of the nation’s economy. They did ruin the nation’s bond rating. And you could have had an unthinkable catastrophe if they had done what they were threatening to do. Things like that should be off the table in our politics. That was an outrage in its own way as great as the bailouts. It was a shocking moment.

Yet at the end of the book, you contemplate the right wing in power, and you suggest they might do exactly that — take actions they know would ruin the economy.

They might. They see the financial crisis as something we deserve, that we’ve spent beyond our means, and now we have to pay. In their minds, we need a recession to get back on track. They think we’re due for something like the 1930s, so why not make it happen?

You conclude by saying say that the problems that editorialists fret about — inequality, global warming and financial bubbles — will endure, but so will this utopian market populism, which you describe as chasing “the dream more vivid than life itself.” You have shown how entrenched those impulses are in American politics. Maybe part of the American pursuit of happiness is to “chase the dream more vivid than life itself. “

The right has discovered the magic against relativism. They have long inveighed against relativism, but now they’ve discovered they can say anything. They can endlessly withdraw into this world of utopian fiction and everything can be explained away. It’s like some kid discovering a new video game. It’s so awesome.

So what gives you hope?

I was out in Wisconsin earlier this year, when you had thousands of people surrounding the capitol every day. There were big days when they had a hundred thousand people, and then there were the off days where you had “only” a couple of thousand — and this went for day after day. That was really a hopeful moment for me. It was the predecessor to the Occupy movement. Let’s see if they can make a comeback when it gets warm again. I would love to see it happen.

Wayne Flynt - Keeping the Faith

A few weeks ago I finished this memoir by Wayne Flynt, the retired Auburn history professor (he retired in 2005) and liberal voice of Alabama politics. I enjoyed the book as much as I have enjoyed any recent book that I've read.

Wayne was born in Mississippi but spent most of his childhood and all of his adulthood in Alabama. Growing up he lived mostly in Pinson and Anniston. He did his undergradute work at Samford and his Phd at Florida State. Southern history is his specialty. He spent about 20 years at Auburn (moving to Auburn from the Samford faculty) and in the book writes about the ups and downs of the various administration battles during his time on the Plains. It seems that he was always in the middle of all the Auburn politics.

I wanted to list his reference to my favotite Auburn professor, Dr. Joe Harrison, who was on the Auburn history faculty for many years and under whom I had three courses. Dr. Harrison was legendary for his incredible memory. He never lectured from notes. It was all in his head. I chuckle when I read this.

"Like the graduate students, I found myself seduced by the coffee club. These professors were some of the state's brightest people and best storytellers. Some (notably Joe Harrison, Robert R. Rea, and Bill Maehl) were brilliant. Joe possessed total recall,lectured without notes, and seemingly never forgot a date, statistic, or story. No one at Auburn matched him as a raconteur. Bob Rea---distant, acerbic, sometimes terrifying to graduate students---was a fine writer and a generous writer. Bill Maehl, our German historian, looked and acted the part of a Prussian field officer." P. 126

Tuesday, December 27, 2011

The Influence of Ron Paul

Peter Beinart: How Ron Paul Will Change the GOP in 2012

Dec 27, 2011 4:45 AM EST The libertarian upstart isn’t just stirring controversy; he’s threatening to expose profound divisions within the GOP. Peter Beinart on how Paul will change the Republican Party in 2012.

Print Email Comments (119) We haven’t even said goodbye to 2011, but I want to be first in line with my person of the year prediction for 2012: Ron Paul. I don’t think Paul is going to win the presidency, or even win the Republican nomination. But he’s going to come close enough to change the GOP forever.

Washington Republicans and political pundits keep depicting Paul as some kind of ideological mutation, the conservative equivalent of a black swan. They’re wrong. Ask any historically-minded conservative who the most conservative president of the 20th Century was, and they’ll likely say Calvin Coolidge. No president tried as hard to make the federal government irrelevant. It’s said that Coolidge was so terrified of actually doing something as president that he tried his best not even to speak. But in 1925, Silent Cal did open his mouth long enough to spell out his foreign policy vision, and what he said could be emblazoned on a Ron Paul for President poster: “The people have had all the war, all the taxation, and all the military service they want.”

Small government conservatism, the kind to which today’s Republicans swear fealty, was born in the 1920s not only in reaction to the progressive movement’s efforts to use government to regulate business, but in reaction to World War I, which conservatives rightly saw as a crucial element of the government expansion they feared. To be a small government conservative in the 1920s and 1930s was, for the most part, to vehemently oppose military spending while insisting that the US never, ever get mired in another European war.

Even after World War II, Mr. Republican—Robert Taft—opposed the creation of NATO and called the Korean War unconstitutional. Dwight Eisenhower worked feverishly to scale back the Truman-era defense spending that he feared would bankrupt America and rob it of its civil liberties. Even conservative luminaries like William F. Buckley and Barry Goldwater who embraced the global anti-communist struggle made it clear that they were doing so with a heavy heart. Global military commitments, they explained, represented a tragic departure from small government conservatism, a departure justified only by the uniquely satanic nature of the Soviet threat.

Republican presidential candidate, Rep. Ron Paul, R-Texas speaks during a campaign stop in Dubuque, Iowa, Thursday, Dec. 22, 2011., Charlie Riedel / AP Photo
The cold war lasted half a century, but isolationism never left the conservative DNA. And when the Soviet Union collapsed, some of America’s most prominent conservative intellectuals—people like Irving Kristol, Jeane Kirkpatrick and Pat Buchanan—argued that the GOP should become the party of Coolidge and Taft once again. The Republican Congress of the 1990s bitterly opposed Bill Clinton’s wars in the Balkans, and Buchanan, running on an isolationist platform, briefly led the GOP presidential field in 1996. Even the pre-9/11 Bush administration was so hostile to increased military spending that the Weekly Standard called on Defense Secretary Donald Rumsfeld to resign.

Given this history, it’s entirely predictable that in the wake of two disillusioning wars, a diminishing al Qaeda threat and mounting debt, someone like Ron Paul would come along. In Washington, Republican elites are enmeshed in a defense-industrial complex with a commercial interest in America’s global military footprint. But listen to Bill O’Reilly or Rush Limbaugh and see how often you hear them demanding that America keep fighting in Afghanistan, or even attack Iran. According to a November CBS News poll, as many Republicans said the U.S. should decrease its troop presence in Afghanistan as said America should increase it or keep it the same. In the same survey, only 22 percent of Republicans called Iran’s nuclear program “a threat that requires military action now” compared to more than fifty percent who said it “can be contained with diplomacy.” Almost three-quarters of Republicans said the U.S. should not try to change dictatorships to democracies.
The dominant storyline at the Republican convention will be figuring out how to appease Paul.

There are certainly Republicans out there who support the Bush-Cheney neo-imperialist foreign policy vision. But they’re split among the top tier presidential candidates. Paul has the isolationists all to himself. Moreover, his two top opponents—Mitt Romney and Newt Gingrich—not only back a big-government foreign policy agenda, but have periodically backed a big-government domestic agenda as well. In other words, they personify the argument at the heart of Paul’s campaign: that if you love a powerful Pentagon, you’ll end up loving other parts of the government bureaucracy as well.

Since the Iowa caucuses generally reward organization and passion, I suspect Paul will win them easily. That would likely propel him to a strong showing in libertarian New Hampshire. Somehow, I think Romney and the Republican establishment will find a way to defeat him in the vicious and expensive struggle that follows. But the dominant storyline at the Republican convention will be figuring out how to appease Paul sufficiently to ensure that he doesn’t launch a third party bid. And in so doing, the GOP will legitimize its isolationist wing in a way it hasn’t since 9/11.
In truth, the modern Republican Party has always been a house divided, pulled between its desire to crusade against evil abroad and its fear that that crusade will empower the evil of big government at home. In 2012, I suspect, Ron Paul will expose that division in a way it has not been exposed in a long time. And Republicans will not soon paper it over again.

Michael Connelly - The Drop

Michael Connelly seems to be one of our most successful crime writers. I usually read a crime nove or two per year. It turns out that I've read several Connelly books this year. His two continuing protagonists are Mickey Haller, the Lincoln lawyer, and Harry Bosch, the detective. This is the latest Bosch novel.

It's good. It's entertaining, mostly for the character of Bosch, a widower with a 15-yr. old daughter, who is in the last 5 years of his law enforcement career. He contines his job with passion although he occasionally looks forward to quitting.

The plot is fun but the end is dull. The novel ends with a thud rather than a flash, but that's OK. I don't expect to go back and read the earlier Bosch novels, but I am likely to read new ones going forward.

Science vs. Anti-Science

The Journal of the American Enterprise Institute
ARTICLE TOOLS EMAIL THE EDITOR PRINT Share ArticleScience and the Chattering Classes
By Daniel Akst
Friday, December 16, 2011

Filed under: Science & Technology

To speak out against the anti-scientific orthodoxy that prevails among large segments of the educated class is to make yourself the skunk at the garden party.

Imagine yourself at one of those fashionable dinner parties you go to now and then—you know, the kind where everybody has retro-chic eyeglasses and au courant haircuts, and the food isn’t just vegetarian but organic.

You make the mistake of mentioning your headache and the woman on your left offers you some capsules from the health food store. Here is your side of the ensuing conversation:

“Oh, thanks, but you know I only take medications that have been subjected to rigorous double-blind testing... Really? Well, maybe, but I still kind of prefer science… Yup, I know. But hey, maybe all those chemicals are somehow good for us—maybe that’s why life expectancy goes up every year! Ha ha. Oh, gee, sorry. My wife thought it was funny, and she actually had cancer… What? Sure, some things are sacred, but… Gosh, I’m not sure I ever feel ‘spiritual.’ How will I know it when I do? Is it like sneezing?”

Pity the poor rationalist in polite company. Inevitably, diet has come up, and if the party is in Southern California, chances are somebody was “detoxifying.” But to speak out against the anti-scientific orthodoxy that prevails among large segments of the educated class is to paint a stripe down your back and make yourself the skunk at the garden party.

Technology and ignorance have succeeded where religion has failed: in draping the world in a cloak of mystery, but one we find more threatening than enchanting.Food is at the center of elites’ anxieties about science and modernity, yet the truth is that it has become a scapegoat, or perhaps I should say scapetofu, for a host of imaginary sins we associate with technology. The timing of this obsession is no surprise; never before has such complex technology occupied such a central place in the economy, to say nothing of daily life. Yet by and large, when we chew on the fruits of science, they are sweet. Thanks to science—not so much medical as industrial—life expectancy increases every year, mostly as a function of affluence. So why is science—to say nothing of the very idea of progress—so unfashionable?

One obvious reason is that, among the chattering classes, hardly anybody knows anything about it. Today’s children of the native-born bourgeoisie study cinema or gender studies or even marketing, but not so much physics or chemistry, at least in my experience. It's indicative, perhaps, that in 2006 (the most recent year for which I could find data), foreign students earned nearly two-thirds of the U.S. doctorate degrees in engineering and computer sciences, while snaring about half of those in the physical sciences and math. Mercifully, many of these foreign students stay.

But for too many Americans, science is something alien and abstract. Max Weber observed nearly a century ago that, by explaining so many natural phenomena, science has lead to the “disenchantment” of the modern world. What a difference 100 years makes! Nowadays we’re surrounded by products of technology (from gelcaps to smartphones) whose essential workings are unintelligible to all but a specialized few. The result is that technology and ignorance have succeeded where religion has failed: in draping the world in a cloak of mystery, but one we find more threatening than enchanting.

Food has become a scapegoat, or perhaps I should say scapetofu, for a host of imaginary sins we associate with technology.With its great stress on specialization, capitalism has eroded the kind of homely technological skills Americans typically possessed a generation ago. Most of us no longer work on our own cars, for instance, and given electronic fuel injection and other newfangled features, we probably couldn’t even if we wanted to. Heck, a lot of us can’t even cook our own food.

In our system, it pays for people to develop knowledge that is deep but narrow, with the consequence that more and more of what goes on around us is shrouded in a fog of intimidating complexity. Newspaper readership, that traditional barometer of the well-informed public, is on the decline, and newspapers that used to have staff expertise in science have cut it back drastically. Broadcast media are especially inept—or uninterested—in reporting on science and technology, except of course when it’s supposedly killing us.

Another reason for people’s discomfort with the products of science might be our sense of the fragility of technological life, which is underscored whenever we read about shadowy Chinese hackers or the rising threat of global warming. Sooner or later, many of us are convinced, we are destined to be hoist by our own petard, victims of the false god of technology. In our high-tech vehicles and air-conditioned homes bristling with microchips, we yearn for some mythical pre-technological innocence the way some Chekhov characters yearn for Moscow. At least until we visit the doctor, at which point we want all the technology in the world brought to bear.

Of course, suspicion of science and technology goes way back. Forbidden knowledge was at the root of our troubles back in the Garden of Eden, and England's Luddites later attacked the newfangled looms that were about to make clothes more affordable for everyone. Mary Shelley, with Frankenstein, and Robert Louis Stevenson, with Jekyll and Hyde, were just two of the most prominent writers who warned against the hazards of invention.

Today’s children of the native-born bourgeoisie study cinema or gender studies or even marketing, but not so much physics or chemistry.Science itself, or perhaps its acolytes, has given us ample reason for suspicion, too, although humanity is an awfully fickle lover. We loved science, for instance, for giving us the atom bomb and nuclear power when those things seemed essential and good; only later did we decide they were evil. Asbestos was at one time a wonder product.

In our country, progress sometimes seems a victim of its own success. If most kids are vaccinated, after all, why not exempt your own children from the infinitesimal risks associated with inoculation, in effect free-riding on the willingness of everyone else to undergo them? You're still relatively safe from disease. Vaccination fears seem to be most prevalent among the young, educated families who ought to be most receptive to the facts—and who in every other way have the most collective outlook on life.

The challenge for business, whose products will contain more and more technology as time goes on, is to increase the general level of comfort in science without making people feel they’re being taken for a ride. More and better science in the schools would be a great start. And of course, somebody needs to help the media distinguish between bogus risks (non-organic produce) and real risks (eating too few fruits and vegetables).

Food irradiation is a great example of a safe, effective technology that could save lives, if only people could get over their terror of it. It may not be true, in this arena, that the only thing we have to fear is fear itself. But it’s close.

Daniel Akst is a columnist and editorial writer for Newsday.

Monday, December 26, 2011

The Reagan Soviet Union Myth

by Leslie Gelb

Dec 23, 2011 11:00 PM EST This month is the 20th anniversary of its end, but few remember how it dominated our lives. What does stick in people's heads, writes Leslie H. Gelb, is wrong—that Reagan won the war with big military spending and toughness.

As the Soviet flag was lowered from atop the Kremlin 20 years ago this month for the last time, it marked the first time in modern history that major powers ended their struggles without war. After threatening American values and interests at home and abroad for nearly half a century, the Soviet Union and the Cold War simply vanished. The perilous contest died, but a dangerous myth lived and thrived-- that President Ronald Reagan won the day with unmatchable hikes in military spending and by being tough and uncompromising. Today, that myth tugs daily in wrong directions regarding the most momentous U.S. policy decisions in hotspots like Iran, North Korea, Afghanistan, and China.

The myth that military power and true grit conquer all locks every major dispute into a test of wills. It blocks the full deployment of American powers. It does no justice to the sophisticated diplomacy employed by Reagan and his successor, George H.W. Bush. Above all, it blinds today’s policymakers from seeing clearly what actually won the Cold War and what matters most in 21st-century global affairs—the strength of the U.S. economy.
Without doubt, superior arms and determination were essential in thwarting the Soviet Union. But diplomacy was at least as critical, especially as the Soviet demise neared, when the Cold War could have concluded not with a whimper, but a bang. At that moment and during the critical period that followed, Ronald Reagan and George H.W. Bush used diplomacy not to make unrealistic demands upon Soviet leader Mikhail Gorbachev, but to help him do what he wanted to do—dismantle the Soviet empire to save the Soviet Union itself and reform the Soviet political and economic system. Gorby was on the ropes. The Berlin Wall had fallen, Soviet troops limped out of Afghanistan, and the Soviet economy was in tatters. Almost everything Gorby wanted to do would reduce the threat to America and its allies. The trick was to help him do it, and that’s what Reagan and Bush did. Pure toughness without sensitive diplomacy would have weakened Gorby in the Kremlin and could have led to more Cold War standoffs, or to war. Toughness tendered with diplomatic compromises led to winning without war.

But the driving force underpinning U.S. military power, grit, and diplomacy was the comparative power of the U.S. economy: the Soviet economy was crashing after many decades of communist corruption, gross military spending, and over-planning, while America’s was still sparking.

Aside from Senator Daniel Patrick Moynihan, some CIA analysts, and Reagan himself, few saw what was really happening in the early 1980s. Most startlingly, the Soviets’ chief of the general staff, Marshal Nikolai Ogarkov, saw the writing on the wall more clearly than anyone, and saw it long before anyone could claim that Reagan’s military buildup had brought the Soviets to their knees. Here’s what he told me in a mostly off-the-record conversation in March 1983 at a huge conference table in the Soviet Defense Ministry’s conference room ringed by triumphal Soviet divisional flags: “The Cold War is over, and you have won.”

This shocking talk began with my attacking the recent buildup of Soviet missiles in Europe. “Stop the baloney, Leslie,” said the tall, red-haired general famed for his hawkishness. “You know your country has military superiority over my country, and that your superiority is growing.”

“All modern military capability is based on economic innovation, technology, and economic strength,” he continued. “And military technology is based on computers. You are far, far ahead of us with computers.” Now waving his arms, “I will take you around this ministry and you will see that even many offices here don’t have computers. In your country, every little child has a computer from age 5.”
“We are so far behind because our political leadership is afraid of computers. The political leadership in my country sees the free use of computers as fatal to their control of information and their power. So, we are far behind you today, and will be more so tomorrow.”

Ogarkov surely shared his revolutionary views with his fellow Politburo members, who fired him the year following our talk, and then consigned him to a frivolous non-post in Eastern Europe. He died in obscurity in 1994.

Reagan and Ogarkov’s insights notwithstanding, the myth prevailed: Moscow tried and failed to match the Reagan military increases, over-stretched, caved economically, then politically. But here’s the reality as seen by almost all scholars who’ve read the Soviet archives and the CIA experts: far from trying to match Reagan’s massive military buildup, the Soviets held their military expenditures roughly constant throughout the 1980s as a percent of their GDP. As for the claim that Moscow also tried and failed to match Reagan’s Star Wars missile-defense system, that’s also false. They were not starstruck by Star Wars and always felt their huge missile force could overwhelm the Star Wars shield, even in the unlikely event that it worked.

Since those years, America’s foreign policy has paid mostly lip service to the centrality of U.S. economic strength. And it hasn’t pursued politically unpopular diplomacy with bad guys with Reagan-like determination. Reagan, Bush I, Harry Truman, Richard Nixon, Henry Kissinger, James Baker, Brent Scowcroft, George Shultz—the great American diplomats—did not fear to negotiate, to compromise strategically, and to persist. They had the inner confidence that America’s power allowed for compromises, and that even after compromising with bad guys, the United States had the advantages to land on top. Reagan was right about Moscow being “The Evil Empire.” But that didn’t stop him or our other great statesmen from pursuing sensible compromises in America’s interests.

Castro knew the U.S. game was to use economic ties to overthrow him. Bad guys today in Tehran and Pyongyang are making the same calculus, and they aren’t wrong.
Truman and Eisenhower knew the power of economics. They made enormous economic sacrifices to help Germany and Japan after World War II. There were no guarantees these investments would pay off. And yet, for decades now, these two nations have been among America’s closest and most potent allies. Fidel Castro’s Cuba would have succumbed decades ago had Washington not feared to open up the economic spigots. Fear of being snared by American goodies was the real reason Castro himself didn’t want those spigots opened.

Castro knew the U.S. game was to use economic ties to overthrow him. Bad guys today in Tehran and Pyongyang are making the same calculus, and they aren’t wrong. To be blunt, don’t expect them to abandon their nuclear weapons or nuclear programs in return for our economic lures. They have to believe they can survive without their nukes. Convincing them will take brilliant diplomacy and compromise. Remember, the alternative is war—and perhaps after 9/11, Iraq, and Afghanistan, Americans can now glimpse its prohibitive costs and limitations.

As for China, no sane American military expert would advocate a land war in Asia. The competition and the cooperation will continue mainly along economic lines. For sure, military competition will grow as well. But which country prevails in the burgeoning global tug of war will turn directly on which economy works best.
Now, 20 years since the Cold War’s end, both Russia and the United States remain cursed—Moscow by the distracting memory of a greatness that it can never recover, and Washington by its inability to shake the myth that military power and true grit conquer all. Russia’s good fortune rests on its leaders getting over the lust for new global power and getting on with genuine democracy at home. Moscow still has a long way to go, judging by recent protests in major Russian cities over manipulated parliamentary elections. America’s good future rests in good measure on setting aside its triumphal myths and getting down to the hard business of rejuvenating its diplomacy and its economy.

About that Cabin

DREW GILPIN FAUST on UNCLE TOM'S CABIN

History . Literature
Much, But Not Everything
Drew Gilpin FaustDecember 25, 2011 | 12:00 am Print
Mightier Than the Sword: Uncle Tom’s Cabin and the Battle for America
by David Reynolds
W.W. Norton & Company, 351 pp., $27.95

AS THE OBSOLESCENCE and even the demise of the book are widely foretold, it is all the more important—and comforting—to recognize how a book can change the world. It is hard to think of many that have done so more emphatically than Uncle Tom’s Cabin. Lincoln is famously said to have greeted its author, Harriet Beecher Stowe, in 1862 by inquiring, “So this is the little lady who started this great war?” And whether he actually ever made the remark or not, the very fact of her visit to the White House and the emergence of the legend of his respectful, if somewhat patronizing, salutation are sufficient evidence of the remarkable influence that Stowe’s words claimed in mid-nineteenth-century America.

Uncle Tom’s Cabin was at once a novel and an “event,” as Theodore Parker proclaimed soon after it appeared. Today its publication is appropriately included—along with such occurrences as the Dred Scott decision and John Brown’s raid—on timelines of incidents that propelled the nation towards civil war. In the mounting sectional conflict, words assumed the power of deeds, acts of political as well as social transformation. Originating as a serial, Uncle Tom’s Cabin appeared in book form in the spring of 1852. By mid-October, 120,000 copies had been sold; by the following spring, 310,000. In England it was even more successful, with sales of a million within a year. Michael Winship has called it “the world’s first true blockbuster.” It may also have been the first bestseller to produce spin-offs-which came to be known as “Tomitudes”: engravings, games, puzzles, songs and sheet music, dramatizations-in Europe as well as the United States. The book was a phenomenon, in its popularity and its influence.

Yet by the early twentieth century it was out of print and would remain so for decades. “Uncle Tom” became an epithet, representing not the admirable saintliness and sacrifice with which Stowe had sought to imbue her protagonist, but—in the eyes of African Americans such as W.E.B. DuBois and James Baldwin—an embarrassing embodiment of black obsequiousness and self-loathing. In the white segregated South, scorn for Stowe’s book claimed different origins: it was seen as part of a long tradition of Northern meddling in Southern racial arrangements. In South Carolina in 1900, a teacher might well make his students raise their right hands and swear never to read Uncle Tom’s Cabin—an unwitting nod to the book’s power as well as an affirmation of the white South’s racial solidarity. Uncle Tom’s Cabin was certainly never taught as literature in the North or the South, because it was seen by critics and scholars as sentimental and overwrought—less art than propaganda. Hawthorne dismissed Stowe as one of his era’s “scribbling women.”

But Stowe and Uncle Tom’s Cabin never did disappear entirely. Perhaps the first modern appreciation of her and her masterwork came from Edmund Wilson, not the easiest or the most gentle of critics. His great book Patriotic Gore: Studies in the Literature of the American Civil War, which appeared in 1962 at the very outset of the conflict’s centennial, opens with a lengthy chapter on Stowe. Wilson emerges from his consideration a grudging admirer, acknowledging the prejudices he brought to the text, but demonstrating a thorough conversion. ”To expose oneself in maturity to Uncle Tom,” he confessed, was “a startling experience.” He admitted that “it is a much more impressive work than one has ever been allowed to suspect.” Wilson hailed the “vitality” of its characters, the book’s “eruptive force,” the clear evidence of the author’s “critical mind.” Comparing her favorably with Dickens and Gogol, he concluded she was “no contemptible novelist.” He became a fan in spite of himself.

The Civil Rights movement of the 1950s and ’60s drew more attention to Uncle Tom’s Cabin as a vehicle of scorn than to either the literary power or the abolitionist sympathies of the novel. It was the emergence of Second Wave feminism and the resultant growth of interest in women’s history that ultimately led to a systematic rehabilitation of the book as an essential example of the moral authority and reach of nineteenth-century American women. The cult of domesticity, the centrality of evangelical religion, the influence of social reform, and the impact of the female pen shaped mid-century society and culture in ways that reached well beyond the home. The era’s “scribbling women”—with Stowe the most successful among them—were both the cause and the result of this transformation.

The past quarter century has witnessed sustained interest in Uncle Tom’s Cabin and its author. The book’s original popularity derived in no small part from its invocation of so many of the critical concerns of nineteenth-century American culture. As a result it can serve as an almost unsurpassed point of entry into the assumptions of that historical moment. It is a marvelous book to teach—as I have done with undergraduates, graduate students, and summer seminars of high school teachers. It is a document that captures the sensibilities of people both like and unlike ourselves, and it describes a past world with voices and characters that speak to us across the barriers of space and time—Tom, Topsy, Eva, Cassy, Mrs. Bird, St. Clare, Ophelia—even Simon Legree. That Stowe achieved such influence in a period when American feminism was making its first appearances, and that she did so with a text intended to advance the anti-slavery cause, further contributes to its present day relevance, for these two nineteenth-century social movements have had modern manifestations that have shaped our age as fundamentally as they did hers.

Through the work of Jane Tompkins, Mary Kelley, and others, Uncle Tom’s Cabin has played a key role in reorienting the study of the American Renaissance to include women alongside its iconic men—Emerson, Thoreau, Whitman, Melville. In 1995, Joan Hedrick won a Pulitzer Prize for the first full-scale biography of Stowe in half a century. And Uncle Tom entered promptly into the digital era as well. In 1852, the book had strained the technological capacities of its time, requiring, according to its publisher, that three paper mills and more than a hundred book binders remain constantly at work to meet the demand of eager readers. Today’s technology has extended Tom’s reach through a website created at the University of Virginia that has served as a founding model for the digital humanities. “Uncle Tom’s Cabin and American Culture: A Multi Media Archive” is directed by Stephen Railton and funded by the National Endowment for the Humanities, offering texts, images, songs, poems, even film that document the book’s origins, its later renditions on stage and screen, as well as assessments of its history and impact by a range of distinguished scholars. The website makes twelve editions of the book available on a virtual shelf.

David Reynolds, the author of widely read volumes on the nineteenth century, has not only joined the twenty-first century chorus of appreciation for Stowe and her novel. He has reached well beyond his predecessors in his claims for its influence. His book is true to its excessive title: it represents Uncle Tom’s Cabin as not just an influence on American life, but a force nearly unmatched in its social and cultural impact. For Reynolds, Uncle Tom’s Cabin was “central to redefining American democracy on a more egalitarian basis”; it made the Bible “relevant to contemporary life,” and it “replaced the venal religion of the churches with a new, abolitionist Christianity.” It also “established a whole new school of popular antislavery literature,” and at the same time gave rise to the pro-slavery argument, which is customarily seen as emerging in force in the 1830s but in Reynolds’s portrayal does not substantially appear until prompted by Stowe’s novel more than twenty years later.

The book’s dramatic versions were equally revolutionary, in Reynolds’s account, serving even as a “major step toward making theatergoing respectable” and leading also to the creation of the matinee and the long theatrical run. Uncle Tom’s Cabin also influenced James and Howells, and profoundly shaped realist fiction and, later, D.W. Griffiths and the emergence of realist film. By century’s end, moreover, Uncle Tom’s Cabin had set off a “chain reaction” that led to Birth of a Nation “and the revitalized Ku Klux Klan” and also “the self-assertion and protest on the part of DuBois and other African-Americans,” resulting in the establishment of the NAACP. Even more than seventy-five years after the publication of Uncle Tom’s Cabin, the appearance of Gone With the Wind was, Reynolds finds, “largely in reply to Stowe.”

This one book did all that? “Chain reaction” with its invocation of nuclear force, seems a more apt metaphor than the “sword” of Reynolds’s title to capture his assessment of the book and its might. Lincoln may have suggested that Stowe caused a war, but Reynolds offers much more: he assigns to Stowe central responsibility for the unfolding history of much of the following century. As we enter into the sesquicentennial celebration of the Civil War, Harriet Beecher Stowe’s achievement reminds us that we must remember more than battles and statesmen if we are to understand the causes, the conflict and its aftermath. But swords and statesmen and armies and governments and writers and preachers all played their complex and interdependent parts in what Reynolds calls the “Battle for America.” DuBois, Margaret Mitchell, D.W. Griffiths, and Henry James, not to mention Lee, Grant, and Lincoln, would likely be surprised to learn that the twenty-first century could imagine that the battle over race and power, not to mention culture and values, was really all about Harriet Beecher Stowe.

Sunday, December 25, 2011

Irony: Obama is the Real Conservative

By E.J. Dionne Jr., Updated: Sunday, December 25, 7:00 PM


At a moment when the nation wonders whether politicians can agree on anything, here is something that unites the Republican presidential candidates — and all of them with President Obama: Everyone agrees that the 2012 election will be a turning point involving one of the most momentous choices in U.S. history.

True, candidates (and columnists) regularly cast an impending election as the most important ever. Campaigning last week in Pella, Iowa, Republican Rick Santorum acknowledged as much. But he insisted that this time, the choice really was that fundamental. “The debate,” he said, “is about who we are.”


Speaking not far away, in Mount Pleasant, Newt Gingrich went even further, and was more specific. “This is the most important election since 1860,” he said, “because there’s such a dramatic difference between the best food-stamp president in history and the best paycheck candidate.” Thus did Gingrich combine historic sweep with a cheap and inaccurate attack. Nonetheless, it says a great deal that Gingrich chose to reach all the way back to the election that helped spark the Civil War.

Mitt Romney was on the same page in a speech in Bedford, N.H. “This is an election not to replace a president but to save a vision of America,” he declared. “It’s a choice between two destinies.” Sounding just like Santorum, he urged voters to ask: “Who are we as Americans, and what kind of America do we want for our children?”

Obama could not agree more. “This is not just another political debate,” the president said in his theme-setting speech in Osawatomie, Kan., earlier this month. “This is a make-or-break moment for the middle class, and for all those who are fighting to get into the middle class.”

On this one, Santorum, Gingrich, Romney and Obama all have it right. For the first time since Barry Goldwater made the effort in 1964, the Republican Party is taking a run at overturning the consensus that has governed U.S. political life since the Progressive era.

Obama is defending a tradition that sees government as an essential actor in the nation’s economy, a guarantor of fair rules of competition, a countervailing force against excessive private power, a check on the inequalities that capitalism can produce, and an instrument that can open opportunity for those born without great advantages.

Today’s Republicans cast the federal government as an oppressive force, a drag on the economy and an enemy of private initiative. Texas Gov. Rick Perry continues to promise, as he did last week during a campaign stop in Davenport, Iowa, to be a president who would make “Washington, D.C., as inconsequential in your life as he can make it.” That far-reaching word “inconsequential” implies a lot more than trims in budgets or taxes.

The GOP is engaged in a wholesale effort to redefine the government help that Americans take for granted as an effort to create a radically new, statist society. Consider Romney’s claim in his Bedford speech: “President Obama believes that government should create equal outcomes. In an entitlement society, everyone receives the same or similar rewards, regardless of education, effort and willingness to take risk. That which is earned by some is redistributed to the others. And the only people who truly enjoy any real rewards are those who do the redistributing — the government.”

Obama believes no such thing. If he did, why are so many continuing to make bundles on Wall Street? As my colleagues Greg Sargent and Paul Krugman have been insisting, Romney is saying things about the president that are flatly, grossly and shamefully untrue. But Romney’s sleight of hand is revealing: Republicans are increasingly inclined to argue that any redistribution (and Social Security, Medicare, student loans, veterans benefits and food stamps are all redistributive) is but a step down the road to some radically egalitarian dystopia.

Obama will thus be the conservative in 2012, in the truest sense of that word. He is the candidate defending the modestly redistributive and regulatory government the country has relied on since the New Deal, and that neither Ronald Reagan nor George W. Bush dismantled. The rhetoric of the 2012 Republicans suggests they want to go far beyond where Reagan or Bush ever went. And here’s the irony: By raising the stakes of 2012 so high, Republicans will be playing into Obama’s hands. The GOP might well win a referendum on the state of the economy. But if this is instead a larger-scale referendum on whether government should be “inconsequential,” Republicans will find the consequences to be very disappointing.

Living in the Multiverse

The accidental universe:
Science's crisis of faith
By Alan P. Lightman

Alan Lightman, a physicist and novelist, teaches at MIT. His new book, Mr g: A Novel About the Creation, will be published in January by Pantheon.
In the fifth century B.C., the philosopher Democritus proposed that all matter was made of tiny and indivisible atoms, which came in various sizes and textures—some hard and some soft, some smooth and some thorny. The atoms themselves were taken as givens. In the nineteenth century, scientists discovered that the chemical properties of atoms repeat periodically (and created the periodic table to reflect this fact), but the origins of such patterns remained mysterious. It wasn’t until the twentieth century that scientists learned that the properties of an atom are determined by the number and placement of its electrons, the subatomic particles that orbit its nucleus. And we now know that all atoms heavier than helium were created in the nuclear furnaces of stars.

The history of science can be viewed as the recasting of phenomena that were once thought to be accidents as phenomena that can be understood in terms of fundamental causes and principles. One can add to the list of the fully explained: the hue of the sky, the orbits of planets, the angle of the wake of a boat moving through a lake, the six-sided patterns of snowflakes, the weight of a flying bustard, the temperature of boiling water, the size of raindrops, the circular shape of the sun. All these phenomena and many more, once thought to have been fixed at the beginning of time or to be the result of random events thereafter, have been explained as necessary consequences of the fundamental laws of nature—laws discovered by human beings.

This long and appealing trend may be coming to an end. Dramatic developments in cosmological findings and thought have led some of the world’s premier physicists to propose that our universe is only one of an enormous number of universes with wildly varying properties, and that some of the most basic features of our particular universe are indeed mere accidents—a random throw of the cosmic dice. In which case, there is no hope of ever explaining our universe’s features in terms of fundamental causes and principles.

It is perhaps impossible to say how far apart the different universes may be, or whether they exist simultaneously in time. Some may have stars and galaxies like ours. Some may not. Some may be finite in size. Some may be infinite. Physicists call the totality of universes the “multiverse.” Alan Guth, a pioneer in cosmological thought, says that “the multiple-universe idea severely limits our hopes to understand the world from fundamental principles.” And the philosophical ethos of science is torn from its roots. As put to me recently by Nobel Prize–winning physicist Steven Weinberg, a man as careful in his words as in his mathematical calculations, “We now find ourselves at a historic fork in the road we travel to understand the laws of nature. If the multiverse idea is correct, the style of fundamental physics will be radically changed.”

The scientists most distressed by Weinberg’s “fork in the road” are theoretical physicists. Theoretical physics is the deepest and purest branch of science. It is the outpost of science closest to philosophy, and religion. Experimental scientists occupy themselves with observing and measuring the cosmos, finding out what stuff exists, no matter how strange that stuff may be. Theoretical physicists, on the other hand, are not satisfied with observing the universe. They want to know why. They want to explain all the properties of the universe in terms of a few fundamental principles and parameters. These fundamental principles, in turn, lead to the “laws of nature,” which govern the behavior of all matter and energy. An example of a fundamental principle in physics, first proposed by Galileo in 1632 and extended by Einstein in 1905, is the following: All observers traveling at constant velocity relative to one another should witness identical laws of nature. From this principle, Einstein derived his theory of special relativity. An example of a fundamental parameter is the mass of an electron, considered one of the two dozen or so “elementary” particles of nature. As far as physicists are concerned, the fewer the fundamental principles and parameters, the better. The underlying hope and belief of this enterprise has always been that these basic principles are so restrictive that only one, self-consistent universe is possible, like a crossword puzzle with only one solution. That one universe would be, of course, the universe we live in. Theoretical physicists are Platonists. Until the past few years, they agreed that the entire universe, the one universe, is generated from a few mathematical truths and principles of symmetry, perhaps throwing in a handful of parameters like the mass of the electron. It seemed that we were closing in on a vision of our universe in which everything could be calculated, predicted, and understood.

However, two theories in physics, eternal inflation and string theory, now suggest that the same fundamental principles from which the laws of nature derive may lead to many different self-consistent universes, with many different properties. It is as if you walked into a shoe store, had your feet measured, and found that a size 5 would fit you, a size 8 would also fit, and a size 12 would fit equally well. Such wishy-washy results make theoretical physicists extremely unhappy. Evidently, the fundamental laws of nature do not pin down a single and unique universe. According to the current thinking of many physicists, we are living in one of a vast number of universes. We are living in an accidental universe. We are living in a universe uncalculable by science.


--------------------------------------------------------------------------------

“Back in the 1970s and 1980s,” says Alan Guth, “the feeling was that we were so smart, we almost had everything figured out.” What physicists had figured out were very accurate theories of three of the four fundamental forces of nature: the strong nuclear force that binds atomic nuclei together, the weak force that is responsible for some forms of radioactive decay, and the electromagnetic force between electrically charged particles. And there were prospects for merging the theory known as quantum physics with Einstein’s theory of the fourth force, gravity, and thus pulling all of them into the fold of what physicists called the Theory of Everything, or the Final Theory. These theories of the 1970s and 1980s required the specification of a couple dozen parameters corresponding to the masses of the elementary particles, and another half dozen or so parameters corresponding to the strengths of the fundamental forces. The next step would then have been to derive most of the elementary particle masses in terms of one or two fundamental masses and define the strengths of all the fundamental forces in terms of a single fundamental force.

There were good reasons to think that physicists were poised to take this next step. Indeed, since the time of Galileo, physics has been extremely successful in discovering principles and laws that have fewer and fewer free parameters and that are also in close agreement with the observed facts of the world. For example, the observed rotation of the ellipse of the orbit of Mercury, 0.012 degrees per century, was successfully calculated using the theory of general relativity, and the observed magnetic strength of an electron, 2.002319 magnetons, was derived using the theory of quantum electrodynamics. More than any other science, physics brims with highly accurate agreements between theory and experiment.

Guth started his physics career in this sunny scientific world. Now sixty-four years old and a professor at MIT, he was in his early thirties when he proposed a major revision to the Big Bang theory, something called inflation. We now have a great deal of evidence suggesting that our universe began as a nugget of extremely high density and temperature about 14 billion years ago and has been expanding, thinning out, and cooling ever since. The theory of inflation proposes that when our universe was only about a trillionth of a trillionth of a trillionth of a second old, a peculiar type of energy caused the cosmos to expand very rapidly. A tiny fraction of a second later, the universe returned to the more leisurely rate of expansion of the standard Big Bang model. Inflation solved a number of outstanding problems in cosmology, such as why the universe appears so homogeneous on large scales.

When I visited Guth in his third-floor office at MIT one cool day in May, I could barely see him above the stacks of paper and empty Diet Coke bottles on his desk. More piles of paper and dozens of magazines littered the floor. In fact, a few years ago Guth won a contest sponsored by the Boston Globe for the messiest office in the city. The prize was the services of a professional organizer for one day. “She was actually more a nuisance than a help. She took piles of envelopes from the floor and began sorting them according to size.” He wears aviator-style eyeglasses, keeps his hair long, and chain-drinks Diet Cokes. “The reason I went into theoretical physics,” Guth tells me, “is that I liked the idea that we could understand everything—i.e., the universe—in terms of mathematics and logic.” He gives a bitter laugh. We have been talking about the multiverse.


--------------------------------------------------------------------------------

While challenging the Platonic dream of theoretical physicists, the multiverse idea does explain one aspect of our universe that has unsettled some scientists for years: according to various calculations, if the values of some of the fundamental parameters of our universe were a little larger or a little smaller, life could not have arisen. For example, if the nuclear force were a few percentage points stronger than it actually is, then all the hydrogen atoms in the infant universe would have fused with other hydrogen atoms to make helium, and there would be no hydrogen left. No hydrogen means no water. Although we are far from certain about what conditions are necessary for life, most biologists believe that water is necessary. On the other hand, if the nuclear force were substantially weaker than what it actually is, then the complex atoms needed for biology could not hold together. As another example, if the relationship between the strengths of the gravitational force and the electromagnetic force were not close to what it is, then the cosmos would not harbor any stars that explode and spew out life-supporting chemical elements into space or any other stars that form planets. Both kinds of stars are required for the emergence of life. The strengths of the basic forces and certain other fundamental parameters in our universe appear to be “fine-tuned” to allow the existence of life. The recognition of this fine­tuning led British physicist Brandon Carter to articulate what he called the anthropic principle, which states that the universe must have the parameters it does because we are here to observe it. Actually, the word anthropic, from the Greek for “man,” is a misnomer: if these fundamental parameters were much different from what they are, it is not only human beings who would not exist. No life of any kind would exist.

If such conclusions are correct, the great question, of course, is why these fundamental parameters happen to lie within the range needed for life. Does the universe care about life? Intelligent design is one answer. Indeed, a fair number of theologians, philosophers, and even some scientists have used fine-tuning and the anthropic principle as evidence of the existence of God. For example, at the 2011 Christian Scholars’ Conference at Pepperdine University, Francis Collins, a leading geneticist and director of the National Institutes of Health, said, “To get our universe, with all of its potential for complexities or any kind of potential for any kind of life-form, everything has to be precisely defined on this knife edge of improbability…. [Y]ou have to see the hands of a creator who set the parameters to be just so because the creator was interested in something a little more complicated than random particles.”

Intelligent design, however, is an answer to fine-tuning that does not appeal to most scientists. The multiverse offers another explanation. If there are countless different universes with different properties—for example, some with nuclear forces much stronger than in our universe and some with nuclear forces much weaker—then some of those universes will allow the emergence of life and some will not. Some of those universes will be dead, lifeless hulks of matter and energy, and others will permit the emergence of cells, plants and animals, minds. From the huge range of possible universes predicted by the theories, the fraction of universes with life is undoubtedly small. But that doesn’t matter. We live in one of the universes that permits life because otherwise we wouldn’t be here to ask the question.

The explanation is similar to the explanation of why we happen to live on a planet that has so many nice things for our comfortable existence: oxygen, water, a temperature between the freezing and boiling points of water, and so on. Is this happy coincidence just good luck, or an act of Providence, or what? No, it is simply that we could not live on planets without such properties. Many other planets exist that are not so hospitable to life, such as Uranus, where the temperature is –371 degrees Fahrenheit, and Venus, where it rains sulfuric acid.

The multiverse offers an explanation to the fine-tuning conundrum that does not require the presence of a Designer. As Steven Weinberg says: “Over many centuries science has weakened the hold of religion, not by disproving the existence of God but by invalidating arguments for God based on what we observe in the natural world. The multiverse idea offers an explanation of why we find ourselves in a universe favorable to life that does not rely on the benevolence of a creator, and so if correct will leave still less support for religion.”

Some physicists remain skeptical of the anthropic principle and the reliance on multiple universes to explain the values of the fundamental parameters of physics. Others, such as Weinberg and Guth, have reluctantly accepted the anthropic principle and the multiverse idea as together providing the best possible explanation for the observed facts.

If the multiverse idea is correct, then the historic mission of physics to explain all the properties of our universe in terms of fundamental principles—to explain why the properties of our universe must necessarily be what they are—is futile, a beautiful philosophical dream that simply isn’t true. Our universe is what it is because we are here. The situation could be likened to a school of intelligent fish who one day began wondering why their world is completely filled with water. Many of the fish, the theorists, hope to prove that the entire cosmos necessarily has to be filled with water. For years, they put their minds to the task but can never quite seem to prove their assertion. Then, a wizened group of fish postulates that maybe they are fooling themselves. Maybe there are, they suggest, many other worlds, some of them completely dry, and everything in between.


--------------------------------------------------------------------------------

The most striking example of fine-tuning, and one that practically demands the multiverse to explain it, is the unexpected detection of what scientists call dark energy. Little more than a decade ago, using robotic telescopes in Arizona, Chile, Hawaii, and outer space that can comb through nearly a million galaxies a night, astronomers discovered that the expansion of the universe is accelerating. As mentioned previously, it has been known since the late 1920s that the universe is expanding; it’s a central feature of the Big Bang model. Orthodox cosmological thought held that the expansion is slowing down. After all, gravity is an attractive force; it pulls masses closer together. So it was quite a surprise in 1998 when two teams of astronomers announced that some unknown force appears to be jamming its foot down on the cosmic accelerator pedal. The expansion is speeding up. Galaxies are flying away from each other as if repelled by antigravity. Says Robert Kirshner, one of the team members who made the discovery: “This is not your father’s universe.” (In October, members of both teams were awarded the Nobel Prize in Physics.)

Physicists have named the energy associated with this cosmological force dark energy. No one knows what it is. Not only invisible, dark energy apparently hides out in empty space. Yet, based on our observations of the accelerating rate of expansion, dark energy constitutes a whopping three quarters of the total energy of the universe. It is the invisible elephant in the room of science.

The amount of dark energy, or more precisely the amount of dark energy in every cubic centimeter of space, has been calculated to be about one hundred-millionth (10–8) of an erg per cubic centimeter. (For comparison, a penny dropped from waist-high hits the floor with an energy of about three hundred thousand—that is, 3 × 105—ergs.) This may not seem like much, but it adds up in the vast volumes of outer space. Astronomers were able to determine this number by measuring the rate of expansion of the universe at different epochs—if the universe is accelerating, then its rate of expansion was slower in the past. From the amount of acceleration, astronomers can calculate the amount of dark energy in the universe.

Theoretical physicists have several hypotheses about the identity of dark energy. It may be the energy of ghostly subatomic particles that can briefly appear out of nothing before self­annihilating and slipping back into the vacuum. According to quantum physics, empty space is a pandemonium of subatomic particles rushing about and then vanishing before they can be seen. Dark energy may also be associated with an as-yet-unobserved force field called the Higgs field, which is sometimes invoked to explain why certain kinds of matter have mass. (Theoretical physicists ponder things that other people do not.) And in the models proposed by string theory, dark energy may be associated with the way in which extra dimensions of space—beyond the usual length, width, and breadth—get compressed down to sizes much smaller than atoms, so that we do not notice them.

These various hypotheses give a fantastically large range for the theoretically possible amounts of dark energy in a universe, from something like 10115 ergs per cubic centimeter to –10115 ergs per cubic centimeter. (A negative value for dark energy would mean that it acts to decelerate the universe, in contrast to what is observed.) Thus, in absolute magnitude, the amount of dark energy actually present in our universe is either very, very small or very, very large compared with what it could be. This fact alone is surprising. If the theoretically possible positive values for dark energy were marked out on a ruler stretching from here to the sun, with zero at one end of the ruler and 10115 ergs per cubic centimeter at the other end, the value of dark energy actually found in our universe (10–8 ergs per cubic centimeter) would be closer to the zero end than the width of an atom.

On one thing most physicists agree: If the amount of dark energy in our universe were only a little bit different than what it actually is, then life could never have emerged. A little more and the universe would accelerate so rapidly that the matter in the young cosmos could never pull itself together to form stars and thence form the complex atoms made in stars. And, going into negative values of dark energy, a little less and the universe would decelerate so rapidly that it would recollapse before there was time to form even the simplest atoms.

Here we have a clear example of fine-tuning: out of all the possible amounts of dark energy that our universe might have, the actual amount lies in the tiny sliver of the range that allows life. There is little argument on this point. It does not depend on assumptions about whether we need liquid water for life or oxygen or particular biochemistries. As before, one is compelled to ask the question: Why does such fine-tuning occur? And the answer many physicists now believe: The multiverse. A vast number of universes may exist, with many different values of the amount of dark energy. Our particular universe is one of the universes with a small value, permitting the emergence of life. We are here, so our universe must be such a universe. We are an accident. From the cosmic lottery hat containing zillions of universes, we happened to draw a universe that allowed life. But then again, if we had not drawn such a ticket, we would not be here to ponder the odds.


--------------------------------------------------------------------------------

The concept of the multiverse is compelling not only because it explains the problem of fine-tuning. As I mentioned earlier, the possibility of the multiverse is actually predicted by modern theories of physics. One such theory, called eternal inflation, is a revision of Guth’s inflation theory developed by Andrei Linde, Paul Steinhardt, and Alex Vilenkin in the early and mid-1980s. In regular inflation theory, the very rapid expansion of the infant universe is caused by an energy field, like dark energy, that is temporarily trapped in a condition that does not represent the lowest possible energy for the universe as a whole—like a marble sitting in a small dent on a table. The marble can stay there, but if it is jostled it will roll out of the dent, roll across the table, and then fall to the floor (which represents the lowest possible energy level). In the theory of eternal inflation, the dark energy field has many different values at different points of space, analogous to lots of marbles sitting in lots of dents on the cosmic table. Moreover, as space expands rapidly, the number of marbles increases. Each of these marbles is jostled by the random processes inherent in quantum mechanics, and some of the marbles will begin rolling across the table and onto the floor. Each marble starts a new Big Bang, essentially a new universe. Thus, the original, rapidly expanding universe spawns a multitude of new universes, in a never-ending process.

String theory, too, predicts the possibility of the multiverse. Originally conceived in the late 1960s as a theory of the strong nuclear force but soon enlarged far beyond that ambition, string theory postulates that the smallest constituents of matter are not subatomic particles like the electron but extremely tiny one-dimensional “strings” of energy. These elemental strings can vibrate at different frequencies, like the strings of a violin, and the different modes of vibration correspond to different fundamental particles and forces. String theories typically require seven dimensions of space in addition to the usual three, which are compacted down to such small sizes that we never experience them, like a three-dimensional garden hose that appears as a one-dimensional line when seen from a great distance. There are, in fact, a vast number of ways that the extra dimensions in string theory can be folded up, and each of the different ways corresponds to a different universe with different physical properties.

It was originally hoped that from a theory of these strings, with very few additional parameters, physicists would be able to explain all the forces and particles of nature—all of reality would be a manifestation of the vibrations of elemental strings. String theory would then be the ultimate realization of the Platonic ideal of a fully explicable cosmos. In the past few years, however, physicists have discovered that string theory predicts not a unique universe but a huge number of possible universes with different properties. It has been estimated that the “string landscape” contains 10500 different possible universes. For all practical purposes, that number is infinite.

It is important to point out that neither eternal inflation nor string theory has anywhere near the experimental support of many previous theories in physics, such as special relativity or quantum electrodynamics, mentioned earlier. Eternal inflation or string theory, or both, could turn out to be wrong. However, some of the world’s leading physicists have devoted their careers to the study of these two theories.


--------------------------------------------------------------------------------

Back to the intelligent fish. The wizened old fish conjecture that there are many other worlds, some with dry land and some with water. Some of the fish grudgingly accept this explanation. Some feel relieved. Some feel like their lifelong ruminations have been pointless. And some remain deeply concerned. Because there is no way they can prove this conjecture. That same uncertainty disturbs many physicists who are adjusting to the idea of the multiverse. Not only must we accept that basic properties of our universe are accidental and uncalculable. In addition, we must believe in the existence of many other universes. But we have no conceivable way of observing these other universes and cannot prove their existence. Thus, to explain what we see in the world and in our mental deductions, we must believe in what we cannot prove.

Sound familiar? Theologians are accustomed to taking some beliefs on faith. Scientists are not. All we can do is hope that the same theories that predict the multiverse also produce many other predictions that we can test here in our own universe. But the other universes themselves will almost certainly remain a conjecture.

“We had a lot more confidence in our intuition before the discovery of dark energy and the multiverse idea,” says Guth. “There will still be a lot for us to understand, but we will miss out on the fun of figuring everything out from first principles.”

One wonders whether a young Alan Guth, considering a career in science today, would choose theoretical physics.

The West and the Rest

By DONALD KAGAN
Published: November 25, 2011

This is a difficult time in which to present an account — and what amounts to a defense — of the West’s rise to pre-eminence and its unequaled influence in shaping the world today. The West is on the defensive, challenged economically by the ascent of China and politically and militarily by a wave of Islamist hatred. Perhaps as great a challenge is internal. The study of Western civilization, which dominated American education after World War II, has long been under attack, and is increasingly hard to find in our schools and colleges. When it is treated at all, the West is maligned because of its history of slavery and imperialism, an alleged addiction to war and its exclusion of women and nonwhites from its rights and privileges. Some criticize its study as narrow, limiting, arrogant and discriminatory, asserting that it has little or no value for those of non-European origins. Or it is said to be of interest chiefly as a horrible example.

CIVILIZATION

The West and the Rest

By Niall Ferguson


Books of The Times: ‘Civilization’ by Niall Ferguson (November 15, 2011) Niall Ferguson thinks otherwise. A professor at both Harvard University and the Harvard Business School, quite aware of the faults and blemishes of the West, he flatly rejects the view of those who find nothing worthwhile in it, calling their position “absurd.” He recognizes both good and bad sides and decides that in comparison with other civilizations, the better side “came out on top.”

Many of the observations in “Civilization: The West and the Rest” will not win Ferguson friends among the fashionable in today’s academy. He upbraids critics who speak scornfully of “ ‘Eurocentrism’ as if it were some distasteful prejudice.” “The scientific revolution was, by any scientific measure, wholly Eurocentric.” Ferguson pays due respect to the intellectual and scientific contributions of China and Islam, but makes it clear that modern science and technology are fundamentally Western products. He asks if any non-Western state can simply acquire scientific knowledge without accepting other key Western institutions like “private property rights, the rule of law and truly representative government.”

Ferguson is so unfashionable as to speak in defense of imperialism: “It is a truth almost universally acknowledged in the schools and colleges of the Western world that imperialism is the root cause of nearly every modern problem, . . . a convenient alibi for rapacious dictators like Zimbabwe’s Robert Mugabe.” Contradicting historians who “represent colonial officials as morally equivalent to Nazis or Stalinists,” he points out that in most Asian and African countries “life expectancy began to improve before the end of European colonial rule.”

Ferguson does not attempt a thorough investigation of the many charges made against the West, or a defense against them. Instead, he addresses the interesting and difficult question: “Just why, beginning around 1500, did a few small polities on the western end of the Eurasian landmass come to dominate the rest of the world?” The book’s method, he says, is to tell “a big story,” along with many little ones, but that is not a proper description. Rather than a chronological narrative, Ferguson offers six chapters of what he calls “killer apps,” each addressing a major element in his answer to the question of Western domination: 1) competition, both among and within the European states; 2) science, beginning with the scientific revolution of the 16th and 17th centuries; 3) the rule of law and representative government, based on the rights of private property and representation in elected legislatures; 4) modern medicine; 5) the consumer society that resulted from the Industrial Revolution; and 6) the work ethic. These, he argues, were crucial to the growth of the West’s power, but weak or nonexistent in other societies.

Excellence in these categories, Ferguson says, may explain the West’s remarkable rise, but late in the 19th century “the Rest,” especially Japan, began to catch up in all but internal competition and representative government. By the 1950s states in East Asia, especially and increasingly China, made great strides in economic modernization and now compete successfully against the West. At present, he says, we are experiencing “the end of 500 years of Western predominance,” and he foresees the possibility of a clash between the declining and rising forces. He wonders “whether the weaker will tip over from weakness to outright collapse.”

What’s worse, Ferguson sees the current financial crisis as “an accelerator of an already well-established trend of relative Western decline.” He worries that there may come a moment when a “seemingly random piece of bad news — perhaps a negative report by a rating agency” panics investors, who lose confidence in the credit of the United States. This could cause disaster, “for a complex adaptive system is in big trouble when a critical mass of its constituents loses faith in its viability.”

Nonetheless, Ferguson has not given up on the West; it still has more “institutional advantages than the Rest.” The lack of political competition, the rule of law, freedom of conscience and a free press help explain why countries like China, Iran and Russia “lag behind Western countries in qualitative indices that measure ‘national innovative development’ and ‘national innovative capacity.’ ” Still, his hopes for continued success do not seem very strong. Although the “Western package” offers “the best available set of economic, social and political institutions,” he questions whether Westerners are still able to recognize it.

An element central to all this is education, especially history, and Ferguson is appalled by the decline of historical teaching and knowledge in the Western world. His conclusion is not encouraging: “The biggest threat to Western civilization is posed not by other civilizations, but by our own pusillanimity — and by the historical ignorance that feeds it.”

“Civilization” is part of his solution. The book is the basis for a television series in Britain, and he told an interviewer that it aims to give a “17-year-old boy or girl . . . a lot of history in a very digestible way.” Yet it must be said that bits of history are what they get, not the kind of “big story” one requires to understand the character and development of Western and other civilizations. We still need a full account of how and why one thing followed another, of cause and consequence, of the role of chance versus the force of inherited ­tradition.

Over all, Ferguson calls for a return to traditional education, since “at its core, a civilization is the texts that are taught in its schools, learned by its students and recollected in times of tribulation” — by which he means Great Books, and especially Shakespeare. The greatest dangers facing us are probably not “the rise of China, Islam or CO2 emissions,” he writes, but “our own loss of faith in the civilization we inherited from our ancestors.”

Friday, December 23, 2011

How Ayn Rand Seduced Generations of Young Men and Helped Make the U.S. Into a Selfish, Greedy Nation

BY Bruce E. Levine
AlterNet
15 December 2011

Ayn Rand’s “philosophy” is nearly perfect in its immorality, which makes the size of her audience all the more ominous and symptomatic as we enter a curious new phase in our society....To justify and extol human greed and egotism is to my mind not only immoral, but evil.— Gore Vidal, 1961

Only rarely in U.S. history do writers transform us to become a more caring or less caring nation. In the 1850s, Harriet Beecher Stowe (1811-1896) was a strong force in making the United States a more humane nation, one that would abolish slavery of African Americans. A century later, Ayn Rand (1905-1982) helped make the United States into one of the most uncaring nations in the industrialized world, a neo-Dickensian society where healthcare is only for those who can afford it, and where young people are coerced into huge student-loan debt that cannot be discharged in bankruptcy.

Rand’s impact has been widespread and deep. At the iceberg’s visible tip is the influence she’s had over major political figures who have shaped American society. In the 1950s, Ayn Rand read aloud drafts of what was later to become Atlas Shrugged to her “Collective,” Rand’s ironic nickname for her inner circle of young individualists, which included Alan Greenspan, who would serve as chairman of the Federal Reserve Board from 1987 to 2006.

In 1966, Ronald Reagan wrote in a personal letter, “Am an admirer of Ayn Rand.” Today, Rep. Paul Ryan (R-WI) credits Rand for inspiring him to go into politics, and Sen. Ron Johnson (R-WI) calls Atlas Shrugged his “foundation book.” Rep. Ron Paul (R-TX) says Ayn Rand had a major influence on him, and his son Sen. Rand Paul (R-KY) is an even bigger fan. A short list of other Rand fans includes Supreme Court Justice Clarence Thomas; Christopher Cox, chairman of the Security and Exchange Commission in George W. Bush’s second administration; and former South Carolina governor Mark Sanford.

But Rand’s impact on U.S. society and culture goes even deeper.

The Seduction of Nathan Blumenthal

Ayn Rand’s books such as The Virtue of Selfishness and her philosophy that celebrates self-interest and disdains altruism may well be, as Vidal assessed, “nearly perfect in its immorality.” But is Vidal right about evil? Charles Manson, who himself did not kill anyone, is the personification of evil for many of us because of his psychological success at exploiting the vulnerabilities of young people and seducing them to murder. What should we call Ayn Rand’s psychological ability to exploit the vulnerabilities of millions of young people so as to influence them not to care about anyone besides themselves?

While Greenspan (tagged “A.G.” by Rand) was the most famous name that would emerge from Rand’s Collective, the second most well-known name to emerge from the Collective was Nathaniel Branden, psychotherapist, author and “self-esteem” advocate. Before he was Nathaniel Branden, he was Nathan Blumenthal, a 14-year-old who read Rand’s The Fountainhead again and again. He later would say, “I felt hypnotized.” He describes how Rand gave him a sense that he could be powerful, that he could be a hero. He wrote one letter to his idol Rand, then a second. To his amazement, she telephoned him, and at age 20, Nathan received an invitation to Ayn Rand’s home. Shortly after, Nathan Blumenthal announced to the world that he was incorporating Rand in his new name: Nathaniel Branden. And in 1955, with Rand approaching her 50th birthday and Branden his 25th, and both in dissatisfying marriages, Ayn bedded Nathaniel.

What followed sounds straight out of Hollywood, but Rand was straight out of Hollywood, having worked for Cecil B. DeMille. Rand convened a meeting with Nathaniel, his wife Barbara (also a Collective member), and Rand’s own husband Frank. To Branden's astonishment, Rand convinced both spouses that a time-structured affair—she and Branden were to have one afternoon and one evening a week together—was “reasonable.” Within the Collective, Rand is purported to have never lost an argument. On his trysts at Rand’s New York City apartment, Branden would sometimes shake hands with Frank before he exited. Later, all discovered that Rand’s sweet but passive husband would leave for a bar, where he began his self-destructive affair with alcohol.

By 1964, the 34-year-old Nathaniel Branden had grown tired of the now 59-year-old Ayn Rand. Still sexually dissatisfied in his marriage to Barbara and afraid to end his affair with Rand, Branden began sleeping with a married 24-year-old model, Patrecia Scott. Rand, now “the woman scorned,” called Branden to appear before the Collective, whose nickname had by now lost its irony for both Barbara and Branden. Rand’s justice was swift. She humiliated Branden and then put a curse on him: “If you have one ounce of morality left in you, an ounce of psychological health—you'll be impotent for the next twenty years! And if you achieve potency sooner, you'll know it’s a sign of still worse moral degradation!”

Rand completed the evening with two welt-producing slaps across Branden’s face. Finally, in a move that Stalin and Hitler would have admired, Rand also expelled poor Barbara from the Collective, declaring her treasonous because Barbara, preoccupied by her own extramarital affair, had neglected to fill Rand in soon enough on Branden's extra-extra-marital betrayal. (If anyone doubts Alan Greenspan’s political savvy, keep in mind that he somehow stayed in Rand’s good graces even though he, fixed up by Branden with Patrecia’s twin sister, had double-dated with the outlaws.)

After being banished by Rand, Nathaniel Branden was worried that he might be assassinated by other members of the Collective, so he moved from New York to Los Angeles, where Rand fans were less fanatical. Branden established a lucrative psychotherapy practice and authored approximately 20 books, 10 of them with either “Self” or “Self-Esteem” in the title. Rand and Branden never reconciled, but he remains an admirer of her philosophy of self-interest.

Ayn Rand’s personal life was consistent with her philosophy of not giving a shit about anybody but herself. Rand was an ardent two-pack-a-day smoker, and when questioned about the dangers of smoking, she loved to light up with a defiant flourish and then scold her young questioners on the “unscientific and irrational nature of the statistical evidence.” After an x-ray showed that she had lung cancer, Rand quit smoking and had surgery for her cancer. Collective members explained to her that many people still smoked because they respected her and her assessment of the evidence; and that since she no longer smoked, she ought to tell them. They told her that she needn’t mention her lung cancer, that she could simply say she had reconsidered the evidence. Rand refused.

How Rand’s Philosophy Seduced Young Minds

When I was a kid, my reading included comic books and Rand’s The Fountainhead and Atlas Shrugged. There wasn’t much difference between the comic books and Rand’s novels in terms of the simplicity of the heroes. What was different was that unlike Superman or Batman, Rand made selfishness heroic, and she made caring about others weakness.

Rand said, “Capitalism and altruism are incompatible....The choice is clear-cut: either a new morality of rational self-interest, with its consequences of freedom, justice, progress and man’s happiness on earth—or the primordial morality of altruism, with its consequences of slavery, brute force, stagnant terror and sacrificial furnaces.” For many young people, hearing that it is “moral” to care only about oneself can be intoxicating, and some get addicted to this idea for life.

I have known several people, professionally and socially, whose lives have been changed by those close to them who became infatuated with Ayn Rand. A common theme is something like this: “My ex-husband wasn’t a bad guy until he started reading Ayn Rand. Then he became a completely selfish jerk who destroyed our family, and our children no longer even talk to him.”

To wow her young admirers, Rand would often tell a story of how a smart-aleck book salesman had once challenged her to explain her philosophy while standing on one leg. She replied: “Metaphysics—objective reality. Epistemology—reason. Ethics—self-interest. Politics—capitalism.” How did that philosophy capture young minds?

Metaphysics—objective reality. Rand offered a narcotic for confused young people: complete certainty and a relief from their anxiety. Rand believed that an “objective reality” existed, and she knew exactly what that objective reality was. It included skyscrapers, industries, railroads, and ideas—at least her ideas. Rand’s objective reality did not include anxiety or sadness. Nor did it include much humor, at least the kind where one pokes fun at oneself. Rand assured her Collective that objective reality did not include Beethoven’s, Rembrandt’s, and Shakespeare’s realities—they were too gloomy and too tragic, basically buzzkillers. Rand preferred Mickey Spillane and, towards the end of her life, “Charlie's Angels.”

Epistemology—reason. Rand’s kind of reason was a “cool-tool” to control the universe. Rand demonized Plato, and her youthful Collective members were taught to despise him. If Rand really believed that the Socratic Method described by Plato of discovering accurate definitions and clear thinking did not qualify as “reason,” why then did she regularly attempt it with her Collective? Also oddly, while Rand mocked dark moods and despair, her “reasoning” directed that Collective members should admire Dostoyevsky, whose novels are filled with dark moods and despair. A demagogue, in addition to hypnotic glibness, must also be intellectually inconsistent, sometimes boldly so. This eliminates challenges to authority by weeding out clear-thinking young people from the flock.

Ethics—self-interest. For Rand, all altruists were manipulators. What could be more seductive to kids who discerned the motives of martyr parents, Christian missionaries and U.S. foreign aiders? Her champions, Nathaniel Branden still among them, feel that Rand’s view of “self-interest” has been horribly misrepresented. For them, self-interest is her hero architect Howard Roark turning down a commission because he couldn’t do it exactly his way. Some of Rand’s novel heroes did have integrity, however, for Rand there is no struggle to discover the distinction between true integrity and childish vanity. Rand’s integrity was her vanity, and it consisted of getting as much money and control as possible, copulating with whomever she wanted regardless of who would get hurt, and her always being right. To equate one’s selfishness, vanity, and egotism with one’s integrity liberates young people from the struggle to distinguish integrity from selfishness, vanity, and egotism.

Politics—capitalism. While Rand often disparaged Soviet totalitarian collectivism, she had little to say about corporate totalitarian collectivism, as she conveniently neglected the reality that giant U.S. corporations, like the Soviet Union, do not exactly celebrate individualism, freedom, or courage. Rand was clever and hypocritical enough to know that you don’t get rich in the United States talking about compliance and conformity within corporate America. Rather, Rand gave lectures titled: “America’s Persecuted Minority: Big Business.” So, young careerist corporatists could embrace Rand’s self-styled “radical capitalism” and feel radical — radical without risk.

Rand’s Legacy

In recent years, we have entered a phase where it is apparently okay for major political figures to publicly embrace Rand despite her contempt for Christianity. In contrast, during Ayn Rand’s life, her philosophy that celebrated self-interest was a private pleasure for the 1 percent but she was a public embarrassment for them. They used her books to congratulate themselves on the morality of their selfishness, but they publicly steered clear of Rand because of her views on religion and God. Rand, for example, had stated on national television, “I am against God. I don’t approve of religion. It is a sign of a psychological weakness. I regard it as an evil.”

Actually, again inconsistent, Rand did have a God. It was herself. She said:

I am done with the monster of “we,” the word of serfdom, of plunder, of misery, falsehood and shame. And now I see the face of god, and I raise this god over the earth, this god whom men have sought since men came into being, this god who will grant them joy and peace and pride. This god, this one word: “I.”

While Harriet Beecher Stowe shamed Americans about the United State’s dehumanization of African Americans and slavery, Ayn Rand removed Americans’ guilt for being selfish and uncaring about anyone except themselves. Not only did Rand make it “moral” for the wealthy not to pay their fair share of taxes, she “liberated” millions of other Americans from caring about the suffering of others, even the suffering of their own children.

The good news is that I’ve seen ex-Rand fans grasp the damage that Rand’s philosophy has done to their lives and to then exorcize it from their psyche. Can the United States as a nation do the same thing?