Tuesday, November 30, 2010

The Death Penalty

by John Paul Stevens

Peculiar Institution: America’s Death Penalty in an Age of Abolition
by David Garland
Belknap Press/Harvard University Press, 417 pp., $35.00


David Garland is a well-respected sociologist and legal scholar who taught courses on crime and punishment at the University of Edinburgh before relocating to the United States over a decade ago. His recent Peculiar Institution: America’s Death Penalty in an Age of Abolition is the product of his attempt to learn “why the United States is such an outlier in the severity of its criminal sentencing.” Thus, while the book primarily concerns the death penalty, it also illuminates the broader, dramatic differences between American and Western European prison sentences.

Describing his study, Garland explains:

When I talk to people about my book on capital punishment, the first thing they inevitably ask is, “Is your book for it or against it?” The answer, I tell them, is neither.
In fact, despite its ostensible amorality, his work makes a powerful argument that will persuade many readers that the death penalty is unwise and unjustified.

His explanation of why the United States retains capital punishment is based, in part, on the greater importance of local decision-making as compared with the more centralized European governments with which he was familiar before moving to New York. Some of his eminently readable prose reminds me of Alexis de Tocqueville’s nineteenth-century narrative about his visit to America; it has the objective, thought-provoking quality of an astute observer rather than that of an interested participant in American politics.

As is typical of many Supreme Court opinions rejecting legal arguments advanced by defendants in capital cases, Garland’s prologue begins with a detailed description of a horrible crime that will persuade many readers that the defendant not only deserves the death penalty but also should be subjected to the kind of torture that was common in sixteenth-century England. Garland also describes such torture in detail. This “emotional appeal” of the death penalty, Garland declares, is an important topic in his study.


Advertisement

His first chapter then includes a graphic description of the 1757 execution in Paris, France, of Robert Damiens, who tried to assassinate Louis XV, and an even more graphic description of the 1893 lynching of Henry Smith in Paris, Texas. Each was a gruesome public spectacle witnessed by a large, enthusiastic crowd. Of the latter, Garland writes:

between three and four hundred spectacle lynchings of this kind took place in the South between 1890 and 1940, along with several thousand other lynchings that proceeded with less cruelty, smaller crowds, and little ceremony.
Garland uses the “archetypal Southern lynching scene,” another gruesome execution, and chilling murders to orient his study. Not until page 36 does he pose the question that had already occurred to me: Would his analysis differ if he had initially discussed Michigan’s pathbreaking 1846 decision to abolish capital punishment for crimes besides treason? In 1846, Michigan had not executed anyone for fifteen years. Its legislature regarded the death penalty as a “dead letter,” quite inessential to crime control. Shortly before, two innocent men—one in Canada and one in New York—had been executed. Unsurprisingly, the committee report stressed the “fallibility” of the punishment. Even though Wisconsin and Rhode Island soon followed Michigan’s abolition, Garland seems to discount its importance, seeing it as the work of a small group of liberal reformers with New England backgrounds that, in his view, may not have reflected most Michiganders’ views.

Had Garland made the Michigan abolition his starting point, I suspect that readers might have been inclined to disagree with the death penalty. Execution of innocents is disturbing, both to many today and, I presume, to Michigan voters then willing to endorse their leaders’ reasoned abolitionist positions. Readers will presumably have a similar reaction to his observation that exonerations, “whereby condemned individuals are found to be innocent and are released from custody,” have “become a recurring feature of the system; indeed, since 1973, more than 130 people have been exonerated and freed from death row,” a number on the basis of DNA evidence.

Garland’s argument is historical and contemporary. Chapters 2–6 situate the modern American death penalty within US and European histories of capital punishment. On both continents, capital punishment has roots in gruesome and public spectacles: unspeakable torture and postmortem desecrations of offenders’ remains designed, respectively, to maximize suffering and exalt the omnipotence of the sovereign. In Europe, the greater availability both of deportation and of prisons led to reductions in executions, and new techniques like the guillotine made executions somewhat more humane. Eventually, in the modern period, where it survives, fundamental changes in the timing and character of executions have profoundly altered its retributive and deterrent potential.

A “lengthy and elaborate legal process has become a central feature of American capital punishment.” As a result, several executions have occurred after a delay of more than twenty years,

and some prisoners currently have been awaiting their executions for more than three decades…. Such delays do not just undermine the death penalty’s deterrent effect; they also spoil its capacity for satisfying retribution.
Changes designed to avoid needless infliction of pain have had the same effect. What once was a frightening public spectacle now resembles painless administration of preoperative anesthesia in the presence of few witnesses. American officials do not enjoy executions; “they seem, in short, embarrassed, as if caught in a transgression.”

Europeans abolished the death penalty in the decades after World War II. History, Garland contends, explains much of this transatlantic difference. In Europe,

the sequence of events was first, the formation, extension, and consolidation of state power; second, the emergence of bureaucratic rationalization; and third, the growth of popular participation.
In the United States, Garland argues, the sequence was reversed. As a result, criminal justice bureaucrats and national parties in Europe—once they became motivated to do so—imposed abolition despite popular opposition. In the United States, abolitionists found the more politicized bureaucracy and the relatively weak national parties inadequate to the task of overriding public support.

Having established that US death penalty policy is largely set locally, Garland turns to describing why and in what ways the United States retains capital punishment. In Chapter 7 he cites a tradition of community-level executions dating to colonial times, frontier beliefs in meeting violence with violence, and pluralism that inhibits solidarity with victims. Chapter 8 reviews the Legal Defense Fund’s litigation, which in 1972 produced, in Furman v. Georgia, a moratorium on executions in the forty-two jurisdictions that authorized them. The backlash was swift, as the following chapter shows in detail. Thirty-four states enacted new death penalty laws before the decade was out. One—Oregon—had not previously authorized capital punishment.

Attacks on Furman, like the related vigorous and continuing criticism of liberal Warren Court decisions protecting the rights of criminal defendants and minority voters, were an important part of the Republican Party’s “Southern strategy.” The history of racism in the South partly explains the appeal of the “states’ rights” arguments that helped move the “solid South” from the Democratic to the Republican column in national elections.

After Furman, Garland argues in Chapter 10, the Supreme Court focused on transforming capital punishment, requiring new procedural protections, reducing the cruelty of executions, and devolving power to “the people” at the local level. The concern with local policymaking that Garland emphasizes, however, has not prevented Supreme Court decisions from eliminating categories of defendants (juveniles and the mentally retarded) and offenses (rape and unintentional killings) from exposure to capital punishment nationwide.

For Garland, the death penalty is “a strange social fact that stands in need of explanation.” He approaches it and debates around it “with the sorts of questions and concepts that anthropologists bring to bear on the exotic cultural practices of a foreign society they are struggling to understand.” In his view, an important reason Americans retain capital punishment is their fascination with death. While neither the glamour nor the gore that used to attend public executions remains today, he observes, capital cases still generate extensive commentary about victims’ deaths and potential deaths of defendants. Great works of literature, like best-selling paperbacks, attract readers by discussing killings and revenge. Garland suggests that the popularity of the mystery story is part of the culture that keeps capital punishment alive. As he explains in Chapter 11, current discourse about death reflects how the purposes that American capital punishment serves have changed over the years.

Garland concludes that capital punishment today is “reasonably well adapted to the purposes that it serves, but deterrent crime control and retributive justice are not prominent among them.” Instead, the death penalty promotes “gratifications,” of “professional and political users, of the mass media, and of its public audience.” In particular, he contends, capital punishment derives “its emotional power, its popular interest, and its perennial appeal” from five types of “death penalty discourse.” They are: (1) political exploitation of the gap between the Furman decision and popular opinion; (2) adversarial legal proceedings featuring cultural tensions between capital punishment and liberal humanism; (3) the political association of capital punishment with larger political and cultural issues, such as civil rights, states’ rights, and crime control; (4) demands for revenge; and (5) the emotional power of imagining killing and death. He concludes that “the American death penalty has been transformed from a penal instrument that puts persons to death to a peculiar institution that puts death into discourse for political and cultural purposes.”

Notably, Garland all but denies that the death penalty serves significant deterrent purposes. Death penalty states, after all, have generally higher crime rates than “abolitionist” ones. For Garland, this differential helps—by eliminating one possibility—to explain the people’s decisions; it tells us nothing about the wisdom of those decisions.

To illustrate how political and cultural purposes of the death penalty have replaced penal purposes, he writes:

Support for death penalty laws allows politicians to show that they support law enforcement…. California Senator Barbara Boxer bragged that she voted 100 times for the death penalty. And George W. Bush first ran for president in a year when, as governor of Texas, he had presided over the largest number of state executions ever carried out in a single twelve-month period—a total of forty in the year 2000.
Similarly, local elections affect decisions of state prosecutors to seek the death penalty and of state judges to impose it. “In states where judges were until recently empowered to override jury sentences,” Garland explains, “elected judges typically used this power to impose death rather than life. In Alabama the death-to-life ratio of these judicial overrides was ten to one.” In Delaware, where judges are not elected, such decisions favored defendants. The “tight connection between legal decision-making and local politics produces…an obvious risk of bias in capital cases.” Popular opinion has less effect on criminal justice in Europe. European judges and prosecutors are typically tenured civil servants. Popular opinion thus has less sway over individual trials. This difference provides a powerful argument for opponents of judicial elections.

Conservatism in Action

Jonathan Chait
The Meaning Of "Constitutional Conservatism"

The emergence of "Constitutional conservatism" as a new aspect of right-wing thought is about nine-parts empty slogan and one-part actual idea. When you look at the actual idea, it's fairly scary. Conservatives are correct that the country has changed its original understanding of the Constitution. Those changes have primarily involved making the country more democratic -- we now get to elect Senators, a privilege many conservatives would like to remove. Another change is that the franchise is no longer restricted to white, male property owners. I don't see anybody looking to reverse women's suffrage or restore slavery, but Tea Party Nation Judson Phillips thinks the franchise should be taken away from renters:

The Founding Fathers originally said, they put certain restrictions on who gets the right to vote. It wasn’t you were just a citizen and you got to vote. Some of the restrictions, you know, you obviously would not think about today. But one of those was you had to be a property owner. And that makes a lot of sense, because if you’re a property owner you actually have a vested stake in the community. If you’re not a property owner, you know, I’m sorry but property owners have a little bit more of a vested interest in the community than non-property owners.

This particular element of "constitutional conservatism" also hews to the pervasive sense among conservatives that the political process has been captured by poor, lazy leeches who are exploiting the hard-working rich/middle class.

Sunday, November 28, 2010

Laura Hillenbrand - Seabiscuit

Hemingway said that only bullfighters lived their lives "all the way up." I'm not sure what he meant. Perhaps he meant that bullfighters lived their lives in the ring facing injury and death each and every day they went to work. In her story about Seabiscuit, the famous racehorse of the 30's, Laura Hillenbrand talks of jockeys of that era in the same vein. What kept them going was the transcendance they felt riding that horse around the track, perhaps the same transcendance from regular life that bullfighters felt.

"On the ground, the jockey was fettered and muted, moving in slow motion, the world a sensory vacuum, after the tenfold high of racing speed. In the saddle, emancipated from their bodies, Pollard, Woolf, and all other reinsmen sailed eight feet over the world, emphatically free, emphatically alive. They were Hemingway's bullfighters, living "all the way up."

Laura Hillenbrand, Seabiscuit, p. 80.

Tuesday, November 23, 2010

Mark Twain - Pudd'nhead Wilson

I am coming to the end of my Mark Twain reading for the year. I was going to read the new volume 1 of The Mark Twain's Project's effort, but I think not. I don't see it as pleasureable reading. I'll probably settle for reading excerpts.

Pudd'nhead is pleasureable Twain reading. You have to hang in there with the plot, for tight plotting is not one of Twain's strengths.

I see a story about miscegenation in the antebellum South, a book noted for its grim humor and its reflections on racism and responsibility. Roxana, a light-skinned mixed-race slave, switches her baby with her white owner's baby. Her natural son, Tom Driscoll, grows up in a privileged household to become a criminal who finances his gambling debts by selling her to a slave trader and who later murders his putative uncle. Meanwhile, Roxy raises Valet de Chambre as a slave. David ("Pudd'nhead") Wilson, an eccentric lawyer, determines the true identities of Tom and Valet. As a result Roxy is exposed, Wilson is elected mayor, Tom is sold into slavery, and Valet, unfitted for his newly won freedom, becomes an illiterate, uncouth landholder. Grim stuff, indeed, but true to life in the 19th century.

The book is dated. Miscegenation is not a big deal today. In Twain's time this must have been explosive stuff. Racism is not dated. The novel stands the test of time despite the vehicle of miscegenation.

We want to see Twain as a racial liberal. Perhaps he was; perhaps he wasn't. I don't know. This book puts him on the right side of history. I suspect his autobiography also puts him on the right side of history.

The Republicans Never Change

The Broken-Clock Brigade

by Jacob S. Hacker and Paul Pierson/Made the Rich Richer—And Turned Its Back on the Middle Class
Posted: November 22, 2010 04:07 PM BIO Become a Fan Get Email Alerts Bloggers' Index
Warren Buffett vs. the Broken-Clock Brigade


Warren Buffett made headlines the other day when he said on ABC that "people at the high end, people like me, should be paying a lot more taxes. We have it better than we've ever had it."

It's sign of the times that economic sanity only gets traction when it comes from the mouth of a multi-billionaire. Unfortunately for the American middle class, that traction is still pretty limited. That's because Warren Buffett's inconvenient truth stands in the way of the number-one priority of the newly empowered Republican Party. For the broken-clock brigade, no time is a bad time for more tax cuts for the super-rich.

For thirty years, the share of those at the top has been skyrocketing while the middle class struggles. The top .1% of households -- one out of every 1,000 -- has seen its share of national income almost quintuple, and now takes home about one out of every eight dollars of pre-tax income in the United States.

You'd think that against this backdrop politicians would have asked those who have done so well to pay a little larger share of their stratospheric incomes in taxes. Instead, taxes have been slashed on the highest earners, and it's the middle class and the most economically desperate who are now being told they must tighten their belts: no extensions of unemployment benefits; huge cuts in social programs that will result in big layoffs of teachers and first responders; diminished services for the neediest. The human and economic costs mount.

For the new power-brokers in Washington, however, the times they aren't a'changing. The GOP, with its new majority in the House, backed by a resurgent business community and facing a chastened Democratic Party, is doubling down on the refrain it has sustained for thirty years: the solution to our economic problems is more tax cuts for the most privileged among us, along with the freeing of business from tiresome regulation (like those for Wall Street that might have prevented the current economic calamity).

The GOP's current version of the refrain is that tax breaks for the rich are crucial because they create jobs: poor people don't create jobs, rich people do. As Rand Paul colorfully puts it, don't talk about the gap between the rich and the rest because we are all in this together: "there are no rich, there are no poor." Leave aside that this is a pernicious myth -- the rich may hire but it is a society full of energy, innovation and opportunity, built by scientists, educators and entrepreneurs of all types, that creates jobs.

The easy way to see the GOP's true motives is that no matter what happens in the real world, the clock they read always says the same thing. If budgets are flush cut (high-end) taxes because the people deserve their money back. If budgets are in deficit, cut (high-end) taxes because it generates new revenues. Peace? Cut taxes. War? As Tom DeLay once said, "nothing is more important in the face of a war than cutting taxes." For the GOP, it's always tax-cut time, and the group that always needs the tax cuts the most are those at the very top.

Leave the last word to Mr. Buffett. Presented with the GOP argument about the rich fueling the economy, he cut through the smoke: "The rich are always going to say that, you know. 'Just give us more money, and we'll go out and spend more, and then it will all trickle down to the rest of you.' But that has not worked the last 10 years and I hope the American public is catching on."

Jacob S. Hacker and Paul Pierson are the authors of Winner-Take-All Politics: How Washington

Monday, November 22, 2010

Yes, There Will Be Blood

We are indeed headed for a showdown in this country. The electorate is under the illusion that when push comes to shove, the two parties will get together and do what's right. This is not true unless the Democrats cave in to the Republicans. My guess is that this is exactly what will happen. Mr. Obama has given no indication that he is willing to fight for what is right.


NYTimes.com
Op-Ed Columnist
There Will Be Blood
By PAUL KRUGMAN
Published: November 22, 2010

Former Senator Alan Simpson is a Very Serious Person. He must be — after all, President Obama appointed him as co-chairman of a special commission on deficit reduction.
So here’s what the very serious Mr. Simpson said on Friday: “I can’t wait for the blood bath in April. ... When debt limit time comes, they’re going to look around and say, ‘What in the hell do we do now? We’ve got guys who will not approve the debt limit extension unless we give ’em a piece of meat, real meat,’ ” meaning spending cuts. “And boy, the blood bath will be extraordinary,” he continued.

Think of Mr. Simpson’s blood lust as one more piece of evidence that our nation is in much worse shape, much closer to a political breakdown, than most people realize.

Some explanation: There’s a legal limit to federal debt, which must be raised periodically if the government keeps running deficits; the limit will be reached again this spring. And since nobody, not even the hawkiest of deficit hawks, thinks the budget can be balanced immediately, the debt limit must be raised to avoid a government shutdown. But Republicans will probably try to blackmail the president into policy concessions by, in effect, holding the government hostage; they’ve done it before.

Now, you might think that the prospect of this kind of standoff, which might deny many Americans essential services, wreak havoc in financial markets and undermine America’s role in the world, would worry all men of good will. But no, Mr. Simpson “can’t wait.” And he’s what passes, these days, for a reasonable Republican.

The fact is that one of our two great political parties has made it clear that it has no interest in making America governable, unless it’s doing the governing. And that party now controls one house of Congress, which means that the country will not, in fact, be governable without that party’s cooperation — cooperation that won’t be forthcoming.

Elite opinion has been slow to recognize this reality. Thus on the same day that Mr. Simpson rejoiced in the prospect of chaos, Ben Bernanke, the Federal Reserve chairman, appealed for help in confronting mass unemployment. He asked for “a fiscal program that combines near-term measures to enhance growth with strong, confidence-inducing steps to reduce longer-term structural deficits.”

My immediate thought was, why not ask for a pony, too? After all, the G.O.P. isn’t interested in helping the economy as long as a Democrat is in the White House. Indeed, far from being willing to help Mr. Bernanke’s efforts, Republicans are trying to bully the Fed itself into giving up completely on trying to reduce unemployment.

And on matters fiscal, the G.O.P. program is to do almost exactly the opposite of what Mr. Bernanke called for. On one side, Republicans oppose just about everything that might reduce structural deficits: they demand that the Bush tax cuts be made permanent while demagoguing efforts to limit the rise in Medicare costs, which are essential to any attempts to get the budget under control. On the other, the G.O.P. opposes anything that might help sustain demand in a depressed economy — even aid to small businesses, which the party claims to love.

Right now, in particular, Republicans are blocking an extension of unemployment benefits — an action that will both cause immense hardship and drain purchasing power from an already sputtering economy. But there’s no point appealing to the better angels of their nature; America just doesn’t work that way anymore.

And opposition for the sake of opposition isn’t limited to economic policy. Politics, they used to tell us, stops at the water’s edge — but that was then.

These days, national security experts are tearing their hair out over the decision of Senate Republicans to block a desperately needed new strategic arms treaty. And everyone knows that these Republicans oppose the treaty, not because of legitimate objections, but simply because it’s an Obama administration initiative; if sabotaging the president endangers the nation, so be it.

How does this end? Mr. Obama is still talking about bipartisan outreach, and maybe if he caves in sufficiently he can avoid a federal shutdown this spring. But any respite would be only temporary; again, the G.O.P. is just not interested in helping a Democrat govern.

My sense is that most Americans still don’t understand this reality. They still imagine that when push comes to shove, our politicians will come together to do what’s necessary. But that was another country.

It’s hard to see how this situation is resolved without a major crisis of some kind. Mr. Simpson may or may not get the blood bath he craves this April, but there will be blood sooner or later. And we can only hope that the nation that emerges from that blood bath is still one we recognize.

The Republican Party=The Old Confederacy

Sunday, Nov 21, 2010 11:01 ET
If Gen. Lee hadn't surrendered at Appomattox ...
By Glenn W. LaFantasie


Gen. Robert E. Lee's surrender to Lt. Gen. Ulysses S. Grant at Appomattox on April 9, 1865. There was a time in the not so distant past when Americans could safely assume that the Civil War, which claimed 620,000 Northern and Southern lives, resulted in two immutable outcomes: It forever settled the issue that secession was illegal, and it forever abolished the institution of slavery.

Lately, though, those truisms seem not to have been written in stone. Ironically, it’s the Republican Party -- the party of Lincoln and the Northern victors -- that has voiced challenges to the old received wisdom about the legacies of the Civil War. In Texas, Gov. Rick Perry has openly spoken about secession (and opting out of -- in other words, nullifying -- federal programs such as Medicaid); Rand Paul, the Tea Party/Republican senator-elect from Kentucky, has questioned whether the Civil Rights Act of 1964 should have been passed; and a variety of Republicans have argued that the 14th Amendment, or at least a portion of it, should be rescinded.

While some of these political stands might be momentary posturing for the sake of the 2010 midterm elections, these Republican/Red State/Tea Party positions raise again the specter of the Civil War at the very moment when the nation stands ready to commemorate the sesquicentennial of that event. It’s as if the South, the purest of the Red States, didn’t lose the Civil War at all. Or put another way: The Confederacy, often depicted on maps as Red States (as opposed to Union Blue), may have lost the fighting -- forcing Robert E. Lee to surrender to Ulysses S. Grant at Appomattox Court House, Virginia, on April 9, 1865 -- but the social and political values of those Red States live on, nurtured and sustained in a Republican Party that often sounds more Confederate in its ideology than Jefferson Davis, the first and only president of the Confederate States of America.

Sunday, November 21, 2010

A Publishing Phenomenon

Mark Twain’s Autobiography Flying Off the Shelves
By JULIE BOSMAN
Published: November 19, 2010

When editors at the University of California Press pondered the possible demand for “Autobiography of Mark Twain,” a $35, four-pound, 500,000-word doorstopper of a memoir, they kept their expectations modest with a planned print run of 7,500 copies.

“Autobiography of Mark Twain” is a smash hit across the country.
Now it is a smash hit across the country, landing on best-seller lists and going back to press six times, for a total print run — so far — of 275,000. The publisher cannot print copies quickly enough, leaving some bookstores and online retailers stranded without copies just as the holiday shopping season begins.

“It sold right out,” said Kris Kleindienst, an owner of Left Bank Books in St. Louis, which first ordered 50 copies and has a dozen people on a waiting list. “You would think only completists and scholars would want a book like this. But there’s an enduring love affair with Mark Twain, especially around here. Anybody within a stone’s throw of the Mississippi River has a Twain attachment.”

Farther upriver, at the Prairie Lights bookstore in Iowa City, Paul Ingram, the book buyer, said he initially ordered 10 copies, but they disappeared almost immediately.

“We are dearly hoping we’ll get more copies in a couple of weeks,” Mr. Ingram said. “I’m sure every bookseller in the world is saying, ‘I should have been sharper, I should have thought this one through more carefully.’ ”

Earlier this week, the book was out of stock at a handful of Barnes & Noble stores in Chicago, Boston and Austin, Tex. On Borders.com, it is back-ordered for at least two to four weeks. Some independent booksellers said they had been told, much to their despair, that they would not receive reorders until mid-December or even January.

“It’s frustrating,” said Rona Brinlee, the owner of the BookMark in Neptune Beach, Fla. “In this age of instant books, why does it take so long to reprint it?”

Those who have been lining up to buy it seem to be a mix of Twain aficionados, history buffs and early Christmas shoppers who gravitate toward big, heavy classic biographies as gifts.

“It’s totally the Dad book of the year,” said Rebecca Fitting, an owner of the Greenlight Bookstore in Fort Greene, Brooklyn. “It’s that autobiography, biography, history category, a certain kind of guy gift book.”

Many booksellers said the memoir has a perfect holiday-gift quality: a widely adored author, a weighty feel, and a unique story behind its publication. (Twain ordered that the book be published a century after his death.)

Most of the content was dictated to Twain’s stenographer in the four years before he died, at 74 in 1910. It is more political than his previous works, by turns frank, funny, angry and full of recollections from his childhood, which deeply influenced books like “Huckleberry Finn.”

A younger generation of readers is discovering Twain for his political writings, Ms. Fitting said.

“He’s surprisingly relevant right now,” she added. “When you look at how much he wrote and the breadth of the subjects he wrote about, you know that if he were alive today, he would totally be a blogger.”

Steve Kettmann, an American writer living in Berlin, said that he tried to buy a copy during a visit to a Borders in Orlando, Fla., but was told that they were sold out and would not receive more copies for four to six weeks. (He went to another Borders nearby, found two copies, and bought them both.)

“I just think that there’s a feeling out there by a lot of people that Mark Twain is one of our greatest writers, and there’s something particularly American about his combination of wit and insight,” Mr. Kettmann said. “He was a wonderful showman. And he was cool, let’s face it. That’s part of it.”

Alex Dahne, a spokeswoman for the University of California Press, said the book was the biggest success the publisher has had in 60 years.

The first print run of “Autobiography” was for 50,000 copies. Thomson-Shore, a small printer in Michigan that is producing the books, has been working overtime and is now producing 30,000 copies a week. To speed up delivery, the printer found bigger-than-usual trucks to carry books to warehouses in Richmond, Calif., and Ewing, N.J. — the trucks carry 10,000 copies instead of the usual 7,000.

The book will reach the No. 7 spot on The New York Times’s hardcover nonfiction best-seller list to be published on Nov. 28, its fourth week on the list. On Friday afternoon it was No. 4 on the BN.com best-seller list, behind “Decision Points,” former President George W. Bush’s memoir; the latest “Diary of a Wimpy Kid,” an illustrated children’s novel by Jeff Kinney; and “Unbroken,” a prisoner-of-war’s story by Laura Hillenbrand.

“Autobiography of Mark Twain” received a huge lift from excerpts in Granta, Newsweek, Playboy and Harper’s Magazine, and a burst of early media coverage this summer, well in advance of the official Nov. 15 publication date. The publisher created an eye-catching Web site, thisismarktwain.com, complete with audio, black-and-white photos and a timeline of Twain’s life. (Two more 600-page volumes are planned.)

Edward Ash-Milby, a buyer for Barnes & Noble, said the book had already emerged as one of the hottest of the holiday season.

“I believe it has a certain cachet, a gift of quality that says a lot about the giver as well as the recipient,” Mr. Ash-Milby said in an e-mail. “It’s literary, but not too tough to read. The content, itself, is immensely readable, although nonlinear. It can be easily picked up and read in spots without the worry of plot lines or continuity.”

Booksellers seemed to agree that the memoir, which has letters, diary entries, pictures and nearly 200 pages of “explanatory notes,” is a book to be read in small bites.

“I’ve barely had a chance to look at it, but from what I did see, it looked like the kind of book you would never finish, and you would never even think of reading start to finish,” said Mr. Ingram of Prairie Lights. “But it’s the kind of book you would read a little bit of every day of your life.”

While many booksellers were caught flat-footed by the intense interest in the book, others said they saw it coming. The book is currently available at Amazon.com and BN.com. At BookCourt, an independent store in Brooklyn, booksellers initially ordered 100 copies, the general manager, Zack Zook, said.

“We felt from the beginning that it was a title which our neighborhood would gravitate heavily toward,” Mr. Zook said. “There’s genuine interest there. It’s been on our best-seller list now for weeks.”

Powell’s Books in Portland, Ore., ordered 600 copies and has already sold 500. Six hundred more books are on the way.

Ms. Dahne of the University of California Press said the publisher was rushing to get copies to bookstores and promised that they would be there in time for the holidays.

“We feel like, wow, America’s kind of excited about a literary icon,” she said. “There’s something very sweet about the fact that people are interested in a 736-page scholarly tome about Mark Twain.”

Digital Distraction

Growing Up Digital, Wired for Distraction


Students have always faced distractions and time-wasters. But computers and cellphones, and the constant stream of stimuli they offer, pose a profound new challenge to focusing and learning.

By MATT RICHTEL
Published: November 21, 2010


REDWOOD CITY, Calif. — On the eve of a pivotal academic year in Vishal Singh’s life, he faces a stark choice on his bedroom desk: book or computer?

Your Brain on Computers
Staying on Task

Articles in this series examine how a deluge of data can affect the way people think and behave.

Vishal Singh, 17, often chooses time on his computer over doing homework. Vishal, whose lighting gear is on the bed, is an aspiring filmmaker.
Enlarge This Image

Students at Woodside High School are often reminded of the restrictions on their phones at school.

Allison Miller sends and receives 27,000 texts a month, carrying on multiple text conversations at a time.

Mr. Reilly, the principal, says that the unchecked use of devices can create a culture in which students are addicted to and lost in the virtual world.

Woodside introduced a popular audio course last year that uses digital tools to record music.

By all rights, Vishal, a bright 17-year-old, should already have finished the book, Kurt Vonnegut’s “Cat’s Cradle,” his summer reading assignment. But he has managed 43 pages in two months.

He typically favors Facebook, YouTube and making digital videos. That is the case this August afternoon. Bypassing Vonnegut, he clicks over to YouTube, meaning that tomorrow he will enter his senior year of high school hoping to see an improvement in his grades, but without having completed his only summer homework.

On YouTube, “you can get a whole story in six minutes,” he explains. “A book takes so long. I prefer the immediate gratification.”

Students have always faced distractions and time-wasters. But computers and cellphones, and the constant stream of stimuli they offer, pose a profound new challenge to focusing and learning.

Researchers say the lure of these technologies, while it affects adults too, is particularly powerful for young people. The risk, they say, is that developing brains can become more easily habituated than adult brains to constantly switching tasks — and less able to sustain attention.

“Their brains are rewarded not for staying on task but for jumping to the next thing,” said Michael Rich, an associate professor at Harvard Medical School and executive director of the Center on Media and Child Health in Boston. And the effects could linger: “The worry is we’re raising a generation of kids in front of screens whose brains are going to be wired differently.”

But even as some parents and educators express unease about students’ digital diets, they are intensifying efforts to use technology in the classroom, seeing it as a way to connect with students and give them essential skills. Across the country, schools are equipping themselves with computers, Internet access and mobile devices so they can teach on the students’ technological territory.

It is a tension on vivid display at Vishal’s school, Woodside High School, on a sprawling campus set against the forested hills of Silicon Valley. Here, as elsewhere, it is not uncommon for students to send hundreds of text messages a day or spend hours playing video games, and virtually everyone is on Facebook.

The principal, David Reilly, 37, a former musician who says he sympathizes when young people feel disenfranchised, is determined to engage these 21st-century students. He has asked teachers to build Web sites to communicate with students, introduced popular classes on using digital tools to record music, secured funding for iPads to teach Mandarin and obtained $3 million in grants for a multimedia center.

He pushed first period back an hour, to 9 a.m., because students were showing up bleary-eyed, at least in part because they were up late on their computers. Unchecked use of digital devices, he says, can create a culture in which students are addicted to the virtual world and lost in it.

“I am trying to take back their attention from their BlackBerrys and video games,” he says. “To a degree, I’m using technology to do it.”

The same tension surfaces in Vishal, whose ability to be distracted by computers is rivaled by his proficiency with them. At the beginning of his junior year, he discovered a passion for filmmaking and made a name for himself among friends and teachers with his storytelling in videos made with digital cameras and editing software.

He acts as his family’s tech-support expert, helping his father, Satendra, a lab manager, retrieve lost documents on the computer, and his mother, Indra, a security manager at the San Francisco airport, build her own Web site.

But he also plays video games 10 hours a week. He regularly sends Facebook status updates at 2 a.m., even on school nights, and has such a reputation for distributing links to videos that his best friend calls him a “YouTube bully.”

Several teachers call Vishal one of their brightest students, and they wonder why things are not adding up. Last semester, his grade point average was 2.3 after a D-plus in English and an F in Algebra II. He got an A in film critique.

“He’s a kid caught between two worlds,” said Mr. Reilly — one that is virtual and one with real-life demands.

Vishal, like his mother, says he lacks the self-control to favor schoolwork over the computer. She sat him down a few weeks before school started and told him that, while she respected his passion for film and his technical skills, he had to use them productively.

“This is the year,” she says she told him. “This is your senior year and you can’t afford not to focus.”

It was not always this way. As a child, Vishal had a tendency to procrastinate, but nothing like this. Something changed him.

Growing Up With Gadgets

When he was 3, Vishal moved with his parents and older brother to their current home, a three-bedroom house in the working-class section of Redwood City, a suburb in Silicon Valley that is more diverse than some of its elite neighbors.

Thin and quiet with a shy smile, Vishal passed the admissions test for a prestigious public elementary and middle school. Until sixth grade, he focused on homework, regularly going to the house of a good friend to study with him.

But Vishal and his family say two things changed around the seventh grade: his mother went back to work, and he got a computer. He became increasingly engrossed in games and surfing the Internet, finding an easy outlet for what he describes as an inclination to procrastinate.

“I realized there were choices,” Vishal recalls. “Homework wasn’t the only option.”

Several recent studies show that young people tend to use home computers for entertainment, not learning, and that this can hurt school performance, particularly in low-income families. Jacob L. Vigdor, an economics professor at Duke University who led some of the research, said that when adults were not supervising computer use, children “are left to their own devices, and the impetus isn’t to do homework but play around.”

Research also shows that students often juggle homework and entertainment. The Kaiser Family Foundation found earlier this year that half of students from 8 to 18 are using the Internet, watching TV or using some other form of media either “most” (31 percent) or “some” (25 percent) of the time that they are doing homework.

At Woodside, as elsewhere, students’ use of technology is not uniform. Mr. Reilly, the principal, says their choices tend to reflect their personalities. Social butterflies tend to be heavy texters and Facebook users. Students who are less social might escape into games, while drifters or those prone to procrastination, like Vishal, might surf the Web or watch videos.

The technology has created on campuses a new set of social types — not the thespian and the jock but the texter and gamer, Facebook addict and YouTube potato.

“The technology amplifies whoever you are,” Mr. Reilly says.

For some, the amplification is intense. Allison Miller, 14, sends and receives 27,000 texts in a month, her fingers clicking at a blistering pace as she carries on as many as seven text conversations at a time. She texts between classes, at the moment soccer practice ends, while being driven to and from school and, often, while studying.

Most of the exchanges are little more than quick greetings, but they can get more in-depth, like “if someone tells you about a drama going on with someone,” Allison said. “I can text one person while talking on the phone to someone else.”

But this proficiency comes at a cost: she blames multitasking for the three B’s on her recent progress report.

“I’ll be reading a book for homework and I’ll get a text message and pause my reading and put down the book, pick up the phone to reply to the text message, and then 20 minutes later realize, ‘Oh, I forgot to do my homework.’ ”

Some shyer students do not socialize through technology — they recede into it. Ramon Ochoa-Lopez, 14, an introvert, plays six hours of video games on weekdays and more on weekends, leaving homework to be done in the bathroom before school.

Escaping into games can also salve teenagers’ age-old desire for some control in their chaotic lives. “It’s a way for me to separate myself,” Ramon says. “If there’s an argument between my mom and one of my brothers, I’ll just go to my room and start playing video games and escape.”

With powerful new cellphones, the interactive experience can go everywhere. Between classes at Woodside or at lunch, when use of personal devices is permitted, students gather in clusters, sometimes chatting face to face, sometimes half-involved in a conversation while texting someone across the teeming quad. Others sit alone, watching a video, listening to music or updating Facebook.

Students say that their parents, worried about the distractions, try to police computer time, but that monitoring the use of cellphones is difficult. Parents may also want to be able to call their children at any time, so taking the phone away is not always an option.

Other parents wholly embrace computer use, even when it has no obvious educational benefit.

“If you’re not on top of technology, you’re not going to be on top of the world,” said John McMullen, 56, a retired criminal investigator whose son, Sean, is one of five friends in the group Vishal joins for lunch each day.

Sean’s favorite medium is video games; he plays for four hours after school and twice that on weekends. He was playing more but found his habit pulling his grade point average below 3.2, the point at which he felt comfortable. He says he sometimes wishes that his parents would force him to quit playing and study, because he finds it hard to quit when given the choice. Still, he says, video games are not responsible for his lack of focus, asserting that in another era he would have been distracted by TV or something else.

“Video games don’t make the hole; they fill it,” says Sean, sitting at a picnic table in the quad, where he is surrounded by a multimillion-dollar view: on the nearby hills are the evergreens that tower above the affluent neighborhoods populated by Internet tycoons. Sean, a senior, concedes that video games take a physical toll: “I haven’t done exercise since my sophomore year. But that doesn’t seem like a big deal. I still look the same.”

Sam Crocker, Vishal’s closest friend, who has straight A’s but lower SAT scores than he would like, blames the Internet’s distractions for his inability to finish either of his two summer reading books.

“I know I can read a book, but then I’m up and checking Facebook,” he says, adding: “Facebook is amazing because it feels like you’re doing something and you’re not doing anything. It’s the absence of doing something, but you feel gratified anyway.”

He concludes: “My attention span is getting worse.”

The Lure of Distraction


Some neuroscientists have been studying people like Sam and Vishal. They have begun to understand what happens to the brains of young people who are constantly online and in touch.

In an experiment at the German Sport University in Cologne in 2007, boys from 12 to 14 spent an hour each night playing video games after they finished homework.

On alternate nights, the boys spent an hour watching an exciting movie, like “Harry Potter” or “Star Trek,” rather than playing video games. That allowed the researchers to compare the effect of video games and TV.

The researchers looked at how the use of these media affected the boys’ brainwave patterns while sleeping and their ability to remember their homework in the subsequent days. They found that playing video games led to markedly lower sleep quality than watching TV, and also led to a “significant decline” in the boys’ ability to remember vocabulary words. The findings were published in the journal Pediatrics.

Markus Dworak, a researcher who led the study and is now a neuroscientist at Harvard, said it was not clear whether the boys’ learning suffered because sleep was disrupted or, as he speculates, also because the intensity of the game experience overrode the brain’s recording of the vocabulary.

“When you look at vocabulary and look at huge stimulus after that, your brain has to decide which information to store,” he said. “Your brain might favor the emotionally stimulating information over the vocabulary.”

At the University of California, San Francisco, scientists have found that when rats have a new experience, like exploring an unfamiliar area, their brains show new patterns of activity. But only when the rats take a break from their exploration do they process those patterns in a way that seems to create a persistent memory.

In that vein, recent imaging studies of people have found that major cross sections of the brain become surprisingly active during downtime. These brain studies suggest to researchers that periods of rest are critical in allowing the brain to synthesize information, make connections between ideas and even develop the sense of self.

Researchers say these studies have particular implications for young people, whose brains have more trouble focusing and setting priorities.

“Downtime is to the brain what sleep is to the body,” said Dr. Rich of Harvard Medical School. “But kids are in a constant mode of stimulation.”

“The headline is: bring back boredom,” added Dr. Rich, who last month gave a speech to the American Academy of Pediatrics entitled, “Finding Huck Finn: Reclaiming Childhood from the River of Electronic Screens.”

Dr. Rich said in an interview that he was not suggesting young people should toss out their devices, but rather that they embrace a more balanced approach to what he said were powerful tools necessary to compete and succeed in modern life.

The heavy use of devices also worries Daniel Anderson, a professor of psychology at the University of Massachusetts at Amherst, who is known for research showing that children are not as harmed by TV viewing as some researchers have suggested.

Multitasking using ubiquitous, interactive and highly stimulating computers and phones, Professor Anderson says, appears to have a more powerful effect than TV.

Like Dr. Rich, he says he believes that young, developing brains are becoming habituated to distraction and to switching tasks, not to focus.

“If you’ve grown up processing multiple media, that’s exactly the mode you’re going to fall into when put in that environment — you develop a need for that stimulation,” he said.

Vishal can attest to that.

“I’m doing Facebook, YouTube, having a conversation or two with a friend, listening to music at the same time. I’m doing a million things at once, like a lot of people my age,” he says. “Sometimes I’ll say: I need to stop this and do my schoolwork, but I can’t.”

“If it weren’t for the Internet, I’d focus more on school and be doing better academically,” he says. But thanks to the Internet, he says, he has discovered and pursued his passion: filmmaking. Without the Internet, “I also wouldn’t know what I want to do with my life.”

Clicking Toward a Future


The woman sits in a cemetery at dusk, sobbing. Behind her, silhouetted and translucent, a man kneels, then fades away, a ghost.

This captivating image appears on Vishal’s computer screen. On this Thursday afternoon in late September, he is engrossed in scenes he shot the previous weekend for a music video he is making with his cousin.

The video is based on a song performed by the band Guns N’ Roses about a woman whose boyfriend dies. He wants it to be part of the package of work he submits to colleges that emphasize film study, along with a documentary he is making about home-schooled students.

Now comes the editing. Vishal taught himself to use sophisticated editing software in part by watching tutorials on YouTube. He does not leave his chair for more than two hours, sipping Pepsi, his face often inches from the screen, as he perfects the clip from the cemetery. The image of the crying woman was shot separately from the image of the kneeling man, and he is trying to fuse them.

“I’m spending two hours to get a few seconds just right,” he says.

He occasionally sends a text message or checks Facebook, but he is focused in a way he rarely is when doing homework. He says the chief difference is that filmmaking feels applicable to his chosen future, and he hopes colleges, like the University of Southern California or the California Institute of the Arts in Los Angeles, will be so impressed by his portfolio that they will overlook his school performance.

“This is going to compensate for the grades,” he says. On this day, his homework includes a worksheet for Latin, some reading for English class and an economics essay, but they can wait.

For Vishal, there’s another clear difference between filmmaking and homework: interactivity. As he edits, the windows on the screen come alive; every few seconds, he clicks the mouse to make tiny changes to the lighting and flow of the images, and the software gives him constant feedback.

“I click and something happens,” he says, explaining that, by comparison, reading a book or doing homework is less exciting. “I guess it goes back to the immediate gratification thing.”

The $2,000 computer Vishal is using is state of the art and only a week old. It represents a concession by his parents. They allowed him to buy it, despite their continuing concerns about his technology habits, because they wanted to support his filmmaking dream. “If we put roadblocks in his way, he’s just going to get depressed,” his mother says. Besides, she adds, “he’s been making an effort to do his homework.”

At this point in the semester, it seems she is right. The first schoolwide progress reports come out in late September, and Vishal has mostly A’s and B’s. He says he has been able to make headway by applying himself, but also by cutting back his workload. Unlike last year, he is not taking advanced placement classes, and he has chosen to retake Algebra II not in the classroom but in an online class that lets him work at his own pace.

His shift to easier classes might not please college admissions officers, according to Woodside’s college adviser, Zorina Matavulj. She says they want seniors to intensify their efforts. As it is, she says, even if Vishal improves his performance significantly, someone with his grades faces long odds in applying to the kinds of colleges he aspires to.

Still, Vishal’s passion for film reinforces for Mr. Reilly, the principal, that the way to reach these students is on their own terms.

Hands-On Technology

Big Macintosh monitors sit on every desk, and a man with hip glasses and an easygoing style stands at the front of the class. He is Geoff Diesel, 40, a favorite teacher here at Woodside who has taught English and film. Now he teaches one of Mr. Reilly’s new classes, audio production. He has a rapt audience of more than 20 students as he shows a video of the band Nirvana mixing their music, then holds up a music keyboard.

“Who knows how to use Pro Tools? We’ve got it. It’s the program used by the best music studios in the world,” he says.

In the back of the room, Mr. Reilly watches, thrilled. He introduced the audio course last year and enough students signed up to fill four classes. (He could barely pull together one class when he introduced Mandarin, even though he had secured iPads to help teach the language.)

“Some of these students are our most at-risk kids,” he says. He means that they are more likely to tune out school, skip class or not do their homework, and that they may not get healthful meals at home. They may also do their most enthusiastic writing not for class but in text messages and on Facebook. “They’re here, they’re in class, they’re listening.”

Despite Woodside High’s affluent setting, about 40 percent of its 1,800 students come from low-income families and receive a reduced-cost or free lunch. The school is 56 percent Latino, 38 percent white and 5 percent African-American, and it sends 93 percent of its students to four-year or community colleges.

Mr. Reilly says that the audio class provides solid vocational training and can get students interested in other subjects.

“Today mixing music, tomorrow sound waves and physics,” he says. And he thinks the key is that they love not just the music but getting their hands on the technology. “We’re meeting them on their turf.”

It does not mean he sees technology as a panacea. “I’ll always take one great teacher in a cave over a dozen Smart Boards,” he says, referring to the high-tech teaching displays used in many schools.

Teachers at Woodside commonly blame technology for students’ struggles to concentrate, but they are divided over whether embracing computers is the right solution.

“It’s a catastrophe,” said Alan Eaton, a charismatic Latin teacher. He says that technology has led to a “balkanization of their focus and duration of stamina,” and that schools make the problem worse when they adopt the technology.

“When rock ’n’ roll came about, we didn’t start using it in classrooms like we’re doing with technology,” he says. He personally feels the sting, since his advanced classes have one-third as many students as they had a decade ago.

Vishal remains a Latin student, one whom Mr. Eaton describes as particularly bright. But the teacher wonders if technology might be the reason Vishal seems to lose interest in academics the minute he leaves class.

Mr. Diesel, by contrast, does not think technology is behind the problems of Vishal and his schoolmates — in fact, he thinks it is the key to connecting with them, and an essential tool. “It’s in their DNA to look at screens,” he asserts. And he offers another analogy to explain his approach: “Frankenstein is in the room and I don’t want him to tear me apart. If I’m not using technology, I lose them completely.”

Mr. Diesel had Vishal as a student in cinema class and describes him as a “breath of fresh air” with a gift for filmmaking. Mr. Diesel says he wonders if Vishal is a bit like Woody Allen, talented but not interested in being part of the system.

But Mr. Diesel adds: “If Vishal’s going to be an independent filmmaker, he’s got to read Vonnegut. If you’re going to write scripts, you’ve got to read.”

Back to Reading Aloud

Vishal sits near the back of English IV. Marcia Blondel, a veteran teacher, asks the students to open the book they are studying, “The Things They Carried,” which is about the Vietnam War.

“Who wants to read starting in the middle of Page 137?” she asks. One student begins to read aloud, and the rest follow along.

To Ms. Blondel, the exercise in group reading represents a regression in American education and an indictment of technology. The reason she has to do it, she says, is that students now lack the attention span to read the assignments on their own.

“How can you have a discussion in class?” she complains, arguing that she has seen a considerable change in recent years. In some classes she can count on little more than one-third of the students to read a 30-page homework assignment.

She adds: “You can’t become a good writer by watching YouTube, texting and e-mailing a bunch of abbreviations.”

As the group-reading effort winds down, she says gently: “I hope this will motivate you to read on your own.”

It is a reminder of the choices that have followed the students through the semester: computer or homework? Immediate gratification or investing in the future?

Mr. Reilly hopes that the two can meet — that computers can be combined with education to better engage students and can give them technical skills without compromising deep analytical thought.

But in Vishal’s case, computers and schoolwork seem more and more to be mutually exclusive. Ms. Blondel says that Vishal, after a decent start to the school year, has fallen into bad habits. In October, he turned in weeks late, for example, a short essay based on the first few chapters of “The Things They Carried.” His grade at that point, she says, tracks around a D.

For his part, Vishal says he is investing himself more in his filmmaking, accelerating work with his cousin on their music video project. But he is also using Facebook late at night and surfing for videos on YouTube. The evidence of the shift comes in a string of Facebook updates.

Saturday, 11:55 p.m.: “Editing, editing, editing”

Sunday, 3:55 p.m.: “8+ hours of shooting, 8+ hours of editing. All for just a three-minute scene. Mind = Dead.”

Sunday, 11:00 p.m.: “Fun day, finally got to spend a day relaxing... now about that homework...”

Progressive Disappointment with Obama

November 21, 2010, 2:07 am
by Paul Krugman

FDR, Reagan, and Obama

Some readers may recall that back during the Democratic primary Barack Obama shocked many progressives by praising Ronald Reagan as someone who brought America a “sense of dynamism and entrepreneurship that had been missing.” I was among those who found this deeply troubling — because the idea that Reagan brought a transfomation in American dynamism is a right-wing myth, not borne out by the facts. (There was a surge in productivity and innovation — but it happened in the 90s, under Clinton, not under Reagan).

All the usual suspects pooh-poohed these concerns; it was ridiculous, they said, to think of Obama as a captive of right-wing mythology.

But are you so sure about that now?

And here’s this, from Thomas Ferguson: Obama saying

We didn’t actually, I think, do what Franklin Delano Roosevelt did, which was basically wait for six months until the thing had gotten so bad that it became an easier sell politically because we thought that was irresponsible. We had to act quickly.

As Ferguson explains, this is a right-wing smear. What actually happened was that during the interregnum between the 1932 election and the1933 inauguration — which was much longer then, because the inauguration didn’t take place until March — Herbert Hoover tried to rope FDR into maintaining his policies, including rigid adherence to the gold standard and fiscal austerity. FDR declined to be part of this.

But Obama buys the right-wing smear.

More and more, it’s becoming clear that progressives who had their hearts set on Obama were engaged in a huge act of self-delusion. Once you got past the soaring rhetoric you noticed, if you actually paid attention to what he said, that he largely accepted the conservative storyline, a view of the world, including a mythological history, that bears little resemblance to the facts.

And confronted with a situation utterly at odds with that storyline … he stayed with the myth.

Obama and Job

Essay
Obama and the Book of Job
By JON MEACHAM
Unfortunately for the powerful, the plight of the biblical Job is a story with perennial resonance. A man seemingly rich in the gifts life has to offer, happy and blessed, finds himself — unjustly, from his perspective — bereft. Protected and apparently invincible one day, he is buffeted as God turns his back on his former beloved, producing rage, confusion and self-pity. In the history of the American presidency, reversal happens time and time again: Lyndon Johnson declining to run four years after his landslide victory in 1964, George H. W. Bush losing re-election after winning the Persian Gulf war of 1991, Bill Clinton in the 1994 midterms, and now Barack Obama.


The connection between the trials of Job and the president’s midterm rebuke came to mind as I read what I think is the political book of the season: THE WISDOM BOOKS: Job, Proverbs, and Ecclesiastes (Norton, $35), a new translation and commentary by Robert Alter. A master translator of Hebrew poetry, Alter previously rendered the Pentateuch and the Psalms into affecting contemporary English, and has now turned his attention to what he calls the most mysterious books in the Hebrew Bible.

The Wisdom books have a “distinctive identity” in the context of the narrative of Scripture; there is, Alter writes, “little . . . that is specifically Israelite.” Wisdom literature concerns itself with questions both existential and moral: What is the nature of life, and how are we to conduct ourselves as we go about living? These books are linked to a broader, more universalist tradition in the ancient Near East.

Job and Ecclesiastes are especially atypical, for they are philosophically bleak, asking unanswerable questions. In these books God is great, but he is not necessarily good. Why do the innocent suffer and die? Why are some rich, and others poor? Why are some hearts full, and others perpetually broken? The replies are hardly the stuff of Sunday school lessons. As Job says at one point:


Man born of woman, scant of days and sated with trouble,
like a blossom he comes forth and withers, and flees like a shadow — he will not stay.

The texts make for illuminating reading in a season of widespread economic pain and political upheaval. They should assuage the gloom of the defeated and temper the joy of the victors. “All is mere breath,” says the narrator of Ecclesiastes, adding, “That which was is that which will be, and that which was done is that which will be done, and there is nothing new under the sun.”

Outside politics, President Obama thinks of himself less as a professor or community organizer and more as a writer — a man who observes reality, interprets it internally, and then recasts it on the page in his own voice and through his own eyes. And he is a reader of serious books.

Given that, he might find Alter’s new book congenial. John Boehner is not exactly a case of boils, but the president may feel differently at the moment, and thus the story of Job could be of some use to him.

Like Obama, Job was once the highly favored one:


Would that I were as in moons of yore, as the days when God watched over me,
when he shined his lamp over my head. . . .

But the Lord withdraws his protection, inflicting pain and death and misery on Job, who cries:


Terror rolls over me, pursues my path like the wind. . . .
At night my limbs are pierced, and my sinews know no rest.
With great power he seizes my garment, grabs hold of me at the collar.
He hurls me into the muck, and I become like dust and ashes.

God is having none of it. He will not be questioned by a mortal, even a mortal whom he once loved and who has honored him. Fairly snarling, the Lord taunts Job from a whirlwind: “Where were you when I founded earth? / Tell, if you know understanding.”

Four brilliant, contemptuous chapters of the poetry of power follow this sneering query — or, more precisely, the poetry of God’s power and man’s powerlessness. They are humbling verses, exhausting even. The reader feels berated and beaten, the victim of a mighty torrent from a boastful, cold, imperious God. (This is how Dick Cheney’s vision of unfettered executive power might sound if rendered in ancient Hebrew verse: The Unilateralist in the Whirlwind.)

Job finally surrenders — he has no other choice — and humbles himself, recanting his challenge and repenting in “dust and ashes.” With that, God tries to make amends with gifts of livestock and new children, and there is a telling line in this bittersweet ending. “And all his male and female kinfolk and all who had known him before came and broke bread with him in his house and grieved with him and comforted him for all the harm that the Lord had brought on him.”

Commiseration and communion, however fleeting, are thus given their place in the human enterprise, a reminder that life on this side of the grave is ultimately redeemable (and endurable) only through alliance and affection.

The ethos of resignation that pervades Alter’s translation is hardly cheering. Ecclesiastes (Alter uses the Hebrew title Qohelet, from the root q-h-l, which means “to assemble,” as in the assembling of an audience to hear a philosophical discourse) in particular is all too eloquent and convincing on the question of the provisional nature of life, advising its readers to take comfort in the pleasures of the senses: “There is nothing better for a man than to eat and drink and sate himself with good things through his toil.” We associate such views more with pagan writers than with biblical ones. Alter is unsparing in interpreting verses in all their matter-of-factness about the limits of the human condition.

The Wisdom books force readers to face uncomfortable truths. “There is no remembrance of the first things nor of the last things that will be,” says Ecclesiastes. In a footnote, Alter observes: “This is a radical and deeply disturbing idea for the Hebrew imagination, which, on the evidence of many earlier texts, sets such great store in leaving a remembrance, and envisages the wiping out of remembrance as an ultimate curse.”

And yet, and yet. All is not lost, which should give the president some hope amid the shadows, and should keep the Republicans from thinking that their own course will now be unimpeded. “And I saw that wisdom surpasses folly as light surpasses darkness,” says Ecclesiastes. “The wise man has eyes in his head, and the fool goes in darkness.” The world will never bend itself totally to our purposes, but Job’s example offers us some hope: endure in tribulation, and perhaps all may be well.


Jon Meacham, whose book “American Lion: Andrew Jackson in the White House” won a Pulitzer Prize last year, is writing a biography of Thomas Jefferson. In January, he will become the executive editor at Random House.

Saturday, November 20, 2010

Another Good Palin Story

Palin's Allergy To Expertise
19 Nov 2010 08:31 pm
Real Clear Politics reports that "SarahPAC recently hired Joshua Livestro – a Dutch newspaper columnist who has also contributed to the pro-Palin web site Conservatives4Palin.com – to research [the European debt crisis] for Palin on a freelance basis." Frum is amazed:

[I]t’s hard for me to imagine any expert in any subject who wouldn’t feel it an imperative public duty to talk to Gov. Palin if asked. Instead she turns to a journalist with no formal training in economics and no experience in public finance. ... [W]hy is a potential president relying for economic advice on freelance journalists rather than Nobel Prizewinners? It’s almost as if Gov. Palin finds the idea of expertise – not merely incomprehensible – but actively repugnant.

Well, there's always the danger that someone with more knowledge than she might actually disagree with her. And we can't have that.

Excerpts from the Autobiography of Mark Twain

Excerpts From the ‘Autobiography of Mark Twain’
Published: July 9, 2010



One hundred years after his death, the University of California Press is publishing the “Autobiography of Mark Twain: The Complete and Authoritative Edition” in a series of three volumes, edited by Harriet Elinor Smith and the editors of the Mark Twain Project. It will be the first time the entire text has appeared in print. Here are some of Twain’s spicier comments, all drawn from Volume 1.

Look Who’s Got a Best Seller: Mark Twain’s Autobiography Is Hot (November 20, 2010)
Dead for a Century, Twain Says What He Meant (July 10, 2010) ON THEODORE ROOSEVELT

“Theodore Roosevelt is one of the most impulsive men in existence ... He flies from one thing to another with incredible dispatch — throws a somersault and is straightaway back again where he was last week. He will then throw some more somersaults and nobody can foretell where he is finally going to land after the series. Each act of his, and each opinion expressed, is likely to abolish or controvert some previous act or expressed opinion. That is what is happening to him all the time as president.”

ON THE MEANING OF THANKSGIVING

“Thanksgiving Day, a function which originated in New England two or three centuries ago when those people recognized that they really had something to be thankful for — annually, not oftener — if they had succeeded in exterminating their neighbors, the Indians, during the previous twelve months instead of getting exterminated by their neighbors the Indians. Thanksgiving Day became a habit, for the reason that in the course of time, as the years drifted on, it was perceived that the exterminating had ceased to be mutual and was all on the white man’s side, consequently on the Lord’s side, consequently it was proper to thank the Lord for it.”

ON THE AMERICAN BUSINESS CLASS

“The multimillionaire disciples of Jay Gould — that man who in his brief life rotted the commercial morals of this nation and left them stinking when he died — have quite completely transformed our people from a nation with pretty high and respectable ideals to just the opposite of that; that our people have no ideals now that are worthy of consideration; that our Christianity which we have always been so proud of — not to say vain of — is now nothing but a shell, a sham, a hypocrisy; that we have lost our ancient sympathy with oppressed peoples struggling for life and liberty; that when we are not coldly indifferent to such things we sneer at them, and that the sneer is about the only expression the newspapers and the nation deal in with regard to such things.”

Friday, November 19, 2010

Racist Palin

I have always said and continue to say that the core of the contemporary Republican Party is race. The Republican Party is the party of Confederate whites. The majority are, unfortunately, in the South. Palin is the whitest of the white.


Posted: November 18, 2010
by Geoffrey Dunn
From the Huffington Post

Sarah Palin Slams Michelle Obama in Racially Charged Passage From New Book



In passages leaked from her forthcoming book America by Heart, Sarah Palin -- the erstwhile quitter governor of Alaska, who now, by all indications, fancies herself as President of the United States -- has taken another cheap shot at First Lady Michelle Obama.

In a passage on perceptions of racial inequality in the United States, Palin slams President Barack Obama, who, she asserts, "seems to believe" that "America -- at least America as it currently exists -- is a fundamentally unjust and unequal country."

And then she goes after Michelle Obama:

Certainly his wife expressed this view when she said during the 2008 campaign that she had never felt proud of her country until her husband started winning elections. In retrospect, I guess this shouldn't surprise us, since both of them spent almost two decades in the pews of the Reverend Jeremiah Wright's church listening to his rants against America and white people.
The passage -- coming on page 26 in a chapter entitled "We, the People" -- echoes remarks made by Palin on the eve of the midterm elections, at a rally in San Jose, California, at which point she mocked remarks made by Michelle Obama during the 2008 campaign: "You know, when I hear people say, or had said during the campaign that they've never been proud of America," Palin spat out. "Haven't they met anybody in uniform yet? I get tears in my eyes when I see that young man, that young woman, walking through the airport in uniform...you too... so proud to be American."

In fact, Michelle Obama's remarks were made (in Madison, Wisconsin, during the 2008 campaign) in a context of Americans being "unified around some basic common issues":

What we have learned over this year is that hope is making a comeback. It is making a comeback. And let me tell you something--for the first time in my adult lifetime, I am really proud of my country. And not just because Barack has done well, but because I think people are hungry for change. And I have been desperate to see our country moving in that direction, and just not feeling so alone in my frustration and disappointment. I've seen people who are hungry to be unified around some basic common issues, and it's made me proud.
Afterwards, the First Lady further clarified her remarks by noting that she was referencing the "record number" of young voters participating in the political process in the 2008 campaign:

For the first time in my lifetime, I am seeing people rolling up their sleeves in way that I haven't seen and really trying to figure this out, and that's the source of pride I was talking about.
The passages from Palin's latest book first appeared at Palingates, where several other pages from American by Heart have also been posted. Palin followed up her comments about Michelle Obama by throwing an elbow at U.S. Attorney General Eric Holder, also focusing on racial overtones:

It also makes sense, then, that the man President Obama made his attorney general, Eric Holder, would call us a "nation of cowards" for failing to come to grips with what he described as the persistence of racism.

Wednesday, November 17, 2010

Republican Delusions

Republican delusions continue, and the electorate falls for it every time.



The Cycle Of Conservative Budget Delusion

Jonathan Chait
Senior Editor
The New Republic


Michael Tanner flays Republicans for the meagerness of their budget savings:

[S]ince the election, Republican leaders have been busy “clarifying” that promise. It now seems that they didn’t actually mean that they would roll back all federal spending to 2008 levels, just domestic, discretionary spending. Entitlement programs such as Social Security and Medicare are off the table, as is defense spending. Homeland security and veterans’ programs would also be spared any cuts. Removing those categories, along with interest on the debt, exempts 83 percent of the budget from any serious cuts. Rolling back the remaining spending to 2008 levels would save barely $100 billion, 2.8 percent of federal spending. That’s a drop in the bucket compared with our $1.3 trillion deficit.
Tanner omits any mention of the fact that Republican-endorsed tax cuts would increase the deficit by far more than the spending cuts would reduce it, in keeping with conservative movement policy of refusing to acknowledge the fiscal effects of tax cuts. Still, tanner's analysis is correct as far as it goes -- the Republican plan to confine spending cuts to domestic discretionary spending is indeed pathetically small in proportion to the scale of the deficit.

But then Tanner concludes with a political warning:

In the run-up to the election, Republicans constantly reassured voters that they understood how they had “lost their way” during the Bush era. If we gave them one more chance, they would leave their big-spending days behind them. Faced with the fiscal nightmare of the Obama-Reid-Pelosi agenda, voters reluctantly gave the car keys back to the Republicans. It’s very early, of course, but if Republicans hope to earn and keep that trust, they are going to have to demonstrate that they are a lot more serious about cutting spending than they have shown us thus far.
This is standard right-wing budget doggerel.

Republican politicians have lost their way, they must hew to the right-wing path or they will be turned out by the voters. It fails to explain why, save the eternal blandishments of Washington, Republicans would settle for such meager budget savings in defiance of their political self-interest. Could it be that voters do not want to cut actual government programs? Why yes, it could. From Eric Cantor's memo to Congressional Republicans:

Fast Fact: Over two-thirds of Republican voters believe the budget can be balanced without reducing spending on Social Security or Medicare.
Right. They're totally misinformed about this. Moreover, they believe this in part because Republicans spent two years attacking Democrats for threatening entitlements.

Anyway, Republicans have the same dilemma as before. They can slash entitlement spending and incur the wrath of the voters. Or they can fail to address the deficit, or -- more likely -- make the deficit worse by cutting taxes. And then the conservative movement can explain that they failed because they lost their way, and the cycle can continue.

How the South Became Republican

Wednesday, Nov 17, 2010 07:01 ET
Dittoheads, race and denial
When it comes to civil rights history, Rush and his fans suddenly like to hide behind ... liberal Republicans
By Steve Kornacki

AP/SalonOn Tuesday, I learned that there is a circumstance under which right-wingers are happy to align themselves with RINOs: in hindsight over the issue of civil rights.

I discovered this almost by accident, after including in a post about Rush Limbaugh's confused racial paranoia -- he can't seem to decide whether powerful white Democrats are conspiring to stifle the ambitions of their black counterparts or if they've been intimidated by political correctness into excessive deference -- a line in which I characterized the Republican Party as "an overwhelmingly white party that owes its strength to decades of growth in the South, a result of its 1964 decision to reject the Civil Rights Act."

It's not exactly a revolutionary claim, given the crucial role the South has played in the rise of modern conservatism and the degree to which the region has been remade as a Republican bastion -- a reality driven home by this month's midterms, which left the GOP, as the New York Times put it, "at a stronger position in the South than at any time since Reconstruction." The Civil Rights Act of 1964 is not the only reason for the GOP's evolving dominance in the South (nor is race in general), but it does mark a pivotal turning point in the relationship between the region and both political parties.

But, at least to judge from the e-mail I received on Tuesday, many conservatives aren't comfortable acknowledging this. By Tuesday afternoon, my in box was flooded with irate responses, virtually all of them making the exact same point: A greater percentage of Republicans than Democrats voted for the Civil Rights Act in Congress in 1964. Many of the e-mails included the exact partisan vote breakdown in both chambers. "Care to redo that rejection of the Civil Rights Act you discuss in the article?" one correspondent asked.

Through a quick Google search, I discovered that my piece had been linked at Lucianne.com, the right-wing news forum created a decade ago by Lucianne Goldberg (the same woman who infamously prodded Linda Tripp to rat out Monica Lewinsky to Kenneth Starr). The comments, just like the e-mails I was receiving, were filled with reminders of the partisan breakdown of the '64 vote. Who was this guy Kornacki claiming that the modern Republican Party had capitalized on the South's resistance to civil rights? "[T]the Democrats actually filibustered against the law," one typical comment read. "Simple fact is that the Republicans passed the Civil Rights Act. Only left wing biased historians keep lying about the facts."

A little more searching revealed that Limbaugh himself has been sounding the same theme. "All these were Democrats in the civil rights days that opposed civil rights," he said in a broadcast this past August. "They were Democrats. Bull Connor, Democrat. Lester Maddox, Democrat. George Wallace, Democrat. J. William Fulbright, Democrat." So have other prominent voices on the right. "As with the 1957 and 1960 civil rights acts, it was Republicans who passed the 1964 Civil Rights Act by huge majorities," Ann Coulter wrote back in May. "A distinctly smaller majority of Democrats voted for it."

What Rush and Ann (and the folks who e-mailed me and filled up the comments section at Lucianne.com) don't mention, of course, is that all of the Democrats who voted against civil rights back in the '60s were conservative white Southerners, while most of the Republicans who voted for civil rights were moderate-to-liberal Northerners -- and that in the years following the '64 vote, each of these groups migrated to the other party.

I've written about this history before, but it's worth going over once more. During the "radical Reconstruction" that followed the Civil War, liberal national Republicans installed liberal GOP governments throughout the South, stirring intense feelings of resentment, humiliation and resistance among most Southern whites. When Rutherford B. Hayes, convinced that these whites would be more cooperative if they were left to control their own affairs, ended Reconstruction in 1877, one Republican state government after another fell. Virtually every Southern white voter lined up with the Democratic Party, simply because the GOP had been the party of Reconstruction. Jim Crow laws were quickly enacted, and decades of segregation ensued.

In all this time, Northern Democrats were mostly content to look the other way, seeing the South as a crucial source of electoral support within Congress and in presidential elections. The overwhelming margins that Democratic White House candidates routinely ran up in the region can't be emphasized enough. In 1916, for instance, Woodrow Wilson won 85 percent of the vote in Louisiana, 92 percent in Mississippi and 96 percent in South Carolina; nationally, his share was 49 percent. In 1936, FDR took 98.5 percent in South Carolina and 97 percent in Mississippi. Even Adlai Stevenson, while getting crushed nationally by Dwight Eisenhower, took 70 percent in Georgia in 1952. For all intents and purposes, the Republican Party didn't exist in the South.

Civil rights changed this. The impetus was the Great Migration, the movement of millions of African-Americans away from the South and to the urban centers of the North and West Coast -- places where there were no Jim Crow laws to keep them from voting. Suddenly, civil rights became a priority for the Democratic Party's urban bosses; if they didn't go to bat for these new voters, they'd risk losing their grip on power -- especially with the rival Republican Party, then filled with educated liberal Northerners, compiling a much better record on the issue. (It was Republican Theodore Roosevelt, remember, who invited Booker T. Washington to dinner at the White House in 1901.) No longer could Northern Democrats ignore and abide Jim Crow.

In 1948, Democrats, urged on by an ambitious Minneapolis mayor named Hubert Humphrey, adopted a civil rights plank at their convention. Southern delegates promptly walked out and rallied around South Carolina's Democratic governor, Strom Thurmond, who won four Southern states (and 2.4 percent of the national popular vote) running as a "Dixiecrat" in November. Democrats managed to tamp down the South's anger for the next decade or so, but by '64 the issue could wait no longer. The party's officeholders and voters outside the South uniformly supported the sweeping civil rights bill working its way through Congress. President Lyndon Johnson, who had pushed through a watered-down civil rights bill as the Senate's majority leader back in 1957, did too. When he put his signature on the new law, LBJ supposedly noted that he'd just signed away the South for a generation.

At the same time, the Republican Party, which had racked up a largely admirable record on civil rights, took a sharp turn to the right, nominating for president Barry Goldwater, an Arizona senator who had supported Southern Democrats in their effort to filibuster the civil rights bill. The new law, Goldwater argued, would create a "police state" and foster an "informer" mentality among citizens. By all measures, Goldwater was no racist when it came to his personal views, but in 1964 he made common cause with every white Southerner who wanted to deny African-Americans federally guaranteed equality. These white Southerners, as noted above, had voted Democratic for their whole lives. But in the fall of '64, they rejected LBJ and embraced Goldwater in stunning numbers. In Mississippi, Goldwater claimed 87 percent of the vote. In Alabama, nearly 70 percent. In South Carolina, nearly 60 percent. Mind you, Goldwater only received 38 percent nationally -- one of the worst-ever showings for a major party candidate. He was roundly rejected in every corner of the country. The only voters who liked him? White Southern "Democrats."

Just consider Time magazine's state-by-state handicapping of the race a few weeks before Election Day '64. Mississippi? "Barry's anti-civil rights vote makes him an all but certain winner," the magazine (correctly) predicted. Louisiana? "Barry has been slipping, but the big segregationist vote north of New Orleans should put him over." (It did.) Here's how Time analyzed the politics of the "Democratic" South in the wake of the Civil Rights vote:


[Goldwater is] abloom in the South. Florida's Democratic candidate for Governor, Haydon Burns, said last week that he would not campaign for his party's national ticket, and added: "I expect the Republican candidate will have strong support in Florida." Louisiana's Democratic Governor John McKeithan admits that he may well decide to back Barry. The recent Mississippi Democratic convention was filled with pro-Goldwater sentiment. Georgia's Democratic Senators Richard Russell and Herman Talmadge both predict privately that today Barry could carry their state. Pollster Sam Lubell discovered last week that Goldwater is, as of now, running ahead of Johnson in Florida, Virginia, South Carolina and North Carolina. In Texas, Lubell found Lyndon holding an uneasy lead that could quickly vanish under the pressure of civil rights troubles.


Goldwater ended up carrying five Southern states in '64. The election marked the beginning of the region's steady, decades-long shift to the Republican Party. Think of Richard Nixon's "Southern strategy" (which a future Republican national chairman would apologize for) and Ronald Reagan's decision to tout "states' rights" as he launched his 1980 general election campaign in Philadelphia, Miss. (where three civil rights workers had been murdered just 16 years earlier). It didn't happen overnight, but Southern states began routinely voting for GOP candidates for president (with blips in 1976 and 1992, when the Democrats nominated Southerners), and Democratic officeholders began defecting to the Republican Party (although plenty of entrenched conservative Democratic officials finished out their careers with the party).

As the New York Times noted after this month's midterms, just about the only Democrats left in Congress from the old Confederacy represent majority black districts that were created by the Voting Rights Act after the 1990 census. Otherwise, as the paper put it, "Southern white Democrats in Congress have become as rare as a Dixie blizzard." In effect, the region's white voters have gone from uniformly Democratic to uniformly Republican. And it's obvious when and why the transformation really took hold.

So, yes, Rush, it's true: Plenty of Republicans did support civil rights back in 1964. They were moderates and liberals with names like Margaret Chase Smith, George Aiken, Clifford Case and Jacob Javits -- and as the GOP became an ideologically right-wing, Southern-dominated party they were, in many cases, driven out for being RINOs (Case and Javits both saw their careers end in Republican primaries, as the "New Right" targeted them in its purge campaign of the late '70s). The voters who'd elected them began to favor Democrats more regularly, a response to the GOP's embrace of Southern-style conservatism.

And, yes, Rush, it's also true that plenty of Democrats did oppose the Civil Rights Act in '64. It's just that they became Republicans (or started voting like Republicans) as soon as it became law.

For what it's worth, I responded to one of my email interlocutors on Tuesday with a (very truncated) version of this history. A few hours later, I received this response:


Was Martin Luther King one of those Southern Republicans? Tell me just what the Liberal Democrats have done for African Americans other than keeping them on the Plantation for voting purposes?

Tuesday, November 16, 2010

What If the South Had Been Allowed to Secede?

The Chronicle Review Share October 3, 2010

The Civil War at 150

Abraham Lincoln's election in 1860 precipitated secession, which led to the Civil War. The sesquicentennial of that event, on November 6, marks a period of commemoration, with a cavalcade of new books on the topic.

By Louis P. Masur

The sesquicentennial of the Civil War (1861-65) is nearly upon us. Lincoln's bicentennial, in February 2009, generated scores of celebrations and dozens of books. But that was only a single day. It is safe to say that for the next four years, we will be inundated with reflections and publications.

Two new books and a exhibition offer the opening salvo in what will be a continuing barrage. From 2011 to 2015, major battles and events will be commemorated: Fort Sumter, Bull Run, Antietam, the Emancipation Proclamation, Gettysburg, Vicksburg, Lincoln's re-election, Appomattox. No list is complete. What about the abolition of slavery in Washington, D.C., in 1862, or the battles of Grant's Overland Campaign in Virginia during 1864? Such is the history of the Civil War that small moments gather attention and accrue meaning: three cigars wrapped in Lee's battle orders discovered in 1862 by Union soldiers in a field in Maryland; the great locomotive chase, or military raid, in Georgia that same year; a riot over food shortages in Richmond in 1864. Of course, events will be memorialized differently North and South. In that way, memories of the war will serve to perpetuate the crisis.

Perhaps no event in American history has invited more speculation about whether it could have been avoided, or turned out differently, than the Civil War. It is an intriguing thought experiment to pose questions: What if Lincoln had acquiesced in Southern secession? What if a settlement assuring the perpetuity of slavery through constitutional amendment had been reached in the winter of 1860-61? What if some general at any one of a half-dozen battles had managed to decimate the enemy army? But ultimately such "what if" questions tell us nothing about what was.

Causation is nearly as nettlesome a problem as contingency. One can no more know exactly what caused an event as complex as the Civil War than whether it could have been avoided. That is not to say key factors cannot be isolated: Slavery caused the Civil War—but in what ways? Disagreements over sovereignty and constitutional authority caused the Civil War—but how? Northerners and Southerners saw themselves as different—but why did those differences turn lethal?

Certainly, Lincoln's election in 1860 precipitated secession, which resulted in war, and the sesquicentennial of that event, on November 6, truly marks the beginning of the forthcoming cycle of commemoration. Douglas R. Egerton's Year of Meteors: Stephen Douglas, Abraham Lincoln, and the Election That Brought on the Civil War (Bloomsbury Press, out this month) offers a thorough analysis. The contest featured four candidates: John C. Breckinridge, of Kentucky, nominee of the Southern Democrats; Stephen A. Douglas, of Illinois, candidate of the Northern Democrats; John Bell, of Tennessee, representing the Constitutional Union Party; and, of course, Abraham Lincoln, of the Republican Party, whose very nomination entices us into playing the counterfactual game: What if the Republican convention had not been held in Lincoln's home state, in Chicago, a site chosen over St. Louis by one vote? Egerton does not speculate about what might have occurred had the convention been held in Missouri, but it certainly would have boosted the chances of Edward Bates, who had lived there since before the territory became a state.

From the start of the convention season, extreme secessionists like Robert Barnwell Rhett, of South Carolina, and William Lowndes Yancey, of Alabama, schemed against the expected nomination of Douglas. Egerton emphasizes their "conspiracy" to divide the Democratic Party, enable the Republicans to win, and then lead their states out of the Union. Historians have long noted the political machinations of the most conservative Democrats, but Egerton's focus on a conspiracy goes further. Indeed, he makes too much of it. After all, both men had been outspoken for some time about their desire to create an independent confederacy.

In 1850, Rhett and Yancey had helped organize a convention at Nashville to discuss measures to be taken should Congress ban slavery in the new territories acquired from Mexico. Ten years later—despite a strengthened Fugitive Slave Act, and the Kansas-Nebraska Act, which repealed the 34-year-old prohibition on slavery in the territories north of 36 degrees 30 minutes latitude, and the U.S. Supreme Court's Dred Scott decision, which strengthened slavery's constitutional imprimatur—secessionists had new reasons to fear for slavery. For them, Douglas committed political suicide when he opposed admitting Kansas to statehood in 1858. More heinous was John Brown's raid, on October 16, 1859, on the federal arsenal at Harpers Ferry, Va.

The title of Egerton's book comes from a Walt Whitman poem that touches on the abolitionist and a meteor that fell across Eastern skies in 1859. Herman Melville also wrote a poem about Brown, in which he called him "the meteor of the war." Not to be outdone, Thoreau labeled his life "meteorlike." Executed on December 2, Brown became a martyr to the abolitionist cause, an ominous enough sign for Southerners.

Coming less than five months before the Democratic National Convention, in Charleston, S.C., Brown's raid and execution, as Egerton astutely notes, altered the political landscape. On the Democratic side, it gave fuel to the secessionists and further animated their fears of the Republican Party, whose front-runner for the nomination, William H. Seward, had once suggested that there was a "higher law" than the Constitution and had declared the North-South struggle an "irrepressible conflict."

In a three-hour speech delivered to the Senate after Brown's execution, Seward did try to reposition himself as a moderate, playing to conservatives North and South by denouncing "unconstitutional aggression against slavery." But in the excitement at Chicago, his candidacy sputtered, and on the third ballot, Lincoln, fresh, well spoken, and from a region Republicans needed to win (the results of 1856 showed they had to carry Illinois or Indiana as well as Pennsylvania) was nominated.

Parsing the results of the election has been a favorite exercise of historians ever since, given the significance of Lincoln's becoming president, and the permutations of possible outcomes. Lincoln took 38.82 percent of the popular vote, Douglas 29.46 percent, Breckinridge 18.10 percent, and Bell 12.62 percent. Lincoln won 180 electoral votes, 28 more than needed to claim victory. Combined, his opponents garnered 123 electoral votes. Lincoln carried every Northern state except New Jersey, which he split with Douglas, who also won only Missouri. Bell took Kentucky, Tennessee, and Virginia. Breckinridge carried the remainder of the South. (Lincoln was on the ballot in only five slave states.)

Egerton argues that under almost any hypothetical scenario—a unified Democratic ticket, the Constitutional Union Party's not running a candidate, Douglas's carrying all of Breckinridge's states, all of Bell's, all of New Jersey, and even Illinois—Democrats would not have won. If so, the actions of Southern extremists did not matter in bringing a Republican to the White House.

In Eric Foner's illuminating study of Lincoln and slavery, The Fiery Trial: Abraham Lincoln and American Slavery (W.W. Norton, also October), we leave behind the counterfactual quicksand of Lincoln's election for the firmer, though still unsettled, ground of his attitudes toward slavery. Two days after South Carolina seceded, Lincoln asked Alexander H. Stephens, soon to become vice president of the Confederacy, "Do the people of the South really entertain fears that a Republican administration would, directly, or indirectly, interfere with their slaves, or with them, about their slaves?" Although Lincoln asserted "there is no cause for such fears," Southerners, with John Brown on their minds, were not to be persuaded.

But, as Foner makes clear, the Lincoln who became president posed little threat to the institution of slavery. While antislavery, he was no abolitionist, and he held fast to the belief that slavery in the states where it already existed was a local decision. Like most of his generation, he was no racial egalitarian, and while he believed that all people had the right to the fruits of their labor, he did not envision black people as his equal. For most of his life, he advocated schemes of colonization to expatriate them from the United States. "What I would most desire," he said in 1858, "would be the separation of the white and black races."

Lincoln, however, changed. One of the pleasures of Foner's book is watching a professional historian become enamored of the 16th president. Foner is far from the first to note Lincoln's development. As he acknowledges, it is something of a longstanding truism that the "hallmark of Lincoln's greatness was his capacity for growth." Although the contours of Lincoln's belief system about race and slavery are well known, Foner traces that evolution more completely than any scholar before him. He is especially acute on the president's stubborn faith in colonization, which Lincoln had held to despite objections from his cabinet, finally ceasing to advocate it publicly after issuing the Emancipation Proclamation. He came to accept the enlistment of black soldiers, after first opposing it, and, as he began to think about Reconstruction, he considered the necessity of giving black men the right to vote.

But Foner's plot line on the graph of Lincoln's changing attitudes toward race and slavery is a bit too steep in its upward trajectory. "He began, during the last two years of the war, to imagine an interracial future for the United States," declares Foner. That may be claiming too much. In July 1863, Lincoln wrote to a general in Missouri about the state's plan to enact gradual emancipation starting in 1870 for children who would be freed at age 21. "I believe some plan, substantially being gradual emancipation, would be better for both white and black," he said. Lincoln had no problem with the proposed ending date, "but I am sorry the beginning should have been postponed for seven years, leaving all that time to agitate for the repeal of the whole thing." Six months after the Emancipation Proclamation, he was still willing to allow some of the enslaved to die in slavery.

Yet, as Foner acknowledges, we must also understand Lincoln's fears, pressures, and anxieties. Despite victories at Gettysburg and Vicksburg, the Union war effort had seemed to stall. In August 1864, he was convinced he would not be re-elected, that all would go for naught, not only emancipation but also the Union whose preservation had led him toward emancipation to begin with. What we need to remember is that if Lincoln's election in 1860 initiated the crisis that led to war, it was his re-election, in 1864, that finally provided the mandate that would lead to passage of the 13th Amendment, the end of armed conflict, and the beginning of Reconstruction.

Egerton and Foner offer focused political and intellectual histories. Other sesquicentennial studies to look at, recent and forthcoming, provide sweeping accounts of various Civil War themes: strategy (Donald Stoker, The Grand Design: Strategy and the U.S. Civil War, out in July, from Oxford University Press); religion (George C. Rable, God's Almost Chosen Peoples: A Religious History of the American Civil War, coming in November from the University of North Carolina Press); society (David Goldfield, America Aflame: How the Civil War Created a Nation, due in March from Bloomsbury Press); and nationhood (Gary W. Gallagher, The Union War, Harvard University Press, coming in April.)

In addition, public history has already begun to play a prominent role in commemorating the war. An exhibition at the National Archives and Records Administration, "Discovering the Civil War," presents a probing, hands-on experience. Part I, "Beginnings," opened in April, and Part II, "Consequences," opens in November.

The curators have figured out how to use new ways to make written and graphic documents exciting. To walk through the exhibition, which focuses on the everyday lives of Americans, is to encounter a steady stream of interactive exhibits. Tag clouds ask questions about the document under review. For example, a petition from a group of women to the Confederate Secretary of War raises the question, Why not let women fight? Video kiosks allow visitors to follow a story based on what aspects most interest them.

My favorite exhibit is "Finding Leaders," which draws on social media to explore the relationships among various Union and Confederate officers. Each soldier has a Facebook page that lists friends, events, and documents. Click on a friend and discover how the two knew each other and the battles in which they fought. I left only under the pressure of two texting teenagers waiting their turn.

If "Discovering the Civil War" is any indication, interest in the schism remains high. On a sultry summer day in Washington, lines extended onto Constitution Avenue. It has been 20 years since Ken Burns's The Civil War first appeared on public television, viewed by tens of millions of people. A new generation has come of age without any similar educational experience about the causes or consequences of a war that commands attention unlike any other event in American history.

One hundred and fifty years later, it is more relevant than ever. Today's Tea Party candidates prefer to see themselves as Revolutionary in origin, but their platform—disdain for the federal government, preference for state and local control, opposition to taxes, often a desire for racial homogeneity—resembles that of the secessionists. The 19th-century rebels themselves made the analogy: "The tea has been thrown overboard; the revolution of 1860 has been initiated," declared the Charleston Mercury when South Carolina seceded.

That our two main political parties have switched sides ideologically since 1860 was largely the result of developments and realignments culminating with the election of Franklin Delano Roosevelt and then the fundamental shift of white Southerners' partisan allegiance in the wake of the civil-rights measures under Lyndon Baines Johnson. It is fitting, perhaps, that Barack Obama has been compared time and again to Lincoln. Obama does not face the breakup of the United States, but he does face rogue states (i.e. Arizona and its immigration law), racial ideologues (Google the name "Tom Tancredo," a former representative from Colorado), and incipient secessionists (Gov. Rick Perry has suggested that Texas could secede). Tea Party enthusiasts denounce Obama with nearly the same fervor that secessionists denounced Lincoln; like the secessionists, they, too, are supported by a small but vocal and affluent group of predominantly white, middle-aged men.

That is not to say that we are headed for anything more than a continuing culture war, albeit one that has serious consequences for the lives of people. And, to paraphrase William James, analogies leak at all the joints. But then this is not simply an analogy—it represents a historical development that springs directly from the era of the Civil War.

In 1873, Mark Twain and Charles Dudley Warner observed that the Civil War had "uprooted institutions that were centuries old, changed the politics of a people, transformed the social life of half the country, and wrought so profoundly upon the entire national character that the influence cannot be measured short of two or three generations." The sesquicentennial will provide a continuing opportunity to try to fathom those changes and to understand how the nation is still challenged by forces unleashed in those uncompromising years.

Louis P. Masur is chair of American studies at Trinity College in Connecticut and author of The Civil War: A Concise History, forthcoming from Oxford University Press.