Sunday, June 30, 2013

Joseph J. Ellis - Revolutionary Summer

The summer of 1776 was revolutionary summer in the incipient United States.  In this thrilling example of popular narrative history, historian Joseph Ellis tells the story.  Ellis is my favorite early American historian.

Though the war touched off in 1775 it really began in 1776 when George Washington took his ragtag continental army to New York to meet the impending British invasion.  It turned out that New York City was indefensible and Washington was warned that this was so but he and others thought it had to be defended anyway.

Ellis's framework thesis is that the events in Philadelphia in the summer of 1776 must be considered in conjunction with the unfolding events in New York in that same summer as General Washington prepared to meet the expected British invasion.

The Continental army was routed in New York.  Washington and his men barely escaped to fight another day. First they escaped from Long Island to Manhattan, and then from Manhattan to the mainland up to White Plains.  If the Howes had only pursued they could have wiped out the Continentals.  They did not pursue because the Howes didn't think they had to to end the rebellion.  Richard Howe was playing diplomat trying to get surrender without undue bloodshed.  Good thing!   If the Continental army had been thoroughly anihilated, would the war have ended then and there?  Historians will always debate the question.

It is obvious that Ellis likes John Adams.  As the war commenced it was Adams more than anybody that held things together in Philadelphia as the Second Constinental Congress did its business.  In effect Adams was the secretary of war.  Adams was a conservative revolutionary.  He was dedicated to independence but thought the road to independence had to be properly and slowly carried out.  Events moved faster than Adams could control.  There was never any doubt in Adams's mind that the colonies would succeed.  You get the feeling from Ellis that there was some kind of mystical feeling on the part of true believers.  Once the rebellion reached a certain point, there was no turning back and the Howes were deluded in thinking they could end the matter in New York.

Saturday, June 29, 2013

The White Party Will Get Even Whiter

Get Ready for More GOP Race Baiting

by Michael Tomasky Jun 28, 2013 4:45 AM EDT

Forget creating a big tent. Some Republicans want their party to simply try to win more white votes.

 You, unsuspecting citizen, probably take the view that the Republican Party is too white. It’s the conventional wisdom, after all, and last year’s election results would seem to have proven the point resoundingly. But you’re obviously not up with the newest thinking in some conservative quarters, which is that the party isn’t white enough, and that the true and only path to victory in the future is to get whiter still. Some disagree, which gives us the makings of a highly entertaining intra-GOP race war playing out as we head into 2016. But given this mad party’s recent history, which side would you bet on winning?


The situation is this. The immigration reform bill passed the Senate yesterday. It will now go to the House. A few weeks ago, as I read things, there were occasional and tepid signals that the House would not take up the Senate bill. Now, by contrast, those signals are frequent and full-throated. For example, yesterday Peter Roskam, a deputy GOP whip in the House, said this: “It is a pipe dream to think that [the Senate] bill is going to go to the floor and be voted on. The House is going to move through in a more deliberative process.”



“Deliberative process” probably means, in this case, killing the legislation. House conservatives, National Journal reports, are increasingly bullish on the idea that they may be able to persuade John Boehner to drop the whole thing.



Last December, such an outcome was supposed to mean disaster for the Republicans. But now, some say the opposite. Phyllis Schlafly and talk-radio opponents of the bill like Laura Ingraham have been saying for a while now that the party doesn’t need Latino votes, it just needs to build up the white vote. And now, they have the social science to prove it, or the “social science” to “prove” it.



Sean Trende, the conservative movement’s heavily asterisked answer to Nate Silver (that is to say, Silver got everything right, and Trende got everything wrong), came out with an analysis this week, headlined “Does GOP Have to Pass Immigration Reform?,” showing that by golly no, it doesn’t. You can jump over there yourself and study all his charts and graphs, but the long and short of it is something like this. Black turnout and Democratic support have both been unusually high in the last two elections, which is true; Democrats have been steadily losing white voters, which is also true; if you move black turnout back down to 2004-ish levels and bump up GOP margins among whites (by what strikes me as a wildly optimistic amount), you reach White Valhalla. Somehow or another, under Trende’s “racial polarization scenario,” it’ll be 2044 before the Democrats again capture 270 electoral votes. Thus is the heat of Schlafly’s rhetoric cooled and given fresh substance via the dispassionate tools of statistics.



The party used to listen to Rove, but most of them have zoomed well past him to the twilight zone of the far, far right.

Karl Rove says this is bunk. He wrote in The Wall Street Journal yesterday that to win the White House without more Latino support, a Republican candidate would have to equal Ronald Reagan’s 1984 total among whites, which was 63 percent. Rove thinks this unlikely—Trende thinks it’s pessimistic—and counsels some Latino reach-out (naturally, none of them ever says anything about black reach-out). The party used to listen to Rove, but most of them have zoomed well past him to the twilight zone of the far, far right.



These Republicans and the people they represent—that is, the sliver of people they care about representing—don’t want any outreach. They almost certainly won’t let a path to citizenship get through the House. And they’ll attack minorities in other ways, too. It’s been mostly civil rights advocates who’ve denounced the Supreme Court’s Voting Rights Act decision, and one can obviously see why. But trust me, that decision, as Bloomberg’s Josh Green shrewdly noted the day it came down, is a “poisoned chalice” for the GOP.



Why? Just look at what’s already happened since the decision was announced—the party is launching voter-suppression drives in six of the nine freshly liberated states. All the states, of course, are down South. These drives might “work.” But they will attract an enormous amount of negative publicity, and they’ll probably induce massive backlashes and counter-movements. This effort will lead to even greater distrust of the GOP by people of color, and it will reinforce the captive Southern-ness of the party, making it even more Southern than it already is. And Republicans won’t stop, because they can’t stop. Race baiting is their crack pipe.



And here’s the worst part of this story. If the House Republicans kill immigration reform, and Republican parties across the South double down to keep blacks from voting, then they really will need to jack up the white vote—and especially the old white vote—in a huge way to be competitive in 2016 and beyond. Well, they’re not going to do that by mailing out Lawrence Welk CDs. They’re going to run heavily divisive and racialized campaigns, worse than we’ve ever seen out of Nixon or anyone. Their only hope of victory will be to make a prophet of Trende—that is, reduce the Democrats’ share of the white vote to something in the mid- to low-30 percent range. That probably can’t happen, but there’s only one way it might. Run the most racially inflamed campaign imaginable.



That’s the near-term future we’re staring at. We can take satisfaction in the fact that it’s bad for them, but unfortunately, it’s not so good for the country.



Insect Day

We were having such a pleasant early summer afternoon and then a few ants showed up in the master bathroom. Fudge! How could such a terrible thing happen to such nice people as us? Extermination done, we now hope for no more rude disturbances in the Hudson Castle today. This is after earlier today when I was at Hancock Fabrics in Hoover selecting some material when I noticed a black bug on the floor at my feet and the feet of the lady helping me. I was the one who had to step on and crunch the little bugger the clerk named Diane saying she hated to hear bugs being crunched. Ewwwww! she said. But what else was a MAN supposed to do under the circumstances???


Friday, June 28, 2013

Supreme Court Decisions

In two 5-4 decisions, the Supreme Court ruled against DOMA, which denied federal benefits to same-sex couples, and also struck Proposition 8, which banned same-sex couples in California from marrying.  This is a joyous moment.

These rulings are a victory for all those who truly believe in equality, who believe we should not erect walls of discrimination that treat certain people as less than.  It is sad and disheartening that there are those in this country who want to deny rights to their brethren, who instead prefer intolerance over acceptance.  That opponents of same-sex marriage think that society will crumble if same-sex couples could marry is incomprehensible to me.

We should love all our brothers and sisters, no matter their beliefs, religion, or lifestyles.  Justice Kennedy should be thanked.

Is Anthony Kennedy 'The First Gay Justice'?

BY Bill Mears
CNN
28 June 2013

Washington (CNN) -- Justice Anthony Kennedy was among the first of his colleagues to arrive Wednesday at the U.S. Supreme Court. His chambers lit up several hours before the last-day release of monumental rulings on same-sex marriage.

There was little doubt that later that morning, this quietly powerful justice would be having a major say in the legal, political, and social path of gay rights moving forward.

And at precisely 10 a.m., Kennedy kicked off the public session with his eloquent majority ruling striking down a key part of a federal law that blocks a range of benefits for legally married gay and lesbian couples.

The Defense of Marriage Act "humiliates tens of thousands of children now being raised by same-sex couples," he said. "The law in question makes it even more difficult for the children to understand the integrity and closeness of their own family and its concord with other families in their community and in their daily lives."

It was vintage Kennedy -- a mix of sweeping rhetoric mixed with practical legal and social considerations.

"If Bill Clinton was 'the first black president,' Anthony Kennedy has now firmly secured his place in history as 'the first gay justice,'" said Michael Dorf, a law professor at Cornell University and a former Kennedy law clerk. "Justice Kennedy makes clear that he not only accepts, but welcomes the task of writing majestic opinions affirming the dignity of gay persons and couples."

Kennedy, a moderate-conservative, is in many ways the "power broker" on the court. He shared the role of a "swing vote" with fellow centrist Sandra Day O'Connor before she retired seven years ago.

"The basic principle is, it's Justice Kennedy's world and you just live in it," said Thomas Goldstein, a private attorney who publishes the well-read SCOTUSblog.com. "Justice O'Connor, having been the most powerful woman in the world, handed the keys to him on her way out the door and said, 'Have fun.' And he took up that invitation."

The thinking goes that with four solid conservatives aligned on the right and four liberals on the left, Kennedy is the man in the middle, often able to cast the deciding vote in contentious cases, assuring his views of the law prevail.

Kennedy has crafted a powerful, if hard to define, judicial legacy -- seemingly in the forefront of every major ruling during his tenure. As an unapologetic "swing" vote, he was the key behind-the-scenes architect of the 2000 Bush v. Gore drama, and a 1992 opinion upholding abortion rights. He has written majority decisions upholding rights to homosexual couples, underage killers, and foreign fighters held by the U.S. military in the war on terror.

That was true this past term in several other hot-button cases in which he played a key role by:

• Writing the majority opinion allowing for the continued but limited use of race in the college admissions process, yet making it harder for institutions to use such policies to achieve diversity.

• Siding with his fellow conservatives to strike down a key section of the Voting Rights Act of 1965, weakening federal oversight of states and counties with a past history of discrimination of minority voters at the polls.

• Concluding criminal suspects can be subjected to a police DNA test after arrest but before trial and conviction.

• Blocking a lawsuit by privacy advocates challenging the federal government's sweeping electronic eavesdropping on suspected foreign terrorists and spies.

Kennedy was in the majority 91% of the time in the court's 78 argued cases this term, more than any justice. In divided cases -- where there was no unanimity -- he was on the winning side 83%, again tops on the court.

Sometimes he sides with his more liberal colleagues, as he did in the Defense of Marriage Act case, but he is mostly a reliable conservative vote, especially on business and regulation disputes.

That unpredictability has long made liberals and conservatives equally nervous, but many on the right are more outspoken in their disappointment in the Ronald Reagan nominee, who turns 77 in July.

"Kennedy's style as the 'man in the middle' is often as a 'justice in a muddle,'" said Douglas Kmiec, a law professor at Pepperdine University and a former lawyer in the Reagan and Bush administrations. "He writes cryptically ... suggesting a standard of his own making that is not fully developed."

Despite the rhetoric, Kennedy's moderating force has generally benefited his conservative colleagues.

Of the 23 divided 5-4 cases this term -- including the two marriage cases as well as voting rights -- Kennedy was in the majority 20 times, according to SCOTUSblog.com, again higher than anyone on the nine-member bench. Only on six decisions did he side with the four left-leaning justices: Ruth Bader Ginsburg, Stephen Breyer, Sonia Sotomayor, and Elena Kagan.

But it was his majority opinion in the DOMA case that could have ripple effects for years to come.

"Although Justice Kennedy's opinion explicitly states that it is confined to same-sex marriages that have been recognized by states, it contains reasoning and language that will certainly be used, in later cases, to argue that legal recognition of same-sex marriage by all states is constitutionally required," said University of Notre Dame law professor Richard W. Garnett, a past clerk to former Chief Justice William H. Rehnquist."Almost certainly, and fairly soon, that argument will be presented squarely to the court."

Within the marbled halls of the high court, Kennedy is personally respected by his colleagues, both for the power he wields and for his courtly, low-key manner.

But professionally, justices on the losing side of a big case can often be unsparing in their criticism of Kennedy and his legal reasoning.

Justice Antonin Scalia on Wednesday called Kennedy's analysis in the DOMA case "jaw dropping" and an assertion of "judicial supremacy" that "envisions the Supreme Court standing (or rather enthroned) at the apex of government."

And this from a close friend -- Scalia and Kennedy joined the court a year apart, were born the same year, and live on the same street.

The Sacramento, California, native joined the high court in 1988, the third choice of President Reagan after more conservative nominees Robert Bork and Douglas Ginsburg flamed out.

And along with O'Connor for almost two decades, the two native Westerners carved out a jumpy place in the center. Less driven by practical concerns than O'Connor was, Kennedy has striven for a loftier sense of the law's impact on society.

"He has brought to the bench a combination of a very scholarly, erudite, academic bent," said Brad Berenson, a friend and former law clerk to Kennedy, "and a very practical bent he had developed while practicing law on his own."

Kennedy himself acknowledged the unique role he played for decades on the court. "There is a loneliness" to his job, he once told CNN.

For now, Kennedy, like his eight colleagues, will retreat from the public spotlight. He has some vacation time ahead, mixed with his annual overseas teaching gig in Austria in two weeks.

Then, come the first Monday October, Kennedy is expected back in his familiar seat, just to the left of Chief Justice John Roberts on the bench. But clearly he is the man in the middle, and the man that in many ways shapes the direction of a divided court.

Digital Natives Still Like Print

No Surprise: Student-Age Digital Natives Still Like Libraries, Print

June 27, 2013

By Dennis Abrams

According to a new report from the Pew Research Center even though Americans between the ages of 16-29 are heavy users of technology, they are more likely than their older counterparts to both use and appreciate libraries as physical spaces.

Perhaps surprisingly, a majority of those polled under the age of 30 said that it was “very important” for libraries to have librarians and books available for borrowing, and few thought that libraries should fully automate or move their services online.

As one of the study’s authors, Kathryn Zickuhr said:



“Younger Americans — those ages 16-29 — exhibit a fascinating mix of habits and preferences when it comes to reading, libraries, and technology. Almost all Americans under age 30 are online, and they are more likely than older patrons to use libraries’ computer and internet connections; however, they are also still closely bound to print, as three-quarters (75%) of younger Americans say they have read at least one book in print in the past year, compared with 64% of adults ages 30 and older.”



“Similarly, younger Americans’ library usage reflect a blend of traditional and technological services. Americans under age 30 are just as likely as older adults to visit the library, and once there they borrow print books and browse the shelves at similar rates.” Zickuhr toldShelf Awareness, “Some of this stems from the demands of school or work, yet some likely lies in their current personal preferences. And this group’s priorities and expectations for libraries likewise reflect a mix of traditional and technological services.”



But let’s also be honest here, the age group polled 16-29 year olds, include a large number of high school and college students, a group that will likely be required to read print books — which still dominate classrooms — and use libraries on a regular basis.



Among the study’s highlights:



* As with other age groups, younger Americans were significantly more likely to have read an ebook during 2012 than a year earlier. Among all those ages 16-29, 19% read an e-book during 2011, while 25% did so in 2012. At the same time, however, print reading among younger Americans has remained steady: When asked if they had read at least one print book in the past year, the same proportion (75%) of Americans under age 30 said they had both in 2011 and in 2012.



In fact, younger Americans under age 30 are now significantly more likely than older adults to have read a book in print in the past year (75% of all Americans ages 16-29 say this, compared with 64% of those ages 30 and older). And more than eight in ten (85%) older teens ages 16-17 read a print book in the past year, making them significantly more likely to have done so than any other age group.



* The under-30 age group remains anchored in the digital age, but retains a strong relationship with print media and an affinity for libraries. Moreover, younger Americans have a broad understanding of what a library is and can be—a place for accessing printed books as well as digital resources, that remains at its core a physical space.



Overall, most Americans under age 30 say it is “very important” for libraries to have librarians and books for borrowing; they are more ambivalent as to whether libraries should automate most library services or move most services online. Younger Americans under age 30 are just as likely as older adults to visit the library, and younger patrons borrow print books, browse the shelves, or use research databases at similar rates to older patrons; finally, younger library visitors are more likely to use the computer or internet at a library, and more likely to see assistance from librarians while doing so.



Additionally, younger patrons are significantly more likely than older library visitors to use the library as a space to sit and ready, study, or consume media—some 60% of younger library patrons have done that in the past 12 months, compared with 45% of those ages 30 and older. And most younger Americans say that libraries should have completely separate locations or spaces for different services, such as children’s services, computer labs, reading spaces, and meeting rooms: 57% agree that libraries should “definitely” do this.



Along those lines, patrons and librarians in our focus groups often identified teen hangout spaces as especially important to keep separate from the main reading or lounge areas, not only to reduce noise and interruptions for other patrons, but also to give younger patrons a sense of independence and ownership



A Nation of Mutts

A more diverse America is coming whether Brooks has it exactly right or not.  Will we have to redefine what it means to be an "American?"




A Nation of MuttsBy DAVID BROOKS

Published: June 27, 2013
Over the past few decades, American society has been transformed in a fit of absence of mind. First, we’ve gone from a low immigrant nation to a high immigrant nation. If you grew up between 1950 and 1985, you grew up at a time when only about 5 percent or 6 percent of American residents were foreign born. Today, roughly 13 percent of American residents are foreign born, and we’re possibly heading to 15 percent.


Moreover, up until now, America was primarily an outpost of European civilization. Between 1830 and 1880, 80 percent of the immigrants came from Northern and Western Europe. Over the following decades, the bulk came from Southern and Central Europe. In 1960, 75 percent of the foreign-born population came from Europe, with European ideas and European heritage.



Soon, we will no longer be an outpost of Europe, but a nation of mutts, a nation with hundreds of fluid ethnicities from around the world, intermarrying and intermingling. Americans of European descent are already a minority among 5-year-olds. European-Americans will be a minority over all in 30 years at the latest, and probably sooner.



If enacted, the immigration reform bill would accelerate these trends. It would further increase immigration levels. According to the Census Bureau, roughly 20 million immigrants will come to this country under current law. The Congressional Budget Office expects another 16 million under the new provisions.



It would boost the rise of non-Europeans. Immigration would be more global. Hispanics are now projected to make up 30 percent of the U.S. population by 2050. We would hit that mark sooner with reform.



In other words, immigration reform won’t transform America. It will just speed up the arrival of a New America that is already guaranteed.



As we stand on the cusp of this New America, it’s understandable to feel some anxiety. If you take sociology and culture seriously, it’s sensible to wonder whether this is the sort of country we want to be. Can we absorb this many immigrants without changing something fundamental?



Let’s make some educated guesses about what the New America will look like. It will almost certainly be economically dynamic. Immigration boosts economic dynamism, and more immigration would boost it more. There would also be a lot of upward striving. Immigrant groups tend to work harder than native groups. They save more. They start business at higher rates than natives.



My colleague Anne Snyder delineates several possible changes to the social fabric. Basically we are witnessing the end of the old ethnic-racial order. Traditionally, mainstream America has been defined by the big block of whites, while other big blocks — blacks, Hispanics, Asians — occupied different places on the hierarchy.



Soon there will be no dominant block, just complex networks of fluid streams — Vietnamese, Bengalis, Kazakhs. It’s a bit like the end of the cold war when bipolar thinking had to give way to a radically multipolar mind-set.



Because high immigration is taking place at a time of unprecedentedly low ethnic hostility, we’re seeing high rates of intermarriage. This creates large numbers of hybrid individuals, biracial or triracial people with names like Enrique Cohen-Chan. These people transcend existing categories and soften the social boundaries between groups.



This won’t lead to a bland mélange America but probably a move to ethnic re-orthodoxy. As Alvaro Vargas Llosa points out in his book, “Global Crossings,” the typical pattern is that the more third-generation people assimilate, the more they also value their ethnic roots. We could soon see people with completely unaccented English joining Chinese-American Federations and Honduran-American Support Networks.



The big divides could be along educational lines, not ethnic ones. Because educated people intermarry at higher rates, we could have an educated cosmopolitan class with low ethnic boundaries and a fair bit of integration in white-collar workplaces. Then, underneath, there could be a less-educated, more-balkanized layer, with high residential and professional segregation and more ethnic hostility.



We could also see more ethnic jostling between groups. The most interesting and problematic flashpoint may be between immigrants and African-Americans. We now have this bogus category, “minority,” in which we lump the supposed rainbow coalition of immigrants and blacks. But, in fact, tensions between “minority” groups could soon be more plainly obvious than any solidarity.



Finally, it would make sense that the religion of diversity, which dominates the ethos of our schools, would give way to an ethos of civic cohesion. We won’t have to celebrate diversity because it will be a fact. The problem will be finding the 21st-century thing that binds the fluid network of ethnic cells.



On the whole, this future is exciting. The challenge will be to create a global civilization that is, at the same time, distinctly American. Immigration reform or not, the nation of mutts is coming.



Khaled Hosseni - And the Mountains Echoed

At first as I was reading I thought this latest from Hosseni was better than The Kite Runner, and it is a good book, but by the end I was weary because of all of the characters the author introduces throughout the novel.  I wish his character focus was more narrow.

In essence Hosseni writes about the power of place and of early family relationships the place and family   we all come from and its lifetime power over all of us, and his place is Afghanistan.  What a mysterious, barren, and tragic land is Afghanistan!

The main story line is the closeness of a brother and sister and what happened to them.  The ending is not as dramatic as The Kite Runner but it's okay yet you don't get a lump in your throat like the kite runner.

Hosseni is a great story-teller, as good as any we have working today.  He is always worth reading.

Wednesday, June 26, 2013

Paula Deen

I have followed the Paula Deen story with amused detachment.  I don't have a dog in this fight in that I don't care one way or the other about Deen.  I do not consume her products so in that respect I do not care.  Is she a racist?  I don't know and I don't care.  You cannot have a legitimate public discussion in this country about race.  The whole thing is mostly pointless. 

Tuesday, June 25, 2013

A Sad Day

It's a sad day in the history of this country.  The Supreme Court has, in effect, struck down the Voting Rights Act of 1965.  This single law dramatically changed politics in the South.  Before this law was passed, few blacks were eligible to vote because white polticians kept them off the voting rolls.  Now the South is a different place.  Blacks vote and hold elective office all over the place.  By gutting the Voting Rights Act of 1965, the SCOTUS has opened the doors for Republicans to reenact restrictions on minority voting rights.  It will happen.  Stay tuned and watch what happens now.

Saturday, June 22, 2013

1940

 The Revolution of 1940: America’s Fight Over Entering World War Two
by Curtis Wortman

Jun 23, 2013 4:45 AM EDT Not since the Civil War had Americans voted in such a high stakes presidential election as the unprecedented 1940 race. Two new books and several others recently published relive that tumultuous period before Pearl Harbor—a time much like our own.

 We may pledge allegiance to “one nation under God,” but from the start American society has been anything but “indivisible.” The America Revolution split families and pitted neighbor against neighbor before it launched a nation; the Civil War that nearly broke it asunder was, well, a civil war. And let’s not talk about slavery and the brutal subjugation of Native Americans. Between the open battles, we mostly just shout at each other across political lines like now. Yet the great national commitment to victory in World War II stands out as a singular shining moment of cohesion and unity. The afterglow of that massive war effort and the Allies’ great victory hides a darker reality of the political storm that swept the nation right up to the very day of the Pearl Harbor attack.



‘Roosevelt's Second Act: The Election of 1940 and the Politics of War’ by Richard Moe. 392 p. Oxford University Press. $20.76

The fight between isolationists and interventionists over America’s future role in the world, a fight that turned into a political and sometimes real brawl for the presidency in the 1940 election, proved lower and even more vicious than what passes for political discourse today. Franklin Delano Roosevelt’s 1940 election to an unprecedented third term—breaking a 150-year national taboo—marked a true revolution in American determination to take on global responsibility in a world at war and, often to our regret, ever after.

Overshadowed by the Second World War and the prior Great Depression, relatively few books have been written about the pre-war period, despite its supremely high stakes for the nation’s and world’s future. It’s a period well worth study, with high drama among the remarkable people involved in the clashes, such as aviation hero Charles Lindbergh, plutocrat U.S. Ambassador to Great Britain (and father of a future president) Joseph P. Kennedy, Republican presidential candidate Wendell Willkie, movie mogul Jack Warner, publisher Henry Luce, and the ever-calculating, Sphinx-like FDR. Several recent books, including two new histories of the 1940 race for the presidency, fill some major gaps.

Just published, Susan Dunn’s 1940: FDR, Willkie, Lindbergh, Hitler—the Election amid the Storm delivers a richly detailed Making of the President-style look back at the 1940 race. I also got an early look at Richard Moe’s more engaging and nuanced Roosevelt’s Second Act: The Election of 1940 and the Politics of War, due out this fall. Both books remind us how grave a decision the nation faced in choosing a president with global war raging just over the oceanic horizons and just how close America came to abandoning its traditional democratic allies to the seemingly unstoppable forces of totalitarianism and nationalistic hatred.

Placing that election into a wider overview of America at the time, Lynne Olson’s brilliant Those Angry Days: Roosevelt, Lindbergh, and America’s Fight Over World War II, 1939-1941, published in early spring, is hard to put down. She richly portrays the debate (when it wasn’t a shouting match or a fist fight) that swept up America between the isolationists and interventionists. A large majority of Americans deeply opposed sending their boys to fight another war in the Old World, yet even more strongly supported the last free nations holding out against Nazi terror. Most understood that if the Brits (and later the Russians) fell, America would almost certainly confront a very powerful foe in Hitler. But would offering aid to the Allies court war with the Axis powers or prevent it? As Lindbergh, who would emerge as the chief spokesperson for America First, the most organized anti-interventionist movement, told a cheering crowd of Yale students a week before the 1940 election day, “If we desire peace, we need only stop asking for war. Nobody wishes to attack us, and nobody is in a position to do so.”





“When your boy is dying on some battle field in Europe and he’s crying out ‘Mother! Mother!’—don’t blame Franklin D. Roosevelt because he sent your boy to war—blame YOURSELF, because YOU sent Franklin D. Roosevelt back to the White House!”

Olson captures vividly how such seemingly intelligent and well-meaning people like the Yale student-founders of the America First Committee, including future U.S. President Gerald Ford and future Yale President and Ambassador to Great Britain Kingman Brewster, and their counterparts at Harvard such as Ambassador Kennedy’s eldest son Joseph, could agree with The Harvard Crimson editors when they wrote, “We are frankly determined to have peace at any price.” On Capitol Hill, staunchly conservative Republicans—prominent among them Ohio Senator Robert Taft and North Dakota Senator Gerald Nye, several Northeasterners including FDR’s Hyde Park, NY, home district’s U.S. Representative Hamilton Fish—along with a smaller number of anti-foreign Democrats like powerful North Carolina Senator Robert Reynolds, used their obstructive legislative power much like today’s Republicans to kneecap the Administration in its efforts to arm the British and Chinese Nationalists (at war with the Japanese) and grow the paltry American military.

They at least were not fascists. Olson also describes the machinations of more odious anti-British Hitler-sympathizers and anti-Semites such as Ambassador Kennedy, a young architect Philip Johnson, and Hitler’s favorite U.S. industrialist Henry Ford, who shared Anne Morrow Lindbergh’s belief, articulated in her No. 1 bestselling testament The Wave of the Future: A Confession of Faith, that the various violent movements crushing human freedom, above all Hitler’s Nazism, were just “scum on the surface of the wave” as civilization moved into a new “highly scientific, mechanized and material era.” I guess they considered the already well-known existence of the Nazi concentration camps just part of adjusting to the new age dawning upon the world.

However nauseating in retrospect, millions shared their views. Bankrolled by General Robert E. Wood, chairman of retail giant Sears, Roebuck, editorially backed by the Hearst chain of newspapers, publisher Robert McCormick’s Chicago Tribune, New York Daily News publisher Joseph Patterson, given voice by avowed anti-Semite radio priest Father Charles Coughlin, and many other leading media figures, the anti-intervention movement rallies attracted crowds in the tens of thousands, often to hear Lindbergh speak, sometimes to call for FDR’s impeachment and Lindbergh to take power. Many in the movement were thoughtfully opposed to the war, but their numbers included Communist Party members toeing the Stalinist line, German Bund Hitlerites, and unabashed Jew haters.

For all his cynical willingness to manipulate everyone around him, FDR never wavered in his commitment to freedom, republican ideals, and his New Deal social programs. He would bend the truth, dissimulate his thinking behind a bright smile and personal charm, send subordinates to take actions and deny he did, even risk impeachment, to win reelection and provide “all aid short of war” to the Allies. For all his public denials that American boys would ever fight in foreign wars, he was certain that war with Hitler (and eventually Japan) was inevitable. He preferred that until then Britain and the Red Army have American aid to hold off the Axis and that when the time came to fight his country’s military be ready.

Nobody in his Administration played a more central role in bringing the interventionist agenda to fruition in this pre-war period—not even FDR in certain respects— than the still too-little-known Harry Hopkins. The Hopkins Touch: Harry Hopkins and the Forging of the Alliance to Defeat Hitler, by David L. Roll, is the best biography since FDR speechwriter Robert Sherwood’s seminal postwar Pulitzer-winning Roosevelt and Hopkins: An Intimate History, of the extraordinarily brave, deathly ill man who lived down the hall from the president from 1940. Roll had access to newly opened papers from the period when Hopkins spent more time with FDR than anyone, including his wife Eleanor. Despite lacking any official government title or party post, Hopkins rammed FDR’s 1940 third term renomination down a restive Democratic Convention’s throat and then oversaw the common law marriage that would take place between his boss and British Prime Minister Winston Churchill and later Soviet leader Josef Stalin. (After Lend-Lease opened the floodgates to arming the Allies in 1941, Hopkins also administered the greatest arms buildup and transfer of military hardware to foreign powers in world history out of his White House bedroom.)

Before then, however, FDR had good reason to fear his inward-looking nation, still deeply traumatized by the Depression, would reject him and his start-and-stop efforts to rebuild the nation’s paltry national defenses.

Two books follow out the story of how the seemingly indomitable FDR found himself on what appeared to be the verge of defeat on Election Night. Williams College history professor and author of prize-winning histories of Franklin D. Roosevelt’s presidency Susan Dunn offers in 1940, a deeply researched look into how the darkest of dark horse candidates Wendell Willkie and the unwilling-to-declare-his-intentions FDR handled their historically unique nominating conventions—brilliantly in Wendell Willkie’s case and utterly ineptly in FDR’s. She follows out the way their campaigns caught fire as the question whether to fight Hitler or not became the true decision for voters.

Until just before a liberal Wall Street Democrat and an avowed internationalist, Willkie entered the Republican Convention with seemingly little support. But using his engaging, devil-may-care persona and Wall Street and media ties, he swept up support and shockingly emerged from the convention as nominee and the frontrunner. From the Luce Time-Life-Fortune publishing empire to the New York Times, virtually every major media outlet came out in support of Willkie over FDR. Even the powerful United Mineworker’s union leader John L. Lewis vowed he’d give up his post if Roosevelt won. Dunn shows well the forces at work across the political landscape that enabled the amateur pol Willkie to threaten the greatest politic mind in American history right up to Election Night.

For his part, FDR went into the Democratic Party Convention refusing to declare his candidacy. He wanted to be “drafted” into service, much like the first-ever peacetime draft then being proposed for the nation’s armed forces. He refused to attend the convention and nearly caused a party revolt when, for the first time in political history, he demanded his own choice for vice-presidential running mate, Henry Wallace, be nominated. It took an unprecedented convention speech by the immensely popular Eleanor Roosevelt to stop the bitter tide running against him.

The sides were drawn for an epic battle—with both candidates in a sense running against Hitler, or who would prove better able to deal with the world at war.

Richard Moe was Vice President Walter Mondale’s chief of staff and a longtime head of the National Trust for Historic Preservation. In Roosevelt’s Second Act, he offers reliable interpretations of the inner workings of FDR’s always fenced-off mind. Moe’s book is as exciting as a character-driven thriller, insightful as only those who have watched presidents up close can be. Read it when it comes out in September.

As Election Day drew near in 1940, Willkie began to lose the Party’s conservative base. He tacked hard to the right into the isolationist camp. It worked and his standing in the polls surged. Meantime, in the months before the election, despite the isolationist outcry, FDR engineered a controversial Bases-for-Destroyers exchange with the fast-sinking British and the creation of a draft army.

The day before the vote the Republican Party ran an ominous radio commercial addressed to the mothers of America. A chilling voice warned, “When your boy is dying on some battle field in Europe and he’s crying out ‘Mother! Mother!’—don’t blame Franklin D. Roosevelt because he sent your boy to war—blame YOURSELF, because YOU sent Franklin D. Roosevelt back to the White House!”

FDR had to answer such attacks. He may have believed war with Germany was inevitable but he felt forced to declare in a nationally broadcast campaign speech just days before the vote, “Your boys are not going to be sent into any foreign wars…. We will not send our army, navy or air forces to fight in foreign lands….” We know the election result, but the final days of the campaign played out like a political Super Bowl where the outcome wasn’t clear until the very last seconds of play. Even the preternaturally confident FDR thought he had lost the election as he read the first vote counts coming to him at his home in Hyde Park.

Such pronouncements that he would not intervene made it impossible for the third-term president to go to war—except through subterfuge, which he did—without an actual attack on America. One can speculate that had he been freer to pursue the election’s foreign policy mandate the U.S. might have entered the war several months before Pearl Harbor. An earlier buildup of the nation’s armed forces would surely have saved uncountable but likely many lives.

These books help us reflect on today’s deep national divisions, as our messy republican democracy lurches from crisis to crisis and minority viewpoints—still largely Republican—game the system to block worthy legislation and needed government action, even when most citizens believe they are in the nation’s interest.

On the afternoon of December 7, 1941, Senator Nye addressed several thousand rowdy people at a Pittsburgh America First rally. “Whose war is this?” he shouted out. “Roosevelt’s!” roared back the crowd. A short while later they knew that war was America’s.

You Gotta Love Carville!

Republican policies harm people. That's simply the truth. Thanks to Quotable Liberals via Being Liberal for the image. 

Posted by Justin Acuff on Americans Against The Republican Party.

The Humanist Vocation

The Humanist VocationBy DAVID BROOKS

Published: June 20, 2013 A half-century ago, 14 percent of college degrees were awarded to people who majored in the humanities. Today, only 7 percent of graduates in the country are humanities majors. Even over the last decade alone, the number of incoming students at Harvard who express interest in becoming humanities majors has dropped by a third.


Most people give an economic explanation for this decline. Accounting majors get jobs. Lit majors don’t. And there’s obviously some truth to this. But the humanities are not only being bulldozed by an unforgiving job market. They are committing suicide because many humanists have lost faith in their own enterprise.



Back when the humanities were thriving, the leading figures had a clear definition of their mission and a fervent passion for it. The job of the humanities was to cultivate the human core, the part of a person we might call the spirit, the soul, or, in D.H. Lawrence’s phrase, “the dark vast forest.”



This was the most inward and elemental part of a person. When you go to a funeral and hear a eulogy, this is usually the part they are talking about. Eulogies aren’t résumés. They describe the person’s care, wisdom, truthfulness and courage. They describe the million little moral judgments that emanate from that inner region.



The humanist’s job was to cultivate this ground — imposing intellectual order upon it, educating the emotions with art in order to refine it, offering inspiring exemplars to get it properly oriented.



Somewhere along the way, many people in the humanities lost faith in this uplifting mission. The humanities turned from an inward to an outward focus. They were less about the old notions of truth, beauty and goodness and more about political and social categories like race, class and gender. Liberal arts professors grew more moralistic when talking about politics but more tentative about private morality because they didn’t want to offend anybody.



To the earnest 19-year-old with lofty dreams of self-understanding and moral greatness, the humanities in this guise were bound to seem less consequential and more boring.



So now the humanities are in crisis. Rescuers are stepping forth. On Thursday, the American Academy of Arts and Sciences released a report called “The Heart of the Matter,” making the case for the humanities and social sciences. (I was a member of this large commission, though I certainly can’t take any credit for the result.)



The report is important, and you should read it. It focuses not only on the external goods the humanities can produce (creative thinking, good writing), but also the internal transformation (spiritual depth, personal integrity). It does lack some missionary zeal that hit me powerfully as a college freshman when the humanities were in better shape.



One of the great history teachers in those days was a University of Chicago professor named Karl Weintraub. He poured his soul into transforming his students’ lives, but, even then, he sometimes wondered if they were really listening. Late in life, he wrote a note to my classmate Carol Quillen, who now helps carry on this legacy as president of Davidson College.



Teaching Western Civ, Weintraub wrote, “seems to confront me all too often with moments when I feel like screaming suddenly: ‘Oh, God, my dear student, why CANNOT you see that this matter is a real, real matter, often a matter of the very being, for the person, for the historical men and women you are looking at — or are supposed to be looking at!’



“I hear these answers and statements that sound like mere words, mere verbal formulations to me, but that do not have the sense of pain or joy or accomplishment or worry about them that they ought to have if they were TRULY informed by the live problems and situations of the human beings back there for whom these matters were real. The way these disembodied words come forth can make me cry, and the failure of the speaker to probe for the open wounds and such behind the text makes me increasingly furious.



“If I do not come to feel any of the love which Pericles feels for his city, how can I understand the Funeral Oration? If I cannot fathom anything of the power of the drive derived from thinking that he has a special mission, what can I understand of Socrates? ... How can one grasp anything about the problem of the Galatian community without sensing in one’s bones the problem of worrying about God’s acceptance?



“Sometimes when I have spent an hour or more, pouring all my enthusiasm and sensitivities into an effort to tell these stories in the fullness in which I see and experience them, I feel drained and exhausted. I think it works on the student, but I do not really know.”



Teachers like that were zealous for the humanities. A few years in that company leaves a lifelong mark.



About The Bill of Rights

Thursday, Jun 20, 2013 06:44 AM CDT


The secret history of the Bill of Rights

Activists today invoke the 2nd and 4th Amendments as if they're kindred spirits with Madison. Here's the real story

By Michael Lind

Is the Bill of Rights — made up by the first 10 amendments to the U.S. Constitution — the foundation of American liberty? So we are told by civil libertarians on the left alarmed by government surveillance programs, and by opponents of gun control on the right. The truth about the Founders and the Bill of Rights, however, is quite at odds with modern civil libertarian mythology.



The term “Founders” is ambiguous. It usually refers to the delegates who drafted today’s federal Constitution in Philadelphia in 1787, but it might as well apply to the members of the state ratifying conventions, who voted to enact it into law. In this case, it doesn’t matter, because a majority of the delegates at the Constitutional Convention rejected proposals by Virginia’s George Mason and others to include a bill of rights in the federal Constitution. The new federal Constitution was then ratified by a majority of the states, even though no bill of rights was included. Neither the drafters nor the ratifiers of the Constitution thought a bill of rights was necessary to protect American liberties.



Why did the authors of the Constitution reject proposals for a bill of rights? The Federalist Papers, written by Alexander Hamilton, James Madison and John Jay to promote ratification of the new Constitution, defends the decision of the framers of the U.S. Constitution to exclude any bill of rights.



In Federalist 84, Hamilton observes that a bill of rights, as a bargain between the people and a separate ruler, is irrelevant in a republic in which the people themselves are the collective sovereign.



It has been several times truly remarked, that bills of rights are in their origin, stipulations between kings and their subjects, abridgments of prerogative in favor of privilege, reservations of rights not surrendered to the prince. …It is evident, therefore, that according to their primitive signification, they [i.e. bills of rights] have no application to constitutions professedly founded upon the power of the people, and executed by their immediate representatives and servants. Here, in strictness, the people surrender nothing, and as they retain every thing, they have no need of particular reservations.



Hamilton also argues that listing some rights in the Constitution might inadvertently endanger other rights, which would be assumed to be unprotected because they were not mentioned:



I go further, and affirm that bills of rights, in the sense and in the extent in which they are contended for, are not only unnecessary in the proposed constitution, but would even be dangerous. They would contain various exceptions to powers which are not granted; and on this very account, would afford a colourable pretext to claim more than were granted.



Hamilton, the founder of the New York Post, did not agree that a bill of rights was necessary to protect freedom of the press:



What signifies a declaration that “the liberty of the press shall be inviolably preserved?” What is the liberty of the press? Who can give it any definition which would not leave the utmost latitude for evasion? I hold it to be impracticable; and from this, I infer, that its security, whatever fine declarations may be inserted in any constitution respecting it, must altogether depend on public opinion, and on the general spirit of the people and of the government.



Hamilton concluded that the regulation of power by the federal Constitution itself, not a laundry list of specific rights, was the best protection of liberty in the new country:



The truth is, after all the declamation we have heard, that the constitution is itself in every rational sense, and to every useful purpose, a bill of rights.



James Madison, the “father of the Constitution,” shared the skepticism of the majority of the Founders about bills of rights. However, the Anti-Federalists, the opponents of a stronger federal government, were particularly influential in slave states like Madison’s Virginia, where they were inspired by some of his fellow slave owners like Thomas Jefferson, George Mason and Patrick Henry. These men were hardly precursors of the ACLU. Mason and Henry in particular objected to the federal Constitution because it did not sufficiently prevent the federal government from intervening in Southern slavery. Unlike George Washington, the only slave-holding president who freed his own slaves at his death, and a supporter of a strong federal government, Mason and Henry were hypocrites who denounced slavery in the abstract while opposing any government power that might infringe upon their despotic personal power over their own slave “property.”



As a delegate at the Constitutional Convention, George Mason, who authored Virginia’s bill of rights, refused to sign the final product, objected to the federal Constitution because of a lack of a bill of rights — and inadequate safeguards to slavery. As a delegate to Virginia’s ratifying convention, Mason denounced the Constitution for allowing a two-decade continuation of the slave trade (which lowered the value of the slaves that Virginian planters sold to slave owners in other states) and also for doing too little to secure slavery from federal interference — for example, a hypothetical federal tax on slavery that would force emancipation:



As much as I value a union of all the states, I would not admit the Southern States into the Union unless they agree to the discontinuance of this disgraceful trade, because it would bring weakness, and not strength, to the Union. And, though this infamous traffic be continued, we have no security for the property of that kind which we have already. There is no clause in the Constitution to secure it; for they may lay such a tax as will amount to manumission [emphasis added]…. Yet they have not secured us the property of the slaves we have already. So that they have “done what they ought not to have done, and have left undone what they ought to have done.”



Another Anti-Federalist opponent of the Constitution Patrick Henry feared that the military power of the federal government might be used to end slavery, something that indeed occurred during the Civil War, when President Lincoln justified the Emancipation Proclamation as a war measure. As Thom Hartman has pointed out, for Southern slave owners like Henry the chief purpose of what became the Second Amendment was to prevent the federal government from interfering with state militias used to repress slaves:



May Congress not say, that every black man must fight? Did we not see a little of this last war? We were not so hard pushed as to make emancipation general; but acts of Assembly passed that every slave who would go to the army should be free.



Ironically, it is to the pressure of the slave-holding oligarchy on Virginia’s federal representatives that we owe the Bill of Rights. To be specific, in running for the first Congress in 1788 James Madison beat his rival James Monroe by only 336 votes out of 2,280. This near-death experience led Madison to do a classic political flip-flop, trying to co-opt his opponents by embracing their cause, the addition of a bill of rights to the Constitution. Pennsylvania’s Sen. Robert Morris sneered that Madison “got frightened in Virginia and wrote a book” — the amendments that became the Bill of Rights.



Madison thought that the states were greater menaces to liberty than the federal government, but his proposal that any federal bill of rights govern the states as well as the federal government died in Congress. (According to today’s judicial doctrine, some but not all of the rights in the first 10 amendments have applied to the states since the passage of the 14th Amendment after the Civil War). Of the 12 amendments drafted by Madison and sent to the states for ratification by Congress, only 10 were initially ratified, becoming today’s Bill of Rights. An 11th, governing pay raises for Congress, was ratified only in 1992 as the 27th amendment, while the 12th, about congressional apportionment, failed to win state ratification.



In introducing his proposed amendments to Congress, Madison acknowledged that his bill of rights was an incoherent philosophical and legal mess:



In some instances they assert those rights which are exercised by the people in forming and establishing a plan of Government. In other instances, they specify those rights which are retained when particular powers are given up to be exercised by the Legislature. In other instances, they specify positive rights, which may seem to result from the nature of the compact. Trial by jury cannot be considered as a natural right, but a right resulting from a social compact which regulates the action of the community, but is as essential to secure the liberty of the people as any one of the pre-existent rights of nature. In other instances, they lay down dogmatic maxims with respect to the construction of the Government; declaring that the legislative, executive, and judicial branches shall be kept separate and distinct.



Madison’s bill of rights was a hodgepodge slapped together hastily to try to conciliate former opponents of the newly ratified federal Constitution. This was a typical case of damage control by a reluctant politician trying to head off a more radical alternative by enacting a watered-down substitute. Madison would have been proud to be remembered as “the Father of the Constitution.” But he would have been appalled to be told that without his Bill of Rights the U.S. would be a tyranny. That was the rhetoric of the Anti-Federalists whom he reluctantly sought to appease.



History has vindicated the skepticism about bills of rights shared by Hamilton and Madison and a majority of the drafters and ratifiers of the U.S. Constitution. Mere paper guarantees of rights have never been enough to secure liberty, in periods when the public is panicked — think of Lincoln’s excessive suspension of habeas corpus during the Civil War, or FDR’s wartime internment of Japanese-Americans. And the American system of checks and balances has repeatedly, if belatedly, worked to check imbalances of power, as it did when Congress reined in “the imperial presidency” in the 1970s.



In the contemporary debate about civil liberties and government surveillance, absolutist civil libertarians routinely claim that “the Founders” viewed the Bill of Rights as essential to American liberty. But paranoid rhetoric about our allegedly tyrannical government is closer to the rhetoric of the Anti-Federalists who denounced the U.S. Constitution than to the thinking of the Constitution’s drafters, ratifiers and supporters. The real Founders thought little of lists of abstract rights, putting their faith instead in checks and balances and accountability through elections. In the spirit of the real Founders, we should be debating what kind of system of congressional and judicial oversight of executive intelligence activity can best balance individual liberty with national security — and we should leave anti-government paranoia to today’s Anti-Federalists.

Novel Readers as Better Thinkers

Saturday, Jun 15, 2013 08:00 AM CDT


Study: Reading novels makes us better thinkers

New research says reading literary fiction helps people embrace ambiguous ideas and avoid snap judgments

By Tom Jacobs

Are you uncomfortable with ambiguity? It’s a common condition, but a highly problematic one. The compulsion to quell that unease can inspire snap judgments, rigid thinking, and bad decision-making.



Fortunately, new research suggests a simple anecdote for this affliction: Read more literary fiction.



A trio of University of Toronto scholars, led by psychologist Maja Djikic, report that people who have just read a short story have less need for what psychologists call “cognitive closure.” Compared with peers who have just read an essay, they expressed more comfort with disorder and uncertainty—attitudes that allow for both sophisticated thinking and greater creativity.



“Exposure to literature,” the researchers write in the Creativity Research Journal, “may offer a (way for people) to become more likely to open their minds.”



Djikic and her colleagues describe an experiment featuring 100 University of Toronto students. After arriving at the lab and providing some personal information, the students read either one of eight short stories or one of eight essays. The fictional stories were by authors including Wallace Stegner, Jean Stafford, and Paul Bowles; the non-fiction essays were by equally illustrious writers such as George Bernard Shaw and Stephen Jay Gould.



Afterwards, each participant filled out a survey measuring their emotional need for certainty and stability. They expressed their agreement or disagreement with such statements as “I don’t like situations that are uncertain” and “I dislike questions that can be answered in many different ways.”



Those who read a short story had significantly lower scores on that test than those who read an essay. Specifically, they expressed less need for order and more comfort with ambiguity. This effect was particularly pronounced among those who reported being frequent readers of either fiction or non-fiction.



So how does literature induce this ease with the unknown? Djikic and her colleagues, Keith Oatley and Mihnea Moldoveanu, have some ideas.



“The thinking a person engages in while reading fiction does not necessarily lead him or her to a decision,” they note. This, they observe, decreases the reader’s need to come to a definitive conclusion.



“Furthermore,” they add, “while reading, the reader can stimulate the thinking styles even of people he or she might personally dislike. One can think along and even feel along with Humbert Humbert in Lolita, no matter how offensive one finds this character. This double release—of thinking through events without concerns for urgency and permanence, and thinking in ways that are different than one’s own—may produce effects of opening the mind.”



The researchers have no idea how long this effect might last. But their discovery that it is stronger in frequent readers suggests such people may gradually become programmed to respond in this way. “It is likely that only when experiences of this kind accumulate to reach some critical mass would they lead to long-term changes of meta-cognitive habits,” they write.



Their results should give people “pause to think about the effect of current cutbacks of education in the arts and humanities,” Djikic and her colleagues add. After all, they note, while success in most fields demands the sort of knowledge gained by reading non-fiction, it also “requires people to become insightful about others and their perspectives.”



If their conclusions are correct, that all-important knowledge can be gained by immersing yourself in a work of literature. There’s no antidote to black-or-white thinking like reading “It was the best of times, it was the worst of times.”

Will Reading Make You Rich?

Friday, Jun 21, 2013 06:44 AM CDT


Will reading make you rich?

Researchers claim reading fiction bestows marketable skills. That's not really what it's for ...

By Laura Miller

If you are an avid reader — or writer — of fiction, chances are you took note of a news item that appeared in the Pacific Standard last week (reprinted in Salon over the weekend). Titled “Study: Reading Novels Makes Us Better Thinkers,” the article, by Tom Jacobs, cited a recent paper out of the University of Toronto indicating that subjects who read a short story scored lower afterward on tests designed to determine “need for cognitive closure” than did people who’d read an essay. The fiction readers were, the researchers concluded, left more “open-minded,” and therefore both more “creative” and “rational” than their nonfiction-reading counterparts.



And it was not just any fiction that did the trick, mind you, but “literary” fiction. What balm to the acolytes of that dwindling corner of the cultural landscape! Many of the novelists, would-be novelists and bookworms who posted the item to their Facebook pages went on with their day ever so slightly perked up. Chances are they did not notice that the cited study was produced by the same circle of Torontonian researchers whose work has prompted similar recent news items. All told, this bunch has announced that reading literature — specifically fiction — makes people more empathetic, less inclined toward “attachment avoidance,” more socially adept and better able to change for the better in personality and temperament. Taken as a whole, they’ve touted the novel as a Swiss Army knife for matters of psychological hygiene.



Although the primary author of the open-mindedness study is Maja Djikic, a secondary author, Keith Oatley, seems to be at the center of this enterprise, which includes a group blog, OnFiction, and a two-year-old journal, Scientific Study of Literature. Oatley, professor emeritus of cognitive psychology at the University of Toronto, is also the author of the 2011 book “Such Stuff as Dreams: The Psychology of Fiction,” as well as being a novelist himself. (His “The Case of Emily V.” won the 1994 Commonwealth Writers Prize for best first novel, a prestigious Canadian award.)



The core idea of this academic coterie was expressed by Oatley in a paper published in the Review of General Psychology in 1999: Fiction, he wrote, is not meant to represent “empirical truth” but is rather “a simulation that runs on minds of readers just as a computer simulations run on computers.” This idea elaborates on the cognitive ability known as “theory of mind” — the capacity to attribute complex mental states to others, in particular states different from one’s own — combined with popular hypotheses in evolutionary psychology. Some evolutionary psychologists have speculated that telling and enjoying stories is an adaptation that builds social cohesion. According to this view, stories, by giving us insights into the minds and behavior of others, improve our ability to knit together the groups that have been the secret to humanity’s success as a species.



But there’s a more immediate survival agenda behind the research going on in Toronto, too, an agenda that the authors of the open-mindedness study state frankly: “It is hoped that this experiment will stimulate further investigation into the potential of literature in opening closed minds, as well as give one a pause to think about the effects of current cut-backs of education in the arts and humanities.”



The humanities have been under siege in the academy lately, condemned as inessential to the university’s role in preparing students for adult life. In recent years, the percentage of undergraduates pursuing degrees in the humanities has dropped precipitously, and Gov. Rick Scott of Florida has even urged that majors in such fields as art history and anthropology be charged higher tuitions at state universities on the theory that they can’t be directly funneled to gainful employment. The alarm has been raised and, as was announced in the New York Times this week, a new report — requested by a bipartisan committee of legislators and shepherded by the American Academy of Arts and Sciences — calls for a host of measures to defend liberal arts education.



While some treat the study of, say, literature, political science and philosophy as “a waste of time” (in the words of an educator who served on the commission that produced the report), the new initiative argues that “the humanities and social sciences are the heart of the matter, the keeper of the republic — a source of national memory and civic vigor, cultural understanding and communication, individual fulfillment and the ideals we hold in common.” A video on the American Academy of Arts and Sciences’ website accompanying the report features such luminaries as actor John Lithgow, director George Lucas and architect Billie Tsien, who asserts that “the humanities are the immeasurable.”



It’s all very high-minded, but rather abstract. The work done by scholars like Oatley and Djikic offers quantitative evidence that engagement with culture — particularly high culture — results in self-improvement; in other words, they claim to be able to measure Tsien’s “immeasurable.” They offer proof that reading literary fiction (that is, short stories chosen from an anthology of established greats) is more conducive to “creativity and rationality” in subjects than reading essays (chosen from a similar anthology).



I do not doubt Djikic’s commitment to literature, but also cannot help noticing that she’s a research associate at Toronto’s Rotman School of Management and director of the “Self-Development Lab.” Surely the main audience for her case consists of business leaders and officials like Gov. Scott, who only seem interested in education that can be harnessed by big business and made to pay. This casts the terms “creativity” and “rationality” in a somewhat different light. In the world of corporate management, “creativity” is less likely to mean “writing ‘Leaves of Grass’ and otherwise carrying on like the mad ecstatic hobo poet you are” than it is to mean “coming up with new ways to sell colored sugar water.”



Back in the early 2000s, I met a mathematician turned New York investment banker who explained to me that his work involved “inventing new transactions” — which at the time I took to be a bizarre form of corporate doublespeak. I now realize he was a quant, possibly one of the guys who devised new forms of credit default swap or the rest of the arcane financial products that brought about the crash of 2008. Surely he was being creative in inventing all those new transactions, and it was exactly the sort of creativity Rick Scott and his ilk would cheer on and consider well worth funding. The rest of America, however, is left to wonder how much more corporate creativity the middle and working classes can stand.



Nevertheless, these are the powers to whom educators and artists must now make the case for their own livelihoods, and the attitudes of those powers have become ever more utilitarian. Once, a case could be made for liberal education as the forge of the well-rounded individual; now it is expected to be a factory for producing well-engineered cogs. Students see themselves as consumers of a service that will provide them with marketable credentials rather than as human beings seeking guidance in a life-long search for truth, beauty and meaning. To save the liberal arts in academia, scholars like Djikic and Oatley are obliged to frame their findings in terms that present art as good for business rather than irrelevant to it.



Creativity, however, is an unpredictable thing. It is, after all, a species of imagination, and transcendence is its modus operandi. It can be bent to serve the needs and demands of late capitalism, for sure. But even as it exhorts its minions to think outside the box, the corporate class tends to forget that it, too, is a kind of box. When thinkers emerge to show us the way out of that particular box, they will almost certainly be the products of a liberal arts education. Just don’t tell the bosses that.

Is College Worth It?

 Illiberal Arts‘Is College Worth It?’ and ‘College (Un)bound’

By ANDREW DELBANCO

Published: June 21, 2013

More than a century ago, the president of Harvard, A. Lawrence Lowell, issued a warning to America’s colleges and universities. “Institutions,” he said, “are rarely murdered. They meet their end by suicide. . . . They die because they have outlived their usefulness, or fail to do the work that the world wants done.” Most of the institutions he had in mind are still around today, but the doomsday talk is back. William J. Bennett, secretary of education under President Reagan, and Jeffrey Selingo, an editor at The Chronicle of Higher Education, believe our system is self-­destructing. Their tones are different — Bennett and his co-author, David Wilezol, write in an expectant mood of good riddance, while Selingo is sympathetically alarmed — but their views are grimly consistent. College costs are up. Learning and graduation rates are down.
Michael Bierut

IS COLLEGE WORTH IT?



A Former United States Secretary of Education and a Liberal Arts Graduate Expose the Broken Promise of Higher Education



By William J. Bennett and David Wilezol



278 pp. Thomas Nelson. $22.99.



.COLLEGE (UN)BOUND



The Future of Higher Education and What It Means for Students



By Jeffrey J. Selingo



238 pp. New Harvest/Houghton Mifflin Harcourt. $26.

.Bennett’s basic argument is a familiar one, at least from conservative pundits: “Too many people are going to college.” In the search for employment, he believes, a college education confers less advantage than is commonly assumed and leaves students with crushing debt. He would prefer to see the United States emulate countries like Germany, where most young people are tracked into vocational training, and he wants more Americans who do go to college to study science, technology, engineering and mathematics rather than what he calls “irrelevant material.” Before attempting college at all, students should “critically evaluate the data: student-loan debt, return on investment, lifetime salary earnings, academic performance, skills training . . . and so on.” This seems an improbable strategy for most adolescents and a surrender of hopes and dreams, especially for those whose parents have not gone to college themselves.



Selingo doesn’t propose early sorting, but he agrees that the roughly $1 trillion students owe to private and public lenders are often wasted on empty pleasures, citing as an example the 645-foot-long river-rafting feature in the “leisure pool” at Texas Tech. My own sense is that most colleges are filled with hard-working students and teachers. At underfunded, overcrowded community colleges, which enroll more than a third of the almost 18 million American undergraduates, there aren’t many leisure pools.



But student debt is certainly too high, and Bennett and Selingo are right that the financial structure of college is breaking down. Private universities face a decline in federal dollars attached to research grants; endowment returns are unlikely to achieve the double-digit norms of a few years ago; and the relentless rise in tuition (which Bennett blames partly on the ready availability of government grants and loans) is unsustainable. At public institutions, which enroll three times as many students as private colleges, the problems are worse. After a sharp drop in the state appropriations that once kept the price of attendance affordable, tuition there has been rising even faster.



Bennett approaches these issues from a strong anti-government, pro-business perspective that leads to some odd contradictions. He commends for-profit universities even though at many for-profits graduation rates are low and student debt levels high. He scolds the federal government for violating “simple, sound banking principles” by lending money to students with “no credit history” but praises “private banks that, at large risk to themselves,” do the same thing.



Even if there were a quick fix for the fiscal problems, other problems remain. According to Selingo, today’s students “regard their professors as service providers, just like a cashier at the supermarket or a waiter in a restaurant.” He sees “a power shift in the classroom” as students evaluate their teachers through questionnaires “eerily similar to customer satisfaction surveys from department stores.” And, all too often, when professors evaluate students we know the result: grade inflation.



What to do? Selingo envisions a fundamental shift in how degrees are awarded — not on the basis of credit hours completed but on competency demonstrated. He sees students taking instruction, whether at a traditional college or through an independent online provider, using “adaptive learning technologies” that “adjust to the speed at which an individual student learns.” Each student’s progress would be continually tested and achievement recognized by a certificate or “badge” that would be more reliable than today’s diplomas, which are essentially based on time spent in class rather than on how much students have actually learned.



Successful machine-teaching is being pioneered by the Open Learning Initiative at Carnegie Mellon University, which Selingo rightly calls the “Cadillac” of online education. And some true believers in the online future — whether on the relatively modest scale of the Carnegie Mellon program or in the form of Massive Open Online Courses (MOOCs) — are convinced that all but the wealthiest colleges will be swept away by economic pressure and technological innovation. Selingo is right to doubt it. More likely, colleges will become increasingly stratified. Private institutions will retreat from their commitment to discounting tuition for needy students and will serve mainly the affluent, while public colleges, in order to cut costs, will rely more on technology and part-time faculty.



As different as they are in tone, these books share the assumption that education is mostly about what Selingo calls “information delivery.” If students get their information from multiple providers and demonstrate mastery of what they have learned, colleges will lose their monopoly on issuing marketable credentials. Some venture capitalists think this will happen soon, which is why there’s a surging number of for-profit education entrepreneurs.



So far, traditional colleges have fended off the challenge, mainly because they’re favored by the existing accreditation system. But Selingo suggests that they won’t — and shouldn’t — hold out much longer. For his part, Bennett dismisses most colleges as venues for “drinking, drugs, partying, sex,” though he adds, with evident reluctance, “sometimes learning.” He simply wants them to go away, allowing for some screamingly obvious exceptions: “If you get into Stanford . . . you should probably go.”



The colleges that survive will be those, in Selingo’s words, that “prove their worth.” Fair enough. But there’s a problem with this formulation, which presumes a narrow definition of worth that can be captured in data like rates of early job attainment or levels of lifetime income.



In times of economic stress, it’s entirely reasonable for students and families to demand evidence that paying for college makes sense. Bennett construes college as a business proposition, but Selingo allows himself to reflect on what’s sacrificed in such a view: “I worry at times about what might be lost in an unbound, personalized experience for students. Will they discover subjects they never knew existed? If a computer is telling them where to sit for class discussions, will they make those random connections that lead to lifelong friends? Will they be able to develop friendships and mentors if they move from provider to provider?”



These are the right questions. In striving to “prove their worth,” America’s colleges risk losing their value as places young ­people enter as adventurous adolescents and from which they emerge as intellectually curious adults. Such a loss could never be compensated by any gain.



Thursday, June 20, 2013

The Evil of Ayn Rand

Is Rand Paul’s Love of Ayn Rand a ‘Conspiracy’?By Jonathan Chait

lMy item on Rand Paul the other day, predictably, went over quite badly in the libertarian community. The Insomniac Libertarian, in an item wonderfully headlined “Obama Quisling Jonathan Chait Smears Rand Paul,” complains that my Paul piece “never discloses that [my] wife is an Obama campaign operative.” A brief annotated response:



1. I question the relevance of the charge, since Rand Paul is not running against Obama.



2. In point of fact, my wife is not an Obama campaign operative and has never worked for Obama’s campaign, or his administration, or volunteered for his campaign, or any campaign, and does not work in politics at all.



3. I question the headline labeling me an “Obama quisling,” a construction that implies that I have betrayed Obama, which seems to be the opposite of the Insomniac Libertarian’s meaning.



4. For reasons implied by points one through three, I urge the Insomniac Libertarian to familiarize himself with some of the science linking sleep deprivation to impaired brain function.



A more substantive, though still puzzling, retort comes from the Atlantic’s Conor Friedersdorf, a frequent bête noire of mine on subjects relating to Ayn Rand and Ron or Rand Paul. Friedersdorf raises two objections to my piece, which traced Rand Paul’s odd admission that he is “not a firm believer in democracy” to his advocacy of Randian thought. Friedersdorf first charges that the intellectual connection between Paul and Rand is sheer paranoia:



Chait takes the quote and turns it into a conspiracy … As I read this, I couldn't help but think of Chait as a left-leaning analog to the character in Bob Dylan's "Talkin' John Birch Paranoid Blues." Those Objectivists were coming around/They were in the air / They were on the Ground/ They wouldn't give me no peace. For two thousand years, critics of unmediated democracy have warned about the masses abusing individuals and minorities. The American system was built from the very beginning to check democratic excesses.



But if Rand Paul distrusts democracy he must've gotten it from Ayn Rand.



A conspiracy? Am I imagining that Rand Paul has been deeply influenced by Ayn Rand? Paul himself has discussed the deep influence her work had on his own thinking. In college he wrote a series of letters and columns either quoting Rand or knocking off her theories. He used a congressional hearing to describe one of her novels at tedious length. How is this a conspiracy?



Friedersdorf proceeds to argue that Rand is not really very militant anyway:



It's also interesting that Chait regards Rand's formulation as "militant." Let's look at it again. "I do not believe that a majority can vote a man's life, or property, or freedom away from him." Does Chait believe that a democratic majority should be able to vote a man's life or freedom away? …



In the political press, it happens again and again: libertarian leaning folks are portrayed as if they're radical, extremist ideologues, even when they're expressing ideas that are widely held by Americans across the political spectrum.



Well, here we come to a deeper disagreement about Ayn Rand. My view of her work is pretty well summarized in a review-essay I wrote in 2009, tying together two new biographies of Rand with some of the Randian strains that were gaining new currency in the GOP. My agenda here is not remotely hidden, but maybe I need to put more cards on the table. I've described her worldview as inverted Marxism — a conception of politics as a fundamental struggle between a producer class and a parasite class.



What I really mean is, I find Rand evil. Friedersdorf’s view is certainly far more nuanced and considerably more positive than mine. He’s a nice, intelligent person and a good writer, but we’re not going to agree on this.



Friedersdorf waves away Rand’s (and Rand Paul’s) distrust of democracy as the same fears everybody has about democracy. Well, no. Lots of us consider democracy imperfect or vulnerable, but most of us are very firm believers in democracy. Rand viewed the average person with undisguised contempt, and her theories pointed clearly in the direction of cruelty in the pursuit of its fanatical analysis. A seminal scene in Atlas Shrugged described the ideological errors of a series of characters leading up to their violent deaths, epitomizing the fanatical class warfare hatred it's embodied and which inspired Whitaker Chambers to observe, “From almost any page of Atlas Shrugged, a voice can be heard, from painful necessity, commanding: 'To the gas chambers — go!'”



Randism has never been tried as the governing philosophy of a country, so it remains conjecture that her theories would inevitably lead to repression if put into practice at a national level. But we do have a record of the extreme repression with which she ran her own cult, which at its height was a kind of totalitarian ministate. You can read her biographies, or at least my review, to get a sense of the mind-blowing repression, abuse, and corruption with which she terrorized her followers.



But the upshot is that I strongly dispute Friedersdorf’s premise that Rand’s theories are a variant of democracy, any more than Marx’s are. In fact, I find the existence of powerful elected officials who praise her theories every bit as disturbing to contemplate as elected officials who praise Marxism. Even if you take care to note some doctrinal differences with Rand, in my view we are talking about a demented, hateful cult leader and intellectual fraud. People who think she had a lot of really good ideas should not be anywhere near power.



Wednesday, June 19, 2013

A Necessary War?

Misunderstanding the Civil War

 150 Years of Misunderstanding the Civil War

As the anniversary of the Battle of Gettysburg approaches, it's time for America to question the popular account of a war that tore apart the nation.

Tony Horwitz Jun 19 2013, 2:10 PM ET

In early July, on the 150th anniversary of the Battle of Gettysburg, pilgrims will crowd Little Round Top and the High Water Mark of Pickett's Charge. But venture beyond these famous shrines to battlefield valor and you'll find quiet sites like Iverson's Pits, which recall the inglorious reality of Civil War combat.



On July 1st, 1863, Alfred Iverson ordered his brigade of North Carolinians across an open field. The soldiers marched in tight formation until Union riflemen suddenly rose from behind a stone wall and opened fire. Five hundred rebels fell dead or wounded "on a line as straight as a dress parade," Iverson reported. "They nobly fought and died without a man running to the rear. No greater gallantry and heroism has been displayed during this war."



Soldiers told a different story: of being "sprayed by the brains" of men shot in front of them, or hugging the ground and waving white kerchiefs. One survivor informed the mother of a comrade that her son was "shot between the Eye and ear" while huddled in a muddy swale. Of others in their ruined unit he wrote: "left arm was cut off, I think he will die... his left thigh hit and it was cut off." An artilleryman described one row of 79 North Carolinians executed by a single volley, their dead feet perfectly aligned. "Great God! When will this horrid war stop?" he wrote. The living rolled the dead into shallow trenches--hence the name "Iverson's Pits," now a grassy expanse more visited by ghost-hunters than battlefield tourists.



This and other scenes of unromantic slaughter aren't likely to get much notice during the Gettysburg sesquicentennial, the high water mark of Civil War remembrance. Instead, we'll hear a lot about Joshua Chamberlain's heroism and Lincoln's hallowing of the Union dead.



It's hard to argue with the Gettysburg Address. But in recent years, historians have rubbed much of the luster from the Civil War and questioned its sanctification. Should we consecrate a war that killed and maimed over a million Americans? Or should we question, as many have in recent conflicts, whether this was really a war of necessity that justified its appalling costs?



"We've decided the Civil War is a 'good war' because it destroyed slavery," says Fitzhugh Brundage, a historian at the University of North Carolina. "I think it's an indictment of 19th century Americans that they had to slaughter each other to do that."



The Civil War today is generally seen as a necessary and ennobling sacrifice, redeemed by the liberation of four million slaves.Similar reservations were voiced by an earlier generation of historians known as revisionists. From the 1920s to 40s, they argued that the war was not an inevitable clash over irreconcilable issues. Rather, it was a "needless" bloodbath, the fault of "blundering" statesmen and "pious cranks," mainly abolitionists. Some revisionists, haunted by World War I, cast all war as irrational, even "psychopathic."



World War II undercut this anti-war stance. Nazism was an evil that had to be fought. So, too, was slavery, which revisionists--many of them white Southerners--had cast as a relatively benign institution, and dismissed it as a genuine source of sectional conflict. Historians who came of age during the Civil Rights Movement placed slavery and emancipation at the center of the Civil War. This trend is now reflected in textbooks and popular culture. The Civil War today is generally seen as a necessary and ennobling sacrifice, redeemed by the liberation of four million slaves.



But cracks in this consensus are appearing with growing frequency, for example in studies like America Aflame, by historian David Goldfield. Goldfield states on the first page that the war was "America's greatest failure." He goes on to impeach politicians, extremists, and the influence of evangelical Christianity for polarizing the nation to the point where compromise or reasoned debate became impossible.



Very few Northerners went to war seeking or anticipating the destruction of slavery.Unlike the revisionists of old, Goldfield sees slavery as the bedrock of the Southern cause and abolition as the war's great achievement. But he argues that white supremacy was so entrenched, North and South, that war and Reconstruction could never deliver true racial justice to freed slaves, who soon became subject to economic peonage, Black Codes, Jim Crow, and rampant lynching.



Nor did the war knit the nation back together. Instead, the South became a stagnant backwater, a resentful region that lagged and resisted the nation's progress. It would take a century and the Civil Rights struggle for blacks to achieve legal equality, and for the South to emerge from poverty and isolation. "Emancipation and reunion, the two great results of this war, were badly compromised," Goldfield says. Given these equivocal gains, and the immense toll in blood and treasure, he asks: "Was the war worth it? No."



Few contemporary scholars go as far as Goldfield, but others are challenging key tenets of the current orthodoxy. Gary Gallagher, a leading Civil War historian at the University of Virginia, argues that the long-reigning emphasis on slavery and liberation distorts our understanding of the war and of how Americans thought in the 1860s. "There's an Appomattox syndrome--we look at Northern victory and emancipation and read the evidence backward," Gallagher says.



Very few Northerners went to war seeking or anticipating the destruction of slavery. They fought for Union, and the Emancipation Proclamation was a means to that end: a desperate measure to undermine the South and save a democratic nation that Lincoln called "the last best, hope of earth."



Gallagher also feels that hindsight has dimmed recognition of how close the Confederacy came to achieving its aims. "For the South, a tie was as good as a win," he says. It needed to inflict enough pain to convince a divided Northern public that defeating the South wasn't worth the cost. This nearly happened at several points, when rebel armies won repeated battles in 1862 and 1863. As late as the summer of 1864, staggering casualties and the stalling of Union armies brought a collapse in Northern morale, cries for a negotiated peace, and the expectation that anti-war (and anti-black) Democrats would take the White House. The fall of Atlanta that September narrowly saved Lincoln and sealed the South's eventual surrender.



Allen Guelzo, director of Civil War studies at Gettysburg College, adds the Pennsylvania battle to the roster of near-misses for the South. In his new book, Gettysburg: The Last Invasion, he identifies points when Lee's army came within minutes of breaking the Union line. If it had, he believes the already demoralized Army of the Potomac "would have gone to pieces." With a victorious Southern army on the loose, threatening Northern cities, "it would have been game over for the Union."



"So much of the violence in the Civil War is laundered or sanctified by emancipation, but that result was by no means inevitable."Imagining these and other scenarios isn't simply an exercise in "what if" history, or the fulfillment of Confederate fantasy fiction. It raises the very real possibility that many thousands of Americans might have died only to entrench secession and slavery. Given this risk, and the fact that Americans at the time couldn't see the future, Andrew Delbanco wonders if we ourselves would have regarded the defeat of the South as worth pursuing at any price. "Vindicated causes are easy to endorse," he observes in The Abolitionist Imagination.



Recent scholarship has also cast new light on the scale and horror of the nation's sacrifice. Soldiers in the 1860s didn't wear dog tags, the burial site of most was unknown, and casualty records were sketchy and often lost. Those tallying the dead in the late 19th century relied on estimates and assumptions to arrive at a figure of 618,000, a toll that seemed etched in stone until just a few years ago.



But J. David Hacker, a demographic historian, has used sophisticated analysis of census records to revise the toll upward by 20%, to an estimated 750,000, a figure that has won wide acceptance from Civil War scholars. If correct, the Civil War claimed more lives than all other American wars combined, and the increase in population since 1860 means that a comparable war today would cost 7.5 million lives.



This horrific toll doesn't include the more than half million soldiers who were wounded and often permanently disabled by amputation, lingering disease, psychological trauma and other afflictions. Veterans themselves rarely dwelled on this suffering, at least in their writing. "They walled off the horror and mangling and tended to emphasize the nobility of sacrifice," says Allen Guelzo. So did many historians, who cited the numbing totals of dead and wounded but rarely delved into the carnage or its societal impact.



That's changed dramatically with pioneering studies such as Drew Gilpin Faust's This Republic of Suffering, a 2008 examination of "the work of death" in the Civil War: killing, dying, burying, mourning, counting. "Civil War history has traditionally had a masculine view," says Faust, now president of Harvard, "it's all about generals and statesmen and glory." From reading the letters of women during the war, though, she sensed the depth of Americans' fear, grief, and despair. Writing her book amid "the daily drumbeat of loss" in coverage of Iraq and Afghanistan, Faust's focus on the horrors of this earlier war was reinforced.



"When we go to war, we ought to understand the costs," she says. "Human beings have an extraordinary capacity to forget that. Americans went into the Civil War imagining glorious battle, not gruesome disease and dismemberment."



Disease, in fact, killed roughly twice as many soldiers as did combat; dysentery and diarrhea alone killed over 44,000 Union soldiers, more than ten times the Northern dead at Gettysburg. Amputations were so routine, Faust notes, that soldiers and hospital workers frequently described severed limbs stacked "like cord wood," or heaps of feet, legs and arms being hauled off in carts, as if from "a human slaughterhouse." In an era before germ theory, surgeons' unclean saws and hands became vectors for infection that killed a quarter or more of the 60,000 or so men who underwent amputation.



Other historians have exposed the savagery and extent of the war that raged far from the front lines, including guerrilla attacks, massacres of Indians, extra-judicial executions and atrocities against civilians, some 50,000 of whom may have died as a result of the conflict. "There's a violence within and around the Civil War that doesn't fit the conventional, heroic narrative,' says Fitzhugh Brundage, whose research includes torture during the war. "When you incorporate these elements, the war looks less like a conflict over lofty principles and more like a cross-societal bloodletting."



In other words, it looks rather like ongoing wars in the Middle East and Afghanistan, which have influenced today's scholars and also their students. Brundage sees a growing number of returning veterans in his classes at the University of North Carolina, and new interest in previously neglected aspects of the Civil War era, such as military occupation, codes of justice, and the role of militias and insurgents.



More broadly, he senses an opening to question the limits of war as a force for good. Just as the fight against Nazism buttressed a moral vision of the Civil War, so too have the last decade's conflicts given us a fresh and cautionary viewpoint. "We should be chastened by our inability to control war and its consequences," Brundage says. "So much of the violence in the Civil War is laundered or sanctified by emancipation, but that result was by no means inevitable."



It's very hard, however, to see how emancipation might have been achieved by means other than war. The last century's revisionists thought the war was avoidable because they didn't regard slavery as a defining issue or evil. Almost no one suggests that today. The evidence is overwhelming that slavery was the "cornerstone" of the Southern cause, as the Confederacy's vice-president stated, and the source of almost every aspect of sectional division.



The 150th anniversary of the Civil War is too narrow a lens through which to view the conflict.Slaveholders also resisted any infringement of their right to human property. Lincoln, among many others, advocated the gradual and compensated emancipation of slaves. This had been done in the British West Indies, and would later end slavery in Brazil and Cuba. In theory it could have worked here. Economists have calculated that the cost of the Civil War, estimated at over $10 billion in 1860 dollars, would have been more than enough to buy the freedom of every slave, purchase them land, and even pay reparations. But Lincoln's proposals for compensated emancipation fell on deaf ears, even in wartime Delaware, which was behind Union lines and clung to only 2,000 slaves, about 1.5% of the state's population.



Nor is there much credible evidence that the South's "peculiar institution" would have peacefully waned on its own. Slave-grown cotton was booming in 1860, and slaves in non-cotton states like Virginia were being sold to Deep South planters at record prices, or put to work on railroads and in factories. "Slavery was a virus that could attach itself to other forms," says historian Edward Ayers, president of the University of Richmond. "It was stronger than it had ever been and was growing stronger."



Most historians believe that without the Civil War, slavery would have endured for decades, possibly generations. Though emancipation was a byproduct of the war, not its aim, and white Americans clearly failed during Reconstruction to protect and guarantee the rights of freed slaves, the post-war amendments enshrined the promise of full citizenship and equality in the Constitution for later generations to fulfill.



What this suggests is that the 150th anniversary of the Civil War is too narrow a lens through which to view the conflict. We are commemorating the four years of combat that began in 1861 and ended with Union victory in 1865. But Iraq and Afghanistan remind us, yet again, that the aftermath of war matters as much as its initial outcome. Though Confederate armies surrendered in 1865, white Southerners fought on by other means, wearing down a war-weary North that was ambivalent about if not hostile to black equality. Looking backwards, and hitting the pause button at the Gettysburg Address or the passage of the 13th amendment, we see a "good" and successful war for freedom. If we focus instead on the run-up to war, when Lincoln pledged to not interfere with slavery in the South, or pan out to include the 1870s, when the nation abandoned Reconstruction, the story of the Civil War isn't quite so uplifting.



But that also is an arbitrary and insufficient frame. In 1963, a century after Gettysburg, Martin Luther King Jr. invoked Lincoln's words and the legacy of the Civil War in calling on the nation to pay its "promissory note" to black Americans, which it finally did, in part, by passing Civil Rights legislation that affirmed and enforced the amendments of the 1860s. In some respects, the struggle for racial justice, and for national cohesion, continues still.



From the distance of 150 years, Lincoln's transcendent vision at Gettysburg of a "new birth of freedom" seems premature. But he himself acknowledged the limits of remembrance. Rather than simply consecrate the dead with words, he said, it is for "us the living" to rededicate ourselves to the unfinished work of the Civil War