Monday, February 28, 2011

Wisconsin Like Dixie?

Dixie Madison
Republicans want Wisconsin to become just like the South.

by Ed Kilgore from the New Republic

As Wisconsin Governor Scott Walker tries to strip away the collective bargaining rights of public-sector unions, many liberals have latched onto the idea that his real goal is to dismantle the labor movement and the infrastructure of the Democratic Party. That is almost certainly one of his aims, but it’s not the whole story.

Walker also has an economic vision for his state—one which is common currency in the Republican Party today, but hitherto alien in a historically progressive, unionist Midwestern state like Wisconsin. It is based on a theory of economic growth that is not only anti-statist but aggressively pro-corporate: relentlessly focused on breaking the backs of unions; slashing worker compensation and benefits; and subsidizing businesses in order to attract capital from elsewhere and avoid its flight to even more benighted locales. Students of economic development will recognize it as the “smokestack-chasing” model of growth adopted by desperate developing countries around the world, which have attempted to use their low costs and poor living conditions as leverage in the global economy. And students of American economic history will recognize it as the “Moonlight and Magnolias” model of development, which is native to the Deep South.

Just take a look at the broader policy context of the steps Walker is taking in Wisconsin. While simultaneously battling unions and calling for budget cuts, he’s made the state’s revenue quandary much worse by seeking to cut corporate taxes and boost “economic development incentives” (another term for tax subsidies and other public concessions) to businesses considering operations in Wisconsin. This is philosophically identical to the approach taken by new South Carolina Governor Nikki Haley, who hired a union-busting attorney to head up the state labor department and touted the state’s anti-union environment as a key to its prospects, explaining, “We’re going to fight the unions and I needed a partner to help me do it.” Despite large budget shortfalls, she’s also proposed to eliminate corporate income taxes and pay for it by restoring a sales tax on food. The common thread here is the quasi-religious belief that reducing business costs for corporations is the Holy Grail of economic development, while all other public and private goods should be measured strictly by their impact on the corporate bottom line.

Even before the arrival of Haley, this was the default model of economic growth in Southern states for decades—as the capital-starved, low-wage region concluded that the way it could compete economically with other states was to emphasize its comparative advantages: low costs, a large pool of relatively poor workers, “right to work” laws that discouraged unionization, and a small appetite for environmental or any other sort of regulation. So, like an eager Third-World country, the South sought to attract capital by touting and accentuating these attributes, rather than trying to build Silicon Valleys or seek broad-based improvements in the quality of life. Only during the last several decades, when Southern leaders like Arkansas’s Bill Clinton and North Carolina’s Jim Hunt called for economic strategies that revolved around improving public education and spawning home-grown industries was the hold of the “Moonlight and Magnolias” approach partially broken. And now it’s back with a vengeance, but no longer just in the South.

Members of the modern Republican Party, and the “Tea Party movement” in particular, gravitate naturally toward models of growth that treat public programs and investments as mere obstacles in the path of dynamic corporate “job creators.” Many look South in admiration: Just last week, Minnesota Tea Party heroine and possible presidential candidate Michele Bachmann visited South Carolina and told an audience that she was happy to join them in a “GOP paradise.” And Scott Walker is hardly alone among Midwestern Republican governors in pursuing an agenda that combines business-tax cuts and other incentives with attacks on public investments and Southern-style hostility to unions. That’s also the agenda of Ohio’s John Kasich, and while Michigan’s Rick Snyder and Indiana’s Mitch Daniels have stepped back from efforts to assault collective bargaining rights, they are devotees of the idea that low taxes and deregulation are essential to economic growth, regardless of the impact on public services and investments.

Why is this model of economic growth so appealing to the Tea Party? For one, it tends to jibe very well with the Ayn Randian belief in producerism: the idea that “job creators”—business owners—are the only source of economic growth in society, and that everyone else—the workers, government employees, and the poor—are just “useless eaters” shackling those who exercise individual initiative. While many Democrats are baffled by Scott Walker’s attack on the unions—shouldn’t he be focused on jobs rather than eliminating workers’ protections? they ask—the fact is that today’s conservatives believe this is the right and only way to create jobs. The same delusion is present at the federal level, where House Republicans insist that deregulation and spending cuts are the only ways to create jobs. That doesn’t sound like a formula for job growth, unless you account for the conviction that rolling back the public sector, and in the process impoverishing the middle-class families that depend on its services, is essential to keep any costs low enough for corporations to work their magic. The fact that the “beneficiaries” who get jobs as a result of this corporate development model will have to work for lower wages and fewer benefits, and suffer from poor schools and a violated environment, is beside the point.

The Tea Party’s love of “Moonlight and Magnolias” economics also fits with its disturbing affinity for other Old South concepts, which developed during Dixie’s long era of resistance to unionization, “big government” meddling with economic and social life, limits on natural resources exploitation, and judicial tampering with property rights and state’s rights. Most remarkable is the spread of “Tenther” interposition and nullification theories, which hold that the states should have special sovereign rights to thwart federal policies in ways not considered legitimate since the eras of Reconstruction and the civil rights movement. These have been widely touted by conservatives across the country (notably 2010 Senate candidates Sharron Angle of Nevada and Joe Miller of Alaska) and even by House Majority Leader Eric Cantor (who has spoken warmly of the “Repeal Amendment” that would let states collectively kill federal laws).

The problem with this Southern theory of growth is that it won’t work: Economic development experts usually deride “Moonlight and Magnolias” approaches to job creation, noting that they track the outmoded first and second “waves” of basic economic development theory—which emphasized crude economic races to the bottom—as opposed to third and fourth “waves” that focus on worker skills, quality of life, public-private partnerships, innovation, and sustainability. If Wisconsin and other states—not to mention the country as a whole—end up adopting these atavistic economic ideals, they will simply begin to resemble the dysfunctional Old South societies that spawned them in the first place.

So what is at stake in Wisconsin, and across the country, is not just the pay and benefits of public employees, or their collective bargaining rights, or the specific programs facing the budgetary knife. We are contesting whether Americans who are not “job creators,” by virtue of wealth, should be considered anything more than cannon fodder in an endless war between states—and countries—over who can attract the most capital by slashing the most regulations. In this sense, standing up to Scott Walker is a truly worthy fight.

Sunday, February 27, 2011

Up from Skinner

Beyond Intellectualism

On becoming an anti-intellectual intellectual

Robert Wright | February 14, 2011


I spent much of high school trying not to be interested in ideas. I studied hard and made good grades, but I didn't hang out with the nerds. This was partly because hanging out with nerds wasn't cool and partly because the kind of intellectualism they exuded didn't enthrall me. They talked about Camus and Sartre and Nietzsche -- people I hadn't heard much about in my life as an Army brat and people my mildly anti-intellectual father would have disdained had anyone explained to him who they were.

Then my sister's husband (an aspiring psychologist whose preference for graduate school over employment my father wasn't wild about) suggested I read Beyond Freedom and Dignity by B.F. Skinner.

As intellectuals go, Skinner was pretty dismissive of intellectuals -- at least the ones who blathered unproductively about "freedom" and "dignity," the ones he considered insufficiently hard-nosed and scientific.

Look, he said, people are animals. Kind of like laboratory rats, except taller. Their behavioral proclivities are a product of the positive and negative reinforcements they've gotten in the past. Want to build a better society? Discern the links between past reinforcement and future proclivity, and then adjust society's disbursement of reinforcements accordingly. No need to speculate about unobservable states of mind or ponder the role of "free will" or any other imponderables. Epistemology, phenomenology, metaphysics, and 25 cents will get you a ride on the New York subway.

This was my kind of intellectual -- an anti-intellectual intellectual! I became an ardent Skinnerian.

The ardor eventually faded. I ended up spending a fair amount of my writing career disagreeing with Skinner. He believed, for example, that people are almost infinitely malleable. In his utopian novel, Walden Two, he takes readers to a magical place where things like jealousy and envy are becoming relics of the primitive past, thanks to the masterful deployment of positive and negative reinforcement during childhood.

In high school, I bought into this view, but in college, a reference to the "socio-biology" controversy on the cover of Time magazine caught my eye, and I started looking into the Darwinian underpinnings of human behavior. This train of thought culminated -- about two decades after I encountered Skinner -- in my book The Moral Animal, a full-throated defense of evolutionary psychology.

The book wasn't unrelievedly anti-Skinnerian. Positive and negative reinforcement do shape us, and my Skinnerian roots led me to emphasize that fact much more than the average Darwinian. But natural selection has placed limits on how easy that shaping is. Jealousy can be tamed, but good luck killing it.

There is, though, one big Skinnerian theme to which I stayed true. It has to do with the "freedom" referred to in Skinner's title. Once you could explain an organism's behavior entirely as a product of genetic heritage and environmental history -- which Skinner considered doable in principle -- there's no behavior left to attribute to free will. So why obsess over the "culpability" of criminals? If you need to lock them up to keep society safe, fine, but don't pretend they "deserve" to suffer in some deep philosophical sense.

Here evolutionary psychology proved complementary to Skinner's view. It explained how natural selection had ingrained in us the intuition that wrongdoers deserve punishment, that their suffering somehow rights the moral scales. And once you've reduced a philosophical intuition to a mere instinct, a product of our species' natural history, its rightness, in my view, comes into question. So I've argued that punishment isn't a moral good in itself and is warranted only to the extent that it either keeps criminals off the street or deters would-be criminals. (Here, as elsewhere, my arguments haven't carried the day; the intrinsic goodness of retribution remains part of judicial doctrine.)

Perhaps my biggest departure from the Skinnerian line has been the time I've spent pondering things like free will and the mind-body problem, two probably related conundrums that I consider more challenging than, as I recall, Skinner did. But even this unSkinnerian fascination I probably owe to Skinner, because I had never given much thought to free will or consciousness until I watched with awe as he casually tossed them aside.

I've held on to the essential spirit of Skinner -- which, I now see, was also the spirit of my father. By that I don't mean anti-intellectualism as much as a bedrock pragmatism. Got a problem? Analyze it as cleanly as possible, and then, having seen its roots, solve it. And don't waste time dropping the names of any fancy French philosophers. This is still my basic view.

The Coming Showdown

Op-Ed Columnist
Why Wouldn’t the Tea Party Shut It Down?
By FRANK RICH
Published: February 26, 2011

NO one remembers anything in America, especially in Washington, so the history of the Great Government Shutdown of 1995 is being rewritten with impunity by Republicans flirting with a Great Government Shutdown of 2011. The bottom line of the revisionist spin is this: that 2011 is no 1995. Should the unthinkable occur on some coming budget D-Day — or perhaps when the deadline to raise the federal debt ceiling arrives this spring — the G.O.P. is cocksure that it can pin the debacle on the Democrats.

In the right’s echo chamber, voters are seen as so fed up with deficits that they’ll put principle over temporary inconveniences — like, say, a halt in processing new Social Security applicants or veterans’ benefit checks. Who needs coddled government workers to deal with those minutiae anyway? As Mike Huckabee has cheerfully pointed out, many more federal services are automated now than in the olden days of the late 20th century. Phone trees don’t demand pensions.

Remarkably (or not) much of the Beltway press has bought the line that comparisons between then and now are superficial. Sure, Bill Clinton, like Barack Obama, was bruised by his first midterms, with his party losing the House to right-wing revolutionaries hawking the Contract With America, a Tea Party ur-text demanding balanced budgets. But after that, we’re instructed, the narratives diverge. John Boehner is no bomb-throwing diva like Newt Gingrich, whose petulant behavior inspired the famous headline “Cry Baby” in The Daily News. A crier — well, yes — but Boehner’s too conventional a conservative to foment a reckless shutdown. Obama, prone to hanging back from Congressional donnybrooks, bears scant resemblance to the hands-on Clinton, who clamored to get into the ring with Newt.

Those propagating the 2011-is-not-1995 line also assume that somehow Boehner will prevent the new G.O.P. insurgents from bringing down the government they want to bring down. But if Gingrich couldn’t control his hard-line freshman class of 73 members in 1995 — he jokingly referred to them then as “a third party” — it’s hard to imagine how the kinder, gentler Boehner will control his 87 freshmen, many of them lacking government or legislative experience, let alone the gene for compromise. In the new Congress’s short history, the new speaker has already had trouble controlling his caucus. On Friday Gingrich made Boehner’s task harder by writing a Washington Post op-ed plea that the G.O.P. stick to its guns.

The 2011 rebels are to the right of their 1995 antecedents in any case. That’s why this battle, ostensibly over the deficit, is so much larger than the sum of its line-item parts. The highest priority of America’s current political radicals is not to balance government budgets but to wage ideological warfare in Washington and state capitals alike. The relatively few dollars that would be saved by the proposed slashing of federal spending on Planned Parenthood and Head Start don’t dent the deficit; the cuts merely savage programs the right abhors. In Wisconsin, where state workers capitulated to Gov. Scott Walker’s demands for financial concessions, the radical Republicans’ only remaining task is to destroy labor’s right to collective bargaining.

That’s not to say there is no fiscal mission in the right’s agenda, both nationally and locally — only that the mission has nothing to do with deficit reduction. The real goal is to reward the G.O.P.’s wealthiest patrons by crippling what remains of organized labor, by wrecking the government agencies charged with regulating and policing corporations, and, as always, by rewarding the wealthiest with more tax breaks. The bankrupt moral equation codified in the Bush era — that tax cuts tilted to the highest bracket were a higher priority even than paying for two wars — is now a given. The once-bedrock American values of shared sacrifice and equal economic opportunity have been overrun.

In this bigger picture, the Wisconsin governor’s fawning 20-minute phone conversation with a prankster impersonating the oil billionaire David Koch last week, while entertaining, is merely a footnote. The Koch Industries political action committee did contribute to Walker’s campaign (some $43,000) and did help underwrite Tea Party ads and demonstrations in Madison. But this governor is merely a petty-cash item on the Koch ledger — as befits the limited favors he can offer Koch’s mammoth, sprawling, Kansas-based industrial interests.

Look to Washington for the bigger story. As The Los Angeles Times recently reported, Koch Industries and its employees form the largest bloc of oil and gas industry donors to members of the new House Energy and Commerce Committee, topping even Exxon Mobil. And what do they get for that largess? As a down payment, the House budget bill not only reduces financing for the Environmental Protection Agency but also prohibits its regulation of greenhouse gases.

Here again, the dollars that will be saved are minute in terms of the federal deficit, but the payoff to Koch interests from a weakened E.P.A. is priceless. The same dynamic is at play in the House’s reduced spending for the Securities and Exchange Commission, the Internal Revenue Service. and the Commodities Futures Trading Commission (charged with regulation of the esoteric Wall Street derivatives that greased the financial crisis). The reduction in the deficit will be minimal, but the bottom lines for the Kochs and their peers, especially on Wall Street, will swell.

These special interests will stay in the closet next week when the Tea Partiers in the House argue (as the Gingrich cohort once did) that their only agenda is old-fashioned fiscal prudence. The G.O.P. is also banking on the presumption that Obama will bide his time too long, as he did in the protracted health care and tax-cut melees, and allow the Fox News megaphone, not yet in place in ’95, to frame the debate. Listening to the right’s incessant propaganda, you’d never know that the latest Pew survey found that Americans want to increase, not decrease, most areas of federal spending — and by large margins in the cases of health care and education.

The G.O.P. leadership faced those same headwinds from voters in ’95. As Boehner, then on the Gingrich team, told The Times in a January 1996 post-mortem, the G.O.P. had tested the notion of talking about “balancing the budget and Medicare in the same sentence” and discovered it would bring “big trouble.” Gingrich’s solution, he told The Times then, was simple: “We learned that if you talked about ‘preserving’ and ‘protecting’ Medicare, it worked.” Which it did until it didn’t — at which point the Gingrich revolution imploded.

Rather hilariously, the Republicans’ political gurus still believe that Gingrich’s ruse can work. In a manifesto titled “How the G.O.P. Can Win the Budget Battle” published in The Wall Street Journal last week, Fred Barnes of Fox News put it this way: “Bragging about painful but necessary cuts to Medicare scares people. Stressing the goal of saving Medicare won’t.” But the G.O.P. is trotting out one new political strategy this time. Current House leaders, mindful that their ’95 counterparts’ bravado backfired, constantly reiterate that they are “not looking for a government shutdown,” as Paul Ryan puts it. They seem to believe that if they repeat this locution often enough it will inoculate them from blame should a shutdown happen anyway — when, presumably, they are not looking.

Maybe, but no less an authority than Dick Armey, these days a leading Tea Party operative, thinks otherwise. Back in ’95, as a Gingrich deputy, he had been more bellicose than most in threatening a shutdown, as Bill Clinton recounts in his memoirs. But in 2006, Armey told a different story when reminiscing to an interviewer, Ryan Sager: “Newt’s position was, presidents get blamed for shutdowns, and he cited Ronald Reagan. My position was, Republicans get blamed for shutdowns. I argued that it is counterintuitive to the average American to think that the Democrat wants to shut down the government. They’re the advocates of the government. It is perfectly logical to them that Republicans would shut it down, because we’re seen as antithetical to government.”

Armey’s logic is perfect indeed, but logic is not the rage among his ideological compatriots this year. Otherwise, the Tea Party radicals might have figured out the single biggest difference between 1995 and 2011 — the state of the economy. Last time around, America was more or less humming along with an unemployment rate of 5.6 percent. This time we are still digging out of the worst financial disaster since the Great Depression, with an unemployment rate of 9 percent and oil prices on the rise. To even toy with shutting down the government in this uncertain climate is to risk destabilizing the nascent recovery, with those in need of the government safety net (including 43 million Americans on food stamps) doing most of the suffering.

Not that the gravity of this moment will necessarily stop the right from using the same playbook as last time. Still heady with hubris from the midterms — and having persuaded themselves that Gingrich’s 1995 history can’t possibly repeat itself — radical Republicans are convinced that deficit-addled voters are on their side no matter what. The president, meanwhile, is playing his cards close to his vest. Let’s hope he knows that he, not the speaker, is the player holding a full house, and that he will tell the country in no uncertain terms that much more than money is on the table.

Saturday, February 26, 2011

Shock Doctrine, U.S.A.
By PAUL KRUGMAN
Published: February 24, 2011


As many readers may recall, the results were spectacular — in a bad way. Instead of focusing on the urgent problems of a shattered economy and society, which would soon descend into a murderous civil war, those Bush appointees were obsessed with imposing a conservative ideological vision. Indeed, with looters still prowling the streets of Baghdad, L. Paul Bremer, the American viceroy, told a Washington Post reporter that one of his top priorities was to “corporatize and privatize state-owned enterprises” — Mr. Bremer’s words, not the reporter’s — and to “wean people from the idea the state supports everything.”

The story of the privatization-obsessed Coalition Provisional Authority was the centerpiece of Naomi Klein’s best-selling book “The Shock Doctrine,” which argued that it was part of a broader pattern. From Chile in the 1970s onward, she suggested, right-wing ideologues have exploited crises to push through an agenda that has nothing to do with resolving those crises, and everything to do with imposing their vision of a harsher, more unequal, less democratic society.

Which brings us to Wisconsin 2011, where the shock doctrine is on full display.

In recent weeks, Madison has been the scene of large demonstrations against the governor’s budget bill, which would deny collective-bargaining rights to public-sector workers. Gov. Scott Walker claims that he needs to pass his bill to deal with the state’s fiscal problems. But his attack on unions has nothing to do with the budget. In fact, those unions have already indicated their willingness to make substantial financial concessions — an offer the governor has rejected.

What’s happening in Wisconsin is, instead, a power grab — an attempt to exploit the fiscal crisis to destroy the last major counterweight to the political power of corporations and the wealthy. And the power grab goes beyond union-busting. The bill in question is 144 pages long, and there are some extraordinary things hidden deep inside.

For example, the bill includes language that would allow officials appointed by the governor to make sweeping cuts in health coverage for low-income families without having to go through the normal legislative process.

And then there’s this: “Notwithstanding ss. 13.48 (14) (am) and 16.705 (1), the department may sell any state-owned heating, cooling, and power plant or may contract with a private entity for the operation of any such plant, with or without solicitation of bids, for any amount that the department determines to be in the best interest of the state. Notwithstanding ss. 196.49 and 196.80, no approval or certification of the public service commission is necessary for a public utility to purchase, or contract for the operation of, such a plant, and any such purchase is considered to be in the public interest and to comply with the criteria for certification of a project under s. 196.49 (3) (b).”

What’s that about? The state of Wisconsin owns a number of plants supplying heating, cooling, and electricity to state-run facilities (like the University of Wisconsin). The language in the budget bill would, in effect, let the governor privatize any or all of these facilities at whim. Not only that, he could sell them, without taking bids, to anyone he chooses. And note that any such sale would, by definition, be “considered to be in the public interest.”

If this sounds to you like a perfect setup for cronyism and profiteering — remember those missing billions in Iraq? — you’re not alone. Indeed, there are enough suspicious minds out there that Koch Industries, owned by the billionaire brothers who are playing such a large role in Mr. Walker’s anti-union push, felt compelled to issue a denial that it’s interested in purchasing any of those power plants. Are you reassured?

The good news from Wisconsin is that the upsurge of public outrage — aided by the maneuvering of Democrats in the State Senate, who absented themselves to deny Republicans a quorum — has slowed the bum’s rush. If Mr. Walker’s plan was to push his bill through before anyone had a chance to realize his true goals, that plan has been foiled. And events in Wisconsin may have given pause to other Republican governors, who seem to be backing off similar moves.

But don’t expect either Mr. Walker or the rest of his party to change those goals. Union-busting and privatization remain G.O.P. priorities, and the party will continue its efforts to smuggle those priorities through in the name of balanced budgets.

Mr. Jefferson's Books

Though I am not a fan of Jefferson, it would be fun to see his library.



A Founding Father’s Books Turn Up
By SAM ROBERTS
Published: February 21, 2011


Thomas Jefferson, above in a portrait by Gilbert Stuart, obtained thousands of books in his lifetime but many were sold by his heirs to pay off his debts.
The 28 titles in 74 volumes were discovered recently in the collection of Washington University in St. Louis, immediately elevating its library to the third largest repository of books belonging to Jefferson after the Library of Congress and the University of Virginia.

“My reaction was: ‘Yes! It makes sense,’ ” said Shirley K. Baker, Washington University’s vice chancellor for scholarly resources and dean of university libraries. “It strikes me as particularly appropriate these are in Missouri. Jefferson bought this territory, and we in Missouri identify with him and honor him. And I was thrilled at the detective work our curators had done.”

The Washington University library learned of the Jefferson bonanza a few months ago from Endrina Tay, project manager for the Thomas Jefferson’s Libraries project at Monticello, the former president’s home near Charlottesville, Va., a National Historic Landmark. She has been working since 2004 to reconstruct Jefferson’s collection and make the titles and supplemental reference materials available online. Jefferson had several collections, including 6,700 books that he sold to the Library of Congress in 1815 after the British burned Washington. Writing to John Adams that “I cannot live without books” and confessing to a “canine appetite for reading,” Jefferson immediately started another collection that swelled to 1,600 books by the time he died on July 4, 1826. That collection became known as his retirement library.

Those books were dispersed after Jefferson’s heirs reluctantly decided to sell them at auction in 1829 to pay off Jefferson’s debts; auction catalogs survive, but not a record of who bought the books. The retirement collection is the least known of Jefferson’s libraries and one in which classics were represented in disproportionately greater numbers than politics and the law. He cataloged all 1,600 books according to “the faculties of the human mind,” like memory, reason and imagination, and then classified them further. Many were in French or Italian.

“Currently Monticello and the University of Virginia have the largest concentrations of books from the retirement library,” said Kevin J. Hayes, an English professor at the University of Central Oklahoma and the author of “The Road to Monticello: The Life and Mind of Thomas Jefferson.” “This new find would put Washington University among them. The question I would like to answer is: Do they contain any marginalia? Sometimes Jefferson wrote in his books; his marginalia would enhance both the scholarly and the cultural value of the books immeasurably.”

The answer is yes. Jefferson initialed his books (to affirm his ownership), often corrected typographical errors in the texts and also occasionally wrote marginal notes or comments about the substance. Researchers are combing the newly discovered collection to find such notations.

“These books add a dimension to the study of the life of Jefferson at Monticello,” Ms. Tay said. “They expand our understanding and give us a tangible connection. It helps us understand how Jefferson used his books — whether they were well worn, which means he read them often. Some have annotations, and two architectural volumes include notations of calculations that Jefferson made.”

She explained that while there was no plan to reassemble the retirement library at Monticello, all of the information about it would be placed in a database.

“The physical collection is not as critical as what it represents intellectually,” Ms. Tay said. “What did he read? Where did he get his ideas? What influenced him?”

Armed with the auction catalog, Ms. Tay found letters suggesting that Joseph Coolidge of Boston, who met one of Jefferson’s granddaughters at Monticello and later married her, submitted lists of the books he wanted to buy.

Coolidge wrote Nicholas Philip Trist, who married another Jefferson granddaughter, saying, “If there are any books which have T. J. notes or private marks, they would be interesting to me.” He added, “I beg you to interest yourself in my behalf in relation to the books; remember that his library will not be sold again, and that all the memorials of T. J. for myself and children, and friends, must be secured now! — this is the last chance!”

Ms. Tay also found an annotated auction catalog with the letter “C” written next to a number of items, which seemed to indicate that Coolidge had bid successfully.

While she was tracking down the retirement library, one of her fellow Monticello scholars, Ann Lucas Birle, was researching a book about the Coolidges and, searching Google Books, found a reference in The Harvard Register to a gift in 1880 from a Coolidge son-in-law, Edmund Dwight, to a fellow Harvard alumnus and possible relative, William Greenleaf Eliot, a founder of Washington University.

“It could have been his parents have died, he’s left with 3,000 books, what should he do with these that would really do good?” Dean Baker said. “A great-uncle just founded a new university. If you send them to a university that doesn’t even have 3,000 books, it could make a world of difference.”

The discovery that the 3,000 books in the Coolidge collection included 74 that once belonged to Jefferson means that about half of his retirement library has been accounted for. It has also prompted a search by librarians at Washington University to determine whether any other books in the Coolidge collection had been Jefferson’s.

Friday, February 25, 2011

Libraries and e-books

USA TODAYNation



E-books are a hot story at libraries
Public libraries across the USA are seeing a surge in demand for electronic books at a time many are facing budget cuts that make it difficult to satisfy it.

By Emily Spartz




OverDrive Inc., which supplies electronic books to 13,000 libraries worldwide, reported a 200% increase in e-book circulation in 2010 from 2009.

Though library officials see the value in providing e-books, many don't have the money to keep up.

"Libraries are facing huge budget cuts all across the country," says Audra Caplan, president of the Public Library Association.

In 2007, first lady Laura Bush recognized the Charlotte Mecklenburg Library in North Carolina as one of the USA's best. Now that library is closing four branches and laying off employees.

"We did see a huge uptick in new users and use of our e-books," said Linda Raymond, the library's materials management manager. "And no, we don't have a way to address it because of our budget."

To lend out titles, libraries buy e-book licenses from publishers. A single-copy license lets a library lend an e-book to a user for a set time, says David Burleigh, OverDrive's director of marketing. Once returned, it's available to another patron.

Licenses vary widely in price, depending on the title, says Jodi Fick of Siouxland Libraries in Sioux Falls, S.D. A license for John Grisham's The Confession costs Siouxland $28.95. For a printed book, the library would have gotten a 40% discount, said Kim Koblank, who oversees buying for Siouxland.

LIBRARY APPS: Technology syncs libraries, iPod generation
'COMMUNITY LIVING ROOMS': Libraries welcome homeless
AUTHORS' PICKS: Cornerstones of their libraries
Maryland's Montgomery County Public Libraries have had the budget for new materials slashed from $6.4 million two years ago to $3 million, said Barbara Webb, chief of collection and technology management. That doesn't leave enough money for traditional materials, much less e-books, she said, adding that the system is trying to redirect money toward e-books.

Library officials also are concerned about the different formats among e-readers, said Denise Davis, deputy director of the Sacramento Public Library Authority. "If you chose the wrong path, you've made an enormous investment in a technology that was very short-lived," she said.

This spring, the Sacramento library will test a program to lend out 100 Barnes & Noble Nook e-readers. The Nooks will be preloaded and patrons can return them at any of 28 branches, Davis said.

The Hennepin County Library in Minneapolis is increasing its e-book capacity. Last year, the system spent $35,000 on e-books. This year, it's spending $350,000, despite budget cuts, said Gail Mueller Schultz, head of collection management.

Library patron Beth Hindbjorgen of Sioux Falls, S.D. welcomes the investment. "I think it's the way we're heading for a lot of readers."

The Republican Shakedown

Robert Reich-Fmr. Secretary of Labor; Professor at Berkeley; Author, Aftershock: 'The
Posted: February 24, 2011 12:41


You can't fight something with nothing. But as long as Democrats refuse to talk about the almost unprecedented buildup of income, wealth, and power at the top -- and the refusal of the super-rich to pay their fair share of the nation's bills -- Republicans will convince people it's all about government and unions.

Republicans claim to have a mandate from voters for the showdowns and shutdowns they're launching. Governors say they're not against unions but voters have told them to cut costs, and unions are in the way. House Republicans say they're not seeking a government shutdown but standing on principle. "Republicans' goal is to cut spending and reduce the size of government," says House leader John Boehner, "not to shut it down." But if a shutdown is necessary to achieve the goal, so be it.

The Republican message is bloated government is responsible for the lousy economy that most people continue to experience. Cut the bloat and jobs and wages will return.

Nothing could be further from the truth, but for some reason Obama and the Democrats aren't responding with the truth. Their response is: We agree but you're going too far. Government employees should give up some more wages and benefits but don't take away their bargaining rights. Private-sector unionized workers should make more concessions but don't bust the unions. Non-defense discretionary spending should be cut but don't cut so much.

In the face of showdowns and shutdowns, the "you're right but you're going too far" response doesn't hack it. If Republicans are correct on principle, they're more likely to be seen as taking a strong principled stand than as going "too far." If they're basically correct that the problem is too much government spending why not go as far as possible to cut the bloat?

The truth that Obama and Democrats must tell is government spending has absolutely nothing to do with high unemployment, declining wages, falling home prices, and all the other horribles that continue to haunt most Americans.

Indeed, too little spending will prolong the horribles for years more because there's not enough demand in the economy without it.

The truth is that while the proximate cause of America's economic plunge was Wall Street's excesses leading up to the crash of 2008, its underlying cause -- and the reason the economy continues to be lousy for most Americans -- is so much income and wealth have been going to the very top that the vast majority no longer has the purchasing power to lift the economy out of its doldrums. American's aren't buying cars (they bought 17 million new cars in 2005, just 12 million last year). They're not buying homes (7.5 million in 2005, 4.6 million last year). They're not going to the malls (high-end retailers are booming but Wal-Mart's sales are down).

Only the richest 5 percent of Americans are back in the stores because their stock portfolios have soared. The Dow Jones Industrial Average has doubled from its crisis low. Wall Street pay is up to record levels. Total compensation and benefits at the 25 major Wall St firms had been $130 billion in 2007, before the crash; now it's close to $140 billion.

But a strong recovery can't be built on the purchases of the richest 5 percent.

The truth is if the super-rich paid their fair share of taxes, government wouldn't be broke. If Governor Scott Walker hadn't handed out tax breaks to corporations and the well-off, Wisconsin wouldn't be in a budget crisis. If Washington hadn't extended the Bush tax cuts for the rich, eviscerated the estate tax, and created loopholes for private-equity and hedge-fund managers, the federal budget wouldn't look nearly as bad.

And if America had higher marginal tax rates and more tax brackets at the top -- for those raking in $1 million, $5 million, $15 million a year -- the budget would look even better. We wouldn't be firing teachers or slashing Medicaid or hurting the most vulnerable members of our society. We wouldn't be in a tizzy over Social Security. We'd slow the rise in health care costs but we wouldn't cut Medicare. We'd cut defense spending and lop off subsidies to giant agribusinesses but we wouldn't view the government as our national nemesis.

The final truth is as income and wealth have risen to the top, so has political power. The reason all of this is proving so difficult to get across is the super-rich, such as the Koch brothers, have been using their billions to corrupt politics, hoodwink the public, and enlarge and entrench their outsized fortunes. They're bankrolling Republicans who are mounting showdowns and threatening shutdowns, and who want the public to believe government spending is the problem.

They are behind the Republican shakedown.

These are the truths that Democrats must start telling, and soon. Otherwise the Republican shakedown may well succeed.

Wednesday, February 23, 2011

The Tea Party

Jonathan Chait
What Is the Tea Party?

Senior Editor

What Is the Tea Party? Tom Friedman's Volcano Wakeup Call February 23, 2011 | 4:18


Right-Wing Zionists For The One State SolutionPew has another survey of Tea Party sympathizers, and it's clear once again that the movement is nothing more or less than conservative Republicans:



The Tea Party is essentially a re-branding campaign for the GOP base. It's a successful effort, and one that springs largely though not entirely from the grassroots itself. Conservatives like to imagine that the Tea Party is some incarnation of the popular will, asleep for many years and finally awakened under Obama, and bristle at any analysis that diminishes the world-historical import of the phenomenon. So let me be clear. The Tea Party represents a significant minority of Americans. It's influential. (It allowed conservatives to disown the failures of the Bush administration and to lend them a populist imprimatur.) But it's not anything more than an organizing rubric for the GOP base.

Tuesday, February 22, 2011

Marginalia

I'd love to own a book with margin notes written by Mark Twain.



By DIRK JOHNSON
Published: February 20, 2011

Mark Twain left a comment about “Huckleberry Finn,” in his copy of “The Pen and the Book” by Walter Besant.


The New York Times
A collection of books with notations by well-known writers includes one scribbled in by Ben Hecht.

The book, about making a profit in publishing, scarcely qualifies as a literary masterpiece. It is highly valuable, instead, because a reader has scribbled in the margins of its pages.

The scribbler was Mark Twain, who had penciled, among other observations, a one-way argument with the author, Walter Besant, that “nothing could be stupider” than using advertising to sell books as if they were “essential goods” like “salt” or “tobacco.” On another page, Twain made some snide remarks about the big sums being paid to another author of his era, Mary Baker Eddy, the founder of Christian Science.

Like many readers, Twain was engaging in marginalia, writing comments alongside passages and sometimes giving an author a piece of his mind. It is a rich literary pastime, sometimes regarded as a tool of literary archaeology, but it has an uncertain fate in a digitalized world.

“People will always find a way to annotate electronically,” said G. Thomas Tanselle, a former vice president of the John Simon Guggenheim Memorial Foundation and an adjunct professor of English at Columbia University. “But there is the question of how it is going to be preserved. And that is a problem now facing collections libraries.”

These are the sorts of matters pondered by the Caxton Club, a literary group founded in 1895 by 15 Chicago bibliophiles. With the Newberry, it is sponsoring a symposium in March titled “Other People’s Books: Association Copies and the Stories They Tell.”

The symposium will feature a new volume of 52 essays about association copies — books once owned or annotated by the authors — and ruminations about how they enhance the reading experience. The essays touch on works that connect President Lincoln and Alexander Pope; Jane Austen and William Cooper; Walt Whitman and Henry David Thoreau.

Marginalia was more common in the 1800s. Samuel Taylor Coleridge was a prolific margin writer, as were William Blake and Charles Darwin. In the 20th century it mostly came to be regarded like graffiti: something polite and respectful people did not do.

Paul F. Gehl, a curator at the Newberry, blamed generations of librarians and teachers for “inflicting us with the idea” that writing in books makes them “spoiled or damaged.”

But marginalia never vanished. When Nelson Mandela was imprisoned in South Africa in 1977, a copy of Shakespeare was circulated among the inmates. Mandela wrote his name next to the passage from “Julius Caesar” that reads, “Cowards die many times before their deaths.”

Studs Terkel, the oral historian, was known to admonish friends who would read his books but leave them free of markings. He told them that reading a book should not be a passive exercise, but rather a raucous conversation.

Books with markings are increasingly seen these days as more valuable, not just for a celebrity connection but also for what they reveal about the community of people associated with a work, according to Heather Jackson, a professor of English at the University of Toronto.

Professor Jackson, who will speak at the symposium, said examining marginalia reveals a pattern of emotional reactions among everyday readers that might otherwise be missed, even by literary professionals.

“It might be a shepherd writing in the margins about what a book means to him as he’s out tending his flock,” Professor Jackson said. “It might be a schoolgirl telling us how she feels. Or maybe it’s lovers who are exchanging their thoughts about what a book means to them.”

Just about anyone who has paged through a used college textbook has seen marginalia, and often added comments of their own.

Not everyone values marginalia, said Paul Ruxin, a member of the Caxton Club. “If you think about the traditional view that the book is only about the text,” he said, “then this is kind of foolish, I suppose.”

David Spadafora, president of the Newberry, said marginalia enriched a book, as readers infer other meanings, and lends it historical context. “The digital revolution is a good thing for the physical object,” he said. As more people see historical artifacts in electronic form, “the more they’re going to want to encounter the real object.”

The collection at the Newberry includes a bound copy of “The Federalist” once owned by Thomas Jefferson. Besides penciling his initials in the book, Jefferson wrote those of the founding fathers alongside their essays, which had originally been published anonymously.

“It’s pretty interesting to hold a book that Jefferson held,” Mr. Spadafora said. “Besides that, if we know what books were in his library in the years leading to the writing of the Declaration of Independence, it tells us something about what might have inspired his intellect.”

In her markings, Rose Caylor gave us a sense of her husband, the playwright Ben Hecht. In her copy of “A Child of the Century,” which Mr. Hecht wrote, she had drawn an arrow pointing to burns on a page. “Strikes matches on books,” she noted about her husband, who was a smoker.

Some lovers of literature even conjure dreamy notions about those who have left marginalia for them to find. In his poem “Marginalia,” Billy Collins, the former American poet laureate, wrote about how a previous reader had stirred the passions of a boy just beginning high school and reading “The Catcher in the Rye.”

As the poem describes it, he noticed “a few greasy smears in the margin” and a message that was written “in soft pencil — by a beautiful girl, I could tell.” It read, “Pardon the egg salad stains, but I’m in love.”

Friday, February 18, 2011

The Fraudulent Deficit Debate

NYTimes.com
Op-Ed Columnist
Willie Sutton Wept
By PAUL KRUGMAN
Published: February 17, 2011




There are three things you need to know about the current budget debate. First, it’s essentially fraudulent. Second, most people posing as deficit hawks are faking it. Third, while President Obama hasn’t fully avoided the fraudulence, he’s less bad than his opponents — and he deserves much more credit for fiscal responsibility than he’s getting.

About the fraudulence: Last month, Howard Gleckman of the Tax Policy Center described the president as the “anti-Willie Sutton,” after the holdup artist who reputedly said he robbed banks because that’s where the money is. Indeed, Mr. Obama has lately been going where the money isn’t, making a big deal out of a freeze on nonsecurity discretionary spending, which accounts for only 12 percent of the budget.

But that’s what everyone does. House Republicans talk big about spending cuts — but focus solely on that same small budget sliver.

And by proposing sharp spending cuts right away, Republicans aren’t just going where the money isn’t, they’re also going when the money isn’t. Slashing spending while the economy is still deeply depressed is a recipe for slower economic growth, which means lower tax receipts — so any deficit reduction from G.O.P. cuts would be at least partly offset by lower revenue.

The whole budget debate, then, is a sham. House Republicans, in particular, are literally stealing food from the mouths of babes — nutritional aid to pregnant women and very young children is one of the items on their cutting block — so they can pose, falsely, as deficit hawks.

What would a serious approach to our fiscal problems involve? I can summarize it in seven words: health care, health care, health care, revenue.

Notice that I said “health care,” not “entitlements.” People in Washington often talk as if there were a program called Socialsecuritymedicareandmedicaid, then focus on things like raising the retirement age. But that’s more anti-Willie Suttonism. Long-run projections suggest that spending on the major entitlement programs will rise sharply over the decades ahead, but the great bulk of that rise will come from the health insurance programs, not Social Security.

So anyone who is really serious about the budget should be focusing mainly on health care. And by focusing, I don’t mean writing down a number and expecting someone else to make that number happen — a dodge known in the trade as a “magic asterisk.” I mean getting behind specific actions to rein in costs.

By that standard, the Simpson-Bowles deficit commission, whose work is now being treated as if it were the gold standard of fiscal seriousness, was in fact deeply unserious. Its report “was one big magic asterisk,” Bob Greenstein of the Center on Budget and Policy Priorities told The Washington Post’s Ezra Klein. So is the much-hyped proposal by Paul Ryan, the G.O.P.’s supposed deep thinker du jour, to replace Medicare with vouchers whose value would systematically lag behind health care costs. What’s supposed to happen when seniors find that they can’t afford insurance?

What would real action on health look like? Well, it might include things like giving an independent commission the power to ensure that Medicare only pays for procedures with real medical value; rewarding health care providers for delivering quality care rather than simply paying a fixed sum for every procedure; limiting the tax deductibility of private insurance plans; and so on.

And what do these things have in common? They’re all in last year’s health reform bill.

That’s why I say that Mr. Obama gets too little credit. He has done more to rein in long-run deficits than any previous president. And if his opponents were serious about those deficits, they’d be backing his actions and calling for more; instead, they’ve been screaming about death panels.

Now, even if we manage to rein in health costs, we’ll still have a long-run deficit problem — a fundamental gap between the government’s spending and the amount it collects in taxes. So what should be done?

This brings me to the seventh word of my summary of the real fiscal issues: if you’re serious about the deficit, you should be willing to consider closing at least part of this gap with higher taxes. True, higher taxes aren’t popular, but neither are cuts in government programs. So we should add to the roster of fundamentally unserious people anyone who talks about the deficit — as most of our prominent deficit scolds do — as if it were purely a spending issue.

The bottom line, then, is that while the budget is all over the news, we’re not having a real debate; it’s all sound, fury, and posturing, telling us a lot about the cynicism of politicians but signifying nothing in terms of actual deficit reduction. And we shouldn’t indulge those politicians by pretending otherwise.

Thursday, February 17, 2011

The Information

The InformationHow the Internet gets inside us.by Adam Gopnik
from The New Yorker

February 14, 2011 Books explaining why books no longer matter come in many flavors.
When the first Harry Potter book appeared, in 1997, it was just a year before the universal search engine Google was launched. And so Hermione Granger, that charming grind, still goes to the Hogwarts library and spends hours and hours working her way through the stacks, finding out what a basilisk is or how to make a love potion. The idea that a wizard in training might have, instead, a magic pad where she could inscribe a name and in half a second have an avalanche of news stories, scholarly articles, books, and images (including images she shouldn’t be looking at) was a Quidditch broom too far. Now, having been stuck with the library shtick, she has to go on working the stacks in the Harry Potter movies, while the kids who have since come of age nudge their parents. “Why is she doing that?” they whisper. “Why doesn’t she just Google it?”

That the reality of machines can outpace the imagination of magic, and in so short a time, does tend to lend weight to the claim that the technological shifts in communication we’re living with are unprecedented. It isn’t just that we’ve lived one technological revolution among many; it’s that our technological revolution is the big social revolution that we live with. The past twenty years have seen a revolution less in morals, which have remained mostly static, than in means: you could already say “fuck” on HBO back in the eighties; the change has been our ability to tweet or IM or text it. The set subject of our novelists is information; the set obsession of our dons is what it does to our intelligence.

The scale of the transformation is such that an ever-expanding literature has emerged to censure or celebrate it. A series of books explaining why books no longer matter is a paradox that Chesterton would have found implausible, yet there they are, and they come in the typical flavors: the eulogistic, the alarmed, the sober, and the gleeful. When the electric toaster was invented, there were, no doubt, books that said that the toaster would open up horizons for breakfast undreamed of in the days of burning bread over an open flame; books that told you that the toaster would bring an end to the days of creative breakfast, since our children, growing up with uniformly sliced bread, made to fit a single opening, would never know what a loaf of their own was like; and books that told you that sometimes the toaster would make breakfast better and sometimes it would make breakfast worse, and that the cost for finding this out would be the price of the book you’d just bought.

All three kinds appear among the new books about the Internet: call them the Never-Betters, the Better-Nevers, and the Ever-Wasers. The Never-Betters believe that we’re on the brink of a new utopia, where information will be free and democratic, news will be made from the bottom up, love will reign, and cookies will bake themselves. The Better-Nevers think that we would have been better off if the whole thing had never happened, that the world that is coming to an end is superior to the one that is taking its place, and that, at a minimum, books and magazines create private space for minds in ways that twenty-second bursts of information don’t. The Ever-Wasers insist that at any moment in modernity something like this is going on, and that a new way of organizing data and connecting users is always thrilling to some and chilling to others—that something like this is going on is exactly what makes it a modern moment. One’s hopes rest with the Never-Betters; one’s head with the Ever-Wasers; and one’s heart? Well, twenty or so books in, one’s heart tends to move toward the Better-Nevers, and then bounce back toward someplace that looks more like home.


from the issuecartoon banke-mail thisAmong the Never-Betters, the N.Y.U. professor Clay Shirky—the author of “Cognitive Surplus” and many articles and blog posts proclaiming the coming of the digital millennium—is the breeziest and seemingly most self-confident. “Seemingly,” because there is an element of overdone provocation in his stuff (So people aren’t reading Tolstoy? Well, Tolstoy sucks) that suggests something a little nervous going on underneath. Shirky believes that we are on the crest of an ever-surging wave of democratized information: the Gutenberg printing press produced the Reformation, which produced the Scientific Revolution, which produced the Enlightenment, which produced the Internet, each move more liberating than the one before. Though it may take a little time, the new connective technology, by joining people together in new communities and in new ways, is bound to make for more freedom. It’s the Wired version of Whig history: ever better, onward and upward, progress unstopped. In John Brockman’s anthology “Is the Internet Changing the Way You Think?,” the evolutionary psychologist John Tooby shares the excitement—“We see all around us transformations in the making that will rival or exceed the printing revolution”—and makes the same extended parallel to Gutenberg: “Printing ignited the previously wasted intellectual potential of huge segments of the population. . . . Freedom of thought and speech—where they exist—were unforeseen offspring of the printing press.”

Shirky’s and Tooby’s version of Never-Betterism has its excitements, but the history it uses seems to have been taken from the back of a cereal box. The idea, for instance, that the printing press rapidly gave birth to a new order of information, democratic and bottom-up, is a cruel cartoon of the truth. If the printing press did propel the Reformation, one of the biggest ideas it propelled was Luther’s newly invented absolutist anti-Semitism. And what followed the Reformation wasn’t the Enlightenment, a new era of openness and freely disseminated knowledge. What followed the Reformation was, actually, the Counter-Reformation, which used the same means—i.e., printed books—to spread ideas about what jerks the reformers were, and unleashed a hundred years of religious warfare. In the seventeen-fifties, more than two centuries later, Voltaire was still writing in a book about the horrors of those other books that urged burning men alive in auto-da-fé. Buried in Tooby’s little parenthetical—“where they exist”—are millions of human bodies. If ideas of democracy and freedom emerged at the end of the printing-press era, it wasn’t by some technological logic but because of parallel inventions, like the ideas of limited government and religious tolerance, very hard won from history.

Of course, if you stretch out the time scale enough, and are sufficiently casual about causes, you can give the printing press credit for anything you like. But all the media of modern consciousness—from the printing press to radio and the movies—were used just as readily by authoritarian reactionaries, and then by modern totalitarians, to reduce liberty and enforce conformity as they ever were by libertarians to expand it. As Andrew Pettegree shows in his fine new study, “The Book in the Renaissance,” the mainstay of the printing revolution in seventeenth-century Europe was not dissident pamphlets but royal edicts, printed by the thousand: almost all the new media of that day were working, in essence, for kinglouis.gov.

Even later, full-fledged totalitarian societies didn’t burn books. They burned some books, while keeping the printing presses running off such quantities that by the mid-fifties Stalin was said to have more books in print than Agatha Christie. (Recall that in “1984” Winston’s girlfriend works for the Big Brother publishing house.) If you’re going to give the printed book, or any other machine-made thing, credit for all the good things that have happened, you have to hold it accountable for the bad stuff, too. The Internet may make for more freedom a hundred years from now, but there’s no historical law that says it has to.

Many of the more knowing Never-Betters turn for cheer not to messy history and mixed-up politics but to psychology—to the actual expansion of our minds. The argument, advanced in Andy Clark’s “Supersizing the Mind” and in Robert K. Logan’s “The Sixth Language,” begins with the claim that cognition is not a little processing program that takes place inside your head, Robby the Robot style. It is a constant flow of information, memory, plans, and physical movements, in which as much thinking goes on out there as in here. If television produced the global village, the Internet produces the global psyche: everyone keyed in like a neuron, so that to the eyes of a watching Martian we are really part of a single planetary brain. Contraptions don’t change consciousness; contraptions are part of consciousness. We may not act better than we used to, but we sure think differently than we did.

Cognitive entanglement, after all, is the rule of life. My memories and my wife’s intermingle. When I can’t recall a name or a date, I don’t look it up; I just ask her. Our machines, in this way, become our substitute spouses and plug-in companions. Jerry Seinfeld said that the public library was everyone’s pathetic friend, giving up its books at a casual request and asking you only to please return them in a month or so. Google is really the world’s Thurber wife: smiling patiently and smugly as she explains what the difference is between eulogy and elegy and what the best route is to that little diner outside Hackensack. The new age is one in which we have a know-it-all spouse at our fingertips.

But, if cognitive entanglement exists, so does cognitive exasperation. Husbands and wives deny each other’s memories as much as they depend on them. That’s fine until it really counts (say, in divorce court). In a practical, immediate way, one sees the limits of the so-called “extended mind” clearly in the mob-made Wikipedia, the perfect product of that new vast, supersized cognition: when there’s easy agreement, it’s fine, and when there’s widespread disagreement on values or facts, as with, say, the origins of capitalism, it’s fine, too; you get both sides. The trouble comes when one side is right and the other side is wrong and doesn’t know it. The Shakespeare authorship page and the Shroud of Turin page are scenes of constant conflict and are packed with unreliable information. Creationists crowd cyberspace every bit as effectively as evolutionists, and extend their minds just as fully. Our trouble is not the over-all absence of smartness but the intractable power of pure stupidity, and no machine, or mind, seems extended enough to cure that.

The books by the Better-Nevers are more moving than those by the Never-Betters for the same reason that Thomas Gray was at his best in that graveyard: loss is always the great poetic subject. Nicholas Carr, in “The Shallows,” William Powers, in “Hamlet’s BlackBerry,” and Sherry Turkle, in “Alone Together,” all bear intimate witness to a sense that the newfound land, the ever-present BlackBerry-and-instant-message world, is one whose price, paid in frayed nerves and lost reading hours and broken attention, is hardly worth the gains it gives us. “The medium does matter,” Carr has written. “As a technology, a book focuses our attention, isolates us from the myriad distractions that fill our everyday lives. A networked computer does precisely the opposite. It is designed to scatter our attention. . . . Knowing that the depth of our thought is tied directly to the intensity of our attentiveness, it’s hard not to conclude that as we adapt to the intellectual environment of the Net our thinking becomes shallower.”

These three Better-Nevers have slightly different stories to tell. Carr is most concerned about the way the Internet breaks down our capacity for reflective thought. His testimony about how this happened in his own life is plangent and familiar, but he addles it a bit by insisting that the real damage is being done at the neurological level, that our children are having their brains altered by too much instant messaging and the like. This sounds impressive but turns out to be redundant. Of course the changes are in their brains; where else would they be? It’s the equivalent of saying that playing football doesn’t just affect a kid’s fitness; it changes the muscle tone that creates his ability to throw and catch footballs.

Powers’s reflections are more family-centered and practical. He recounts, very touchingly, stories of family life broken up by the eternal consultation of smartphones and computer monitors:



Somebody excuses themselves for a bathroom visit or a glass of water and doesn’t return. Five minutes later, another of us exits on a similarly mundane excuse along the lines of “I have to check something.”. . . Where have all the humans gone? To their screens of course. Where they always go these days. The digital crowd has a way of elbowing its way into everything, to the point where a family can’t sit in a room together for half an hour without somebody, or everybody, peeling off. . . . As I watched the Vanishing Family Trick unfold, and played my own part in it, I sometimes felt as if love itself, or the acts of heart and mind that constitute love, were being leached out of the house by our screens.


He then surveys seven Wise Men—Plato, Thoreau, Seneca, the usual gang—who have something to tell us about solitude and the virtues of inner space, all of it sound enough, though he tends to overlook the significant point that these worthies were not entirely in favor of the kinds of liberties that we now take for granted and that made the new dispensation possible. (He knows that Seneca instructed the Emperor Nero, but sticks in a footnote to insist that the bad, fiddling-while-Rome-burned Nero asserted himself only after he fired the philosopher and started to act like an Internet addict.)

Similarly, Nicholas Carr cites Martin Heidegger for having seen, in the mid-fifties, that new technologies would break the meditational space on which Western wisdoms depend. Since Heidegger had not long before walked straight out of his own meditational space into the arms of the Nazis, it’s hard to have much nostalgia for this version of the past. One feels the same doubts when Sherry Turkle, in “Alone Together,” her touching plaint about the destruction of the old intimacy-reading culture by the new remote-connection-Internet culture, cites studies that show a dramatic decline in empathy among college students, who apparently are “far less likely to say that it is valuable to put oneself in the place of others or to try and understand their feelings.” What is to be done? Other Better-Nevers point to research that’s supposed to show that people who read novels develop exceptional empathy. But if reading a lot of novels gave you exceptional empathy university English departments should be filled with the most compassionate and generous-minded of souls, and, so far, they are not.

One of the things that John Brockman’s collection on the Internet and the mind illustrates is that when people struggle to describe the state that the Internet puts them in they arrive at a remarkably familiar picture of disassociation and fragmentation. Life was once whole, continuous, stable; now it is fragmented, multi-part, shimmering around us, unstable and impossible to fix. The world becomes Keats’s “waking dream,” as the writer Kevin Kelly puts it.

The odd thing is that this complaint, though deeply felt by our contemporary Better-Nevers, is identical to Baudelaire’s perception about modern Paris in 1855, or Walter Benjamin’s about Berlin in 1930, or Marshall McLuhan’s in the face of three-channel television (and Canadian television, at that) in 1965. When department stores had Christmas windows with clockwork puppets, the world was going to pieces; when the city streets were filled with horse-drawn carriages running by bright-colored posters, you could no longer tell the real from the simulated; when people were listening to shellac 78s and looking at color newspaper supplements, the world had become a kaleidoscope of disassociated imagery; and when the broadcast air was filled with droning black-and-white images of men in suits reading news, all of life had become indistinguishable from your fantasies of it. It was Marx, not Steve Jobs, who said that the character of modern life is that everything falls apart.

We must, at some level, need this to be true, since we think it’s true about so many different kinds of things. We experience this sense of fracture so deeply that we ascribe it to machines that, viewed with retrospective detachment, don’t seem remotely capable of producing it. If all you have is a hammer, the saying goes, everything looks like a nail; and, if you think the world is broken, every machine looks like the hammer that broke it.

It is an intuition of this kind that moves the final school, the Ever-Wasers, when they consider the new digital age. A sense of vertiginous overload is the central experience of modernity, they say; at every moment, machines make new circuits for connection and circulation, as obvious-seeming as the postage stamps that let eighteenth-century scientists collaborate by mail, or as newfangled as the Wi-Fi connection that lets a sixteen-year-old in New York consult a tutor in Bangalore. Our new confusion is just the same old confusion.

Among Ever-Wasers, the Harvard historian Ann Blair may be the most ambitious. In her book “Too Much to Know: Managing Scholarly Information Before the Modern Age,” she makes the case that what we’re going through is like what others went through a very long while ago. Against the cartoon history of Shirky or Tooby, Blair argues that the sense of “information overload” was not the consequence of Gutenberg but already in place before printing began. She wants us to resist “trying to reduce the complex causal nexus behind the transition from Renaissance to Enlightenment to the impact of a technology or any particular set of ideas.” Anyway, the crucial revolution was not of print but of paper: “During the later Middle Ages a staggering growth in the production of manuscripts, facilitated by the use of paper, accompanied a great expansion of readers outside the monastic and scholastic contexts.” For that matter, our minds were altered less by books than by index slips. Activities that seem quite twenty-first century, she shows, began when people cut and pasted from one manuscript to another; made aggregated news in compendiums; passed around précis. “Early modern finding devices” were forced into existence: lists of authorities, lists of headings.

Everyone complained about what the new information technologies were doing to our minds. Everyone said that the flood of books produced a restless, fractured attention. Everyone complained that pamphlets and poems were breaking kids’ ability to concentrate, that big good handmade books were ignored, swept aside by printed works that, as Erasmus said, “are foolish, ignorant, malignant, libelous, mad.” The reader consulting a card catalogue in a library was living a revolution as momentous, and as disorienting, as our own. The book index was the search engine of its era, and needed to be explained at length to puzzled researchers—as, for that matter, did the Hermione-like idea of “looking things up.” That uniquely evil and necessary thing the comprehensive review of many different books on a related subject, with the necessary oversimplification of their ideas that it demanded, was already around in 1500, and already being accused of missing all the points. In the period when many of the big, classic books that we no longer have time to read were being written, the general complaint was that there wasn’t enough time to read big, classic books.

Blair’s and Pettegree’s work on the relation between minds and machines, and the combination of delight and despair we find in their collisions, leads you to a broader thought: at any given moment, our most complicated machine will be taken as a model of human intelligence, and whatever media kids favor will be identified as the cause of our stupidity. When there were automatic looms, the mind was like an automatic loom; and, since young people in the loom period liked novels, it was the cheap novel that was degrading our minds. When there were telephone exchanges, the mind was like a telephone exchange, and, in the same period, since the nickelodeon reigned, moving pictures were making us dumb. When mainframe computers arrived and television was what kids liked, the mind was like a mainframe and television was the engine of our idiocy. Some machine is always showing us Mind; some entertainment derived from the machine is always showing us Non-Mind.

Armed with such parallels, the Ever Wasers smile condescendingly at the Better-Nevers and say, “Of course, some new machine is always ruining everything. We’ve all been here before.” But the Better-Nevers can say, in return, “What if the Internet is actually doing it?” The hypochondriac frets about this bump or that suspicious freckle and we laugh—but sooner or later one small bump, one jagged-edge freckle, will be the thing for certain. Worlds really do decline. “Oh, they always say that about the barbarians, but every generation has its barbarians, and every generation assimilates them,” one Roman reassured another when the Vandals were at the gates, and next thing you knew there wasn’t a hot bath or a good book for another thousand years.

And, if it was ever thus, how did it ever get to be thus in the first place? The digital world is new, and the real gains and losses of the Internet era are to be found not in altered neurons or empathy tests but in the small changes in mood, life, manners, feelings it creates—in the texture of the age. There is, for instance, a simple, spooky sense in which the Internet is just a loud and unlimited library in which we now live—as if one went to sleep every night in the college stacks, surrounded by pamphlets and polemics and possibilities. There is the sociology section, the science section, old sheet music and menus, and you can go to the periodicals room anytime and read old issues of the New Statesman. (And you can whisper loudly to a friend in the next carrel to get the hockey scores.) To see that that is so is at least to drain some of the melodrama from the subject. It is odd and new to be living in the library; but there isn’t anything odd and new about the library.

Yet surely having something wrapped right around your mind is different from having your mind wrapped tightly around something. What we live in is not the age of the extended mind but the age of the inverted self. The things that have usually lived in the darker recesses or mad corners of our mind—sexual obsessions and conspiracy theories, paranoid fixations and fetishes—are now out there: you click once and you can read about the Kennedy autopsy or the Nazi salute or hog-tied Swedish flight attendants. But things that were once external and subject to the social rules of caution and embarrassment—above all, our interactions with other people—are now easily internalized, made to feel like mere workings of the id left on its own. (I’ve felt this myself, writing anonymously on hockey forums: it is easy to say vile things about Gary Bettman, the commissioner of the N.H.L., with a feeling of glee rather than with a sober sense that what you’re saying should be tempered by a little truth and reflection.) Thus the limitless malice of Internet commenting: it’s not newly unleashed anger but what we all think in the first order, and have always in the past socially restrained if only thanks to the look on the listener’s face—the monstrous music that runs through our minds is now played out loud.

A social network is crucially different from a social circle, since the function of a social circle is to curb our appetites and of a network to extend them. Everything once inside is outside, a click away; much that used to be outside is inside, experienced in solitude. And so the peacefulness, the serenity that we feel away from the Internet, and which all the Better-Nevers rightly testify to, has less to do with being no longer harried by others than with being less oppressed by the force of your own inner life. Shut off your computer, and your self stops raging quite as much or quite as loud.

It is the wraparound presence, not the specific evils, of the machine that oppresses us. Simply reducing the machine’s presence will go a long way toward alleviating the disorder. Which points, in turn, to a dog-not-barking-in-the-nighttime detail that may be significant. In the Better-Never books, television isn’t scanted or ignored; it’s celebrated. When William Powers, in “Hamlet’s BlackBerry,” describes the deal his family makes to have an Unplugged Sunday, he tells us that the No Screens agreement doesn’t include television: “For us, television had always been a mostly communal experience, a way of coming together rather than pulling apart.” (“Can you please turn off your damn computer and come watch television with the rest of the family,” the dad now cries to the teen-ager.)

Yet everything that is said about the Internet’s destruction of “interiority” was said for decades about television, and just as loudly. Jerry Mander’s “Four Arguments for the Elimination of Television,” in the nineteen-seventies, turned on television’s addictive nature and its destruction of viewers’ inner lives; a little later, George Trow proposed that television produced the absence of context, the disintegration of the frame—the very things, in short, that the Internet is doing now. And Bill McKibben ended his book on television by comparing watching TV to watching ducks on a pond (advantage: ducks), in the same spirit in which Nicholas Carr leaves his computer screen to read “Walden.”

Now television is the harmless little fireplace over in the corner, where the family gathers to watch “Entourage.” TV isn’t just docile; it’s positively benevolent. This makes you think that what made television so evil back when it was evil was not its essence but its omnipresence. Once it is not everything, it can be merely something. The real demon in the machine is the tirelessness of the user. A meatless Monday has advantages over enforced vegetarianism, because it helps release the pressure on the food system without making undue demands on the eaters. In the same way, an unplugged Sunday is a better idea than turning off the Internet completely, since it demonstrates that we can get along just fine without the screens, if only for a day.

Hermione, stuck in the nineties, never did get her iPad, and will have to manage in the stacks. But perhaps the instrument of the new connected age was already in place in fantasy. For the Internet screen has always been like the palantír in Tolkien’s “Lord of the Rings”—the “seeing stone” that lets the wizards see the entire world. Its gift is great; the wizard can see it all. Its risk is real: evil things will register more vividly than the great mass of dull good. The peril isn’t that users lose their knowledge of the world. It’s that they can lose all sense of proportion. You can come to think that the armies of Mordor are not just vast and scary, which they are, but limitless and undefeatable, which they aren’t.

Thoughts are bigger than the things that deliver them. Our contraptions may shape our consciousness, but it is our consciousness that makes our credos, and we mostly live by those. Toast, as every breakfaster knows, isn’t really about the quality of the bread or how it’s sliced or even the toaster. For man cannot live by toast alone. It’s all about the butter. ♦

The Crazy Right Wing

by Paul Krugman

February 17, 2011, 1:13 pm
Agnotology
An interesting exchange between John Quiggin and Jonathan Chait on right-wing agnotology — that is, culturally-induced ignorance or doubt. The specific issue is birtherism, the claim that Barack Obama was born in Kenya or anyway not in America, which polls indicate is a view held by a majority of Republican primary voters.

Quiggin suggests that right-wingers aren’t really birthers in their hearts; it’s just that affirming birtherism is a sort of badge of belonging, a shibboleth in the original biblical sense. Chait counters that much of the modern right lives in a mental universe in which liberal elites hide the truth, and in which they, through their access to Fox News etc., know things the brainwashed masses don’t.

My view is that Quiggin is right as far as right-wing politicians are concerned: for the most part they know that Obama was born here, that he isn’t a socialist,that there are no death panels, and so on, but feel compelled to pretend to be crazy as a career move. But I think Chait has it right on the broader movement.

I mean, I see it all the time on economic statistics: point out that inflation remains fairly low, that the Fed isn’t really printing money, whatever, and you get accusations that the data are being falsified, that you yourself are cherry-picking by using the same measures you’ve always used, whatever. There really is epistemic closure: if the facts don’t support certain prejudices, that’s because They are hiding the truth, which we true believers know.

And don’t get me started on climate change.

Wednesday, February 16, 2011

The Reagan Mythology

Op-Ed Columnist
Reagan and Reality
By BOB HERBERT
Published: February 14, 2011

The image that many, perhaps most, Americans have of the nation’s 40th president is largely manufactured. Reagan has become this larger-than-life figure who all but single-handedly won the cold war, planted the Republican Party’s tax-cut philosophy in the resistant soil of the liberal Democrats and is the touchstone for all things allegedly conservative, no matter how wacky or extreme.

Mr. Jarecki’s documentary does a first-rate job of respectfully separating the real from the mythical, the significant from the nonsense. The truth is that Ronald Reagan, at one time or another, was all over the political map. Early on, he was a liberal Democrat and admirer of Franklin Roosevelt. Reagan’s family received much-needed help from the New Deal during the Depression.

It is well known that Reagan was the head of the Screen Actors Guild. And though he was staunchly anti-Communist, he did not finger anyone when he appeared before the rabid House Un-American Activities Committee. But Mr. Jarecki learned that at the height of the Red Scare, Reagan had been secretly cooperating with the F.B.I. He was registered officially as Informant T-10.

No less than other public figures, Reagan was complicated. He was neither the empty suit that his greatest detractors would have you believe nor the conservative god of his most slavish admirers. He was a tax-cutter who raised taxes in seven of the eight years of his presidency. He was a budget-cutter who nearly tripled the federal budget deficit.

The biggest problem with Reagan, as we look back at his presidency in search of clues that might help us meet the challenges of today, is that he presented himself — and has since been presented by his admirers — as someone committed to the best interests of ordinary, hard-working Americans. Yet his economic policies, Reaganomics, dealt a body blow to that very constituency.

Mark Hertsgaard, the author of “On Bended Knee: The Press and the Reagan Presidency,” says in the film, “You cannot be fair in your historical evaluation of Ronald Reagan if you don’t look at the terrible damage his economic policies did to this country.”

Paul Volcker, who served as chairman of the Federal Reserve during most of the Reagan years, commented in the film about the economist Arthur Laffer’s famous curve, which, incredibly, became a cornerstone of national economic policy. “The Laffer Curve,” said Mr. Volcker, “was presented as an intellectual support for the idea that reducing taxes would produce more revenues, and that was, I think, considered by most people a pretty extreme interpretation of what would happen.”

Toward the end of his comment, the former Fed chairman chuckled as if still amused by the idea that this was ever taken seriously.

What we get with Reagan are a series of disconnects and contradictions that have led us to a situation in which a president widely hailed as a hero of the working class set in motion policies that have been mind-bogglingly beneficial to the wealthy and devastating to working people and the poor.

“It is important that we stop idolizing our public figures, lionizing them,” said Mr. Jarecki, in an interview. He views Reagan as a gifted individual and does not give short shrift in the film to Reagan’s successes in his dealings with the Soviet Union and other elements of what Mr. Jarecki called “the positive side of Ronald Reagan.” The film also has interviews with many Reagan stalwarts, including James Baker and George Shultz.

But when all is said and done, it is the economic revolution that gained steam during the Reagan years and is still squeezing the life out of the middle class and the poor that is Reagan’s most significant legacy. A phony version of that legacy is relentlessly promoted by right-wingers who shamelessly pursue the interests of the very rich while invoking the Reagan brand to give the impression that they are in fact the champions of ordinary people.

Reagan’s son, Ron, says in the film that he believes his father “was vulnerable to the idea that poor people were somehow poor because it was their fault.” A clip is then shown of Ronald Reagan referring to, “The homeless who are homeless, you might say, by choice.”

“Reagan,” an HBO documentary, will be shown on Presidents’ Day to U.S. military personnel on the American Forces Network. It will be available soon in theaters and home video release. It is an important corrective to the fantasy of Reagan that has gotten such a purchase on American consciousness.

On SS and Medicare

--------------------------------------------------------------------------------
February 16, 2011, 9:54 am
There Is Still No Such Thing As Socialsecuritymedicareandmedicaid
by Paul Krugman
And President Obama, I’m glad to see, knows that:

The truth is Social Security is not the huge contributor to the deficit that the other two entitlements are. I’m confident we can get Social Security done in the same way that Ronald Reagan and Tip O’Neill were able to get it done, by parties coming together, making some modest adjustments. I think we can avoid slashing benefits, and I think we can make it stable and stronger for not only this generation but for the next generation.

It’s also important to realize that the conceptual issues are very different for Social Security than they are for M&M. For SS, we decide the level of benefits; for M&M, we can’t do that, because health costs for any individual are unpredictable; so cost-savings on the health-care programs essentially involve deciding what we’ll pay for rather than how much we’ll pay. (Death panels!)

I like the way Jonathan Bernstein puts it:

In my view, those who are upset about the long-term federal budget deficit should talk about it in terms of what it is, health care costs. Just as the phrase “weapons of mass destruction” encourages sloppy thinking (because nuclear weapons are not really similar at all to chemical and biological weapons in lots of important ways), talking about “entitlements” confuses the budget situation. I could see “Medicare and Medicaid” or, perhaps, “government health programs,” but not entitlements.

Tuesday, February 15, 2011

Bogart

Books of The Times
Talent Is What Made Him Dangerous
By MICHIKO KAKUTANI
Published: February 14, 2011

He was the very image of the quintessential American hero — loyal, unsentimental, plain-spoken. An idealist wary of causes and ideology. A romantic who hid his deeper feelings beneath a tough veneer. A renegade who subscribed to an unshakeable code of honor.


Humphrey Bogart starred in “The Maltese Falcon” in 1941.

TOUGH WITHOUT A GUN

The Life and Extraordinary Afterlife of Humphrey Bogart

By Stefan Kanfer

288 pages. Alfred A. Knopf. $26.95.


Excerpt: ‘Tough Without a Gun’ (February 14, 2011)


David Rogers
Stefan Kanfer
He was cool before cool became cool.

He looked good in a trench coat or a dinner jacket, and often had a cigarette in one hand or a glass of Scotch in the other. He could even make a bow tie or a fedora look hot. For legions of fans, he was a mensch in the fickle world of Hollywood — a man of his word and a consummate pro. Whatever he did he did well and with a minimum of fuss: no fancy words, no intellectual pretensions, just simple grace under pressure. The French tried to claim him as an existentialist and others described him as an old-fashioned stoic, but he would have dismissed such labeling with a sardonic wisecrack.

“He was a man who tried very hard to be Bad because he knew it was easier to get along in the world that way,” Peter Bogdanovich said. “He always failed because of an innate goodness which surely nauseated him.”

Katharine Hepburn said: “He walked straight down the center of the road. No maybes. Yes or no.”

Raymond Chandler said he could “be tough without a gun.”

More than a half-century after his death, Humphrey Bogart remains an iconic star and an enduring symbol — celebrated at college film festivals and revival movie theaters, and immortalized on a United States postage stamp. The American Film Institute named him the country’s greatest male screen star (Hepburn was its No. 1 actress), and Entertainment Weekly named him the top movie legend of all time.

Jean-Luc Godard and Woody Allen have paid tribute to him in movies, and so has Bugs Bunny. He’s been written about perceptively by critics like Kenneth Tynan and Richard Schickel. And he’s been the focus of memoirs by his wife Lauren Bacall and his son Stephen, as well as the subject of a wide array of biographies including a very well done 1997 volume by A. M. Sperber and Eric Lax.

So why another book on Bogart? Having written durable biographies of Groucho Marx and Marlon Brando, the critic Stefan Kanfer seems to have wanted to add Bogart to his portfolio of American originals. He doesn’t really contribute anything significantly new to the record, recycling lots of well known, much recounted stories about the actor’s life and work, and his effort to frame those stories by looking at the Bogart legend and its enduring power feels a little contrived: by now, dissections of Bogie’s on-screen and off-screen personas and his almost mythic aura are highly familiar too.

Still, for readers who simply can’t get enough of Bogart (or members of younger generations who have been dwelling in an Internet echo chamber somewhere), this is a perfectly engaging book. It does an evocative job of conveying Bogart’s uncommon and enduring mystique, and it gives the reader a palpable sense of the sadly truncated arc of his life.

Mr. Kanfer briskly sketches in Bogart’s upper-class upbringing in New York, the son of a prominent physician, who became addicted to morphine, and a well-known illustrator, who was an ardent feminist. Young Humphrey, we’re reminded, was a rebellious, alienated adolescent — think a World War I-era Holden Caulfield — who bounced from one private school to another, eventually getting thrown out of Phillips Andover because, Mr. Kanfer writes, “his grades had fallen so precipitously,” not as Andover legend has it, because he “had thrown grapefruits through the headmaster’s window."

After more or less stumbling into an acting career, Bogart served a lengthy apprenticeship, which for all its frustrations, helped fine-tune his craft. From early roles on Broadway playing juvenile, “tennis anyone?” types, he made his way to a succession of gangster roles in Hollywood. By one account, his first 45 movies had him getting hanged or electrocuted eight times, sentenced to life imprisonment nine times, and cut down by bullets a dozen times.

Bogart’s big break came when George Raft turned down starring roles in “High Sierra,” and then “The Maltese Falcon,” and from there, it was on to the movies with which he has become synonymous, including “Casablanca,” “To Have and Have Not,” “The Big Sleep” and “The Treasure of the Sierra Madre.”

Of “Casablanca,” Mr. Kanfer writes: “It was, and would remain, a Humphrey Bogart movie because he was the one who furnished the work with a moral center. There was no other player who could have so credibly inhabited the role of Rick Blaine, expatriate, misanthrope, habitual drinker, and, ultimately, the most self-sacrificing, most romantic Hollywood hero of the war years. To watch him in this extraordinary feature was not only to see a character rise to the occasion. It was to see a performer mature, to become the kind of man American males yearned to be. When Humphrey Bogart started filming ‘Casablanca’ on May 25, 1942, he was a star without stature; when he finished, on August 1, he was the most important American film actor of his time and place.”

Although many readers might wish that Mr. Kanfer had spent more time explicating Bogart’s major work and less time plodding through a chronicle of his lesser films — his Brando biography suffered from a similar flaw — this volume nonetheless provides a conscientious chronicle of its subject’s evolution as a performer, and it leaves the reader with a haunting picture of Bogart’s brave struggle with esophageal cancer in the last year of his life.

In his eulogy of Bogart, who died at 57 in 1957, John Huston described the fountains of Versailles, where a sharp-toothed pike kept the carp active so that they never got fat and complacent. “Bogie,” he said, “took rare delight in performing a similar duty in the fountains of Hollywood,” adding, “he was endowed with the greatest gift a man can have: talent. The whole world came to recognize it .... We have no reason to feel sorry for him — only for ourselves for having lost him. He is quite irreplaceable.”

The Farce of Constitutional Conservatism

The Short, Happy Life Of Constitutional Conservatism
Jonathan Chait


The Limits Of Budget CuttingThis is a bit of a legislative stunt, but it's a revealing one:

More remarkable was the House vote on a motion offered by the Democrats, which sought to recommit the bill with instructions to add language ensuring that surveillances would only be conducted in compliance with the U.S. Constitution.
That motion lost on a 186-234 vote.
All 234 "no" votes came from Republicans, including two dozen members who minutes later would vote against extension of the surveillance authorities.
Remember "Constitutional conservatism"? I'll let Charles Krauthammer refresh your memory:

What originalism is to jurisprudence, constitutionalism is to governance: a call for restraint rooted in constitutional text. Constitutionalism as a political philosophy represents a reformed, self-regulating conservatism that bases its call for minimalist government - for reining in the willfulness of presidents and legislatures - in the words and meaning of the Constitution. ...

Some liberals are already disdaining the new constitutionalism, denigrating the document's relevance and sneering at its public recitation. They sneer at their political peril. In choosing to focus on a majestic document that bears both study and recitation, the reformed conservatism of the Obama era has found itself not just a symbol but an anchor.

Right, this is total crap. The whole theory is nothing more than a slogan justifying uninhibited right-wing judicial activism. If you're not even willing to approve a rote formulation requiring compliance with the Constitution, you don't care about the Constitution at all except as a weapon to advance your agenda.

Sunday, February 13, 2011

What Lincoln Meant to the Slaves

February 12, 2011, 10:39 pm
What Lincoln Meant to the Slaves
By STEVEN HAHN

Disunion follows the Civil War as it unfolded.



The enormous excitement and anticipation of the 1860 presidential election campaign spread into unexpected corners of the United States. Indeed, during the months surrounding the contest, and especially after Americans learned of Abraham Lincoln’s victory, reports circulated across the Southern states of political attentiveness and restlessness among the slaves.

Southern newspapers noted the slaves’ attraction to “every political speech” and their disposition to “linger around” the hustings or courthouse square “and hear what the orators had to say.” But even more significantly, witnesses told of elevated hopes and expectations among the slaves that Lincoln intended “to set them all free.” And once Lincoln assumed office and fighting erupted between the Union and Confederacy, hopes and expectations seemed to inspire actions. Slaves’ response to the election of 1860 and their ideas about Lincoln’s intentions suggest that they, too, were important actors in the country’s drama of secession and war, and that they may have had an unappreciated influence on its outcome.

Scholars and the interested public have long debated Lincoln’s views on slavery and how they influenced his policies as president. How committed was he to abolition? What was he prepared to do? Could he imagine a world in which white and black people lived together in peace and freedom? For many slaves, at least at first, the answer was clear: Lincoln’s election meant emancipation.

On one Virginia plantation, a group of slaves celebrated Lincoln’s inauguration by proclaiming their freedom and marching off their owner’s estate. In Alabama, some slaves had come to believe that “Lincoln is soon going to free them all,” and had begun “making preparations to aid him when he makes his appearance,” according to local whites. A runaway slave in Louisiana told his captors in late May 1861 that “the North was fighting for the Negroes now and that he was as free as his master.” Shortly thereafter, a nearby planter conceded that “the Negroes have gotten a confused idea of Lincoln’s Congress meeting and of the war; they think it is all to help them and they expected for ‘something to turn up.’”

The slaves, of course, had no civil or political standing in American society on the eve of the Civil War; they were chattel property subject to the power and domination of their owners, and effectively “outside” formal politics. But they were unwilling to accept their assignment to political oblivion. Relying on scattered literacy, limited mobility and communication networks they constructed over many years, slaves had been learning important lessons about the political history of the United States and Western Hemisphere. They heard about the Haitian Revolution and the abolition of slavery in the British West Indies; they knew of a developing antislavery movement in the Northern states and of slaves escaping there; and they heard of a new Republican Party, apparently committed to ending their captivity.

Some slaves discovered that John C. Fremont was the first Republican candidate for president in 1856 and, like William Webb, a slave in Kentucky and Mississippi, held clandestine meetings to consider what might come of it. But it was Lincoln, four years later, who riveted their imaginations. Even as a candidate he was the topic of news and debate on countless plantations. In the view of one slaveholder, slaves simply “know too much about Lincoln . . . for our own safety and peace of mind.” News spread quickly, recalled Booker T. Washington, who grew up in western Virginia: “During the campaign when Lincoln was first a candidate for the presidency, the slaves on our far-off plantation, miles from any railroad or large city or daily newspaper, knew what the issues involved were.”

Related
Civil War Timeline

An unfolding history of the Civil War with photos and articles from the Times archive and ongoing commentary from Disunion contributors.

Of course, the slaves’ expectations that Lincoln and the Republicans were intent on abolishing slavery were for the most part misplaced. Lincoln’s policy in 1860 and 1861 was to restrict the expansion of slavery into the federal territories of the West but also to concede that slavery in the states was a local institution, beyond the reach of the federal government. At the very time that slaves were imagining Lincoln as their ally, Lincoln was assuring slaveholders that he would uphold the Constitution and the Fugitive Slave Law and make no moves against them and their property.

Yet slaves were fortified in their beliefs by the dire predictions many slaveholders were making and by the secessionist movement that led to the creation of the Confederacy. They knew as well as anyone else in the country that the likelihood of civil war was growing and, by sharing information and interpreting the course of political events, they readied themselves to act – not only to escape their bonds, but to do their part to make the war about their freedom, whether the North wanted it that way or not.

Thus the case of Harry Jarvis. Born a slave on the eastern shore of Virginia, Jarvis took to the woods for several weeks after the Civil War began, where he survived owing to fellow slaves who brought him news and food. Then, seizing an opportunity, Jarvis headed to Fort Monroe, 35 miles away, where Union troops were stationed, and asked commanding General Benjamin Butler “to let me enlist.” Although Butler rebuffed Jarvis and told him “it wasn’t a black man’s war,” Jarvis stood his political ground: “I told him it would be a black man’s war before they got through.”


Library of Congress
A wood engraving of “contraband” slaves escaping to Fort Monroe, Va.
Like many other politicized slaves, Jarvis seems to have understood the stakes of the Civil War far better than the combatants themselves. And by testing their expectations, they began to reshape federal policy. By the time of the first Battle of Bull Run, General Butler had declared fugitive slaves within Union lines to be “contrabands of war,” and the Congress soon confirmed him. Before too much longer, as Northern armies moved into the densely populated slave plantation districts of South Carolina and the lower Mississippi Valley, slaves crossed the Northern lines by the thousands, at once depriving the Confederacy of needed labor and forcing the Lincoln administration to reevaluate its position on slavery.

By the early fall of 1862, Lincoln had decided to issue an Emancipation Proclamation and enroll African Americans in the Union Army and Navy. Bold initiatives these were, revolutionary in effect, and wholly unimagined when the war began: except by the slaves whose actions helped bring them about. Lincoln’s political sensibilities had finally caught up to theirs.

Join Disunion on Facebook »


--------------------------------------------------------------------------------