Sunday, January 30, 2011

Literary Profiling

The Perils of Literary Profiling
By GEOFF NICHOLSON
Published: January 28, 2011

It’s probably time to update the list on my Facebook profile for the books I “like.” If you think that “liking” a book is a fairly nebulous and meaningless concept, you’ll get no argument from me. I made the list a couple of years back and jotted down the first few titles that came into my head (“Gravity’s Rainbow,” “The Big Sleep,” “More Pricks Than Kicks” and “The Anatomy of Melancholy,” if you must know). They weren’t selected entirely at random — they’re all books I think are great — but I didn’t spend much time pondering the selection, and on another day I might well have chosen four completely different titles.

Looking at the list now, however, I can see that it contains elements of the pretentious littérateur and the moody loner, both of which are obviously to be avoided. And if the horror of the Arizona shooting has taught us anything, it’s that some place a high value on what can be gleaned from a man’s reading habits, whether actual or simply professed. I have no idea if Jared Lee Loughner was really a great reader of Plato, Lewis Carroll, “The Will to Power” or “The Communist Manifesto,” as he claimed, but he wanted the world to think he was. And perhaps you really can judge a man by the books he displays on his bookshelf (or keeps on his Kindle).

In which case I pray that no F.B.I. agent, criminal profiler or (worst of all) news pundit ever gets a look at my bookshelves. There, alongside Swift, Plato, Lewis Carroll and Marx, you’d find the Marquis de Sade, Mickey Spillane, Hitler and Ann Coulter. Books are acquired for all kinds of reasons, including curiosity, irony, guilty pleasure and the desire to understand the enemy (not to mention free review copies), but you try telling that to a G-man. It seems perfectly obvious to me that owning a copy of “Mein Kampf” doesn’t mean you’re a Nazi, but then I would say that, wouldn’t I?

Thanks to Timothy W. Ryback’s “Hitler’s Private Library,” we now know that Hitler read “Don Quixote,” “Uncle Tom’s Cabin” and “Gulliver’s Travels,” and considered them “among the great works of world literature,” in Ryback’s words. This is problematic enough, since a taste for great literature is supposed to make us more humane and empathetic, isn’t it? But then Ryback tells us that Hitler had “mastered” the writings of Karl May, an ultraprolific German author of cowboy novels featuring the characters Old Surehand and Old Shatterhand. Hitler’s fondness for tales of the Old West may mean something, but if you’re trying to understand Hitler there are more obvious places to start.

Reading habits would seem to be relevant enough to someone’s biography, especially if that person is a writer. In his study “Built of Books,” Thomas Wright attempts to reconstruct the contents of Oscar Wilde’s library, which was dispersed and auctioned off between his imprisonment and trial. It’s worth knowing (though hardly surprising) that Wilde read the Greek and Roman classics, Shakespeare, Hegel and some French pornography, but difficulties arise when Wright tries, in a self-described moment of “quixotic madness,” to make the book a partial autobiography, in the belief that if he read everything Wilde had read, Wilde would become a “Socratic mentor, who would help me give birth to a new self.”

Forging a deep link between criminals and their books can be even more quixotic. Ed Sanders, in “The Family,” tells us that one of Charles Manson’s favorite books was Robert Heinlein’s “Stranger in a Strange Land,” but we’re also told that Manson was barely literate. Both John Hinckley Jr. (Reagan’s would-be assassin) and Mark David Chapman (the murderer of John Lennon) have been connected to “The Catcher in the Rye,” Hinckley by having a copy in his hotel room, Chapman by calmly reading the book outside the Dakota apartment building while waiting for the police to arrive after he shot Lennon. But it’s hardly surprising that a book that has sold well over 35 million copies has occasionally fallen into the hands of criminals.

The psychiatrist Fredric Wertham got things about as wrong as can be when he argued, in “Seduction of the Innocent” (1954), that the fact that prisoners and juvenile delinquents read “crime comics” meant that comics were causing, or at least stoking, their criminal tendencies. Current evidence suggests that if criminals read at all — and let’s not forget how many prisoners are functionally illiterate — then they read much the same books as the rest of us, business and self-help books included. This, anyway, was the conclusion reached by Glennor Shirley, who in 2003 conducted a survey of prison librarians for the American Library Association and learned that prisoners’ favorite writers included Stephen King, Robert Ludlum, Danielle Steel and “all authors of westerns.” Admittedly, she is referring to their reading habits in prison rather than outside. (Here I might add that when I learned one of my short stories had appeared in an anthology used in a prison literacy program, I was relieved to know I was part of the solution, not part of the problem.)

The self-help nature of some prison reading can be disturbing. Scandal arose in Britain two years ago when it was revealed that prisoners were being allowed to read “inappropriate” books, including the memoirs of other, more successful criminals, stories of prison escapes and, inevitably, “Mein Kampf.” (More cheeringly, Avi Steinberg, the author of the recent memoir “Running the Books: The Adventures of an Accidental Prison Librarian,” reports a vogue for Anne Frank’s diary among some female prisoners he worked with.)

For my part, I can say that the presence of Mickey Spillane’s books on my shelves hasn’t persuaded me that I need to start packing heat, nor has ownership of an Ann Coulter volume moved me to denounce Arabs. Such explorations of the dark side notwithstanding, most of us read books that reinforce the opinions and tendencies we already have. And yet and yet . . . the fact is, books really do have the power to influence and change people. That’s why some of us like them so much. A generation of European students (not to mention Russian revolutionaries) read “The Communist Manifesto” and thought it made a great deal of sense. Right now somebody is reading Jack Kerouac and deciding to go on the road, or reading William Burroughs and thinking it might be a lark to become a gentleman junkie.

So, if you actually did examine my bookshelves you could probably reach some reasonably accurate conclusions about my age, class, nationality, sexuality and so on. You would see that I’m not some dangerous, volatile, politically extreme nut job. Rather, you would decide that I’m a bookish, cosmopolitan sophisticate, with broad, quirky and unpredictable interests, a taste for literary experimentation, a sense of history, a serious man with a sense of humor and a wide range of sympathies. At any rate, that’s what I’d like you to think.

Douglas R. Egerton - Year of Meteors

We are into the 150th anniversary of the beginning of the Civil War. I plan to read in this important area of American history in the coming months.

This book is about the crucial 1860 presidential election. The Democrats split into two factions, North and South, over slavery, insuring a Republican victory. Senator Seward of New York was the odds-on favorite to win the nomination of the new Republican Party, but because of doubts about Seward's progressive stance on slavery (ability to win the border states) and because Lincoln had a shrewd promoter in the person his friend David Davis, the largely unknown Abraham Lincoln won the Republican nomination. It didn't hurt that the Republican convention was held in Chicago, which allowed Davis to pack the Wigwam (where the convention was held) with Lincoln supporters.

The author makes it clear that secession was inevitable and that there was nothing that Lincoln or anybody else could have done to stop it. The "fire-eaters," as Egerton calls the rabid secessionists, were determined to lead their states out of the Union regardless. No compromise was possible with these people. I did not realize that one of the key leaders of the fire-eaters was William Lowdnes Yancey of Alabama.

Stephen Douglas ignited the final drive toward toward disunion with his popular sovereignty docrine for Kansas/Nebraska. Douglas was a big race baiter---no doubt about it. But Egerton points out how Douglas spent his final days supporting Lincoln and doing everything he could,in vain as it turned out, to save the Union and prevent war.

Mr. Lincoln resolutely refused to back off from the his position and the position of his new party that slavery should be confined to where it already existed and should not be allowed to spread into the territories.

One side truly belived in slavery, that it was the natural and divine order of things, and one side believed that slavery was wrong. That was the essence of the rub, and so war was inevitable.

The author analyzes the election returns and says that there was no way Lincoln could have lost the election. As the Republican nominee, Lincoln was wired to win the 1860 presidential election. This is just what the secessionists wanted, for they wanted to split from the Union and form a new government that protected slavery, their only goal.

Saturday, January 29, 2011

Anthony Hopkins

I see the movie "The Rite." Anthony Hopkins is the best actor working today. I believe his every character. He's the only actor I can think of whose movie I would see just because he's in it.

Friday, January 28, 2011

In Defense of Psychoanalysis

Freud’s ideas have become part of the fabric of everyday life—yet his methods are going out of favour. Robert Rowland Smith argues that the professionals have got it wrong ...

From INTELLIGENT LIFE Magazine, Winter 2010

It is just over a century since psychoanalysis was first recognised as a science. In 1909 Sigmund Freud gave five lectures at Clark University in Massachusetts that surveyed and explained the fledgling discipline’s achievements to that point—the interpretation of dreams, the analysis of hysteria, the meaning behind jokes, the reasons we make stupid mistakes. Key to them all was the operation of the unconscious, the back-seat driver whispering to us to behave in ways we’d officially disown.

Later, Freud was to remark that his discovery amounted to a third and final nail in the coffin of human pride. The first was Copernicus’s bubble-bursting calculation that the Earth orbits the sun, thus displacing mankind from its central position in the universe. Second came Darwin’s finding that rather than being God’s special creature, descended from Adam and Eve, man was a monkey. And now Freud’s own postulation of an unconscious implied that we were strangers even to ourselves.

In adding to this demoralising ledger of human limits, however, Freud had unlocked a hitherto concealed dimension. Formerly obscure or ignored parts of the mental map now had a legend, and psychoanalysis established itself as the compass by which the terra incognita could be navigated. Before long the unconscious had slipped off the couch and entered the lingua franca, and today it’s virtually impossible to talk about human behaviour without drawing more or less explicitly on Freud’s lexicon. Not only do we speak readily about “unconscious” motivation, but we’ll happily deploy fancy psychoanalytic concepts like “being in denial” in the most ordinary conversations.

Yet for all its seepage into everyday life, psycho-analysis finds itself routinely denounced, even by those in its intellectual debt. Set aside the practical objections —becoming an analysand involves five sessions a week, at perhaps £70 per session, over many years—psychoanalysis, they say, reduces everything to sex. Worse, it does so in a form that looks misogynistic. As for its being a science, that’s laughable—believing that a fireside chat with a patient about their childhood can disclose the deep structure of the psyche is plain arrogant. Not to mention the potential for planting thoughts in the patient’s mind which happen to prove the theory you set out with.

So it’s not surprising that in the face of these perceived flaws psychoanalysis’s therapeutic rival, Cognitive Behavioural Therapy (CBT), has gained ground. Although both approaches pursue the same outcome—happy patients—the underlying method couldn’t be more different. Where psychoanalysis sifts the inner self to shift the outer, CBT adjusts external behaviour to ameliorate the internal state. Psychoanalysis gets to the root cause, often lying in one’s early years, where CBT focuses on the presenting issue. CBT is much more short-term, usually limited to about 30 sessions; doesn’t talk about erotic life unless it comes up; and generally takes an empirical approach that’s easily associated with the scientific. And where psychoanalysis leaves patients haplessly to work through their own psychic detritus, CBT sets homework.

The cause of CBT has also been served by the wider health system, in which all activity is now measured to within an inch of its life, targets become paramount, practitioners get held mercilessly to account, and patients transmogrify into customers demanding accessibility. In this transparent and shadowless world, CBT provides the comforting illusion that the lugubrious terrain of mental health can yield to instant illumination under a striplight. And because it positions itself among the empirical sciences, it enjoys an affinity with pharmacologically oriented psychiatry in which symptoms, should they fail to be dissolved by therapy, can be handily lined up with drugs. Needless to say, this is a system that plays well with the pharmaceutical giants.

The irony is that in becoming more “scientific”, CBT becomes less therapeutic. Now, Freud himself liked to be thought of as a scientist (he began his career in neurology, working on the spinal ganglia), but it’s the non-scientific features that make psychoanalysis the more, not the less, powerful. I’m referring to the therapeutic relationship itself. Although like psychoanalysis largely a talking cure, CBT prefers to set aside the emotions in play between doctor and patient. Psychoanalysis does the reverse. To the annoyance no doubt of many a psychoanalytic patient, the very interaction between the two becomes the subject-matter of the therapy.

This emotional muddling between analyst and patient is known in the trade as “transference”, and it’s important because it’s the way most of our relationships play out in the real world—as ambiguously defined contracts. This isn’t to say the analyst is short of techniques for managing that muddle, but it is to say that there’s no naively “clinical” position to be assumed. The consulting room thus transforms itself into a laboratory in which patients can learn about their impact on someone else in real time, and thus grow in self-awareness—which is the prerequisite for self-improvement.

The respected therapist and writer Irvin Yalom, among others, argues that depression and associated forms of sadness stem from an inability to make good contact with others. Relationships are fundamental to happiness. And so a science that has the courage to include the doctor’s relationship with the patient within the treatment itself, and to work with it, is a science already modelling the solution it prescribes. What psychoanalysis loses in scientific stature, it gains in humanity.

Dangerous Game

The Latest Budget Numbers Look Bad. But The Truth Is Even Worse
by William Galston from The new Republic
Dangerous Game

The latest budget numbers look bad. But the truth is even worse.

I’ve just spent a snowed-in day plowing through the Congressional Budget Office’s latest ten-year budget and economic outlook. The short-term outlook is grim enough, with an estimated deficit of $1.5 trillion—a new record, and the third consecutive 13-figure result. As for the long-term outlook, it’s not as bad as you’ve read; it’s worse.

Here’s why the headlines understate the gravity of our situation. CBO is required to use current law as the basis for its estimates—to assume, for example, that all the Bush tax cuts will expire at the end of 2012, that Medicare payments to physicians will be cut sharply, and that the alternative minimum tax will be allowed to affect millions more Americans. Using these assumptions, taxes as a share of GDP would by allowed to increase by five percentage points by 2014 and would keep on rising thereafter, we’d have a cumulative deficit of about $7 trillion dollars over the next decade, and debt held by the public would increase from 62 percent to 77 percent of GDP. Using more politically realistic assumptions, the cumulative deficit would be about $12 trillion, and debt held by the public would reach 97 percent of GDP, the highest level since 1946 (when it was headed down, not up).

I don’t know many economists, liberal or conservative, who view this prospect with equanimity. The CBO certainly doesn’t; its report states that “Although deficits during or shortly after a recession generally hasten economic recovery, persistent deficits and continually mounting debt would have ... negative economic consequences for the United States”—among them, reduced investment, output, and incomes; less room for maneuver when the next economic crisis erupts; and worst of all higher probability that investors would eventually lose confidence in our country’s creditworthiness and demand much higher interest rates. While no one can predict when that “tipping point” might occur, the report notes that as the global economic recovery gathers strength, investors will be less inclined to purchase U.S. government debt as a safe haven and will focus instead on its rising risks.

What should we do? Answer: Use the tax compromise struck during the lame-duck session of the 111th Congress as a two-year window to get our house in order. By the time that legislation expires, we should agree on a long-term plan that halts and then reverses our downward fiscal plunge. How should we do it? A number of high-level commissions have come to roughly the same conclusion: We need a grand bargain that deals structurally with both spending and our outdated tax code. Can we do it? Yes, because we’ve done it before—in the 1990s, to be precise, starting with the bipartisan budget deal that may have cost George H. W. Bush a second presidential term, continuing with Bill Clinton’s brave 1993 austerity budget, and concluding with an agreement between the Clinton administration and the Gingrich-led House Republicans.

Let’s look at a snapshot of the eight Clinton years (all numbers expressed as a percentage of GDP).

1993 2000

Total spending 21.4 18.2

Domestic discretionary 3.4 3.0

Deficit 3.9 -2.4 (surplus)

Debt held by the public 49.3 34.7

During the Clinton years, spending of all kinds declined as a share of GDP, deficits turned into surpluses, and debt held by the public barely budged: $3.2 trillion at the beginning, $3.4 at the end. During the Bush years, by contrast, every form of spending increased substantially, surpluses turned into deficits, and debt held by the public almost doubled, from $3.3 to $5.8 trillion.

The experience of the Clinton years contradicts a standard Republican talking point: increased revenues aren’t always spent. The Clinton administration used the proceeds from a higher top marginal tax rate to reduce the deficit. And those rates didn’t exactly suppress economic growth or job generation either.

I wish I could say that today’s elected officials are as serious about our fiscal future as Clinton was. Republicans are proposing damaging cuts in domestic discretionary spending (a small part of the total budget) while ducking far more important drivers of the long-term deficit problem—such as entitlements. Democrats seem to be hoping that they can score tactical political points by hanging back and forcing the Republicans to put specifics on the table. And although the State of the Union address left the door open to a serious discussion, not even the president’s supporters are claiming that it etched a profile in fiscal courage. (As intended, his investment message had far more resonance.) Senator Kent Conrad, the Democratic chair of the Senate Budget Committee, is publicly lamenting the president’s unwillingness to join in a budget summit with the bipartisan congressional leadership.

Memo to Republicans: You’re rightly critical of George W. Bush’s fiscal performance. But there is no evidence—none—that you can get the deficit and debt under control with your preferred combination of spending cuts and tax cuts. Have you noticed that Paul Ryan’s famous Roadmap allows the national debt to reach 100 percent of GDP? Do you care about facts?

Memo to Democrats: Denouncing the proposal offered by the president’s commission as a “cat food” budget for the elderly is a political talking-point, not a serious argument. Is Dick Durbin no longer liberal enough for you? Have you forgotten that fiscal restraint and full employment were partners, not adversaries, little more than a decade ago?

Memo to Obama: During your 2008 campaign, you said that the president has to be able to walk and chew gum at the same time. You were absolutely right. You can talk, as you should, about vital public investments and take the lead, as you must, to head off a fiscal train wreck.

During the next two years, we have to do what Clinton did, and more. We have to restrain discretionary spending (defense as well as domestic) and boost revenues. And because what was only a cloud on the horizon in the 1990s—the retirement of the baby boom generation—is now a swelling reality, we have to do what Clinton didn’t: Namely, find a way to reduce the anticipated increases in entitlement spending without imposing hardship on the lower-income elderly who depend on these programs for a decent and dignified old age.

Sunday, January 23, 2011

True Grit by Charles Portis

I like this book because I like the gritty, lawlessness environment that reminds of Cormac McCarthy. Mattie is determined, as she evolves from a girl simply wanting to avenge her father's death into a force who possesses the true grit she seeks.

I wish that Portis gave more strength to Mattie, like the Coen brothers do. Mattie of the book is slightly weaker and less confident. Mattie of the film is more appealing for me.

The novel is simple and straightforward. It plays no tricks. It tells the story, with no frills.

Award Nominees

HILLEL ITALIE 01/22/11 07:35 PM

NEW YORK — Jonathan Franzen is back in the awards circle.

Franzen's "Freedom," among last year's most highly praised novels, is a finalist for the National Book Critics Circle awards. Franzen had been bypassed for the National Book Awards, judged by fellow authors, but was an obvious choice for a prize voted on by reviewers, many of whom placed "Freedom" on their annual best-of lists. None of the fiction nominees for the National Book Award, including winner Jaimy Gordon's "Lord of Misrule," was chosen for the critics circle prize.

The 31 nominees in six competitive categories (autobiography has six finalists) announced Saturday were an international blend of popular authors such as Franzen, Christopher Hitchens and Patti Smith and the kind of lesser-known picks critics pride themselves on, such as German-Dutch novelist Hans Keilson, 101 years old, and cited for the acclaimed "Comedy in a Key." Finalists were published by Random House Inc., Simon & Schuster and other major New York houses, and by McSweeney's, Graywolf Press and the Feminist Press.

Winners will be announced March 10. There are no cash prizes.

Nominees also included Jennifer Egan's novel "A Visit From the Goon Squad," Isabel Wilkerson's history "The Warmth of Other Suns" and memoirs by Hitchens ("Hitch-22") and Smith, whose "Just Kids" won an NBA for nonfiction. Books written in foreign languages but available in English translation also are eligible, so fiction finalists besides Egan and Franzen included Keilson and Israel's David Grossman for "To the End of the Land." The fifth nominee was Irish novelist Paul Murray for "Skippy Dies."

The nonfiction choices were Wilkerson, S.C. Gwynne's "Empire of the Summer Moon," Jennifer Homans' ballet history "Apollo's Angels," Barbara Demick's "Nothing to Envy" and Siddhartha Mukherjee's "The Emperor of All Maladies."

Subjects in the biography category included Somerset Maugham, Crazy Horse and Charlie Chan. The finalists were Sarah Bakewell's "How to Live, or a Life of Montaigne"; Selina Hastings' "The Secret Lives of Somerset Maugham"; Yunte Huang's "Charlie Chan"; Thomas Powers' "The Killing of Crazy Horse"; and Tom Segev's "Simon Wiesenthal."

Besides Hitchens and Smith, autobiography nominees were Darin Strauss' "Half a Life"; David Dow's "The Autobiography of an Execution"; Rahna Reiko Rizzuto's "Hiroshima in the Morning"; and Kai Bird's "Crossing Mandelbaum Gate."

Former U.S. poet laureate Kay Ryan ("The Best of It") and prize-winning poet-translator Anne Carson ("Nox") were poetry finalists, along with Kathleen Graber's "The Eternal City," Terrance Hayes' "Lighthead" and C.D. Wright's "One with Others."

Edmund Morris - Colonel Roosevelt (2)

I finish the third and concluding volume of Morris's fabulous biography of Theordore Roosevelt. So many biographies have been written of TR over the decades. This is the best comprehensive recapitulation of his life for our time.

This volume traces TR's life after he left the Presidency in March of 1909. He preferred to be called "Colonel" from his San Juan Hill charge days rather than "Mr. President" as is the custom today.

TR went a massive safari with his son Kermit shortly after abandoning the White House. He killed dozens of "specimens" for scientific purposes. The record of this animal slaughter is nauseating to the modern sensibility. I do not understand how TR or anybody else could slaughter innocent animals in Africa.

He and his son Kermit went on a dangerous months long journey down a river called The River of Doubt in South America in 1913/1914. TR and his party were literally out of touch with the outside world for months. This is incomprehensible to the modern sensibility. Roosevelt almost died in South America. At one point, racked with malaria, he begged his party to leave him to die. The journey weakened him considerably, and most likely contributed to his demise at the age of 60.

TR maintaned his absorbing interest in politics. The journey in South America came after bolting the GOP in 1912 for the most successful third party bid for the presidency. This split the Republican Party and led to the election of Woodrow Wilson.

TR was an early proponenet of preparedness long before it happened and American entered World World I. He glorified war and seriously wanted to led a volunteer force in Europe, but he was sensibly turned down by Wilson.

Just months before he died, TR lost his youngest son Quentin whose plan was shot down over France. TR most certainly died of a broken heart at the age of 60 on January 6, 1919.

Theodore Roosevelt was the most amazing person to occupy the office of President. He was a naturalist, an author of more than 20 books and about 150,000 surviving letters, and by far the most accomplished intellectual in Presidential history. He far surpasses Thomas Jefferson.

Morris's work will be the definitive traditional biography of TR for our time.

Life

January 18, 2011


by Adam Frank

What exactly are we looking for? What fuels so much of the passion and intensity behind the debates over religion, the debates between religions and the debates surrounding science and religion? At the heart of these debates you will often find the issue of "knowing."

Knowing if God exists, or not. Knowing how the Universe began and if a creator was necessary, or not. Knowing how human beings "became" and what constitutes appropriate moral codes in light of that becoming. Always and again, the emphasis is on knowledge, on the certainty of understanding something, of knowing some fact and its meaning. What a tragic mistake.

The great comparative mythologist Joseph Campbell once said, "People don't want the meaning of life, they want the experience of life." He could not have hit the nail more firmly on the head.


One thing I have never understood in the vitriol that people manage to dredge up in these science vs. religion battles is their lack clarity about goals. Is human spiritual endeavor really about "knowing" the existence of a superbeing? Does this academic "knowing", as in "I can prove this to be true," really what lies behind the spiritual genius of people like the ninth century Sufi poet Rumi, the 13th century Zen teacher Dogen, or more modem examples like Martin Luther King or Ghandi?

There are many reasons human beings institutionalized their spiritual longing into religions. Those reasons often devolved into considerations of power, control and real estate. Those institutions certainly have needed to enforce creed and doctrine, i.e. "knowledge."

But the reasons individuals find their lives transformed by spiritual longing are intimate and deeply personal affairs having little to do with dusty "proofs for the existence of God." As all those "spiritual but not religious" folks popping up in surveys on religion will tell you, the essence of the question is about experience, not facts.

Along a similar vein, in the pro-science/anti-religion camps one often hears the quest for understanding the universe put in equally ultimate, quasi-theological terms. Finding the final theory, the Theory of Everything, is held up as a kind of moment "when the truth shall be revealed once and for all." While many practicing scientists might not see it this way, the scientific knowledge/enlightenment trope has been there in popular culture for a long time reaching all the back to Faust and up through movies like Pi.

As the philosopher Jean-Paul Sartre once said "Even if God did exist, that would change nothing." One way to interpret his meaning was that a formulaic "knowledge" of a superbeing's existence is beside the point when the real issue before us every day, all day is the verb "to be."

It’s the act of being that gives rise to our suffering and our moments of enlightenment. Right there, right in the very experience of life, is the warm, embodied truth we long for so completely.

Spirituality, at its best, points us away from easy codifications when it shows us how to immerse ourselves in the simple, inescapable act of being. Science at its root is also an expression of reverence and awe for the endless varied, resonantly beautiful experience we can find ourselves immersed in. So knowing the meaning of life as encoded in a religious creed on a page or an equation on a blackboard is not the issue. A deeper, richer experience of this one life: that is the issue!

So, can we stop thinking that discussions about science and religion have to focus on who has the best set of facts?

When it comes to the natural world, it's hard to see how science is not going win the "facts" war hands down. But if we broaden our view to see being as the central issue, then connections between science and spiritual longing might be seen in an entirely different light.

Twitter Facebook E-mail ShareStumble Upon Reddit Yahoo! BuzzDigg What is this?
Share Print Comments (46) Recommend (14)

Lives of the Philosophers

Lives of the Philosophers, Warts and All
By SARAH BAKEWELL
Published: January 20, 2011

If the proof of a pudding is in the eating, and the proof of a rule is in the exceptions, where should we look for the proof of a philosophy?


EXAMINED LIVES

From Socrates to Nietzsche

By James Miller



Essay: The Philosophical Novel (January 23, 2011) For Friedrich Nietzsche, the answer was obvious: to test a philosophy, find out if you can live by it. This is “the only critique of a philosophy that is possible and that proves something,” he wrote in 1874. It’s also the form of critique that is generally overlooked in the philosophy faculties of universities. Nietzsche therefore dismissed the professional discipline as irrelevant, a “critique of words by means of other words,” and devoted himself to pursuing an idiosyncratic philosophical quest outside the academy. As for texts, he wrote, “I for one prefer reading Diogenes Laertius” — the popular third-century Epicurean author of a biographical compilation called “Lives of the Eminent Philosophers.” If the proof of philosophy lies in life, then what could be more useful than reading about how the great philosophers have lived?

As James Miller shows in his fascinating “Examined Lives,” choosing Diogenes Laertius over more rigorous treatises was provocative because it challenged an idea already predominant in Nie­tzsche’s time: that a philosophy should be objectively valid, without the need to refer to particular quirks or life experiences on the part of its originator. Diogenes Laertius represents an older tradition, which sees philosophy not as a set of precepts but as something one learns by following a wise man — sometimes literally following him wherever he goes, listening, and observing how he handles situations. The “Lives” offers its readers a vicarious opportunity to try this with a number of philosophers, and see whose way works best.

Miller has now had the superb idea of taking Diogenes Laertius as a model, while simultaneously using this model to test whether such an approach can still offer us anything of value. He covers 12 philosophers: Socrates, Plato, Diogenes the Cynic (not to be confused with Laertius), Aristotle, Seneca, Augustine, Montaigne, Descartes, Rousseau, Kant, Emerson and Nietzsche. In each case, he explores the life selectively, looking for “crux” points and investigating how ideas of the philosophical life have changed. Few readers will be astounded to learn that philosophers make as much of a mess of their lives as anyone else. But Miller, a professor of politics at the New School and author of a biography of Michel Foucault, among other books, does not rest with digging out petty failings or moments of hypocrisy. He shows us philosophers becoming ever more inclined to reflect on these failings, and suggests that this makes their lives more rather than less worth studying.

His starting point is Socrates, the most mythologized of all thinkers, the original source of the statement that “the unexamined life is not worth living” and the philosopher whose life became the measure for all others. Early biographers wrote with awe of Socrates’ strange, itinerant approach to wisdom; of his habit of hanging around the marketplace striking up conversations with any passer-by willing to talk or of standing motionless in the street all night while he thought a problem through. But what really set him apart was his death, which redefined his whole life. Condemned by a panel of 501 Athenian citizens to kill himself with hemlock, Socrates carried out the sentence with perfect composure and in full rational awareness — or so the myth has it. No greater confirmation of the value of a philosopher’s existence could be imagined. As Socrates himself said, “Don’t you think that actions are more reliable evidence than words?”

The rest of “Examined Lives”can be read as a history of other philosophers’ failures to measure up to this ideal, either in their deaths or their lives. One of Miller’s great transitional figures is the Roman court-philosopher Seneca. Living half a millennium after Socrates, he too was condemned to death by suicide. He accepted his fate with Socratic courage, but his death itself was difficult. He slit his wrists before begging for a cup of hemlock and retiring to a hot bath to expire. The messiness of his death reflected a morally messy life. For, while his writings promoted wisdom, balance, restraint and detachment, Seneca himself was forced into numerous compromises in the service of his protégé and employer, the murderous emperor Nero. He even helped Nero plot the murder of Agrippina, the emperor’s own mother. The strain was evident. “I am not wise,” Seneca wrote; “nor . . . shall I ever be.” Yet he also advised his favorite correspondent, Lucilius, to “harmonize talk with life.”

Other philosophers suffered even more self-division, particularly those who succumbed to mental illness. Diogenes the Cynic lived in a clay jar, masturbated on the street and embraced snow-covered statues. His sanity sounds shaky at best, yet there is no doubting his importance: he inspired the early Skeptics and thus influenced the whole of Western thought. Immanuel Kant, most rational of thinkers, ended his life in an obsessive-compulsive hell, endlessly consulting thermometers and barometers, and stopping dead in his tracks whenever he felt warm on a walk because he was afraid that breaking into a sweat would kill him. And Nietzsche wrote some of his most incisive works while in the early stages of the syphilitic dementia that really did kill him.

The most striking of Miller’s subjects is René Descartes, another “transitional” figure and a very strange person. We associate Descartes with the attempt to give mathematical clarity to philosophy, yet he was driven to this by a series of terrifying, irrational visions in 1619, and by life experiences ranging from vagabondage to periods of reclusive withdrawal. He made gripping use of this story in his “Discourse on Method,” the very work where he also set out his criteria of total certainty.

As Miller notes, Descartes opened up two divergent paths in philosophy. One was the old tradition, in which one seeks a better life and recounts the search in a personal narrative. The other led to the impersonal discipline now prevalent in universities, which in theory can be practiced by anyone. Yet Descartes himself would barely have understood this separation. For him, a philosophical life required both the quest for precision and the intense personal experience that drove one to it.

Miller concludes that his 12 philosophical lives offer a moral that is “neither simple nor uniformly edifying.” It amounts mainly to the idea that philosophy can offer little or no consolation, and that the examined life is, if anything, “harder and less potentially rewarding” for us than it was for Socrates.

Yet his entire book conveys a sense that the genuinely philosophical examination of a life can still lead us somewhere radically different from other kinds of reflection. At the end of his chapter on Descartes, Miller cites the 20th-­century phenomenologist Edmund Husserl, whom he identifies as the major exception to the rule that places most post-Cartesian thinkers on one side or the other of the personal/impersonal divide. Apropos of Descartes, Husserl wrote, “Anyone who seriously intends to become a philosopher must ‘once in his life’ withdraw into himself and attempt, within himself, to overthrow and build anew all the sciences that, up to then, he has been accepting.”

It is an extraordinary thing to do: a project that remains “quite personal,” as Husserl admitted, yet that reaches in to seize the whole world and redesign it from the very foundation. Perhaps this is what still distinguishes the philosophical life: that “once in a lifetime” convulsion, in which one reinvents reality around oneself. It is a project doomed to fail, and compromises will always be made. But what, in life, could be more interesting?



Sarah Bakewell is the author, most recently, of “How to Live: Or, A Life of Montaigne in

The Impact of the Internet

The internet: is it changing the way we think?American writer Nicholas Carr's claim that the internet is not only shaping our lives but physically altering our brains has sparked a lively and ongoing debate, says John Naughton. Below, a selection of writers and experts offer their opinion

John Naughton The Observer, Sunday 15 August 2010 Article history
Are our minds being altered due to our increasing reliance on search engines, social networking sites and other digital technologies? Photograph: Chris Jackson/Getty Images

Every 50 years or so, American magazine the Atlantic lobs an intellectual grenade into our culture. In the summer of 1945, for example, it published an essay by the Massachusetts Institute of Technology (MIT) engineer Vannevar Bush entitled "As We May Think". It turned out to be the blueprint for what eventually emerged as the world wide web. Two summers ago, the Atlantic published an essay by Nicholas Carr, one of the blogosphere's most prominent (and thoughtful) contrarians, under the headline "Is Google Making Us Stupid?".

The Shallows: How the Internet is Changing the Way We Think, Read and Remember by Nicholas Carr Buy it from the Guardian bookshop Search the Guardian bookshop
"Over the past few years," Carr wrote, "I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going – so far as I can tell – but it's changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle."

The title of the essay is misleading, because Carr's target was not really the world's leading search engine, but the impact that ubiquitous, always-on networking is having on our cognitive processes. His argument was that our deepening dependence on networking technology is indeed changing not only the way we think, but also the structure of our brains.

Carr's article touched a nerve and has provoked a lively, ongoing debate on the net and in print (he has now expanded it into a book, The Shallows: What the Internet Is Doing to Our Brains). This is partly because he's an engaging writer who has vividly articulated the unease that many adults feel about the way their modi operandi have changed in response to ubiquitous networking. Who bothers to write down or memorise detailed information any more, for example, when they know that Google will always retrieve it if it's needed again? The web has become, in a way, a global prosthesis for our collective memory.

It's easy to dismiss Carr's concern as just the latest episode of the moral panic that always accompanies the arrival of a new communications technology. People fretted about printing, photography, the telephone and television in analogous ways. It even bothered Plato, who argued that the technology of writing would destroy the art of remembering.

But just because fears recur doesn't mean that they aren't valid. There's no doubt that communications technologies shape and reshape society – just look at the impact that printing and the broadcast media have had on our world. The question that we couldn't answer before now was whether these technologies could also reshape us. Carr argues that modern neuroscience, which has revealed the "plasticity" of the human brain, shows that our habitual practices can actually change our neuronal structures. The brains of illiterate people, for example, are structurally different from those of people who can read. So if the technology of printing – and its concomitant requirement to learn to read – could shape human brains, then surely it's logical to assume that our addiction to networking technology will do something similar?

Not all neuroscientists agree with Carr and some psychologists are sceptical. Harvard's Steven Pinker, for example, is openly dismissive. But many commentators who accept the thrust of his argument seem not only untroubled by its far-reaching implications but are positively enthusiastic about them. When the Pew Research Centre's Internet & American Life project asked its panel of more than 370 internet experts for their reaction, 81% of them agreed with the proposition that "people's use of the internet has enhanced human intelligence".

Others argue that the increasing complexity of our environment means that we need the net as "power steering for the mind". We may be losing some of the capacity for contemplative concentration that was fostered by a print culture, they say, but we're gaining new and essential ways of working. "The trouble isn't that we have too much information at our fingertips," says the futurologist Jamais Cascio, "but that our tools for managing it are still in their infancy. Worries about 'information overload' predate the rise of the web... and many of the technologies that Carr worries about were developed precisely to help us get some control over a flood of data and ideas. Google isn't the problem – it's the beginning of a solution."

Sarah Churchwell, academic and critic

Is the internet changing our brains? It seems unlikely to me, but I'll leave that question to evolutionary biologists. As a writer, thinker, researcher and teacher, what I can attest to is that the internet is changing our habits of thinking, which isn't the same thing as changing our brains. The brain is like any other muscle – if you don't stretch it, it gets both stiff and flabby. But if you exercise it regularly, and cross-train, your brain will be flexible, quick, strong and versatile.

In one sense, the internet is analogous to a weight-training machine for the brain, as compared with the free weights provided by libraries and books. Each method has its advantage, but used properly one works you harder. Weight machines are directive and enabling: they encourage you to think you've worked hard without necessarily challenging yourself. The internet can be the same: it often tells us what we think we know, spreading misinformation and nonsense while it's at it. It can substitute surface for depth, imitation for originality, and its passion for recycling would surpass the most committed environmentalist.

In 10 years, I've seen students' thinking habits change dramatically: if information is not immediately available via a Google search, students are often stymied. But of course what a Google search provides is not the best, wisest or most accurate answer, but the most popular one.

But knowledge is not the same thing as information, and there is no question to my mind that the access to raw information provided by the internet is unparalleled and democratising. Admittance to elite private university libraries and archives is no longer required, as they increasingly digitise their archives. We've all read the jeremiads that the internet sounds the death knell of reading, but people read online constantly – we just call it surfing now. What they are reading is changing, often for the worse; but it is also true that the internet increasingly provides a treasure trove of rare books, documents and images, and as long as we have free access to it, then the internet can certainly be a force for education and wisdom, and not just for lies, damned lies, and false statistics.

In the end, the medium is not the message, and the internet is just a medium, a repository and an archive. Its greatest virtue is also its greatest weakness: it is unselective. This means that it is undiscriminating, in both senses of the word. It is indiscriminate in its principles of inclusion: anything at all can get into it. But it also – at least so far – doesn't discriminate against anyone with access to it. This is changing rapidly, of course, as corporations and governments seek to exert control over it. Knowledge may not be the same thing as power, but it is unquestionably a means to power. The question is, will we use the internet's power for good, or for evil? The jury is very much out. The internet itself is disinterested: but what we use it for is not.

Sarah Churchwell is a senior lecturer in American literature and culture at the University of East Anglia

Naomi Alderman, novelist

If I were a cow, nothing much would change my brain. I might learn new locations for feeding, but I wouldn't be able to read an essay and decide to change the way I lived my life. But I'm not a cow, I'm a person, and therefore pretty much everything I come into contact with can change my brain.

It's both a strength and a weakness. We can choose to seek out brilliant thinking and be challenged and inspired by it. Or we can find our energy sapped by an evening with a "poor me" friend, or become faintly disgusted by our own thinking if we've read too many romance novels in one go. As our bodies are shaped by the food we eat, our brains are shaped by what we put into them.

So of course the internet is changing our brains. How could it not? It's not surprising that we're now more accustomed to reading short-form pieces, to accepting a Wikipedia summary, rather than reading a whole book. The claim that we're now thinking less well is much more suspect. If we've lost something by not reading 10 books on one subject, we've probably gained as much by being able to link together ideas easily from 10 different disciplines.

But since we're not going to dismantle the world wide web any time soon, the more important question is: how should we respond? I suspect the answer is as simple as making time for reading. No single medium will ever give our brains all possible forms of nourishment. We may be dazzled by the flashing lights of the web, but we can still just step away. Read a book. Sink into the world of a single person's concentrated thoughts.

Time was when we didn't need to be reminded to read. Well, time was when we didn't need to be encouraged to cook. That time's gone. None the less, cook. And read. We can decide to change our own brains – that's the most astonishing thing of all.

Ed Bullmore, psychiatrist

Whether or not the internet has made a difference to how we use our brains, it has certainly begun to make a difference to how we think about our brains. The internet is a vast and complex network of interconnected computers, hosting an equally complex network – the web – of images, documents and data. The rapid growth of this huge, manmade, information-processing system has been a major factor stimulating scientists to take a fresh look at the organisation of biological information-processing systems like the brain.

It turns out that the human brain and the internet have quite a lot in common. They are both highly non-random networks with a "small world" architecture, meaning that there is both dense clustering of connections between neighbouring nodes and enough long-range short cuts to facilitate communication between distant nodes. Both the internet and the brain have a wiring diagram dominated by a relatively few, very highly connected nodes or hubs; and both can be subdivided into a number of functionally specialised families or modules of nodes. It may seem remarkable, given the obvious differences between the internet and the brain in many ways, that they should share so many high-level design features. Why should this be?

One possibility is that the brain and the internet have evolved to satisfy the same general fitness criteria. They may both have been selected for high efficiency of information transfer, economical wiring cost, rapid adaptivity or evolvability of function and robustness to physical damage.Networks that grow or evolve to satisfy some or all of these conditions tend to end up looking the same.

Although there is much still to understand about the brain, the impact of the internet has helped us to learn new ways of measuring its organisation as a network. It has also begun to show us that the human brain probably does not represent some unique pinnacle of complexity but may have more in common than we might have guessed with many other information-processing networks.

Ed Bullmore is professor of psychiatry at the University of Cambridge

Geoff Dyer, writer

Sometimes I think my ability to concentrate is being nibbled away by the internet; other times I think it's being gulped down in huge, Jaws-shaped chunks. In those quaint days before the internet, once you made it to your desk there wasn't much to distract you. You could sit there working or you could just sit there. Now you sit down and there's a universe of possibilities – many of them obscurely relevant to the work you should be getting on with – to tempt you. To think that I can be sitting here, trying to write something about Ingmar Bergman and, a moment later, on the merest whim, can be watching a clip from a Swedish documentary about Don Cherry – that is a miracle (albeit one with a very potent side-effect, namely that it's unlikely I'll ever have the patience to sit through an entire Bergman film again).

Then there's the outsourcing of memory. From the age of 16, I got into the habit of memorising passages of poetry and compiling detailed indexes in the backs of books of prose. So if there was a passage I couldn't remember, I would spend hours going through my books, seeking it out. Now, in what TS Eliot, with great prescience, called "this twittering world", I just google the key phrase of the half-remembered quote. Which is great, but it's drained some of the purpose from my life.

Exactly the same thing has happened now that it's possible to get hold of out-of-print books instantly on the web. That's great too. But one of the side incentives to travel was the hope that, in a bookstore in Oregon, I might finally track down a book I'd been wanting for years. All of this searching and tracking down was immensely time-consuming – but only in the way that being alive is time-consuming.

Colin Blakemore, neurobiologist
It's curious that some of the most vociferous critics of the internet – those who predict that it will produce generations of couch potatoes, with minds of mush – are the very sorts of people who are benefiting most from this wonderful, liberating, organic extension of the human mind. They are academics, scientists, scholars and writers, who fear that the extraordinary technology that they use every day is a danger to the unsophisticated.

They underestimate the capacity of the human mind – or rather the brain that makes the mind – to capture and capitalise on new ways of storing and transmitting information. When I was at school I learned by heart great swathes of poetry and chunks of the Bible, not to mention page after page of science textbooks. And I spent years at a desk learning how to do long division in pounds, shillings and pence. What a waste of my neurons, all clogged up with knowledge and rules that I can now obtain with the click of a mouse.

I have little doubt that the printing press changed the way that humans used their memories. It must have put out of business thousands of masters of oral history and storytelling. But our brains are so remarkably adept at putting unused neurons and virgin synaptic connections to other uses. The basic genetic make-up of Homo sapiens has been essentially unchanged for a quarter of a million years. Yet 5,000 years ago humans discovered how to write and read; 3,000 years ago they discovered logic; 500 years ago, science. These revolutionary advances in the capacity of the human mind occurred without genetic change. They were products of the "plastic" potential of human brains to learn from their experience and reinvent themselves.

At its best, the internet is no threat to our minds. It is another liberating extension of them, as significant as books, the abacus, the pocket calculator or the Sinclair Z80.

Just as each of those leaps of technology could be (and were) put to bad use, we should be concerned about the potentially addictive, corrupting and radicalising influence of the internet. But let's not burn our PCs or stomp on our iPads. Let's not throw away the liberating baby with the bathwater of censorship.

Colin Blakemore is professor of neuroscience at the University of Oxford

Ian Goodyer, psychiatrist

The key contextual point here is that the brain is a social organ and is responsive to the environment. All environments are processed by the brain, whether it's the internet or the weather – it doesn't matter. Do these environments change the brain? Well, they could and probably do in evolutionary time.

The internet is just one of a whole range of characteristics that could change the brain and it would do so by altering the speed of learning. But the evidence that the internet has a deleterious effect on the brain is zero. In fact, by looking at the way human beings learn in general, you would probably argue the opposite. If anything, the opportunity to have multiple sources of information provides a very efficient way of learning and certainly as successful as learning through other means.

It is being argued that the information coming into the brain from the internet is the wrong kind of information. It's too short, it doesn't have enough depth, so there is a qualitative loss. It's an interesting point, but the only way you could argue it is to say that people are misusing the internet. It's a bit like saying to someone who's never seen a car before and has no idea what it is: "Why don't you take it for a drive and you'll find out?" If you seek information on the internet like that, there's a good chance you'll have a crash. But that's because your experience has yet to inculcate what a car is. I don't think you can argue that those latent processes are going to produce brain pathology.

I think the internet is a fantastic tool and one of the great wonders of the world, if not the greatest. Homo sapiens must just learn to use it properly.

Ian Goodyer is professor of psychiatry at the University of Cambridge

Maryanne Wolf, cognitive neuroscientist
I am an apologist for the reading brain. It represents a miracle that springs from the brain's unique capacity to rearrange itself to learn something new. No one, however, knows what this reading brain will look like in one more generation.

No one today fully knows what is happening in the brains of children as they learn to read while immersed in digitally dominated mediums a minimum of six to seven hours a day (Kaiser report, 2010). The present reading brain's circuitry is a masterpiece of connections linking the most basic perceptual areas to the most complex linguistic and cognitive functions, like critical analysis, inference and novel thought (ie, "deep reading processes"). But this brain is only one variation of the many that are possible. Therein lies the cerebral beauty and the cerebral rub of plasticity.

Understanding the design principles of the plastic reading brain highlights the dilemma we face with our children. It begins with the simple fact that we human beings were never born to read. Depending on several factors, the brain rearranges critical areas in vision, language and cognition in order to read. Which circuit parts are used depends on factors like the writing system (eg English v Chinese); the formation (eg how well the child is taught); and the medium (eg a sign, a book, the internet). For example, the Chinese reading brain requires more cortical areas involved in visual memory than the English reader because of the thousands of characters. In its formation, the circuit utilises fairly basic processes to decode and, with time and cognitive effort, learns to incorporate "deep reading processes" into the expert reading circuit.

The problem is that because there is no single reading brain template, the present reading brain never needs to develop. With far less effort, the reading brain can be "short-circuited" in its formation with little time and attention (either in milliseconds or years) to the deep reading processes that contribute to the individual reader's cognitive development.

The problem of a less potentiated reading brain becomes more urgent in the discussion about technology. The characteristics of each reading medium reinforce the use of some cognitive components and potentially reduce reliance on others. Whatever any medium favours (eg, slow, deep reading v rapid information-gathering) will influence how the reader's circuit develops over time. In essence, we human beings are not just the product of what we read, but how we read.

For me, the essential question has become: how well will we preserve the critical capacities of the present expert reading brain as we move to the digital reading brain of the next generation? Will the youngest members of our species develop their capacities for the deepest forms of thought while reading or will they become a culture of very different readers – with some children so inured to a surfeit of information that they have neither the time nor the motivation to go beyond superficial decoding? In our rapid transition into a digital culture, we need to figure out how to provide a full repertoire of cognitive skills that can be used across every medium by our children and, indeed, by ourselves.

Maryanne Wolf is the author of Proust and the Squid: The Story and Science of the Reading Brain, Icon Books, 2008

Bidisha, writer and critic

The internet is definitely affecting the way I think, for the worse. I fantasise about an entire month away from it, with no news headlines, email inboxes, idle googling or instant messages, the same way retirees contemplate a month in the Bahamas. The internet means that we can never get away from ourselves, our temptations and obsessions. There's something depressing about knowing I can literally and metaphorically log on to the same homepage, wherever I am in the world.

My internet use and corresponding brain activity follow a distinct pattern of efficiency. There's the early morning log-on, the quick and accurate scan of the day's news, the brisk queries and scheduling, the exchange of scripts of articles or edited book extracts.

After all this good stuff, there's what I call the comet trail: the subsequent hours-long, bitty, unsatisfying sessions of utter timewasting. I find myself looking up absolute nonsense only tangentially related to my work, fuelled by obsessions and whims and characterised by topic-hopping, bad spelling, squinting, forum lurking and comically wide-ranging search terms. I end having created nothing myself, feeling isolated, twitchy and unable to sleep, with a headache and painful eyes, not having left the house once.

The internet enables you look up anything you want and get it slightly wrong. It's like a never-ending, trashy magazine sucking all time, space and logic into its bottomless maw. And, like all trashy magazines, it has its own tone, slang and lexicon. I was tempted to construct this piece in textspeak, Tweet abbreviations or increasingly abusive one-liners to demonstrate the level of wit the internet has facilitated – one that is frighteningly easily to mimic and perpetuate. What we need to counteract the slipshod syntax, off-putting abusiveness, unruly topic-roaming and frenetic, unreal "social networking" is good, old-fashioned discipline. We are the species with the genius to create something as wondrous as the internet in the first place. Surely we have enough self-control to stay away from Facebook.

• This articles was amended on 27 August 2010 to change "quarter of a billion years" to "quarter of a million years".

What It All Means

What It All Means
By SUSAN NEIMAN
Published: January 20, 2011


By the time they contemplate an application to graduate school, philosophy students have learned that it isn’t merely tacky to display an interest in questions about the meaning of life; it’s a major professional risk. For many decades, British and American students were urged to turn their attention instead to the meaning of language, while European philosophers were engaged in more portentous-sounding but equally arcane diversions. There were occasional, notable exceptions — Stanley Cavell’s work showed how the meaning of one’s words and the meaning of one’s life might coincide — but then Cavell was long considered to stand outside the real business of philosophy. Even more fatally, students were introduced to the history of philosophy as a series of epistemological puzzles decontaminated from troubling concerns about what it means to be human, a question that had moved the great philosophers of the canon as surely as any thoughtful 18-year-old.

ALL THINGS SHINING

Reading the Western Classics to Find Meaning in a Secular Age

By Hubert Dreyfus and Sean Dorrance Kelly

254 pp. Free Press. $26.


Against this background, Hubert Dreyfus and Sean Dorrance Kelly’s “All Things ­Shining” is certainly good news. Here, two distinguished philosophers from the heart of the profession offer a meditation on the meaning of life, in a sharp, engaging style that will appeal to readers both within the academy and beyond it. They provide a compressed narrative of changes in Western understanding of human existence over the course of nearly three millenniums, and argue that reading great works of literature allows us to rediscover the reverence, gratitude and amazement that were available in Homeric times. These qualities, they believe, can be cultivated to provide a bulwark against the nihilism they rightly view as threatening our ability to lead meaningful lives in the 21st century. “The gods have not withdrawn or abandoned us,” they conclude. “We have kicked them out.”

In 2011, it’s disconcerting to read that we have been released from the ancient temptation to monotheism; much of the world hasn’t heard the news. But if Dreyfus and Kelly neglect those for whom monotheism remains a live option, they have much to say to those for whom it doesn’t. Mediated and suspicious, they argue, we have lost a way of being in the world that the Greeks found natural. The reason so many of us feel so miserable is that we can neither find meaning in ourselves alone nor give up the longing to find it somewhere else. “All Things Shining” offers fascinating readings of works of literature chosen to illuminate this narrative — from Aeschylus, Dante and Melville to David Foster Wallace and Elizabeth Gilbert — as well as passionate glimpses of the attitudes toward the world the authors urge us to regain.

Dreyfus and Kelly begin with those happy polytheists, the Greeks, who were less reflective than we are, and less convinced that they were in control of the world. This left them open to experience a world in which things shine as works of art do, to feel gratitude not only for the bounties of nature but for human excellence in all its forms, itself regarded as a gift.

The authors bring that relation alive, but their narrative of its loss raises many questions. These start with their reading of Homer himself, which highlights Helen’s blithe description, in the “Odyssey,” of the affair with Paris that ignited the Trojan War — “an odd choice,” they comment, “for dinner party conversation in the Menelaus household.” They make it intelligible by arguing that in Homer’s world, erotic excellence is a sacred gift like any other human excellence, to be cherished without moral reflection. Something about this is true and important, but their conclusion that Homer’s world did not view the Trojan War as lamentable is hard to swallow. True, Homer doesn’t explicitly condemn Hector’s death; “he just describes how it affects Hector’s father, Priam.” But that’s how great artists work, and few scenes in literature are more affecting. Similarly problematic is the use of Odysseus’ praise of Achilles to argue that death in the Trojan War was simply accepted by the Greeks as one means to bring forth human excellence. For they ignore Achilles’ reply: far better, he says, to be a serf among the living than to lord it over the dead.

The Sprint

Your Money
With Retirement Savings, It’s a Sprint to the Finish
By TARA SIEGEL BERNARD
Published: January 21, 2011

What would you do if your financial planner prescribed the following advice? Save and invest diligently for 30 years, then cross your fingers and pray your investments will double over the last decade before you retire.

The One Big Catch With Retirement Investing
What are some ways to deal with the uncertainty of investing for retirement?

You might as well go to Las Vegas.

Yet that’s exactly what many professionals and fancy financial calculators have been telling consumers for years, argues Michael Kitces, director of research at the Pinnacle Advisory Group in Columbia, Md., who recently illustrated this notion in his blog, Nerd’s Eye View.

The advice is never delivered in those exact words, of course. Instead, this is the more familiar refrain: save a healthy slice of your salary from the start of your career, invest it in a diversified portfolio and then you should be able to retire with relative ease.

The problem is that even if you do everything right and save at a respectable rate, you’re still relying on the market to push you to the finish line in the last decade before retirement. Why? Reaching your goal is highly dependent on the power of compounding — or the snowball effect, where your pile of money grows at a faster clip as more interest (or investment growth) grows on top of more interest. In fact, you’re actually counting on your savings, in real dollars and cents, to double during that home stretch.

But if you’re dealt a bad set of returns during an extended period of time just before you retire or shortly thereafter, your plan could be thrown wildly off track. Many baby boomers know the feeling all too well, given the stock market’s weak showing during the last decade.

“The way the math really works out is unbelievably dependent on the final few years,” Mr. Kitces said. “I just don’t think we’ve really acknowledged just what a leap the very last part really is.”

Consider the numbers for a 26-year-old who earns $40,000 annually, with a long-term savings target of $1 million. To get there, she’s told to save 8 percent of her salary each year over her 40-year career. (We assumed an annual investment return of 7 percent, and 3 percent annual salary growth, to keep pace with inflation). Yet after 31 years of diligent savings, her portfolio is worth just slightly more than $483,000.

To clear the $1 million mark, her portfolio essentially must double in the nine years before she retires, and the market must cooperate (unless she finds a way to travel back in time and significantly increase her savings).

Should the markets misbehave, however, delivering a mere 2 percent return over the 10 years before retirement (not all that hard to imagine, considering the return of a portfolio split between stock and bonds over the last decade), she falls short by about a third. Her portfolio would be worth only about $640,000. The chart accompanying this column illustrates this.

You can quibble with our assumptions in this example. But a similar pattern emerges regardless of your financial targets and projected returns, Mr. Kitces says. So if your target is to save $500,000 or $2 million, and if you assume a 6 percent return or a higher 10 percent, you’re still relying on your investments to roughly double in the final years before retirement.

Of course, an extended period of dismal returns during any point in your career can inflict damage. But the homestretch before retirement is often the most anxiety-inducing because workers have neither the time nor the financial capacity to recover before they begin taking withdrawals. "Getting the bad 2 percent decade in the earlier years has far less impact because there are fewer contributions already invested,” Mr. Kitces said. “Conversely, when the bad returns come in the final 10 years, no reasonable amount of savings will make up the shortfall."

So what’s an investor to do about all of this, especially as one of the other pillars of retirement savings — pensions — disappears? And who’s to say how Social Security may change by the time that 26-year-old retires?

Most of the solutions, if you can call them that, fall into the “easier said than done” category. If you can’t handle the uncertainty of missing your financial targets, you can try to save more and create a less volatile portfolio, Mr. Kitces says, which may also provide a firmer retirement date.

And naturally, the earlier you start saving, the sooner you’re likely to reach the critical mass you’ll need for compounding to accelerate (assuming the markets provide some lift in the first half of your career). But you will still need to save more than many retirement calculators suggest, since they’re likely to recommend saving a lower amount when you have such a long time horizon. Then you can end up in the same predicament, where you are heavily leaning on market returns in the years before retirement.

“What the wise person does is save a large amount of money when they are young,” said William Bernstein, author of “The Investor’s Manifesto: Preparing for Prosperity, Armageddon and Everything in Between” and other investing books. “And if they can do that, when they are older, they can cut back on their equity allocation. When you’ve won the game, you stop playing the game.”

But that can be hard to accomplish when you have other needs competing for those dollars, whether it’s a down payment for a house, a 529 college savings plan or starting a business. Or perhaps you’re already living on less because you’re unemployed (or underemployed) or because health insurance consumes a significant chunk of your income.

“It’s the cruel irony of retirement planning that those people who most need the markets’ help have the least financial capacity to take the risk,” said Milo Benningfield, a financial planner in San Francisco. “Meanwhile, the people who can afford the risk are the ones who least need to take it.”

A more prudent course of action is a flexible one that acknowledges the many possibilities and accounts for ideal and less-than-ideal spending amounts.

Try using different assumptions for the years leading up to retirement, suggests Scott Hanson, a financial planner at Hanson McClain in Sacramento. If you want to retire in 25 years, for instance, you might use a return assumption of 8 percent for the first 15 years of savings, then reduce that rate to 6 percent or less in the final decade, he says.

“Here’s the catch: most folks aren’t saving enough using standard growth assumptions,” he said. “If they begin to use lower growth assumptions in order to ensure their retirement, they’ll fall further behind and become even more discouraged.”

But simply going through these exercises may help the reality sink in. At the very least, it will show how imprecise even the most sophisticated projections may be.

“The actual date I get to check out with my target sum to retirement is much more uncertain than we give it credit to be,” Mr. Kitces said. “It’s more like 40 years, plus or minus five to 10 years. If you want more certainty, you can have it, but you have to save more and take less risk.”

Saturday, January 22, 2011

Stealing the Constitution

Stealing the Constitution | The Nation

Stealing the Constitution
Garrett Epps
January 20, 2011 | This article appeared in the February 7, 2011 edition of The Nation.


In October I spent a crisp Saturday in the windowless basement of a suburban Virginia church attending a seminar on "The Substance and Meaning of the Constitution." I was told the secrets the "elite" have concealed from the people: the Constitution is based on the Law of Moses; Mosaic law was brought to the West by the ancient Anglo-Saxons, who were probably the Ten Lost Tribes of Israel; the Constitution restores the fifth-century kingdom of the Anglo-Saxons.

Garrett Epps, a law professor at the University of Baltimore and a former reporter for the Washington Post.

Garrett Epps
The Voter ID Fraud (Law, Presidential Election 2008, The Courts, Politics, Society)
The conservatives ensconced on the Supreme Court are set to uphold draconian ID requirements on voters that will redefine electoral politics in America.

There's more: virtually all of modern American life and government is unconstitutional. Social Security, the Federal Reserve, the Environmental Protection Agency, the Civil Rights Act of 1964, hate crime laws—all flatly violate God's law. State governments are not required to observe the Bill of Rights; the First Amendment establishes "The Religion of America," which is "nondenominational" Christianity.

The instructor was Lester Pearce, an Arizona judge and the brother of state senator Russell Pearce, author of Arizona's anti-immigrant law, SB 1070. (Perhaps not surprising, Lester tended to digress about how he cracks down on Mexican immigrants in court.) Pearce got rapt attention from the fifty people in the audience, although one boy near me spent his time perfecting a detailed sketch of an assault rifle.

These were earnest citizens who had come to learn about America and its Constitution. What they were being taught was poisonous rubbish.

Americans today are frightened and disoriented. In the midst of uncertainty, they are turning to the Constitution for tools to deal with crisis. The far right—the toxic coalition of Fox News talking heads, radio hosts, angry "patriot" groups and power-hungry right-wing politicians—is responding to this demand by feeding their fellow citizens mythology and lies.

The seminar I attended was organized by the National Center for Constitutional Studies, nestled securely in the metropolis of Malta, Idaho (2000 Census population 177, white population 174). The NCCS was the cold war brainchild of the late W. Cleon Skousen, a prominent John Bircher. The center and its crazed ideology have been taken up by Glenn Beck, who touts its educational programs on his TV show. Civic groups, school districts and even some city governments across the country have been persuaded to sponsor daylong seminars by the "nonpartisan" NCCS; its speakers are visiting high schools to distribute pocket copies of the Constitution. Skousen's massive "guide" to the Constitution, The Making of America: The Substance and Meaning of the Constitution, is currently No. 14 on Amazon's "constitutional history" bestseller list—and has ranked as high as No. 4 in the past year.

Beck is not the only commentator who is espousing such extremist notions. Popular authors Thomas Woods Jr. and Kevin Gutzman, in their book Who Killed the Constitution?, argue that Brown v. Board of Education should be overturned. Not even the Constitution is safe from the "constitutionalists": Fox News commentator Andrew Napolitano recently called the popular election of senators "the only part of the Constitution that is itself unconstitutional." A gathering of conservative law professors and activists at the 2010 convention of the Federalist Society, after gloating about the right-wing triumph in the off-year elections, advocated calling a constitutional convention to strip Congress of its current powers. House majority leader Eric Cantor supports a constitutional amendment to permit the state legislatures to repeal federal laws.

The new Republican majority in the House decided to kick off Congress with a televised reading of "the Constitution" by members. I use the quotation marks because the Constitution they read was edited so that members wouldn't have to read embarrassing anachronisms like Article I, Section 2, which counted a slave as three-fifths of a white person. (Poignantly, the language in the First Amendment about "the right of the people peaceably to assemble" was read by Representative Gabrielle Giffords, who was shot at a constituent meeting two days later.) They also have enacted a rule requiring that every new piece of legislation include a "constitutional authority" statement explaining why Congress has the power to pass it. (The false implication is that previous Congresses enacted laws willy-nilly, with no attention to that body's powers.)

Conservative lawmakers increasingly claim that the "original intent" of the Constitution's framers and the views of the right wing of the Republican Party are one and the same. Newly elected Senator Mike Lee of Utah has endorsed state "nullification" of the healthcare law. And far-right Republican Congresswoman Michele Bachmann has set up a "Constitution school" for new members of Congress; Justice Antonin Scalia (in other contexts a stickler for the separation of powers) has agreed to join Bachmann's faculty.

Scalia's injudicious involvement with House Republicans underscores the new boldness of conservative federal judges in adopting the rhetoric and ideas of the hard right. Scalia has repeatedly said that direct election of senators is "a bad idea." He recently said that the Equal Protection Clause provides no protection for women against discrimination because when it was adopted "nobody thought it was directed against sex discrimination." Federal District Judge Roger Vinson of Florida, who is hearing a challenge to the new healthcare program, recently cast doubt on its constitutionality in an opinion that cited, among other things, a Wall Street Journal op-ed as its "authority."

It's easy to understand why conservative politicians and judges are trying to align their political program with a strained reading of the Constitution: Social Security, Medicare, environmental protection and aid to education have broad popular support. Even the healthcare program, so reviled by the Republican Party, will be almost impossible to repeal using the legislative process.

* * *

So the right is seeking to win by changing the rules. Progressive, democratically enacted policy choices are unconstitutional, they argue. A document that over time has become more democratic and egalitarian is being rewritten as a charter of privilege and inequality. This shouldn't be allowed to happen.

Why has the right done such a good job of putting out its invented "Constitution"? Some of the responsibility lies with progressive legal scholars, who are well situated to explain the Constitution to the public. It isn't that they have failed; it's that they seldom try. Scholars from top schools hold forth with polysyllabic theories of hermeneutics that ordinary citizens can't fathom. Meanwhile, conservatives don't hesitate to speak directly to the public—and, often, to dumb down the Constitution. They purvey a simple myth: anyone who doesn't support the far-right version of the Constitution is at best unpatriotic, at worst a traitor.

Enough of that. The Constitution belongs to all of us. It's time to take it back from those who are trying to steal it in plain sight. Our Constitution wasn't written to rig the political game but to allow us to play it without killing one another. It created a government and gave that government the power it needed to function.

That seems elementary, but the right claims that the Constitution was designed to prevent America from abandoning the tallow-candle purity of the Anglo-Saxon past. Any innovative government program, the argument runs, must be unconstitutional, or the framers would have predicted it in so many words. But the Constitution wasn't a revival; it was something brand-new—the first national written constitution in Western history. The framers wanted to impel change, not prevent it.

Conservatives also claim that the Constitution was set up to restrain the federal government. If so, there's precious little evidence of it. The actual text of the Constitution is overwhelmingly concerned with making sure the new government had enough power; the framers thought the old Articles of Confederation were fatally weak. Sure, they didn't want to set up a government that could throw people in jail without a good reason, or steal their property, or do away with free elections. The original Constitution prohibited oppressive practices, and the Bill of Rights added other restrictions.

But the document as a whole is much more concerned with what the government can do—not with what it can't. From the beginning it was empowered to levy taxes, to raise armies, to make war, to set the rules of commerce and to bind the nation through treaties and international agreements. There's no sign of the libertarian fairyland many on the far right have invented. Rather, the Constitution allowed for a government adequate to the challenges facing a modern nation.

In particular, the Constitution was not written to weaken an overreaching Congress but to strengthen an enfeebled one. The old Articles of Confederation had set up a Congress with the power only to beg states for money and recommend laws for them to enact. That didn't work; the country found itself headed for bankruptcy and disaster. To replace that old Congress, the Constitution created a bicameral Congress with a long and impressive list of textual powers. It also gives this Congress the power "to make all laws which shall be necessary and proper for carrying into execution," not only those specific powers but "all other powers vested by this Constitution in the government of the United States, or in any department or officer thereof."

That's a lot of power. And over the years, the government has sometimes needed it, to deal with civil war, economic calamity and internal disorder.

Another myth is that the Constitution was created to "protect" the states from federal power. Again, if that's true, it's not because of anything actually in the Constitution. The Constitution includes limits—but they are mostly limits on state governments and corresponding increases in federal power. The idea that states have rights, or that they are sovereign, appears nowhere in the original Constitution. And constitutional amendments have repeatedly imposed further limits on the states while granting more power to Congress.

One of the pet peeves of the right is the "intrusion" of ideas from international law into American law. Senators at the confirmation hearings for Justice Sonia Sotomayor demanded (and, regrettably, got) a promise that she would never rely on international law. A measure adopted by voters in Oklahoma in November forbids state courts from even looking to "the legal precepts of other nations or cultures" or "international law."

This is not a defense of the Constitution; it is a mutilation. The framers knew a great deal of international law. The document itself mentions many sources of international law: treaties (a major source of international law, they are part of "the supreme law of the land"); "the law of nations," which designates customary international law; and "admiralty and maritime jurisdiction," among others.

The most important truth about the Constitution is this: it was written as a set of rules by which living people could solve their own problems, not as a "dead hand" restricting their options. Strikingly many important questions, from the nature of the Supreme Court to the composition of the cabinet, are left to Congress. There's ample evidence in the text that the framers didn't think of themselves as peering into the future and settling all questions; instead, they wrote a document that in essence says, "Work it out."

The Voter ID Fraud (Law, Presidential Election 2008, The Courts, Politics, Society)
The conservatives ensconced on the Supreme Court are set to uphold draconian ID requirements on voters that will redefine electoral politics in America.


America Antonin Scalia Glenn Beck These conclusions come from a careful reading of the Constitution, not from some hazy idea of a "living Constitution." The "living Constitution" is a whipping boy of the right. Progressives supposedly believe, in Ron Paul's words, that "government may unilaterally change the terms of its contract with the American people." Right-wing historian Kevin Gutzman writes that Supreme Court justices use the "myth of a living Constitution" to "write their own views into law on some of the most contentious issues of our day."

This "debate" is a mystification. The far right views the Constitution as something like the "killing jar" scientists use to preserve butterflies, freezing the country under glass, preventing social change and stripping the democratic process of its effectiveness. The issue in constitutional interpretation is not whether the Constitution is a living document; it is whether the United States is a living nation.

That simple reality is often obscured by conservatives' claim that they, and only they, follow the framers' "original intent." Originalism, writes scholar David Forte in The Heritage Guide to the Constitution, "implies that those who make, interpret, and enforce the law ought to be guided by the meaning of the United States Constitution—the supreme law of the land—as it was originally written." Who could be against that? Nobody, Forte writes, except those who believe that the Constitution has "no fixed meaning."

This notion—that there is somehow a fixed, binding, single intent hidden in a each phrase of the Constitution—confuses the Constitution with the Bible. The idea of a single, literal, intended meaning of a biblical text gained primacy during the Reformation. The religious historian Jaroslav Pelikan sees in early Protestant theology the origins of American constitutional discourse. Luther and the other Reformers believed that "Scripture had to be not interpreted but delivered from interpretations to speak for itself." What mattered to Luther was "the original intent and sensus literalis [literal meaning]" of the words of the Bible.

The general Protestant notion of "original intent" was elaborated a century ago, when a group of American evangelical Christians published a set of essays on "the fundamentals" of Christian belief. In large part, fundamentalism was a revolt against "higher criticism"—scholarship that studied the Bible like any other literary work in history. Rejecting this approach, fundamentalists believed that the Bible is the literal word of God; all parts of it are created directly by the breath of God into the human soul. The inspiration is not general but verbal—God has fixed not just the ideas in the Bible but the very words in which they were written. Thus every word has a fixed meaning, immune from question by history; and all the words fit together into one divine whole. This "true" meaning must be zealously guarded against corrupt worldly forces—the "higher critics"—seeking to contaminate it with modern, un-Christian ideas.

* * *

So influential has biblical fundamentalism been in this country that these attitudes are now cultural rather than specifically religious values. In fact, "originalists" have an enemy just as the fundamentalists did. Like the "higher critics," the supposed advocates of the "living Constitution" are smooth-talking "elite" deceivers who want to replace the good old Constitution with their personal views.

But that's one of the right's biggest lies. "We are all federalists, we are all republicans," Thomas Jefferson said in his first inaugural address. And we are all "originalists." But many constitutional interpreters find the "intent" of the framers and ratifiers of the Constitution in, well, what the Constitution says.

If the Constitution says that Congress has the power to regulate "commerce with foreign nations, and among the several states, and with the Indian tribes," we look around us and see what "commerce" today consists of. If the village "barber chirurgeon" has been replaced by a nationwide for-profit hospital chain and a system of group health insurance, then the power of Congress tracks that change. That's an act of interpretation, to be sure; but it's no more of one than the Da Vinci Code–style charade engaged in by many far-right "originalists."

At their baldest and strongest, originalists claim that the nation is bound by their own opinion of what was in the minds of the framers. For all their claims of superior virtue, "originalists" agree that what the framers said governs; they just want to control what counts as what the founders said.

Recognizing the problems inherent in the quest for "original intent," a number of originalists have moved on to what they call a quest for "original public meaning," or the "original understanding." That is, they say, we should consult history to determine what ordinary people in 1787 (or 1866, or whenever a specific provision was written) would have thought the words meant. Justice Scalia, for one, considers that inquiry pretty straightforward: "Often—I dare say usually—that is easy to discern and simple to apply." But as practiced by Scalia, that tends to reduce itself to, "Trust me, I knew the framers and here's what they would have said."

Consider Scalia's concurrence in Citizens United v. Federal Election Commission. In that case, the conservative majority gutted federal restrictions on expenditures by corporations during elections. In his dissent, Justice John Paul Stevens challenged the right on "originalist" grounds. During the founding period, he noted, most political thinkers distrusted the corporate form of organization. That might be true, Scalia replied, but only because in the eighteenth century corporations were associated with monopoly privileges: "Modern corporations do not have such privileges, and would probably have been favored by most of our enterprising Founders."

This dishonest contortion exemplifies the problems with "original meaning." Scalia is essentially saying, "They didn't really know what they thought; luckily, I do." If we adopt his definition of the founders' intent, then we the people lose all right to interpret or even really read our own Constitution. At best, we must accept the dictates of historians, who often disagree.

At worst, though, we are delivered into the hands of Justice Scalia and his ilk. Judges usually know very little about history. But an "originalist" like Scalia is utterly confident about his power to pluck the "easy, simple" meaning from the air. By a bizarre coincidence, the "easy, simple" meaning usually coincides with the program of the twenty-first-century judicial right.

Serious originalist scholarship is very useful as one way of learning more about the Constitution. But in the hands of judges like Scalia or demagogues like Glenn Beck, it is really a kind of intellectual weapon designed to hide from ordinary citizens what is in plain sight—the text of the Constitution and the present circumstances to which it must be applied.

That text, and those circumstances, are the tools we the people need in order to fight back. To save our Constitution, we have to read it. What's remarkable is how few people actually do this before proclaiming their opinions. God knows lawyers don't. In most law schools, constitutional law courses don't even begin with the text. Instead, on day one, students read the 1803 case of Marbury v. Madison. That's the case in which the Supreme Court for the first time announced the doctrine of "judicial review," which allows it to review state and federal laws and invalidate those that, in its judgment, don't comply with the Constitution. Marbury is a terrific case; but the doctrine it embodies isn't written in the Constitution. So at the very beginning of their study, most lawyers leave the text behind, and never return to it.

Ordinary citizens also resist reading the Constitution. They think it's dull. In 1987, the American novelist E.L. Doctorow found no poetry in it. "It is five thousand words long but reads like fifty thousand," he reported sadly. "It lacks high rhetoric and shows not a trace of wit, as you might expect, having been produced by a committee of lawyers. It uses none of the tropes of literature to create empathetic states in the mind of the reader."

Doctorow was wrong. The Constitution as a whole takes effort to read; but once one puts in the effort—several readings, all the way through, and some serious thought about what one has read—it reveals a surprising, indeed sometimes dazzling, array of meanings. By turns political, legal, epic and poetic, it shows us a number of strategies for dealing with contemporary challenges.

How do we read the Constitution then? A citizen who seeks to understand the Constitution should not assume that the answers lie in Supreme Court cases. For one thing, many important constitutional questions have never come before the Court. Some, indeed, can never be heard by any court—they constitute what judges and scholars call "political questions," which must be worked out by other branches of the government.

Second, the courts may get it wrong. In 1857, the Supreme Court announced that Americans of African descent were not and never could become citizens. A bitter political struggle, and an even more bitter Civil War, produced a national consensus that this decision was profoundly wrong even on the day it was announced. More recent decisions, from Roe v. Wade to Citizens United, have provoked profound criticism by political leaders and ordinary citizens. Citizens are not "wrong" because they disagree with the Court.

At its most basic level, reading the Constitution requires the tools that Vladimir Nabokov urged readers to bring to any text: imagination, memory, a dictionary and a willingness to use all three when the going gets tough.

Read the Constitution and measure it against the absurd claims we hear every day. This is a matter of life and death for our Republic. We won't find the Tea Party manifesto there; nor will we find the agenda of progressive advocacy groups. What we will find is a set of political tools and a language that fair-minded citizens, progressive or conservative, can use to talk through our disagreements.

Trapped in that ghastly church basement last year, I made a resolution that I would try to help rescue the Constitution from "constitutionalists." Here and now I say to Nation readers that if any group of citizens anywhere wants to meet in a church basement to discuss these issues, I will either go there to help or try to find someone who will. It's time for progressive constitutional scholars to stop mumbling about deconstruction and speak up for democracy.

Ordinary Americans love the Constitution at least as much as far-right ideologues. It's our Constitution too.

It's time to take it back.