Another Easter is coming to an end. If I could I would have worn the white coat and pink carnation of my youth to church this morning. The blue blazer of my middle age had to suffice. What do we have to look forward to now? The Masters Tournament? Tiger is gonna win. No mystery there---boring! The final four ending of March Madness? Do you really care? I didn't think so. Me either. The ...end of another spring semester? Been there, done that. Wake me up when it's over. I'd rather have a conversation with a blood splatter expert criminologist. Mother's Day? Gotta admit one of the highlights of the year: an excuse for an expensive meal out. Memorial Day? Don't get overly excited thinking about it. It will wreck your health. Then summer. Wup-de-do.
Sunday, March 31, 2013
Tracy Thompson - The New Mind of the South
The classic text about the "mind of the South" was published in 1941 written by W.J. Cash. Over the years I have read numerous references to this book. I have it on my shelf but haven never read it. Now comes a journalist saying that she has written the sequel. I don't know about that since I have nor read Cash. So I take this book as it is on its own. It's entertaining but hardly memorable. More later.
The author, who grew up in Atlanta, devotes a whole chapter to her native city. She presents a city that has forgotten its past in its pursuit of total commercialism. I have no use for Atlanta.
Her main point is that the South is changing because of increasing ethnic diversity. How will this change our beloved South? Only time will tell.
Her other main point is the rural poverty and the deseration of the rural South. Her case subject is Clarkesdale, Mississippi. I did not know this city was in such bad straits, but I do know of the desertion of the rural South. I suspect this is irreversible.
The author, who grew up in Atlanta, devotes a whole chapter to her native city. She presents a city that has forgotten its past in its pursuit of total commercialism. I have no use for Atlanta.
Her main point is that the South is changing because of increasing ethnic diversity. How will this change our beloved South? Only time will tell.
Her other main point is the rural poverty and the deseration of the rural South. Her case subject is Clarkesdale, Mississippi. I did not know this city was in such bad straits, but I do know of the desertion of the rural South. I suspect this is irreversible.
Saturday, March 30, 2013
On the Road with Kerouac
Old habits can be hard to break. I am trying to quit coffee. Last night I dreamed I was drinking coffee in a coffeehouse in North Beach with Jack Kerouac. We had just come from the alley named for him by City Lights. Jack was typing on his manual typewriter as we talked. Dean was sitting there also, though he didn't say a word. Jack was saying something as he typed about how "he had nothing to offer anybody except my own confusion" when Truman Capote stuck his head in the door and asked in a big husky voice (go figure) "would you boys like to go down the street for a drink?" He saw Jack typing and laughed so loud the building shook. "Go away, Truman," said Jack. Truman was still laughing as he walked away. Jack looks at me and says, "Happiness is in realizing that it is all a giant strange dream."
Coffee, anyone?
Coffee, anyone?
Friday, March 29, 2013
Wednesday, March 27, 2013
Hooray for Today!
You are right, Thomas: you can't go home again. But then, who would WANT to go home again? Sure, it would be nice to see parents, family, and loved ones who have gone on again, and I could go for that as long as possible. But the rest of it? It would get old in a hurry. I am happy in 2013 despite all of the stress, anguish, and heartburn of today. Give me my by-pass, cataract surgery, and scones today over what we had in 1966 any day!
Monday, March 25, 2013
William Landay - Defending Jacob
This crime novel is pretty good in that it's a good story and also raises the question of behavorial genetics. The day will come when people might be genetically mapped for their propensity for violence. This is sometimes called the "murder gene." Are some people genetically predisposed to violence? What should be done when this can be predicted?
A 14-yr. boy is murdered---stabbed---on his way to school in a park. A 14-yr. schoolmate is indicted for the crime. The boy charged with murder is the son of the assistant DA. The father refuses to consider that his son might be guilty. The mother slowly comes to think that maybe her son did commit this murder. The father, who is the son of a father spending life in prison for murder and who wonders if his patrimony includes a gene predicting violence in hhis offspring, is convinced that a local man named Patz committed the crime. The case goes to trial. The trial is halted and the son declared innocent when Patz commits suicide and leaves a suicide note admiting his guilt. But that is not the end of the story as the conclusiion is unexpected and dramatic.
Good story---thought provking--fun read.
A 14-yr. boy is murdered---stabbed---on his way to school in a park. A 14-yr. schoolmate is indicted for the crime. The boy charged with murder is the son of the assistant DA. The father refuses to consider that his son might be guilty. The mother slowly comes to think that maybe her son did commit this murder. The father, who is the son of a father spending life in prison for murder and who wonders if his patrimony includes a gene predicting violence in hhis offspring, is convinced that a local man named Patz committed the crime. The case goes to trial. The trial is halted and the son declared innocent when Patz commits suicide and leaves a suicide note admiting his guilt. But that is not the end of the story as the conclusiion is unexpected and dramatic.
Good story---thought provking--fun read.
Saturday, March 23, 2013
A Review of The New Mind of the South
by JonathanYardley
Critic ‘The New Mind of the South’ by Tracy Thompson
The mixture of pride and shame with which Southerners such as Faulkner and Cash have viewed the South has scarcely vanished as the region has undergone remarkable change since World War II. Thompson conveys some sense of that pride (in this case prideful self-delusion) in her conversations with members of the United Daughters of the Confederacy and its certifiably ridiculous “young person’s auxiliary called the Children of the Confederacy,” who to this day cling to the preposterous belief that “the South had not fought to preserve slavery, and that this false accusation was an effort to smear the reputation of the South’s gallant leaders.” UDC lobbyists and other Lost Cause proponents have had extraordinary influence on the cowardly textbook industry, which has caved in to their demands that the antebellum and Civil War periods be presented to high school students in the most favorable light, i.e., that “slavery was a benign institution” and that “the Civil War was fought over the issue of states’ rights.”
As Thompson says, there has long been a “Southern genius for living in an imagined past where racial tension was nonexistent, strangers would stop to help you if you had a flat tire, white people were sweet to black people and black people loved them right back, and everyone went to church on Sunday.” This “lack of historical awareness” is one of the basic characteristics Thompson attributes to Southerners, along with three others: “Southerners are conservative people,” they are notable for “sheer adaptability,” and they have “a certain lack of self-awareness.” All this is legitimate enough and in its essentials echoes much to be found in Cash, but the problem is that these essentially are characteristics of Southern-born and -raised whites, whereas in fact the South now is inhabited by large numbers of (a) people who have immigrated there from other parts of the country, (b) African Americans, who are returning to the South in great numbers, and (c) Latinos and others, Asians most particularly, who have come there from other countries.
.Thompson is quick to acknowledge this but doesn’t seem to grasp that it is now just about impossible to generalize about Southerners and the South, because even though the traits mentioned above can still be found in many places and people, they can’t be found in all. That doesn’t keep her from trying. She says that “the South is finally disentangling itself from the Confederacy,” that “being a twentieth-century Southerner means coming to a better appreciation of the tensions created by. . . our dual identities as Americans and as Southerners,” that the South “exemplifies the progress that has been made on race, and the distance there still is to go,” and that it retains its old “sense of community” but far more inclusively than in the past.
True enough, but what it says to me is that the South is becoming not more Southern but more American. Yes, many of its old folkways live on, the good ones and the bad ones, and the Southern accent persists doughtily (and bravely) in this age of homogenization, and it’s still a whole lot hotter down there than in the iron New England dark, but these carry ever less weight as external influences — mainly all those outsiders who have moved in — leave the South no choice except to change. She’s quite right that in some respects Atlanta is “a very Southern city . . . in its inferiority complex, in its defensive need to be validated as a ‘world-class’ city, Southern in its reflexive need to sugarcoat racial realities, Southern in its resilience and adaptability in the face of calamity” — but in most other respects it’s pretty much indistinguishable from Houston or Kansas City.
My own Southern connections are scarcely as deep as Thompson’s, yet for about a quarter-century I was an adopted Southerner and happy to be one, as well as an ardent reader and booster of Southern literature. But it’s people like me — people who came to the South from elsewhere, who adopted some of its customs and byways but who also carried in their carpetbags their own beliefs and economic practices — who have had much to do with making the South a quite different place. It’s as much a definable region now as it ever was, but it’s much less a state of mind.
The mixture of pride and shame with which Southerners such as Faulkner and Cash have viewed the South has scarcely vanished as the region has undergone remarkable change since World War II. Thompson conveys some sense of that pride (in this case prideful self-delusion) in her conversations with members of the United Daughters of the Confederacy and its certifiably ridiculous “young person’s auxiliary called the Children of the Confederacy,” who to this day cling to the preposterous belief that “the South had not fought to preserve slavery, and that this false accusation was an effort to smear the reputation of the South’s gallant leaders.” UDC lobbyists and other Lost Cause proponents have had extraordinary influence on the cowardly textbook industry, which has caved in to their demands that the antebellum and Civil War periods be presented to high school students in the most favorable light, i.e., that “slavery was a benign institution” and that “the Civil War was fought over the issue of states’ rights.”
As Thompson says, there has long been a “Southern genius for living in an imagined past where racial tension was nonexistent, strangers would stop to help you if you had a flat tire, white people were sweet to black people and black people loved them right back, and everyone went to church on Sunday.” This “lack of historical awareness” is one of the basic characteristics Thompson attributes to Southerners, along with three others: “Southerners are conservative people,” they are notable for “sheer adaptability,” and they have “a certain lack of self-awareness.” All this is legitimate enough and in its essentials echoes much to be found in Cash, but the problem is that these essentially are characteristics of Southern-born and -raised whites, whereas in fact the South now is inhabited by large numbers of (a) people who have immigrated there from other parts of the country, (b) African Americans, who are returning to the South in great numbers, and (c) Latinos and others, Asians most particularly, who have come there from other countries.
Thompson is quick to acknowledge this but doesn’t seem to grasp that it is now just about impossible to generalize about Southerners and the South, because even though the traits mentioned above can still be found in many places and people, they can’t be found in all. That doesn’t keep her from trying. She says that “the South is finally disentangling itself from the Confederacy,” that “being a twentieth-century Southerner means coming to a better appreciation of the tensions created by. . . our dual identities as Americans and as Southerners,” that the South “exemplifies the progress that has been made on race, and the distance there still is to go,” and that it retains its old “sense of community” but far more inclusively than in the past.
True enough, but what it says to me is that the South is becoming not more Southern but more American. Yes, many of its old folkways live on, the good ones and the bad ones, and the Southern accent persists doughtily (and bravely) in this age of homogenization, and it’s still a whole lot hotter down there than in the iron New England dark, but these carry ever less weight as external influences — mainly all those outsiders who have moved in — leave the South no choice except to change. She’s quite right that in some respects Atlanta is “a very Southern city . . . in its inferiority complex, in its defensive need to be validated as a ‘world-class’ city, Southern in its reflexive need to sugarcoat racial realities, Southern in its resilience and adaptability in the face of calamity” — but in most other respects it’s pretty much indistinguishable from Houston or Kansas City.
My own Southern connections are scarcely as deep as Thompson’s, yet for about a quarter-century I was an adopted Southerner and happy to be one, as well as an ardent reader and booster of Southern literature. But it’s people like me — people who came to the South from elsewhere, who adopted some of its customs and byways but who also carried in their carpetbags their own beliefs and economic practices — who have had much to do with making the South a quite different place. It’s as much a definable region now as it ever was, but it’s much less a state of mind.
yardleyj@washpost.com
THE NEW MIND OF THE SOUTH
By Tracy Thompson
Simon & Schuster. 263 pp. $26
Critic ‘The New Mind of the South’ by Tracy Thompson
The mixture of pride and shame with which Southerners such as Faulkner and Cash have viewed the South has scarcely vanished as the region has undergone remarkable change since World War II. Thompson conveys some sense of that pride (in this case prideful self-delusion) in her conversations with members of the United Daughters of the Confederacy and its certifiably ridiculous “young person’s auxiliary called the Children of the Confederacy,” who to this day cling to the preposterous belief that “the South had not fought to preserve slavery, and that this false accusation was an effort to smear the reputation of the South’s gallant leaders.” UDC lobbyists and other Lost Cause proponents have had extraordinary influence on the cowardly textbook industry, which has caved in to their demands that the antebellum and Civil War periods be presented to high school students in the most favorable light, i.e., that “slavery was a benign institution” and that “the Civil War was fought over the issue of states’ rights.”
As Thompson says, there has long been a “Southern genius for living in an imagined past where racial tension was nonexistent, strangers would stop to help you if you had a flat tire, white people were sweet to black people and black people loved them right back, and everyone went to church on Sunday.” This “lack of historical awareness” is one of the basic characteristics Thompson attributes to Southerners, along with three others: “Southerners are conservative people,” they are notable for “sheer adaptability,” and they have “a certain lack of self-awareness.” All this is legitimate enough and in its essentials echoes much to be found in Cash, but the problem is that these essentially are characteristics of Southern-born and -raised whites, whereas in fact the South now is inhabited by large numbers of (a) people who have immigrated there from other parts of the country, (b) African Americans, who are returning to the South in great numbers, and (c) Latinos and others, Asians most particularly, who have come there from other countries.
.Thompson is quick to acknowledge this but doesn’t seem to grasp that it is now just about impossible to generalize about Southerners and the South, because even though the traits mentioned above can still be found in many places and people, they can’t be found in all. That doesn’t keep her from trying. She says that “the South is finally disentangling itself from the Confederacy,” that “being a twentieth-century Southerner means coming to a better appreciation of the tensions created by. . . our dual identities as Americans and as Southerners,” that the South “exemplifies the progress that has been made on race, and the distance there still is to go,” and that it retains its old “sense of community” but far more inclusively than in the past.
True enough, but what it says to me is that the South is becoming not more Southern but more American. Yes, many of its old folkways live on, the good ones and the bad ones, and the Southern accent persists doughtily (and bravely) in this age of homogenization, and it’s still a whole lot hotter down there than in the iron New England dark, but these carry ever less weight as external influences — mainly all those outsiders who have moved in — leave the South no choice except to change. She’s quite right that in some respects Atlanta is “a very Southern city . . . in its inferiority complex, in its defensive need to be validated as a ‘world-class’ city, Southern in its reflexive need to sugarcoat racial realities, Southern in its resilience and adaptability in the face of calamity” — but in most other respects it’s pretty much indistinguishable from Houston or Kansas City.
My own Southern connections are scarcely as deep as Thompson’s, yet for about a quarter-century I was an adopted Southerner and happy to be one, as well as an ardent reader and booster of Southern literature. But it’s people like me — people who came to the South from elsewhere, who adopted some of its customs and byways but who also carried in their carpetbags their own beliefs and economic practices — who have had much to do with making the South a quite different place. It’s as much a definable region now as it ever was, but it’s much less a state of mind.
The mixture of pride and shame with which Southerners such as Faulkner and Cash have viewed the South has scarcely vanished as the region has undergone remarkable change since World War II. Thompson conveys some sense of that pride (in this case prideful self-delusion) in her conversations with members of the United Daughters of the Confederacy and its certifiably ridiculous “young person’s auxiliary called the Children of the Confederacy,” who to this day cling to the preposterous belief that “the South had not fought to preserve slavery, and that this false accusation was an effort to smear the reputation of the South’s gallant leaders.” UDC lobbyists and other Lost Cause proponents have had extraordinary influence on the cowardly textbook industry, which has caved in to their demands that the antebellum and Civil War periods be presented to high school students in the most favorable light, i.e., that “slavery was a benign institution” and that “the Civil War was fought over the issue of states’ rights.”
As Thompson says, there has long been a “Southern genius for living in an imagined past where racial tension was nonexistent, strangers would stop to help you if you had a flat tire, white people were sweet to black people and black people loved them right back, and everyone went to church on Sunday.” This “lack of historical awareness” is one of the basic characteristics Thompson attributes to Southerners, along with three others: “Southerners are conservative people,” they are notable for “sheer adaptability,” and they have “a certain lack of self-awareness.” All this is legitimate enough and in its essentials echoes much to be found in Cash, but the problem is that these essentially are characteristics of Southern-born and -raised whites, whereas in fact the South now is inhabited by large numbers of (a) people who have immigrated there from other parts of the country, (b) African Americans, who are returning to the South in great numbers, and (c) Latinos and others, Asians most particularly, who have come there from other countries.
Thompson is quick to acknowledge this but doesn’t seem to grasp that it is now just about impossible to generalize about Southerners and the South, because even though the traits mentioned above can still be found in many places and people, they can’t be found in all. That doesn’t keep her from trying. She says that “the South is finally disentangling itself from the Confederacy,” that “being a twentieth-century Southerner means coming to a better appreciation of the tensions created by. . . our dual identities as Americans and as Southerners,” that the South “exemplifies the progress that has been made on race, and the distance there still is to go,” and that it retains its old “sense of community” but far more inclusively than in the past.
True enough, but what it says to me is that the South is becoming not more Southern but more American. Yes, many of its old folkways live on, the good ones and the bad ones, and the Southern accent persists doughtily (and bravely) in this age of homogenization, and it’s still a whole lot hotter down there than in the iron New England dark, but these carry ever less weight as external influences — mainly all those outsiders who have moved in — leave the South no choice except to change. She’s quite right that in some respects Atlanta is “a very Southern city . . . in its inferiority complex, in its defensive need to be validated as a ‘world-class’ city, Southern in its reflexive need to sugarcoat racial realities, Southern in its resilience and adaptability in the face of calamity” — but in most other respects it’s pretty much indistinguishable from Houston or Kansas City.
My own Southern connections are scarcely as deep as Thompson’s, yet for about a quarter-century I was an adopted Southerner and happy to be one, as well as an ardent reader and booster of Southern literature. But it’s people like me — people who came to the South from elsewhere, who adopted some of its customs and byways but who also carried in their carpetbags their own beliefs and economic practices — who have had much to do with making the South a quite different place. It’s as much a definable region now as it ever was, but it’s much less a state of mind.
yardleyj@washpost.com
THE NEW MIND OF THE SOUTH
By Tracy Thompson
Simon & Schuster. 263 pp. $26
Nagel the Heretic
Who is Thomas Nagel and why are so many of his fellow academics condemning him?
Mar 25, 2013, Vol. 18, No. 27 • By ANDREW FERGUSON
Last fall, a few days before Halloween and about a month after the publication of Mind and Cosmos, the controversial new book by the philosopher Thomas Nagel, several of the world’s leading philosophers gathered with a group of cutting-edge scientists in the conference room of a charming inn in the Berkshires. They faced one another around a big table set with pitchers of iced water and trays of hard candies wrapped in cellophane and talked and talked, as public intellectuals do. PowerPoint was often brought into play.
The title of the “interdisciplinary workshop” was “Moving Naturalism Forward.” For those of us who like to kill time sitting around pondering the nature of reality—personhood, God, moral judgment, free will, what have you—this was the Concert for Bangladesh. The biologist Richard Dawkins was there, author of The Blind Watchmaker, The Selfish Gene, and other bestselling books of popular science, and so was Daniel Dennett, a philosopher at Tufts and author of Consciousness Explained and Darwin’s Dangerous Idea: Evolution and the Meanings of Life. So were the authors of Why Evolution is True, The Really Hard Problem: Meaning in a Material World, Everything Must Go: Metaphysics Naturalized, and The Atheist’s Guide to Reality: Enjoying Life without Illusions—all of them books that to one degree or another bring to a larger audience the world as scientists have discovered it to be.
Contemporary philosophers have a name for the way you and I see the world, a world filled with other people, with colors and sounds, sights and sensations, things that are good and things that are bad and things that are very good indeed: ourselves, who are able, more or less, to make our own way through life, by our own lights. Philosophers call this common view the “manifest image.” Daniel Dennett pointed out at the conference that modern science, at least since the revelations of Darwin, has been piling up proof that the manifest image is not really accurate in any scientific sense. Rather science—this vast interlocking combine of genetics, neuroscience, evolutionary biology, particle physics—tells us that the components of the manifest image are illusory.
Color, for instance: That azalea outside the window may look red to you, but in reality it has no color at all. The red comes from certain properties of the azalea that absorb some kinds of light and reflect other kinds of light, which are then received by the eye and transformed in our brains into a subjective experience of red. And sounds, too: Complex vibrations in the air are soundless in reality, but our ears are able to turn the vibrations into a car alarm or a cat’s meow or, worse, the voice of Mariah Carey. These capacities of the human organism are evolutionary adaptations. Everything about human beings, by definition, is an evolutionary adaptation. Our sense that the colors and sounds exist “out there” and not merely in our brain is a convenient illusion that long ago increased the survival chances of our species. Powered by Darwin, modern science proceeds, in Dennett’s phrase, as a “universal corrosive,” destroying illusions all the way up and all the way down, dismantling our feelings of freedom and separate selfhood, our morals and beliefs, a mother’s love and a patient’s prayer: All in reality are just “molecules in motion.”
The most famous, most succinct, and most pitiless summary of the manifest image’s fraudulence was written nearly 20 years ago by the geneticist Francis Crick: “ ‘You,’ your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. Who you are is nothing but a pack of neurons.”
This view is the “naturalism” that the workshoppers in the Berkshires were trying to move forward. Naturalism is also called “materialism,” the view that only matter exists; or “reductionism,” the view that all life, from tables to daydreams, is ultimately reducible to pure physics; or “determinism,” the view that every phenomenon, including our own actions, is determined by a preexisting cause, which was itself determined by another cause, and so on back to the Big Bang. The naturalistic project has been greatly aided by neo-Darwinism, the application of Darwin’s theory of natural selection to human behavior, including areas of life once assumed to be nonmaterial: emotions and thoughts and habits and perceptions. At the workshop the philosophers and scientists each added his own gloss to neo-Darwinian reductive naturalism or materialistic neo-Darwinian reductionism or naturalistic materialism or reductive determinism. They were unanimous in their solid certainty that materialism—as we’ll call it here, to limit the number of isms—is the all-purpose explanation for life as we know it.
One notable division did arise among the participants, however. Some of the biologists thought the materialist view of the world should be taught and explained to the wider public in its true, high-octane, Crickian form. Then common, nonintellectual people might see that a purely random universe without purpose or free will or spiritual life of any kind isn’t as bad as some superstitious people—religious people—have led them to believe.
Daniel Dennett took a different view. While it is true that materialism tells us a human being is nothing more than a “moist robot”—a phrase Dennett took from a Dilbert comic—we run a risk when we let this cat, or robot, out of the bag. If we repeatedly tell folks that their sense of free will or belief in objective morality is essentially an illusion, such knowledge has the potential to undermine civilization itself, Dennett believes. Civil order requires the general acceptance of personal responsibility, which is closely linked to the notion of free will. Better, said Dennett, if the public were told that “for general purposes” the self and free will and objective morality do indeed exist—that colors and sounds exist, too—“just not in the way they think.” They “exist in a special way,” which is to say, ultimately, not at all.
On this point the discussion grew testy at times. I was reminded of the debate among British censors over the publication of Lady Chatterley’s Lover half a century ago. “Fine for you or me,” one prosecutor is said to have remarked, “but is this the sort of thing you would leave lying about for your wife or servant to read?”
There was little else to disturb the materialists in their Berkshire contentment. Surveys have shown that vast majorities of philosophers and scientists call themselves naturalists or materialists. Nearly all popular science books, not only those written by the workshoppers, conclude that materialism offers the true picture of reality. The workshoppers seemed vexed, however, knowing that not everyone in their intellectual class had yet tumbled to the truth of neo-Darwinism. A video of the workshop shows Dennett complaining that a few—but only a few!—contemporary philosophers have stubbornly refused to incorporate the naturalistic conclusions of science into their philosophizing, continuing to play around with outmoded ideas like morality and sometimes even the soul.
“I am just appalled to see how, in spite of what I think is the progress we’ve made in the last 25 years, there’s this sort of retrograde gang,” he said, dropping his hands on the table. “They’re going back to old-fashioned armchair philosophy with relish and eagerness. It’s sickening. And they lure in other people. And their work isn’t worth anything—it’s cute and it’s clever and it’s not worth a damn.”
There was an air of amused exasperation. “Will you name names?” one of the participants prodded, joking.
“No names!” Dennett said.
The philosopher Alex Rosenberg, author of The Atheist’s Guide, leaned forward, unamused.
“And then there’s some work that is neither cute nor clever,” he said. “And it’s by Tom Nagel.”
There it was! Tom Nagel, whose Mind and Cosmos was already causing a derangement among philosophers in England and America.
Dennett sighed at the mention of the name, more in sorrow than in anger. His disgust seemed to drain from him, replaced by resignation. He looked at the table.
“Yes,” said Dennett, “there is that.”
Around the table, with the PowerPoint humming, they all seemed to heave a sad sigh—a deep, workshop sigh.
Tom, oh Tom . . . How did we lose Tom . . .
Thomas Nagel may be the most famous philosopher in the United States—a bit like being the best power forward in the Lullaby League, but still. His paper “What Is It Like to Be a Bat?” was recognized as a classic when it was published in 1974. Today it is a staple of undergraduate philosophy classes. His books range with a light touch over ethics and politics and the philosophy of mind. His papers are admired not only for their philosophical provocations but also for their rare (among modern philosophers) simplicity and stylistic clarity, bordering sometimes on literary grace.
Nagel occupies an endowed chair at NYU as a University Professor, a rare and exalted position that frees him to teach whatever course he wants. Before coming to NYU he taught at Princeton for 15 years. He dabbles in the higher journalism, contributing articles frequently to the New York Review of Books and now and then to the New Republic. A confirmed atheist, he lacks what he calls the sensus divinitatis that leads some people to embrace the numinous. But he does possess a finely tuned sensus socialistis; his most notable excursion into politics was a book-length plea for the confiscation of wealth and its radical redistribution—a view that places him safely in the narrow strip of respectable political opinion among successful American academics.
For all this and more, Thomas Nagel is a prominent and heretofore respected member of the country’s intellectual elite. And such men are not supposed to write books with subtitles like the one he tacked onto Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False.
Imagine if your local archbishop climbed into the pulpit and started reading from the Collected Works of Friedrich Nietzsche. “What has gotten into Thomas Nagel?” demanded the evolutionary psychologist Steven Pinker, on Twitter. (Yes, even Steven Pinker tweets.) Pinker inserted a link to a negative review of Nagel’s book, which he said “exposed the shoddy reasoning of a once-great thinker.” At the point where science, philosophy, and public discussion intersect—a dangerous intersection these days—it is simply taken for granted that by attacking naturalism Thomas Nagel has rendered himself an embarrassment to his colleagues and a traitor to his class.
The Guardian awarded Mind and Cosmos its prize for the Most Despised Science Book of 2012. The reviews were numerous and overwhelmingly negative; one of the kindest, in the British magazine Prospect, carried the defensive headline “Thomas Nagel is not crazy.” (Really, he’s not!) Most other reviewers weren’t so sure about that. Almost before the ink was dry on Nagel’s book the UC Berkeley economist and prominent blogger Brad DeLong could be found gathering the straw and wood for the ritual burning. DeLong is a great believer in neo-Darwinism. He has coined the popular term “jumped-up monkeys” to describe our species. (Monkeys because we’re descended from primates; jumped-up because evolution has customized us with the ability to reason and the big brains that go with it.)
DeLong was particularly offended by Nagel’s conviction that reason allows us to “grasp objective reality.” A good materialist doesn’t believe in objective reality, certainly not in the traditional sense. “Thomas Nagel is not smarter than we are,” he wrote, responding to a reviewer who praised Nagel’s intelligence. “In fact, he seems to me to be distinctly dumber than anybody who is running even an eight-bit virtual David Hume on his wetware.” (What he means is, anybody who’s read the work of David Hume, the father of modern materialism.) DeLong’s readers gathered to jeer as the faggots were placed around the stake.
“Thomas Nagel is of absolutely no importance on this subject,” wrote one. “He’s a self-contradictory idiot,” opined another. Some made simple appeals to authority and left it at that: “Haven’t these guys ever heard of Richard Dawkins and Daniel Dennett?” The hearts of still others were broken at seeing a man of Nagel’s eminence sink so low. “It is sad that Nagel, whom my friends and I thought back in the 1960’s could leap over tall buildings with a single bound, has tripped over the Bible and fallen on his face. Very sad.”
Nagel doesn’t mention the Bible in his new book—or in any of his books, from what I can tell—but among materialists the mere association of a thinking person with the Bible is an insult meant to wound, as Bertie Wooster would say. Directed at Nagel, a self-declared atheist, it is more revealing of the accuser than the accused. The hysterical insults were accompanied by an insistence that the book was so bad it shouldn’t upset anyone.
“Evolutionists,” one reviewer huffily wrote, “will feel they’ve been ravaged by a sheep.” Many reviewers attacked the book on cultural as well as philosophical or scientific grounds, wondering aloud how a distinguished house like Oxford University Press could allow such a book to be published. The Philosophers’ Magazine described it with the curious word “irresponsible.” How so? In Notre Dame Philosophical Reviews, the British philosopher John Dupré explained. Mind and Cosmos, he wrote, “will certainly lend comfort (and sell a lot of copies) to the religious enemies of Darwinism.” Simon Blackburn of Cambridge University made the same point: “I regret the appearance of this book. It will only bring comfort to creationists and fans of ‘intelligent design.’ ”
But what about fans of apostasy? You don’t have to be a biblical fundamentalist or a young-earth creationist or an intelligent design enthusiast—I’m none of the above, for what it’s worth—to find Mind and Cosmos exhilarating. “For a long time I have found the materialist account of how we and our fellow organisms came to exist hard to believe,” Nagel writes. “It is prima facie highly implausible that life as we know it is the result of a sequence of physical accidents together with the mechanism of natural selection.” The prima facie impression, reinforced by common sense, should carry more weight than the clerisy gives it. “I would like to defend the untutored reaction of incredulity to the reductionist neo-Darwinian account of the origin and evolution of life.”
The incredulity is not simply a matter of scientific ignorance, as the materialists would have it. It arises from something more fundamental and intimate. The neo-Darwinian materialist account offers a picture of the world that is unrecognizable to us—a world without color or sound, and also a world without free will or consciousness or good and evil or selves or, when it comes to that, selflessness. “It flies in the face of common sense,” he says. Materialism is an explanation for a world we don’t live in.
Nagel’s tone is measured and tentative, but there’s no disguising the book’s renegade quality. There are flashes of exasperation and dismissive impatience. What’s exhilarating is that the source of Nagel’s exasperation is, so to speak, his own tribe: the “secular theoretical establishment and the contemporary enlightened culture which it dominates.” The establishment today, he says, is devoted beyond all reason to a “dominant scientific naturalism, heavily dependent on Darwinian explanations of practically everything, and armed to the teeth against attacks from religion.” I’m sure Nagel would recoil at the phrase, but Mind and Cosmos is a work of philosophical populism, defending our everyday understanding from the highly implausible worldview of a secular clerisy. His working assumption is, in today’s intellectual climate, radical: If the materialist, neo-Darwinian orthodoxy contradicts common sense, then this is a mark against the orthodoxy, not against common sense. When a chain of reasoning leads us to deny the obvious, we should double-check the chain of reasoning before we give up on the obvious.
Nagel follows the materialist chain of reasoning all the way into the cul de sac where it inevitably winds up. Nagel’s touchier critics have accused him of launching an assault on science, when really it is an assault on the nonscientific uses to which materialism has been put. Though he does praise intelligent design advocates for having the nerve to annoy the secular establishment, he’s no creationist himself. He has no doubt that “we are products of the long history of the universe since the big bang, descended from bacteria through millions of years of natural selection.” And he assumes that the self and the body go together. “So far as we can tell,” he writes, “our mental lives, including our subjective experiences, and those of other creatures are strongly connected with and probably strictly dependent on physical events in our brains and on the physical interaction of our bodies with the rest of the physical world.” To believe otherwise is to believe, as the materialists derisively say, in “spooky stuff.” (Along with jumped-up monkeys and moist robots and countless other much-too-cute phrases, the use of spooky stuff proves that our popular science writers have spent a lot of time watching Scooby-Doo.) Nagel doesn’t believe in spooky stuff.
Materialism, then, is fine as far as it goes. It just doesn’t go as far as materialists want it to. It is a premise of science, not a finding. Scientists do their work by assuming that every phenomenon can be reduced to a material, mechanistic cause and by excluding any possibility of nonmaterial explanations. And the materialist assumption works really, really well—in detecting and quantifying things that have a material or mechanistic explanation. Materialism has allowed us to predict and control what happens in nature with astonishing success. The jaw-dropping edifice of modern science, from space probes to nanosurgery, is the result.
But the success has gone to the materialists’ heads. From a fruitful method, materialism becomes an axiom: If science can’t quantify something, it doesn’t exist, and so the subjective, unquantifiable, immaterial “manifest image” of our mental life is proved to be an illusion.
Here materialism bumps up against itself. Nagel insists that we know some things to exist even if materialism omits or ignores or is oblivious to them. Reductive materialism doesn’t account for the “brute facts” of existence—it doesn’t explain, for example, why the world exists at all, or how life arose from nonlife. Closer to home, it doesn’t plausibly explain the fundamental beliefs we rely on as we go about our everyday business: the truth of our subjective experience, our ability to reason, our capacity to recognize that some acts are virtuous and others aren’t. These failures, Nagel says, aren’t just temporary gaps in our knowledge, waiting to be filled in by new discoveries in science. On its own terms, materialism cannot account for brute facts. Brute facts are irreducible, and materialism, which operates by breaking things down to their physical components, stands useless before them. “There is little or no possibility,” he writes, “that these facts depend on nothing but the laws of physics.”
In a dazzling six-part tour de force rebutting Nagel’s critics, the philosopher Edward Feser provided a good analogy to describe the basic materialist error—the attempt to stretch materialism from a working assumption into a comprehensive explanation of the world. Feser suggests a parody of materialist reasoning: “1. Metal detectors have had far greater success in finding coins and other metallic objects in more places than any other method has. 2. Therefore we have good reason to think that metal detectors can reveal to us everything that can be revealed” about metallic objects.
But of course a metal detector only detects the metallic content of an object; it tells us nothing about its color, size, weight, or shape. In the same way, Feser writes, the methods of “mechanistic science are as successful as they are in predicting and controlling natural phenomena precisely because they focus on only those aspects of nature susceptible to prediction and control.”
Meanwhile, they ignore everything else. But this is a fatal weakness for a theory that aspires to be a comprehensive picture of the world. With magnetic resonance imaging, science can tell us which parts of my brain light up when, for example, I glimpse my daughter’s face in a crowd; the bouncing neurons can be observed and measured. Science cannot quantify or describe the feelings I experience when I see my daughter. Yet the feelings are no less real than the neurons.
The point sounds more sentimental than it is. My bouncing neurons and my feelings of love and obligation are unquestionably bound together. But the difference between the neurons and the feelings, the material and the mental, is a qualitative difference, a difference in kind. And of the two, reductive materialism can capture only one.
“The world is an astonishing place,” Nagel writes. “That it has produced you, and me, and the rest of us is the most astonishing thing about it.” Materialists are in the business of banishing astonishment; they want to demystify the world and human beings along with it, to show that everything we see as a mystery is reducible to components that aren’t mysterious at all. And they cling to this ambition even in cases where doing so is obviously fruitless. Neo-Darwinism insists that every phenomenon, every species, every trait of every species, is the consequence of random chance, as natural selection requires. And yet, Nagel says, “certain things are so remarkable that they have to be explained as non-accidental if we are to pretend to a real understanding of the world.” (The italics are mine.)
Among these remarkable, nonaccidental things are many of the features of the manifest image. Consciousness itself, for example: You can’t explain consciousness in evolutionary terms, Nagel says, without undermining the explanation itself. Evolution easily accounts for rudimentary kinds of awareness. Hundreds of thousands of years ago on the African savannah, where the earliest humans evolved the unique characteristics of our species, the ability to sense danger or to read signals from a potential mate would clearly help an organism survive.
So far, so good. But the human brain can do much more than this. It can perform calculus, hypothesize metaphysics, compose music—even develop a theory of evolution. None of these higher capacities has any evident survival value, certainly not hundreds of thousands of years ago when the chief aim of mental life was to avoid getting eaten. Could our brain have developed and sustained such nonadaptive abilities by the trial and error of natural selection, as neo-Darwinism insists? It’s possible, but the odds, Nagel says, are “vanishingly small.” If Nagel is right, the materialist is in a pickle. The conscious brain that is able to come up with neo-Darwinism as a universal explanation simultaneously makes neo-Darwinism, as a universal explanation, exceedingly unlikely.
A similar argument holds for our other cognitive capacities. “The evolution story leaves the authority of reason in a much weaker position,” he writes. Neo-Darwinism tells us that we have the power of reason because reason was adaptive; it must have helped us survive, back in the day. Yet reason often conflicts with our intuition or our emotion—capacities that must also have been adaptive and essential for survival. Why should we “privilege” one capacity over another when reason and intuition conflict? On its own terms, the scheme of neo-Darwinism gives us no standard by which we should choose one adaptive capacity over the other. And yet neo-Darwinists insist we embrace neo-Darwinism because it conforms to our reason, even though it runs against our intuition. Their defense of reason is unreasonable.
So too our moral sense. We all of us have confidence, to one degree or another, that “our moral judgments are objectively valid”—that is, while our individual judgments might be right or wrong, what makes them right or wrong is real, not simply fantasy or opinion. Two and two really do make four. Why is this confidence inherent in our species? How was it adaptive? Neo-Darwinian materialists tell us that morality evolved as a survival mechanism (like everything else): We developed an instinct for behavior that would help us survive, and we called this behavior good as a means of reinforcing it. We did the reverse for behavior that would hurt our chances for survival: We called it bad. Neither type of behavior was good or bad in reality; such moral judgments are just useful tricks human beings have learned to play on ourselves.
Yet Nagel points out that our moral sense, even at the most basic level, developed a complexity far beyond anything needed for survival, even on the savannah—even in Manhattan. We are, as Nagel writes, “beings capable of thinking successfully about good and bad, right and wrong, and discovering moral and evaluative truths that do not depend on [our] own beliefs.” And we behave accordingly, or try to. The odds that such a multilayered but nonadaptive capacity should become a characteristic of the species through natural selection are, again, implausibly long.
Nagel’s reliance on “common sense” has roused in his critics a special contempt. One scientist, writing in the Huffington Post, calls it Nagel’s “argument from ignorance.” In the Nation, the philosophers Brian Leiter and Michael Weisberg could only shake their heads at the once-great philosopher’s retrogression from sophisticated thinking to common sense.
“This style of argument,” they write, “does not, alas, have a promising history.” Once upon a time, after all, our common-sense intuitions told us the sun traveled across the sky over a flat earth. Materialistic science has since taught us otherwise.
Not all intuitions are of the same kind, though. It is one thing for me to be mistaken in my intuition about the shape of the planet; it’s another thing to be mistaken about whether I exist, or whether truth and falsehood exist independently of my say-so, or whether my “self” has some degree of control over my actions. Indeed, a person couldn’t correct his mistaken intuitions unless these intuitions were correct—unless he was a rational self capable of distinguishing the true from the false and choosing one over the other. And it is the materialist attack on those intuitions—“common sense”—that Nagel finds absurd.
Leiter and Weisberg, like most of his other critics, were also agog that Nagel has the nerve to pronounce on matters that they consider purely scientific, far beyond his professional range. A philosopher doubting a scientist is a rare sight nowadays. With the general decline of the humanities and the success of the physical sciences, the relationship of scientists to philosophers of science has been reversed. As recently as the middle of the last century, philosophers like Bertrand Russell and A. J. Ayer might feel free to explain to scientists the philosophical implications of what they were doing. Today the power is all on the side of the scientists: One false move and it’s back to your sandbox, philosophy boy.
And so some philosophers have retreated into the same sort of hyperspecialization that has rendered scientists from different subdisciplines practically incapable of communicating with each other. Now these philosophers, practicing what they call “experimental philosophy,” can pride themselves on being just as incomprehensible as scientists. Other philosophers, like Dennett, have turned their field into a handmaiden of science: meekly and gratefully accepting whatever findings the scientists come up with—from brain scans to the Higgs boson—which they then use to demonstrate the superiority of hardheaded science to the airy musings of old-fashioned “armchair philosophy.”
In this sense too Nagel is a throwback, daring not only to interpret science but to contradict scientists. He admits it’s “strange” when he relies “on a philosophical claim to refute a scientific theory supported by empirical evidence.” But he knows that when it comes to cosmology, scientists are just as likely to make an error of philosophy as philosophers are to make an error of science. And Nagel is accused of making large errors indeed. According to Leiter and Weisberg and the others, he is ignorant of how science is actually done these days.
Nagel, say Leiter and Weisberg, overestimates the importance of materialism, even as a scientific method. He’s attacking a straw man. He writes as though “reductive materialism really were driving the scientific community.” In truth, they say, most scientists reject theoretical reductionism. Fifty years ago, many philosophers and scientists might have believed that all the sciences were ultimately reducible to physics, but modern science doesn’t work that way. Psychologists, for example, aren’t trying to reduce psychology to biology; and biologists don’t want to boil biology down to chemistry; and chemists don’t want to reduce chemistry to physics. Indeed, an evolutionary biologist—even one who’s a good materialist—won’t refer to physics at all in the course of his work!
And this point is true, as Nagel himself writes in his book: Theoretical materialism, he says, “is not a necessary condition of the practice of any of those sciences.” Researchers can believe in materialism or not, as they wish, and still make scientific progress. (This is another reason why it’s unconvincing to cite scientific progress as evidence for the truth of materialism.) But the critics’ point is also disingenuous. If materialism is true as an explanation of everything—and they insist it is—then psychological facts, for example, must be reducible to biology, and then down to chemistry, and finally down to physics. If they weren’t reducible in this way, they would (ta-da!) be irreducible. And any fact that’s irreducible would, by definition, be uncaused and undetermined; meaning it wouldn’t be material. It might even be spooky stuff.
On this point Leiter and Weisberg were gently chided by the prominent biologist Jerry Coyne, who was also a workshopper in the Berkshires. He was delighted by their roasting of Nagel in the Nation, but he accused them of going wobbly on materialism—of shying away from the hard conclusions that reductive materialism demands. It’s not surprising that scientists in various disciplines aren’t actively trying to reduce all science to physics; that would be a theoretical problem that is only solvable in the distant future. However: “The view that all sciences are in principle reducible to the laws of physics,” he wrote, “must be true unless you’re religious.” Either we’re molecules in motion or we’re not.
You can sympathize with Leiter and Weisberg for fudging on materialism. As a philosophy of everything it is an undeniable drag. As a way of life it would be even worse. Fortunately, materialism is never translated into life as it’s lived. As colleagues and friends, husbands and mothers, wives and fathers, sons and daughters, materialists never put their money where their mouth is. Nobody thinks his daughter is just molecules in motion and nothing but; nobody thinks the Holocaust was evil, but only in a relative, provisional sense. A materialist who lived his life according to his professed convictions—understanding himself to have no moral agency at all, seeing his friends and enemies and family as genetically determined robots—wouldn’t just be a materialist: He’d be a psychopath. Say what you will about Leiter and Weisberg and the workshoppers in the Berkshires. From what I can tell, none of them is a psychopath. Not even close.
Applied beyond its own usefulness as a scientific methodology, materialism is, as Nagel suggests, self-evidently absurd. Mind and Cosmos can be read as an extended paraphrase of Orwell’s famous insult: “One has to belong to the intelligentsia to believe things like that: no ordinary man could be such a fool.” Materialism can only be taken seriously as a philosophy through a heroic feat of cognitive dissonance; pretending, in our abstract, intellectual life, that values like truth and goodness have no objective content even as, in our private life, we try to learn what’s really true and behave in a way we know to be good. Nagel has sealed his ostracism from the intelligentsia by idly speculating why his fellow intellectuals would undertake such a feat.
“The priority given to evolutionary naturalism in the face of its implausible conclusions,” he writes, “is due, I think, to the secular consensus that this is the only form of external understanding of ourselves that provides an alternative to theism.”
In a recent review in the New York Review of Books of Where the Conflict Really Lies, by the Christian philosopher Alvin Plantinga, Nagel told how instinctively he recoils from theism, and how hungry he is for a reasonable alternative. “If I ever found myself flooded with the conviction that what the Nicene Creed says is true,” he wrote, “the most likely explanation would be that I was losing my mind, not that I was being granted the gift of faith.” He admits that he finds the evident failure of materialism as a worldview alarming—precisely because the alternative is, for a secular intellectual, unthinkable. He calls this intellectual tic “fear of religion.”
“I speak from experience, being strongly subject to this fear,” he wrote not long ago in an essay called “Evolutionary Naturalism and the Fear of Religion.” “I want atheism to be true and am made uneasy by the fact that some of the most intelligent and well-informed people I know are religious believers. It isn’t just that I don’t believe in God and, naturally, hope that I’m right in my belief. It’s that I hope there is no God! I don’t want there to be a God; I don’t want the universe to be like that.”
Nagel believes this “cosmic authority problem” is widely shared among intellectuals, and I believe him. It accounts for the stubbornness with which they cling to materialism—and for the hostility that greets an intellectual who starts to wander off from the herd. Materialism must be true because it “liberates us from religion.” The positive mission Nagel undertakes in Mind and Cosmos is to outline, cautiously, a possible Third Way between theism and materialism, given that the first is unacceptable—emotionally, if not intellectually—and the second is untenable. Perhaps matter itself has a bias toward producing conscious creatures. Nature in that case would be “teleological”—not random, not fully subject to chance, but tending toward a particular end. Our mental life would be accounted for—phew!—without reference to God.
I don’t think Nagel succeeds in finding his Third Way, and I doubt he or his successors ever will, but then I have biases of my own. There’s no doubting the honesty and intellectual courage—the free thinking and ennobling good faith—that shine through his attempt.
Andrew Ferguson is a senior editor at The Weekly Standard.
Mar 25, 2013, Vol. 18, No. 27 • By ANDREW FERGUSON
Last fall, a few days before Halloween and about a month after the publication of Mind and Cosmos, the controversial new book by the philosopher Thomas Nagel, several of the world’s leading philosophers gathered with a group of cutting-edge scientists in the conference room of a charming inn in the Berkshires. They faced one another around a big table set with pitchers of iced water and trays of hard candies wrapped in cellophane and talked and talked, as public intellectuals do. PowerPoint was often brought into play.
The title of the “interdisciplinary workshop” was “Moving Naturalism Forward.” For those of us who like to kill time sitting around pondering the nature of reality—personhood, God, moral judgment, free will, what have you—this was the Concert for Bangladesh. The biologist Richard Dawkins was there, author of The Blind Watchmaker, The Selfish Gene, and other bestselling books of popular science, and so was Daniel Dennett, a philosopher at Tufts and author of Consciousness Explained and Darwin’s Dangerous Idea: Evolution and the Meanings of Life. So were the authors of Why Evolution is True, The Really Hard Problem: Meaning in a Material World, Everything Must Go: Metaphysics Naturalized, and The Atheist’s Guide to Reality: Enjoying Life without Illusions—all of them books that to one degree or another bring to a larger audience the world as scientists have discovered it to be.
Contemporary philosophers have a name for the way you and I see the world, a world filled with other people, with colors and sounds, sights and sensations, things that are good and things that are bad and things that are very good indeed: ourselves, who are able, more or less, to make our own way through life, by our own lights. Philosophers call this common view the “manifest image.” Daniel Dennett pointed out at the conference that modern science, at least since the revelations of Darwin, has been piling up proof that the manifest image is not really accurate in any scientific sense. Rather science—this vast interlocking combine of genetics, neuroscience, evolutionary biology, particle physics—tells us that the components of the manifest image are illusory.
Color, for instance: That azalea outside the window may look red to you, but in reality it has no color at all. The red comes from certain properties of the azalea that absorb some kinds of light and reflect other kinds of light, which are then received by the eye and transformed in our brains into a subjective experience of red. And sounds, too: Complex vibrations in the air are soundless in reality, but our ears are able to turn the vibrations into a car alarm or a cat’s meow or, worse, the voice of Mariah Carey. These capacities of the human organism are evolutionary adaptations. Everything about human beings, by definition, is an evolutionary adaptation. Our sense that the colors and sounds exist “out there” and not merely in our brain is a convenient illusion that long ago increased the survival chances of our species. Powered by Darwin, modern science proceeds, in Dennett’s phrase, as a “universal corrosive,” destroying illusions all the way up and all the way down, dismantling our feelings of freedom and separate selfhood, our morals and beliefs, a mother’s love and a patient’s prayer: All in reality are just “molecules in motion.”
The most famous, most succinct, and most pitiless summary of the manifest image’s fraudulence was written nearly 20 years ago by the geneticist Francis Crick: “ ‘You,’ your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules. Who you are is nothing but a pack of neurons.”
This view is the “naturalism” that the workshoppers in the Berkshires were trying to move forward. Naturalism is also called “materialism,” the view that only matter exists; or “reductionism,” the view that all life, from tables to daydreams, is ultimately reducible to pure physics; or “determinism,” the view that every phenomenon, including our own actions, is determined by a preexisting cause, which was itself determined by another cause, and so on back to the Big Bang. The naturalistic project has been greatly aided by neo-Darwinism, the application of Darwin’s theory of natural selection to human behavior, including areas of life once assumed to be nonmaterial: emotions and thoughts and habits and perceptions. At the workshop the philosophers and scientists each added his own gloss to neo-Darwinian reductive naturalism or materialistic neo-Darwinian reductionism or naturalistic materialism or reductive determinism. They were unanimous in their solid certainty that materialism—as we’ll call it here, to limit the number of isms—is the all-purpose explanation for life as we know it.
One notable division did arise among the participants, however. Some of the biologists thought the materialist view of the world should be taught and explained to the wider public in its true, high-octane, Crickian form. Then common, nonintellectual people might see that a purely random universe without purpose or free will or spiritual life of any kind isn’t as bad as some superstitious people—religious people—have led them to believe.
Daniel Dennett took a different view. While it is true that materialism tells us a human being is nothing more than a “moist robot”—a phrase Dennett took from a Dilbert comic—we run a risk when we let this cat, or robot, out of the bag. If we repeatedly tell folks that their sense of free will or belief in objective morality is essentially an illusion, such knowledge has the potential to undermine civilization itself, Dennett believes. Civil order requires the general acceptance of personal responsibility, which is closely linked to the notion of free will. Better, said Dennett, if the public were told that “for general purposes” the self and free will and objective morality do indeed exist—that colors and sounds exist, too—“just not in the way they think.” They “exist in a special way,” which is to say, ultimately, not at all.
On this point the discussion grew testy at times. I was reminded of the debate among British censors over the publication of Lady Chatterley’s Lover half a century ago. “Fine for you or me,” one prosecutor is said to have remarked, “but is this the sort of thing you would leave lying about for your wife or servant to read?”
There was little else to disturb the materialists in their Berkshire contentment. Surveys have shown that vast majorities of philosophers and scientists call themselves naturalists or materialists. Nearly all popular science books, not only those written by the workshoppers, conclude that materialism offers the true picture of reality. The workshoppers seemed vexed, however, knowing that not everyone in their intellectual class had yet tumbled to the truth of neo-Darwinism. A video of the workshop shows Dennett complaining that a few—but only a few!—contemporary philosophers have stubbornly refused to incorporate the naturalistic conclusions of science into their philosophizing, continuing to play around with outmoded ideas like morality and sometimes even the soul.
“I am just appalled to see how, in spite of what I think is the progress we’ve made in the last 25 years, there’s this sort of retrograde gang,” he said, dropping his hands on the table. “They’re going back to old-fashioned armchair philosophy with relish and eagerness. It’s sickening. And they lure in other people. And their work isn’t worth anything—it’s cute and it’s clever and it’s not worth a damn.”
There was an air of amused exasperation. “Will you name names?” one of the participants prodded, joking.
“No names!” Dennett said.
The philosopher Alex Rosenberg, author of The Atheist’s Guide, leaned forward, unamused.
“And then there’s some work that is neither cute nor clever,” he said. “And it’s by Tom Nagel.”
There it was! Tom Nagel, whose Mind and Cosmos was already causing a derangement among philosophers in England and America.
Dennett sighed at the mention of the name, more in sorrow than in anger. His disgust seemed to drain from him, replaced by resignation. He looked at the table.
“Yes,” said Dennett, “there is that.”
Around the table, with the PowerPoint humming, they all seemed to heave a sad sigh—a deep, workshop sigh.
Tom, oh Tom . . . How did we lose Tom . . .
Thomas Nagel may be the most famous philosopher in the United States—a bit like being the best power forward in the Lullaby League, but still. His paper “What Is It Like to Be a Bat?” was recognized as a classic when it was published in 1974. Today it is a staple of undergraduate philosophy classes. His books range with a light touch over ethics and politics and the philosophy of mind. His papers are admired not only for their philosophical provocations but also for their rare (among modern philosophers) simplicity and stylistic clarity, bordering sometimes on literary grace.
Nagel occupies an endowed chair at NYU as a University Professor, a rare and exalted position that frees him to teach whatever course he wants. Before coming to NYU he taught at Princeton for 15 years. He dabbles in the higher journalism, contributing articles frequently to the New York Review of Books and now and then to the New Republic. A confirmed atheist, he lacks what he calls the sensus divinitatis that leads some people to embrace the numinous. But he does possess a finely tuned sensus socialistis; his most notable excursion into politics was a book-length plea for the confiscation of wealth and its radical redistribution—a view that places him safely in the narrow strip of respectable political opinion among successful American academics.
For all this and more, Thomas Nagel is a prominent and heretofore respected member of the country’s intellectual elite. And such men are not supposed to write books with subtitles like the one he tacked onto Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False.
Imagine if your local archbishop climbed into the pulpit and started reading from the Collected Works of Friedrich Nietzsche. “What has gotten into Thomas Nagel?” demanded the evolutionary psychologist Steven Pinker, on Twitter. (Yes, even Steven Pinker tweets.) Pinker inserted a link to a negative review of Nagel’s book, which he said “exposed the shoddy reasoning of a once-great thinker.” At the point where science, philosophy, and public discussion intersect—a dangerous intersection these days—it is simply taken for granted that by attacking naturalism Thomas Nagel has rendered himself an embarrassment to his colleagues and a traitor to his class.
The Guardian awarded Mind and Cosmos its prize for the Most Despised Science Book of 2012. The reviews were numerous and overwhelmingly negative; one of the kindest, in the British magazine Prospect, carried the defensive headline “Thomas Nagel is not crazy.” (Really, he’s not!) Most other reviewers weren’t so sure about that. Almost before the ink was dry on Nagel’s book the UC Berkeley economist and prominent blogger Brad DeLong could be found gathering the straw and wood for the ritual burning. DeLong is a great believer in neo-Darwinism. He has coined the popular term “jumped-up monkeys” to describe our species. (Monkeys because we’re descended from primates; jumped-up because evolution has customized us with the ability to reason and the big brains that go with it.)
DeLong was particularly offended by Nagel’s conviction that reason allows us to “grasp objective reality.” A good materialist doesn’t believe in objective reality, certainly not in the traditional sense. “Thomas Nagel is not smarter than we are,” he wrote, responding to a reviewer who praised Nagel’s intelligence. “In fact, he seems to me to be distinctly dumber than anybody who is running even an eight-bit virtual David Hume on his wetware.” (What he means is, anybody who’s read the work of David Hume, the father of modern materialism.) DeLong’s readers gathered to jeer as the faggots were placed around the stake.
“Thomas Nagel is of absolutely no importance on this subject,” wrote one. “He’s a self-contradictory idiot,” opined another. Some made simple appeals to authority and left it at that: “Haven’t these guys ever heard of Richard Dawkins and Daniel Dennett?” The hearts of still others were broken at seeing a man of Nagel’s eminence sink so low. “It is sad that Nagel, whom my friends and I thought back in the 1960’s could leap over tall buildings with a single bound, has tripped over the Bible and fallen on his face. Very sad.”
Nagel doesn’t mention the Bible in his new book—or in any of his books, from what I can tell—but among materialists the mere association of a thinking person with the Bible is an insult meant to wound, as Bertie Wooster would say. Directed at Nagel, a self-declared atheist, it is more revealing of the accuser than the accused. The hysterical insults were accompanied by an insistence that the book was so bad it shouldn’t upset anyone.
“Evolutionists,” one reviewer huffily wrote, “will feel they’ve been ravaged by a sheep.” Many reviewers attacked the book on cultural as well as philosophical or scientific grounds, wondering aloud how a distinguished house like Oxford University Press could allow such a book to be published. The Philosophers’ Magazine described it with the curious word “irresponsible.” How so? In Notre Dame Philosophical Reviews, the British philosopher John Dupré explained. Mind and Cosmos, he wrote, “will certainly lend comfort (and sell a lot of copies) to the religious enemies of Darwinism.” Simon Blackburn of Cambridge University made the same point: “I regret the appearance of this book. It will only bring comfort to creationists and fans of ‘intelligent design.’ ”
But what about fans of apostasy? You don’t have to be a biblical fundamentalist or a young-earth creationist or an intelligent design enthusiast—I’m none of the above, for what it’s worth—to find Mind and Cosmos exhilarating. “For a long time I have found the materialist account of how we and our fellow organisms came to exist hard to believe,” Nagel writes. “It is prima facie highly implausible that life as we know it is the result of a sequence of physical accidents together with the mechanism of natural selection.” The prima facie impression, reinforced by common sense, should carry more weight than the clerisy gives it. “I would like to defend the untutored reaction of incredulity to the reductionist neo-Darwinian account of the origin and evolution of life.”
The incredulity is not simply a matter of scientific ignorance, as the materialists would have it. It arises from something more fundamental and intimate. The neo-Darwinian materialist account offers a picture of the world that is unrecognizable to us—a world without color or sound, and also a world without free will or consciousness or good and evil or selves or, when it comes to that, selflessness. “It flies in the face of common sense,” he says. Materialism is an explanation for a world we don’t live in.
Nagel’s tone is measured and tentative, but there’s no disguising the book’s renegade quality. There are flashes of exasperation and dismissive impatience. What’s exhilarating is that the source of Nagel’s exasperation is, so to speak, his own tribe: the “secular theoretical establishment and the contemporary enlightened culture which it dominates.” The establishment today, he says, is devoted beyond all reason to a “dominant scientific naturalism, heavily dependent on Darwinian explanations of practically everything, and armed to the teeth against attacks from religion.” I’m sure Nagel would recoil at the phrase, but Mind and Cosmos is a work of philosophical populism, defending our everyday understanding from the highly implausible worldview of a secular clerisy. His working assumption is, in today’s intellectual climate, radical: If the materialist, neo-Darwinian orthodoxy contradicts common sense, then this is a mark against the orthodoxy, not against common sense. When a chain of reasoning leads us to deny the obvious, we should double-check the chain of reasoning before we give up on the obvious.
Nagel follows the materialist chain of reasoning all the way into the cul de sac where it inevitably winds up. Nagel’s touchier critics have accused him of launching an assault on science, when really it is an assault on the nonscientific uses to which materialism has been put. Though he does praise intelligent design advocates for having the nerve to annoy the secular establishment, he’s no creationist himself. He has no doubt that “we are products of the long history of the universe since the big bang, descended from bacteria through millions of years of natural selection.” And he assumes that the self and the body go together. “So far as we can tell,” he writes, “our mental lives, including our subjective experiences, and those of other creatures are strongly connected with and probably strictly dependent on physical events in our brains and on the physical interaction of our bodies with the rest of the physical world.” To believe otherwise is to believe, as the materialists derisively say, in “spooky stuff.” (Along with jumped-up monkeys and moist robots and countless other much-too-cute phrases, the use of spooky stuff proves that our popular science writers have spent a lot of time watching Scooby-Doo.) Nagel doesn’t believe in spooky stuff.
Materialism, then, is fine as far as it goes. It just doesn’t go as far as materialists want it to. It is a premise of science, not a finding. Scientists do their work by assuming that every phenomenon can be reduced to a material, mechanistic cause and by excluding any possibility of nonmaterial explanations. And the materialist assumption works really, really well—in detecting and quantifying things that have a material or mechanistic explanation. Materialism has allowed us to predict and control what happens in nature with astonishing success. The jaw-dropping edifice of modern science, from space probes to nanosurgery, is the result.
But the success has gone to the materialists’ heads. From a fruitful method, materialism becomes an axiom: If science can’t quantify something, it doesn’t exist, and so the subjective, unquantifiable, immaterial “manifest image” of our mental life is proved to be an illusion.
Here materialism bumps up against itself. Nagel insists that we know some things to exist even if materialism omits or ignores or is oblivious to them. Reductive materialism doesn’t account for the “brute facts” of existence—it doesn’t explain, for example, why the world exists at all, or how life arose from nonlife. Closer to home, it doesn’t plausibly explain the fundamental beliefs we rely on as we go about our everyday business: the truth of our subjective experience, our ability to reason, our capacity to recognize that some acts are virtuous and others aren’t. These failures, Nagel says, aren’t just temporary gaps in our knowledge, waiting to be filled in by new discoveries in science. On its own terms, materialism cannot account for brute facts. Brute facts are irreducible, and materialism, which operates by breaking things down to their physical components, stands useless before them. “There is little or no possibility,” he writes, “that these facts depend on nothing but the laws of physics.”
In a dazzling six-part tour de force rebutting Nagel’s critics, the philosopher Edward Feser provided a good analogy to describe the basic materialist error—the attempt to stretch materialism from a working assumption into a comprehensive explanation of the world. Feser suggests a parody of materialist reasoning: “1. Metal detectors have had far greater success in finding coins and other metallic objects in more places than any other method has. 2. Therefore we have good reason to think that metal detectors can reveal to us everything that can be revealed” about metallic objects.
But of course a metal detector only detects the metallic content of an object; it tells us nothing about its color, size, weight, or shape. In the same way, Feser writes, the methods of “mechanistic science are as successful as they are in predicting and controlling natural phenomena precisely because they focus on only those aspects of nature susceptible to prediction and control.”
Meanwhile, they ignore everything else. But this is a fatal weakness for a theory that aspires to be a comprehensive picture of the world. With magnetic resonance imaging, science can tell us which parts of my brain light up when, for example, I glimpse my daughter’s face in a crowd; the bouncing neurons can be observed and measured. Science cannot quantify or describe the feelings I experience when I see my daughter. Yet the feelings are no less real than the neurons.
The point sounds more sentimental than it is. My bouncing neurons and my feelings of love and obligation are unquestionably bound together. But the difference between the neurons and the feelings, the material and the mental, is a qualitative difference, a difference in kind. And of the two, reductive materialism can capture only one.
“The world is an astonishing place,” Nagel writes. “That it has produced you, and me, and the rest of us is the most astonishing thing about it.” Materialists are in the business of banishing astonishment; they want to demystify the world and human beings along with it, to show that everything we see as a mystery is reducible to components that aren’t mysterious at all. And they cling to this ambition even in cases where doing so is obviously fruitless. Neo-Darwinism insists that every phenomenon, every species, every trait of every species, is the consequence of random chance, as natural selection requires. And yet, Nagel says, “certain things are so remarkable that they have to be explained as non-accidental if we are to pretend to a real understanding of the world.” (The italics are mine.)
Among these remarkable, nonaccidental things are many of the features of the manifest image. Consciousness itself, for example: You can’t explain consciousness in evolutionary terms, Nagel says, without undermining the explanation itself. Evolution easily accounts for rudimentary kinds of awareness. Hundreds of thousands of years ago on the African savannah, where the earliest humans evolved the unique characteristics of our species, the ability to sense danger or to read signals from a potential mate would clearly help an organism survive.
So far, so good. But the human brain can do much more than this. It can perform calculus, hypothesize metaphysics, compose music—even develop a theory of evolution. None of these higher capacities has any evident survival value, certainly not hundreds of thousands of years ago when the chief aim of mental life was to avoid getting eaten. Could our brain have developed and sustained such nonadaptive abilities by the trial and error of natural selection, as neo-Darwinism insists? It’s possible, but the odds, Nagel says, are “vanishingly small.” If Nagel is right, the materialist is in a pickle. The conscious brain that is able to come up with neo-Darwinism as a universal explanation simultaneously makes neo-Darwinism, as a universal explanation, exceedingly unlikely.
A similar argument holds for our other cognitive capacities. “The evolution story leaves the authority of reason in a much weaker position,” he writes. Neo-Darwinism tells us that we have the power of reason because reason was adaptive; it must have helped us survive, back in the day. Yet reason often conflicts with our intuition or our emotion—capacities that must also have been adaptive and essential for survival. Why should we “privilege” one capacity over another when reason and intuition conflict? On its own terms, the scheme of neo-Darwinism gives us no standard by which we should choose one adaptive capacity over the other. And yet neo-Darwinists insist we embrace neo-Darwinism because it conforms to our reason, even though it runs against our intuition. Their defense of reason is unreasonable.
So too our moral sense. We all of us have confidence, to one degree or another, that “our moral judgments are objectively valid”—that is, while our individual judgments might be right or wrong, what makes them right or wrong is real, not simply fantasy or opinion. Two and two really do make four. Why is this confidence inherent in our species? How was it adaptive? Neo-Darwinian materialists tell us that morality evolved as a survival mechanism (like everything else): We developed an instinct for behavior that would help us survive, and we called this behavior good as a means of reinforcing it. We did the reverse for behavior that would hurt our chances for survival: We called it bad. Neither type of behavior was good or bad in reality; such moral judgments are just useful tricks human beings have learned to play on ourselves.
Yet Nagel points out that our moral sense, even at the most basic level, developed a complexity far beyond anything needed for survival, even on the savannah—even in Manhattan. We are, as Nagel writes, “beings capable of thinking successfully about good and bad, right and wrong, and discovering moral and evaluative truths that do not depend on [our] own beliefs.” And we behave accordingly, or try to. The odds that such a multilayered but nonadaptive capacity should become a characteristic of the species through natural selection are, again, implausibly long.
Nagel’s reliance on “common sense” has roused in his critics a special contempt. One scientist, writing in the Huffington Post, calls it Nagel’s “argument from ignorance.” In the Nation, the philosophers Brian Leiter and Michael Weisberg could only shake their heads at the once-great philosopher’s retrogression from sophisticated thinking to common sense.
“This style of argument,” they write, “does not, alas, have a promising history.” Once upon a time, after all, our common-sense intuitions told us the sun traveled across the sky over a flat earth. Materialistic science has since taught us otherwise.
Not all intuitions are of the same kind, though. It is one thing for me to be mistaken in my intuition about the shape of the planet; it’s another thing to be mistaken about whether I exist, or whether truth and falsehood exist independently of my say-so, or whether my “self” has some degree of control over my actions. Indeed, a person couldn’t correct his mistaken intuitions unless these intuitions were correct—unless he was a rational self capable of distinguishing the true from the false and choosing one over the other. And it is the materialist attack on those intuitions—“common sense”—that Nagel finds absurd.
Leiter and Weisberg, like most of his other critics, were also agog that Nagel has the nerve to pronounce on matters that they consider purely scientific, far beyond his professional range. A philosopher doubting a scientist is a rare sight nowadays. With the general decline of the humanities and the success of the physical sciences, the relationship of scientists to philosophers of science has been reversed. As recently as the middle of the last century, philosophers like Bertrand Russell and A. J. Ayer might feel free to explain to scientists the philosophical implications of what they were doing. Today the power is all on the side of the scientists: One false move and it’s back to your sandbox, philosophy boy.
And so some philosophers have retreated into the same sort of hyperspecialization that has rendered scientists from different subdisciplines practically incapable of communicating with each other. Now these philosophers, practicing what they call “experimental philosophy,” can pride themselves on being just as incomprehensible as scientists. Other philosophers, like Dennett, have turned their field into a handmaiden of science: meekly and gratefully accepting whatever findings the scientists come up with—from brain scans to the Higgs boson—which they then use to demonstrate the superiority of hardheaded science to the airy musings of old-fashioned “armchair philosophy.”
In this sense too Nagel is a throwback, daring not only to interpret science but to contradict scientists. He admits it’s “strange” when he relies “on a philosophical claim to refute a scientific theory supported by empirical evidence.” But he knows that when it comes to cosmology, scientists are just as likely to make an error of philosophy as philosophers are to make an error of science. And Nagel is accused of making large errors indeed. According to Leiter and Weisberg and the others, he is ignorant of how science is actually done these days.
Nagel, say Leiter and Weisberg, overestimates the importance of materialism, even as a scientific method. He’s attacking a straw man. He writes as though “reductive materialism really were driving the scientific community.” In truth, they say, most scientists reject theoretical reductionism. Fifty years ago, many philosophers and scientists might have believed that all the sciences were ultimately reducible to physics, but modern science doesn’t work that way. Psychologists, for example, aren’t trying to reduce psychology to biology; and biologists don’t want to boil biology down to chemistry; and chemists don’t want to reduce chemistry to physics. Indeed, an evolutionary biologist—even one who’s a good materialist—won’t refer to physics at all in the course of his work!
And this point is true, as Nagel himself writes in his book: Theoretical materialism, he says, “is not a necessary condition of the practice of any of those sciences.” Researchers can believe in materialism or not, as they wish, and still make scientific progress. (This is another reason why it’s unconvincing to cite scientific progress as evidence for the truth of materialism.) But the critics’ point is also disingenuous. If materialism is true as an explanation of everything—and they insist it is—then psychological facts, for example, must be reducible to biology, and then down to chemistry, and finally down to physics. If they weren’t reducible in this way, they would (ta-da!) be irreducible. And any fact that’s irreducible would, by definition, be uncaused and undetermined; meaning it wouldn’t be material. It might even be spooky stuff.
On this point Leiter and Weisberg were gently chided by the prominent biologist Jerry Coyne, who was also a workshopper in the Berkshires. He was delighted by their roasting of Nagel in the Nation, but he accused them of going wobbly on materialism—of shying away from the hard conclusions that reductive materialism demands. It’s not surprising that scientists in various disciplines aren’t actively trying to reduce all science to physics; that would be a theoretical problem that is only solvable in the distant future. However: “The view that all sciences are in principle reducible to the laws of physics,” he wrote, “must be true unless you’re religious.” Either we’re molecules in motion or we’re not.
You can sympathize with Leiter and Weisberg for fudging on materialism. As a philosophy of everything it is an undeniable drag. As a way of life it would be even worse. Fortunately, materialism is never translated into life as it’s lived. As colleagues and friends, husbands and mothers, wives and fathers, sons and daughters, materialists never put their money where their mouth is. Nobody thinks his daughter is just molecules in motion and nothing but; nobody thinks the Holocaust was evil, but only in a relative, provisional sense. A materialist who lived his life according to his professed convictions—understanding himself to have no moral agency at all, seeing his friends and enemies and family as genetically determined robots—wouldn’t just be a materialist: He’d be a psychopath. Say what you will about Leiter and Weisberg and the workshoppers in the Berkshires. From what I can tell, none of them is a psychopath. Not even close.
Applied beyond its own usefulness as a scientific methodology, materialism is, as Nagel suggests, self-evidently absurd. Mind and Cosmos can be read as an extended paraphrase of Orwell’s famous insult: “One has to belong to the intelligentsia to believe things like that: no ordinary man could be such a fool.” Materialism can only be taken seriously as a philosophy through a heroic feat of cognitive dissonance; pretending, in our abstract, intellectual life, that values like truth and goodness have no objective content even as, in our private life, we try to learn what’s really true and behave in a way we know to be good. Nagel has sealed his ostracism from the intelligentsia by idly speculating why his fellow intellectuals would undertake such a feat.
“The priority given to evolutionary naturalism in the face of its implausible conclusions,” he writes, “is due, I think, to the secular consensus that this is the only form of external understanding of ourselves that provides an alternative to theism.”
In a recent review in the New York Review of Books of Where the Conflict Really Lies, by the Christian philosopher Alvin Plantinga, Nagel told how instinctively he recoils from theism, and how hungry he is for a reasonable alternative. “If I ever found myself flooded with the conviction that what the Nicene Creed says is true,” he wrote, “the most likely explanation would be that I was losing my mind, not that I was being granted the gift of faith.” He admits that he finds the evident failure of materialism as a worldview alarming—precisely because the alternative is, for a secular intellectual, unthinkable. He calls this intellectual tic “fear of religion.”
“I speak from experience, being strongly subject to this fear,” he wrote not long ago in an essay called “Evolutionary Naturalism and the Fear of Religion.” “I want atheism to be true and am made uneasy by the fact that some of the most intelligent and well-informed people I know are religious believers. It isn’t just that I don’t believe in God and, naturally, hope that I’m right in my belief. It’s that I hope there is no God! I don’t want there to be a God; I don’t want the universe to be like that.”
Nagel believes this “cosmic authority problem” is widely shared among intellectuals, and I believe him. It accounts for the stubbornness with which they cling to materialism—and for the hostility that greets an intellectual who starts to wander off from the herd. Materialism must be true because it “liberates us from religion.” The positive mission Nagel undertakes in Mind and Cosmos is to outline, cautiously, a possible Third Way between theism and materialism, given that the first is unacceptable—emotionally, if not intellectually—and the second is untenable. Perhaps matter itself has a bias toward producing conscious creatures. Nature in that case would be “teleological”—not random, not fully subject to chance, but tending toward a particular end. Our mental life would be accounted for—phew!—without reference to God.
I don’t think Nagel succeeds in finding his Third Way, and I doubt he or his successors ever will, but then I have biases of my own. There’s no doubting the honesty and intellectual courage—the free thinking and ennobling good faith—that shine through his attempt.
Andrew Ferguson is a senior editor at The Weekly Standard.
On Enjoying Literature
.Straight Through the HeartBy DEAN BAKOPOULOS
Published: March 22, 2013
As each semester begins at Grinnell College, a small liberal arts school nestled in the Iowa prairie, I get numerous e-mails from students pleading for a spot in my fiction workshop. The wait list is long, and as much as I’d love to take credit for the course’s popularity, I’m learning it’s less about the teacher and more about the way fiction writers approach the teaching of literature.
Many of these students aren’t English majors — in our dynamic department, majors tend to geek out on theory and critical reading courses from the start. And unlike most M.F.A. students I’ve taught, these undergraduates tend not to consider writing a career choice. They never ask for my agent’s e-mail.
Instead, each semester, I meet students who might be afraid of traditional English courses, but are drawn by the oddly warm and fuzzy phrase “creative writing.” In most academic work, we teach students to discuss other people’s ideas, before they attempt to formulate their own. We withhold the challenge of creation. But in creative writing, we read a few books and then we’re off. By semester’s end, a seeming mystery, I have a roomful of young people in love with reading stories and telling their own. Almost all of them write better sentences and cleaner paragraphs too.
I realized that what I’m really instructing them in is reading as a process of seduction. Consider how one falls in love: by fixating on certain attributes of the beloved. The way he looks in his brown cords. The way she flips her hair from her face. The flecks in her eyes, the twitch in his smile. We do not yet know the whole person, but we are lured by primal responses to a few details. We get to the classic final lines of “The Great Gatsby” or see Lily Briscoe finishing her painting in “To the Lighthouse,” and we want to go back to Page 1 and start again, to know the novel more deeply.
It took a while for me to figure out how to offer students any kind of instruction in this. As an undergraduate myself, when it came time to write an essay on Aphra Behn or Theodore Dreiser, I found I had no idea what to say about it. My professors, and their graduate assistants, usually agreed. They pointed me to secondary texts, which confused me even more. Later, as a teaching assistant in one of the nation’s best English departments, I still had no idea what to say about a piece of literature. I only knew to teach the works that I liked to read. And so that’s where I began with my students. Not exactly a strong teaching philosophy, perhaps, but now that I am both an author and experienced teacher, I still ground the discussion of a well-known work of fiction in that basic question: What did you like about this story? Show me your favorite lines.
A cynical friend of mine calls this the Book Club pedagogy, akin to treating literature as a string of Facebook statuses about our feelings. But think about the first work of literature that blew your mind. Whether it was Salinger or Ellison or Austen, or a Munro story you came across in a waiting-room copy of The New Yorker, there was most likely a moment, a snippet of dialogue or flight of lyricism that exploded in your squirrelly little heart. Perhaps you put an exclamation point in the margin and yellowed the sentence with a highlighter. You felt real energy there — a stirring in your soul, and you wanted more. Excited to find a kindred consciousness, you wanted to understand how a writer could make you feel that intensity with nothing more than words on a page.
In my classes, we read great fiction obsessively, and then attempt to see how a writer managed to affect us. We try to understand which elements — diction, syntax, point of view and so forth — made us feel that way. After we spend several weeks reading this way, wondering how the author made us shiver like that, we try our own hand. I ask students to begin with “green lines,” to isolate writing so good it makes one writer envious of another. Which parts do they wish they had written themselves? Students start to understand how their own writing works, where it ripples with energy.
Obviously, this is great fun for a pack of aspiring novelists, but why does such a motley assortment of computer science majors and chemistry students flock to these classes? For one thing, there is, at first, no reason to understand the historical significance or theoretical implications of a given work. It begins with a reader in the room with a story. Reading like a writer, as we do in workshops, provides a ground floor for any student. The question “What was your favorite moment in a story?” is an easy entry point for both a student schooled in the finest prep academy and a science major straight out of a substandard district. Anyone can find a favorite line. Placing further pressure on those lines — Why did you like it? What changed at that moment that brought energy to the text? — can help students trust their instincts: they were on to something! It’s a less intimidating approach to literature, free from the burden of historical background and devoid of grad-school jargon.
Back when I was teaching first-year composition at a large state school, I’d often lament with my colleagues that so many of our incoming students hated to read (we were instructed not to use texts more than a few pages long). We bemoaned the fact that many had left high school without even knowing how to write a sentence.
But how can you teach someone to master language or read literature until he’s fallen in love with it? Maybe in place of first-year composition we should be teaching first-year fiction. In a creative writing workshop, students begin to think about literature as stories to love, the way many of them did as children. Instead of deconstructing a text (that terrible word, text), they begin to understand the well-crafted sentence and the way it energizes and adds power to a larger story. After reading masterworks and feeling the effects a writer can have on their own souls, they want to get out their laptops and try doing the same thing.
What they really want is to have some kind of firsthand, visceral relationship with a book — to see what it’s like to take a work apart and put it back together — using great stories as structural models, just the way the kids I grew up with in Detroit fell in love with cars by spending weekends trying to make derelict Ford Mustangs run again. When the engine finally starts, when you figure out how to make it fire, it’s an incredibly powerful learning experience.
Love, after all, isn’t a passive process. Just as a chemistry student doesn’t want to lean back and watch an experiment in class, my students don’t like to be told to sit around and admire something simply because it is theoretically or historically significant. They want to formulate their own theorem, to write their own code.
By teaching the pleasures of writing our own stories, we remind them of the pleasures of reading and of the power of literature, something they may have experienced with Harry Potter but lost when they wrote a five-paragraph essay about Hawthorne. For one semester, at least, we do the work because we grow to love the work. After that? Well, with love, all things are possible.
Dean Bakopoulos, a professor of English at Grinnell College, is the author, most recently, of the novel “My American Unhappiness.”
Published: March 22, 2013
As each semester begins at Grinnell College, a small liberal arts school nestled in the Iowa prairie, I get numerous e-mails from students pleading for a spot in my fiction workshop. The wait list is long, and as much as I’d love to take credit for the course’s popularity, I’m learning it’s less about the teacher and more about the way fiction writers approach the teaching of literature.
Many of these students aren’t English majors — in our dynamic department, majors tend to geek out on theory and critical reading courses from the start. And unlike most M.F.A. students I’ve taught, these undergraduates tend not to consider writing a career choice. They never ask for my agent’s e-mail.
Instead, each semester, I meet students who might be afraid of traditional English courses, but are drawn by the oddly warm and fuzzy phrase “creative writing.” In most academic work, we teach students to discuss other people’s ideas, before they attempt to formulate their own. We withhold the challenge of creation. But in creative writing, we read a few books and then we’re off. By semester’s end, a seeming mystery, I have a roomful of young people in love with reading stories and telling their own. Almost all of them write better sentences and cleaner paragraphs too.
I realized that what I’m really instructing them in is reading as a process of seduction. Consider how one falls in love: by fixating on certain attributes of the beloved. The way he looks in his brown cords. The way she flips her hair from her face. The flecks in her eyes, the twitch in his smile. We do not yet know the whole person, but we are lured by primal responses to a few details. We get to the classic final lines of “The Great Gatsby” or see Lily Briscoe finishing her painting in “To the Lighthouse,” and we want to go back to Page 1 and start again, to know the novel more deeply.
It took a while for me to figure out how to offer students any kind of instruction in this. As an undergraduate myself, when it came time to write an essay on Aphra Behn or Theodore Dreiser, I found I had no idea what to say about it. My professors, and their graduate assistants, usually agreed. They pointed me to secondary texts, which confused me even more. Later, as a teaching assistant in one of the nation’s best English departments, I still had no idea what to say about a piece of literature. I only knew to teach the works that I liked to read. And so that’s where I began with my students. Not exactly a strong teaching philosophy, perhaps, but now that I am both an author and experienced teacher, I still ground the discussion of a well-known work of fiction in that basic question: What did you like about this story? Show me your favorite lines.
A cynical friend of mine calls this the Book Club pedagogy, akin to treating literature as a string of Facebook statuses about our feelings. But think about the first work of literature that blew your mind. Whether it was Salinger or Ellison or Austen, or a Munro story you came across in a waiting-room copy of The New Yorker, there was most likely a moment, a snippet of dialogue or flight of lyricism that exploded in your squirrelly little heart. Perhaps you put an exclamation point in the margin and yellowed the sentence with a highlighter. You felt real energy there — a stirring in your soul, and you wanted more. Excited to find a kindred consciousness, you wanted to understand how a writer could make you feel that intensity with nothing more than words on a page.
In my classes, we read great fiction obsessively, and then attempt to see how a writer managed to affect us. We try to understand which elements — diction, syntax, point of view and so forth — made us feel that way. After we spend several weeks reading this way, wondering how the author made us shiver like that, we try our own hand. I ask students to begin with “green lines,” to isolate writing so good it makes one writer envious of another. Which parts do they wish they had written themselves? Students start to understand how their own writing works, where it ripples with energy.
Obviously, this is great fun for a pack of aspiring novelists, but why does such a motley assortment of computer science majors and chemistry students flock to these classes? For one thing, there is, at first, no reason to understand the historical significance or theoretical implications of a given work. It begins with a reader in the room with a story. Reading like a writer, as we do in workshops, provides a ground floor for any student. The question “What was your favorite moment in a story?” is an easy entry point for both a student schooled in the finest prep academy and a science major straight out of a substandard district. Anyone can find a favorite line. Placing further pressure on those lines — Why did you like it? What changed at that moment that brought energy to the text? — can help students trust their instincts: they were on to something! It’s a less intimidating approach to literature, free from the burden of historical background and devoid of grad-school jargon.
Back when I was teaching first-year composition at a large state school, I’d often lament with my colleagues that so many of our incoming students hated to read (we were instructed not to use texts more than a few pages long). We bemoaned the fact that many had left high school without even knowing how to write a sentence.
But how can you teach someone to master language or read literature until he’s fallen in love with it? Maybe in place of first-year composition we should be teaching first-year fiction. In a creative writing workshop, students begin to think about literature as stories to love, the way many of them did as children. Instead of deconstructing a text (that terrible word, text), they begin to understand the well-crafted sentence and the way it energizes and adds power to a larger story. After reading masterworks and feeling the effects a writer can have on their own souls, they want to get out their laptops and try doing the same thing.
What they really want is to have some kind of firsthand, visceral relationship with a book — to see what it’s like to take a work apart and put it back together — using great stories as structural models, just the way the kids I grew up with in Detroit fell in love with cars by spending weekends trying to make derelict Ford Mustangs run again. When the engine finally starts, when you figure out how to make it fire, it’s an incredibly powerful learning experience.
Love, after all, isn’t a passive process. Just as a chemistry student doesn’t want to lean back and watch an experiment in class, my students don’t like to be told to sit around and admire something simply because it is theoretically or historically significant. They want to formulate their own theorem, to write their own code.
By teaching the pleasures of writing our own stories, we remind them of the pleasures of reading and of the power of literature, something they may have experienced with Harry Potter but lost when they wrote a five-paragraph essay about Hawthorne. For one semester, at least, we do the work because we grow to love the work. After that? Well, with love, all things are possible.
Dean Bakopoulos, a professor of English at Grinnell College, is the author, most recently, of the novel “My American Unhappiness.”
Spring Break
Like the students and the teachers and our hard-working state legislature us regular working folks need a spring break. It would give me a chance to clean out my car and maybe ride down to Clanton and get some fudge at Heaton Pecan Farm. Maybe rearrange the junk in my garage. Is it warm enough to go to Orange Beach? Alas, gotta keep my nose to the grindstone. Some people have all the luck, but not me.
Getting Ahead in Life
.The best way to get ahead in life is to inherit money, make money, and marry money. I turned out 0 for 3. The best I can do is to pay my bills on time, not bounce checks, and enjoy a Reese's cup now and then. It ain't so bad once you get used to it.
Like · · Promote · Share.
Kimberley Stewart, Chris Hall, Freddy Hudson and 3 others like this..Don Waller A handsome stud like you not marrying for money? For Christ's sakes. Where are you're priorities?
18 hours ago via mobile · Like..Michael Dew Four marriages is most definitely on the list
Like · · Promote · Share.
Kimberley Stewart, Chris Hall, Freddy Hudson and 3 others like this..Don Waller A handsome stud like you not marrying for money? For Christ's sakes. Where are you're priorities?
18 hours ago via mobile · Like..Michael Dew Four marriages is most definitely on the list
Wednesday, March 20, 2013
The Iraq Anniversary
We remember the Iraq invasion which started 10 years ago today. What a shameful and terrible part of our recent history. Bush and Cheney should be tried for war crimes. There is no way to justify this war.
Tuesday, March 19, 2013
Building a Movie Language in Layers
BY Dennis Lim
15 March 2013
New York Times
“I won’t always give satisfying answers,” the filmmaker Shane Carruth said, by way of warning, in an interview in early January. The premiere of his long-awaited second feature, “Upstream Color,” at the Sundance Film Festival was weeks away at the time. He was finishing the sound mix and working out the details of a self-distribution plan. But the greatest source of anxiety was the prospect of having to talk about his movie.
“I hate even the idea of a synopsis,” Mr. Carruth said. “When stories are really working, when you’re providing subtextual exploration and things that are deeply layered, you’re obligated to not say things out loud.”
Subtexts and layers abound in his new film, which combines elements of an abduction plot, a love story and a cosmic science experiment. “Upstream Color” trended heavily on Twitter when Sundance announced its lineup, and the anticipation has much to do with the cult status of Mr. Carruth’s first feature, “Primer,” the Dramatic Grand Jury Prize winner at that festival in 2004.
15 March 2013
New York Times
“I won’t always give satisfying answers,” the filmmaker Shane Carruth said, by way of warning, in an interview in early January. The premiere of his long-awaited second feature, “Upstream Color,” at the Sundance Film Festival was weeks away at the time. He was finishing the sound mix and working out the details of a self-distribution plan. But the greatest source of anxiety was the prospect of having to talk about his movie.
“I hate even the idea of a synopsis,” Mr. Carruth said. “When stories are really working, when you’re providing subtextual exploration and things that are deeply layered, you’re obligated to not say things out loud.”
Subtexts and layers abound in his new film, which combines elements of an abduction plot, a love story and a cosmic science experiment. “Upstream Color” trended heavily on Twitter when Sundance announced its lineup, and the anticipation has much to do with the cult status of Mr. Carruth’s first feature, “Primer,” the Dramatic Grand Jury Prize winner at that festival in 2004.
Mr. Carruth, 40, a former engineer and self-taught filmmaker, made “Primer,” a deadpan time-travel fantasy, for a reported $7,000, taking on the roles of director, writer, producer, actor, cinematographer, editor and composer. A feat of DIY enterprise and polymathic ingenuity, the film lent itself to repeat viewings and endless theorizing about its laws of physics and metaphysics, which ensured a robust afterlife in home video and on message boards.
On the phone recently Mr. Carruth said that the response to “Upstream Color,” which screens at the New Directors/New Films series this month before opening on April 5, has been overwhelmingly positive. But what irks him is the suggestion that the film, like “Primer,” is a puzzle movie in need of solving. “It’s funny that some of the early reviews used words like opaque and obscure,” he said. “And then they list the plot, beat by beat, and pretty much nail it. You’re sort of like, well, what was so opaque then?”
While the plot elements are not hard to glean, many of them boggle the mind. A young woman named Kris (Amy Seimetz) ingests a hypnotic drug that leaves a gap in her memory and a worm in her body, which is extracted in a surgical procedure that involves transferring the parasite to a pig. She becomes involved with Jeff (Mr. Carruth), who may have undergone a similar trauma. Odd as it is, their plight — as the film suggests with repeated cutaways to orchid harvesters and pig pens, not to mention ample quotations from Thoreau’s “Walden” — exists within a larger cycle of nature.
More than any of these curious details, the most provocative aspect of “Upstream Color” is the way it unfolds, as a skein of associations and in a barrage of fragmentary images and clipped conversations. Increasingly prone to slippage and ellipsis the film builds to a wordless finale in which, as Mr. Carruth put it, “everything deteriorates into the ether.”
“I believe that it’s trying something new in terms of film language,” said Mr. Carruth, who in conversation projects a quiet self-assurance.
Even more than “Primer,” which plunged characters and viewers into a seductive haze of confusion, “Upstream Color” is a movie about the limits of knowledge that doubles as an experiment in inference. “It’s about people building their own narratives when they don’t have anything to hold on to,” Mr. Carruth said. “They’re accumulating their identities out of whatever they find around them.”
An open-ended fable whose protagonists are buffeted by forces they neither control nor comprehend, “Upstream Color” has a pronounced metaphorical aspect. “You could do the same story about any number of things where people are being affected by outside factors that they can’t quite speak to,” Mr. Carruth said. “The end result is the same, whether you’re talking about religious or cosmic or political beliefs, or being affected by a chemical or by a relationship.”
Mr. Carruth took on almost as many roles as on “Primer” and is also overseeing the marketing and distribution. He assumed near-total control again not just for economic reasons but also because of “the hope that there would be something gained from these things all coming from the same mind,” he said.
He cast his co-star, Ms. Seimetz, an actress and filmmaker, on the basis of her feature directing debut, “Sun Don’t Shine” (set to open in May), convinced that she could tune in to his wavelength. “She’s a storyteller,” Mr. Carruth said. “I knew she would get it.”
Ms. Seimetz, who sees “Upstream Color” as “a movie about deep emotional loss,” said that she and Mr. Carruth “clicked on a filmmaking level.” She added, “I think you can only make a film like this with a kindred spirit.”
Mr. Carruth’s next project, “The Modern Ocean,” which he hopes to shoot this year, was conceived as a romance between an oceanographer and the daughter of a shipping magnate but has now “morphed into this other massive thing that is impossibly tragic,” he said. “I’m really curious about how far it can be pushed.”
By “it,” he explained, he means a more exploratory approach to narrative, which he has found in movies as different as Paul Thomas Anderson’s “Punch-Drunk Love” and Steven Soderbergh’s “Solaris.”
While others succumb to gimmickry and doomsaying, Mr. Carruth remains a believer in the untapped potential of cinema. “Everybody’s saying we’ve got to go 3-D or virtual reality or choose your own adventure,” he said. “But there are other ways forward. I don’t think we’re done with film by a long shot.”
Monday, March 18, 2013
Evolution and Existentialism
Evolution and Existentialism, an Intellectual Odd Couple
By David P. Barash
Interdisciplinary efforts, for all their ostensible appeal, are more often praised than practiced, especially when it comes to combining the humanities and sciences. Nonetheless, connecting two intellectual perspectives that seem to be poles apart, and that have had very different fates, helps sweep away some common misconceptions—nay, fears—about modern scientific thought.
Let's look, therefore, at evolutionary biology and existential philosophy.
The former is experiencing rapid, perhaps exponential, growth, while the latter appears to have had its day and is (unjustifiably, in my opinion) in decline. Collectors of oxymorons—"freezer burn," "jumbo shrimp," "military intelligence"—might well appreciate the prospect of "evolutionary existentialism." They might also ask whether the bottom line involves mere alliterative appeal, in which case why not "molecular metaphysics" or "epigenetic epistemology."
Here's why these two seemingly strange bedfellows belong together: They are, in fact, a compatible couple. What they share suggests that science has not completely destroyed our understanding of free will, as so many critics contend. A philosophy of "human meaning" can coexist quite well with a science of "genetic influence."
First let's turn to some of the prominent incompatibilities between the two. Existentialism has, as one of its organizing principles, the notion that human beings have no "essence." As Jean-Paul Sartre famously put it, "existence precedes essence." For existentialists, there is no Platonic form of the person, no ideal self of which our corporeal reality is a pale instantiation. Rather, we define ourselves, give ourselves meaning, establish our essence only via our existence, by what we do, how we choose to live our individual lives. We have no "human nature," just our own intentions.
Thus choice is especially important for existentialists, because we are free; in Sartre's paradoxical words, we are "condemned to be free." In a universe devoid of purpose and uncaring about people, it is our job to give meaning to our lives.
That is vastly different from evolutionary premises. At the heart of an evolutionary view of human nature—or of hippopotamus, halibut, or hickory-tree nature—is the idea that living things are a concatenation of genes, jousting with other, similar genes to get ahead. Free, conscious, intentional choices seem out of place for a creature who is merely the physical manifestation of DNA programmed to succeed.
For evolutionary biologists, all living things have a purpose. It is neither divine nor Platonic. It is also not a choice, at least for nonhuman species, because their purpose is generated, quite simply, by the reward that natural selection provides for creatures that succeed in projecting their genes into future generations. Living things are survival vehicles for their potentially immortal genes. Biologically speaking, that is what they are—and all that they are.
At this point, most existentialists can be expected to disagree.
For evolutionary biologists, behavior is one way genes go about promoting themselves. Other ways are by producing a body that is durable, adapted to its ecological situation, capable of various physiological feats like growth, metabolism, repair, and so on. Probably the most obvious way in which behavior promotes genes is the powerful inclination that adults (of any sexually reproducing species) have to mate, and then, depending on the species, to care for their offspring. Seen in that light, our essence—our genotype—seems to precede our existence, contrary to what the existentialists would have us believe. We are, in a sense, slaves to the selfish genes that created us, body and mind, even though, as is increasingly recognized, sometimes those genes perform their work by "altruistically" benefiting other individuals, offspring not least.
Halibut and hickory trees don't know what they're doing, or why. Human beings do. "Man is but a reed, the most feeble thing in nature, but he is a thinking reed," wrote the French mathematical genius, religious mystic, and precursor of existentialism, Blaise Pascal. "A vapor, a drop of water, is enough to kill a human being.
"But even if the universe should crush him, man would still be more noble than that which destroys him, because he knows that he dies, and he realizes the advantage that the universe possesses over him; the universe knows nothing of this."
Thanks to evolutionary insights, people are acquiring a new knowledge: what their genes are up to, i.e., their evolutionary "purpose." An important benefit of evolutionary wisdom is that, by giving us the kind of knowledge about the universe that Pascal so admired, it leaves us free to pursue our own, chosen purposes. Sometimes those purposes involve a conscious decision to refrain from, say, reproducing—something unimaginable in any other species. At other times (all too rarely), they might involve deciding to extend an ethic of caretaking to include other human beings to whom we are not immediately related, or even to include other species, with whom we share comparatively few genes.
But Pascal also prefigured existential thought when he wrote that "the silence of these infinite spaces frightens me." Such fear was understandable, since the comfortable sense of human specialness that characterized the pre-Copernican world was being replaced in Pascal's day by a vast universe of astronomic distances, no longer centered on Homo sapiens. The great, empty spaces of evolutionary time and possibility—as well as human kinship with "lower" life forms that they demand—have frightened and repelled many observers of evolutionary biology as well (although so far as I can tell, it hasn't deeply troubled any scientists).
Many nonscientists, especially when first exposed to evolutionary thinking, are also chilled by the focus—characteristic of both existentialism and modern evolutionary biology—on the smallest possible unit of analysis.
The Danish philosopher and existentialist pioneer Søren Kierkegaard asked that only this should be written below his name on his gravestone: "The Individual." And in his masterful Man in the Modern Age, the existential psychiatrist and philosopher Karl Jaspers, although rejecting the label of existentialism, focused on the struggle of individuals to achieve an authentic life in the face of pressures for mass conformity.
In a parallel track, much of the intellectual impetus of evolutionary biology has come from abandoning comfortable but outmoded group-level arguments. Although the public still tends to think that evolution acts, as it's commonly put, "for the good of the species," evolutionary biologists are essentially unanimous that natural selection acts most strongly at the smallest level: individuals. Actually, the process goes farther yet, focusing when possible on individual genes. Species-wide effects are simply the arithmetic summation of these micro-impacts.
That individual, gene-centered perspective has given rise to criticism that sociobiology—the application of evolutionary insights to complex social behavior, including that of our own species—is inherently cynical, promoting a gloomy, egocentric Weltansicht. The same, of course, has been said of existentialism, whose stereotypical practitioner is the anguished, angst-ridden loner, wearing a black turtleneck and obsessing, Hamlet-like, about the meaninglessness of life.
Let's look more closely at that critique by taking an extreme position and granting, if only for the sake of argument, that human beings, like other living things, are merely survival machines for their genes, organic robots whose biologically mandated purpose is neither more nor less than the promulgation of those genes. And let's grant that existentialists are very much occupied with the meaninglessness of life and the consequent need for people to assert their own meaning, to define themselves against an absurd universe. Furthermore, let's consider the less-well-known fact that, although evolutionary biology makes no claim that it or what it produces is inherently good, it also teaches that life is absurd.
Evolutionists, after all, might well look at all living things—human beings not least—as playing a vast existential roulette game. No one can ever beat the house. There is no option to cash in one's chips and walk away a winner. The only goal is to keep playing, and indeed, some genes and phyletic lineages manage to stay in the game longer than others. But where, I ask you, is the meaning in a game whose goal is simply to keep on playing, a game that can never be won, but only lost? And for which we did not even get to write the rules?
There is, accordingly, no intrinsic, evolutionary meaning to being alive. We simply are, having been produced when one of our father's sperm connected with one of our mother's eggs, each contributing genes that combined to become a new person. Those genes, too, simply are, because their antecedents avoided being eliminated.
We have simply been, as Martin Heidegger (another precursor of existentialism, who particularly influenced Sartre) put it, "thrown into the world." None of us, after all, was consulted beforehand. Biologically, our genes did it; or rather, our parents' genes. And their parents' before them.
At this point, some critics say that if evolutionary biology reveals that life is without intrinsic meaning, then biology is mistaken. Not at all. From the perspective of natural science generally, there is no inherent reason that anything—a rock, a waterfall, a halibut, a human being—is of itself meaningful. As existentialists have long pointed out, the key to life's meaning is not aliveness itself, but what we attach to it.
At one point in Douglas Adams's hilarious The Hitchhiker's Guide to the Galaxy, as a sperm whale plummets toward the planet Magrathea, it wonders: "Why am I here? What is my purpose in life?" The appealing but doomed creature has just been "thrown into" its world, which happens to be several miles above the planet's surface; the creature exists as a whale because it had inexplicably been transformed from a nuclear missile, directed at our heroes' spaceship, into a briefly airborne cetacean when the occupants of said spaceship activated their Infinite Improbability Drive.
Evolution, too, is an improbability generator, although its outcomes are considerably more finite. After being called into existence by that particular improbability generator called natural selection, we have no more purpose in life than Adams's naïve and ill-fated whale, whose blubber was soon to bespatter the Magrathean landscape.
Think back, now, to Pascal and his successors, whether atheist (Nietzsche, Sartre) or religious (Kierkegaard, Jaspers), for whom there are many ways that human beings can and do say no to their genes. Sartre, for example, encouraged rebellion against the pressures of conformity and the lack of authenticity inherent in denying one's freedom, just as Camus urged his readers to reject any complicity in lethal violence, to be "neither victims nor executioners." By the same token, Kierkegaard led the way for the "truly religious" to take deep and often personal responsibility for their spiritual lives.
As descendants of both existential and evolutionary perspectives, we have the opportunity to assert ourselves as creative rebels. We may elect intentional childlessness. We may choose to be less selfish and more genuinely altruistic than our genes might like. We may decide to groom our sons to be nurses and our daughters to be corporate executives. I would go farther, and suggest that we must do such sorts of things—deny aspects of our own biological heritage—if we want to be fully human. The alternative—to let biology carry us where it will—is to forgo the responsibility of being human, and to be as helpless and abandoned as a (briefly) airborne Magrathean whale.
Going with the flow of our biologically generated inclinations is very close to what Sartre called "bad faith," whereby people pretend—to themselves and others—that they are not free. That is not to claim that human beings are perfectly free. When the early-20th-century philosopher José Ortega y Gasset observed that "man has no nature, only a history," he neglected to add that this includes an evolutionary history, as a result of which we are constrained as well as impelled in certain ways and directions. We cannot assume the lifestyle of honeybees, or Portuguese men-of-war. But such restrictions are trivial and beside the point: Within a remarkable range, our evolutionary bequeathal is wildly permissive.
This uniquely human potential to resist our own genes might help explain why people expend so much effort trying to induce others, especially the young and impressionable, to practice what is widely seen as the cardinal virtue: obedience. To recast Freud's argument about incest restraints, if we were naturally obedient, we probably wouldn't need so much urging. And yet, on balance, it seems that far more harm has been done throughout human history by obedience—to Hitler's Final Solution, Stalin's elimination of opponents real and imagined, Mao's Cultural Revolution, Pol Pot's genocide—than by disobedience.
On the basis of evolutionary existentialism, I would therefore like to suggest the heretical and admittedly paradoxical notion that, in fact, we need to teach more disobedience. Not only disobedience to political and social authority but especially disobedience to some of our troublesome genetic inclinations.
Along with a capacity for altruism, we also appear to have been endowed with occasional tendencies to ill-treat stepchildren (who are, of course, unrelated to one's self), to give free rein to any number of violent tendencies, to discriminate against others who appear different from ourselves, to value short-term successes over long-term consequences. It is a good thing that we are not marionettes, dancing at the end of strings pulled by our DNA. It is also a good thing that we can identify any such tendencies, and decide whether to defy our inclinations or go along with them. It is largely when we act in ignorance of our biology that we are most vulnerable to it.
As Albert Camus wrote, reconfiguring Descartes's cogito, "I rebel, therefore we exist." Or, as André Malraux put it, "The greatest mystery is not that we have been flung at random among the profusion of the earth and the galaxy of the stars, but that in this prison we can fashion images of ourselves sufficiently powerful to deny our nothingness."
In that denial lies not only a great mystery but also a thrilling hope.
By David P. Barash
Interdisciplinary efforts, for all their ostensible appeal, are more often praised than practiced, especially when it comes to combining the humanities and sciences. Nonetheless, connecting two intellectual perspectives that seem to be poles apart, and that have had very different fates, helps sweep away some common misconceptions—nay, fears—about modern scientific thought.
Let's look, therefore, at evolutionary biology and existential philosophy.
The former is experiencing rapid, perhaps exponential, growth, while the latter appears to have had its day and is (unjustifiably, in my opinion) in decline. Collectors of oxymorons—"freezer burn," "jumbo shrimp," "military intelligence"—might well appreciate the prospect of "evolutionary existentialism." They might also ask whether the bottom line involves mere alliterative appeal, in which case why not "molecular metaphysics" or "epigenetic epistemology."
Here's why these two seemingly strange bedfellows belong together: They are, in fact, a compatible couple. What they share suggests that science has not completely destroyed our understanding of free will, as so many critics contend. A philosophy of "human meaning" can coexist quite well with a science of "genetic influence."
First let's turn to some of the prominent incompatibilities between the two. Existentialism has, as one of its organizing principles, the notion that human beings have no "essence." As Jean-Paul Sartre famously put it, "existence precedes essence." For existentialists, there is no Platonic form of the person, no ideal self of which our corporeal reality is a pale instantiation. Rather, we define ourselves, give ourselves meaning, establish our essence only via our existence, by what we do, how we choose to live our individual lives. We have no "human nature," just our own intentions.
Thus choice is especially important for existentialists, because we are free; in Sartre's paradoxical words, we are "condemned to be free." In a universe devoid of purpose and uncaring about people, it is our job to give meaning to our lives.
That is vastly different from evolutionary premises. At the heart of an evolutionary view of human nature—or of hippopotamus, halibut, or hickory-tree nature—is the idea that living things are a concatenation of genes, jousting with other, similar genes to get ahead. Free, conscious, intentional choices seem out of place for a creature who is merely the physical manifestation of DNA programmed to succeed.
For evolutionary biologists, all living things have a purpose. It is neither divine nor Platonic. It is also not a choice, at least for nonhuman species, because their purpose is generated, quite simply, by the reward that natural selection provides for creatures that succeed in projecting their genes into future generations. Living things are survival vehicles for their potentially immortal genes. Biologically speaking, that is what they are—and all that they are.
At this point, most existentialists can be expected to disagree.
For evolutionary biologists, behavior is one way genes go about promoting themselves. Other ways are by producing a body that is durable, adapted to its ecological situation, capable of various physiological feats like growth, metabolism, repair, and so on. Probably the most obvious way in which behavior promotes genes is the powerful inclination that adults (of any sexually reproducing species) have to mate, and then, depending on the species, to care for their offspring. Seen in that light, our essence—our genotype—seems to precede our existence, contrary to what the existentialists would have us believe. We are, in a sense, slaves to the selfish genes that created us, body and mind, even though, as is increasingly recognized, sometimes those genes perform their work by "altruistically" benefiting other individuals, offspring not least.
Halibut and hickory trees don't know what they're doing, or why. Human beings do. "Man is but a reed, the most feeble thing in nature, but he is a thinking reed," wrote the French mathematical genius, religious mystic, and precursor of existentialism, Blaise Pascal. "A vapor, a drop of water, is enough to kill a human being.
"But even if the universe should crush him, man would still be more noble than that which destroys him, because he knows that he dies, and he realizes the advantage that the universe possesses over him; the universe knows nothing of this."
Thanks to evolutionary insights, people are acquiring a new knowledge: what their genes are up to, i.e., their evolutionary "purpose." An important benefit of evolutionary wisdom is that, by giving us the kind of knowledge about the universe that Pascal so admired, it leaves us free to pursue our own, chosen purposes. Sometimes those purposes involve a conscious decision to refrain from, say, reproducing—something unimaginable in any other species. At other times (all too rarely), they might involve deciding to extend an ethic of caretaking to include other human beings to whom we are not immediately related, or even to include other species, with whom we share comparatively few genes.
But Pascal also prefigured existential thought when he wrote that "the silence of these infinite spaces frightens me." Such fear was understandable, since the comfortable sense of human specialness that characterized the pre-Copernican world was being replaced in Pascal's day by a vast universe of astronomic distances, no longer centered on Homo sapiens. The great, empty spaces of evolutionary time and possibility—as well as human kinship with "lower" life forms that they demand—have frightened and repelled many observers of evolutionary biology as well (although so far as I can tell, it hasn't deeply troubled any scientists).
Many nonscientists, especially when first exposed to evolutionary thinking, are also chilled by the focus—characteristic of both existentialism and modern evolutionary biology—on the smallest possible unit of analysis.
The Danish philosopher and existentialist pioneer Søren Kierkegaard asked that only this should be written below his name on his gravestone: "The Individual." And in his masterful Man in the Modern Age, the existential psychiatrist and philosopher Karl Jaspers, although rejecting the label of existentialism, focused on the struggle of individuals to achieve an authentic life in the face of pressures for mass conformity.
In a parallel track, much of the intellectual impetus of evolutionary biology has come from abandoning comfortable but outmoded group-level arguments. Although the public still tends to think that evolution acts, as it's commonly put, "for the good of the species," evolutionary biologists are essentially unanimous that natural selection acts most strongly at the smallest level: individuals. Actually, the process goes farther yet, focusing when possible on individual genes. Species-wide effects are simply the arithmetic summation of these micro-impacts.
That individual, gene-centered perspective has given rise to criticism that sociobiology—the application of evolutionary insights to complex social behavior, including that of our own species—is inherently cynical, promoting a gloomy, egocentric Weltansicht. The same, of course, has been said of existentialism, whose stereotypical practitioner is the anguished, angst-ridden loner, wearing a black turtleneck and obsessing, Hamlet-like, about the meaninglessness of life.
Let's look more closely at that critique by taking an extreme position and granting, if only for the sake of argument, that human beings, like other living things, are merely survival machines for their genes, organic robots whose biologically mandated purpose is neither more nor less than the promulgation of those genes. And let's grant that existentialists are very much occupied with the meaninglessness of life and the consequent need for people to assert their own meaning, to define themselves against an absurd universe. Furthermore, let's consider the less-well-known fact that, although evolutionary biology makes no claim that it or what it produces is inherently good, it also teaches that life is absurd.
Evolutionists, after all, might well look at all living things—human beings not least—as playing a vast existential roulette game. No one can ever beat the house. There is no option to cash in one's chips and walk away a winner. The only goal is to keep playing, and indeed, some genes and phyletic lineages manage to stay in the game longer than others. But where, I ask you, is the meaning in a game whose goal is simply to keep on playing, a game that can never be won, but only lost? And for which we did not even get to write the rules?
There is, accordingly, no intrinsic, evolutionary meaning to being alive. We simply are, having been produced when one of our father's sperm connected with one of our mother's eggs, each contributing genes that combined to become a new person. Those genes, too, simply are, because their antecedents avoided being eliminated.
We have simply been, as Martin Heidegger (another precursor of existentialism, who particularly influenced Sartre) put it, "thrown into the world." None of us, after all, was consulted beforehand. Biologically, our genes did it; or rather, our parents' genes. And their parents' before them.
At this point, some critics say that if evolutionary biology reveals that life is without intrinsic meaning, then biology is mistaken. Not at all. From the perspective of natural science generally, there is no inherent reason that anything—a rock, a waterfall, a halibut, a human being—is of itself meaningful. As existentialists have long pointed out, the key to life's meaning is not aliveness itself, but what we attach to it.
At one point in Douglas Adams's hilarious The Hitchhiker's Guide to the Galaxy, as a sperm whale plummets toward the planet Magrathea, it wonders: "Why am I here? What is my purpose in life?" The appealing but doomed creature has just been "thrown into" its world, which happens to be several miles above the planet's surface; the creature exists as a whale because it had inexplicably been transformed from a nuclear missile, directed at our heroes' spaceship, into a briefly airborne cetacean when the occupants of said spaceship activated their Infinite Improbability Drive.
Evolution, too, is an improbability generator, although its outcomes are considerably more finite. After being called into existence by that particular improbability generator called natural selection, we have no more purpose in life than Adams's naïve and ill-fated whale, whose blubber was soon to bespatter the Magrathean landscape.
Think back, now, to Pascal and his successors, whether atheist (Nietzsche, Sartre) or religious (Kierkegaard, Jaspers), for whom there are many ways that human beings can and do say no to their genes. Sartre, for example, encouraged rebellion against the pressures of conformity and the lack of authenticity inherent in denying one's freedom, just as Camus urged his readers to reject any complicity in lethal violence, to be "neither victims nor executioners." By the same token, Kierkegaard led the way for the "truly religious" to take deep and often personal responsibility for their spiritual lives.
As descendants of both existential and evolutionary perspectives, we have the opportunity to assert ourselves as creative rebels. We may elect intentional childlessness. We may choose to be less selfish and more genuinely altruistic than our genes might like. We may decide to groom our sons to be nurses and our daughters to be corporate executives. I would go farther, and suggest that we must do such sorts of things—deny aspects of our own biological heritage—if we want to be fully human. The alternative—to let biology carry us where it will—is to forgo the responsibility of being human, and to be as helpless and abandoned as a (briefly) airborne Magrathean whale.
Going with the flow of our biologically generated inclinations is very close to what Sartre called "bad faith," whereby people pretend—to themselves and others—that they are not free. That is not to claim that human beings are perfectly free. When the early-20th-century philosopher José Ortega y Gasset observed that "man has no nature, only a history," he neglected to add that this includes an evolutionary history, as a result of which we are constrained as well as impelled in certain ways and directions. We cannot assume the lifestyle of honeybees, or Portuguese men-of-war. But such restrictions are trivial and beside the point: Within a remarkable range, our evolutionary bequeathal is wildly permissive.
This uniquely human potential to resist our own genes might help explain why people expend so much effort trying to induce others, especially the young and impressionable, to practice what is widely seen as the cardinal virtue: obedience. To recast Freud's argument about incest restraints, if we were naturally obedient, we probably wouldn't need so much urging. And yet, on balance, it seems that far more harm has been done throughout human history by obedience—to Hitler's Final Solution, Stalin's elimination of opponents real and imagined, Mao's Cultural Revolution, Pol Pot's genocide—than by disobedience.
On the basis of evolutionary existentialism, I would therefore like to suggest the heretical and admittedly paradoxical notion that, in fact, we need to teach more disobedience. Not only disobedience to political and social authority but especially disobedience to some of our troublesome genetic inclinations.
Along with a capacity for altruism, we also appear to have been endowed with occasional tendencies to ill-treat stepchildren (who are, of course, unrelated to one's self), to give free rein to any number of violent tendencies, to discriminate against others who appear different from ourselves, to value short-term successes over long-term consequences. It is a good thing that we are not marionettes, dancing at the end of strings pulled by our DNA. It is also a good thing that we can identify any such tendencies, and decide whether to defy our inclinations or go along with them. It is largely when we act in ignorance of our biology that we are most vulnerable to it.
As Albert Camus wrote, reconfiguring Descartes's cogito, "I rebel, therefore we exist." Or, as André Malraux put it, "The greatest mystery is not that we have been flung at random among the profusion of the earth and the galaxy of the stars, but that in this prison we can fashion images of ourselves sufficiently powerful to deny our nothingness."
In that denial lies not only a great mystery but also a thrilling hope.
What Grand Bargain?
Grand Bargaining
by Paul Krugman
Like others, Greg Sargent has been pleading with “centrist” pundits to acknowledge an obvious truth: Barack Obama has actually proposed the mix of spending cuts and revenue increases they want, while Republicans are unwilling to make so much as a dime of compromise. Greg looks at yesterday’s talk shows and finds this confirmed openly by GOP leaders: there is no ratio of spending cuts to revenues that will reconcile them to any tax hike whatsoever.
Greg presumably hopes that this admission will finally cure pundits of the habit of blaming both sides for the failure to reach a Grand Bargain — or, in practice, devoting most of their criticism to Obama.
Silly Greg. As Scott Lemieux explains, the centrist view is that all Obama has to do is have the leadership to lead, with leadership. Nothing Republicans say or do can shake this conviction.
.
by Paul Krugman
Like others, Greg Sargent has been pleading with “centrist” pundits to acknowledge an obvious truth: Barack Obama has actually proposed the mix of spending cuts and revenue increases they want, while Republicans are unwilling to make so much as a dime of compromise. Greg looks at yesterday’s talk shows and finds this confirmed openly by GOP leaders: there is no ratio of spending cuts to revenues that will reconcile them to any tax hike whatsoever.
Greg presumably hopes that this admission will finally cure pundits of the habit of blaming both sides for the failure to reach a Grand Bargain — or, in practice, devoting most of their criticism to Obama.
Silly Greg. As Scott Lemieux explains, the centrist view is that all Obama has to do is have the leadership to lead, with leadership. Nothing Republicans say or do can shake this conviction.
.
Sunday, March 17, 2013
Jan-Phillipp Sendker - The Art of Hearing Heartbeats
Some books you read. Some books you enter into. You enter into the created world of the book and it doens't let go even after you've finished reading. This book is one of the latter.
This novel is written by a German. I gather it was popular in Europe before getting tanslated into English.
A Burmese man immigrates to the U.S., marries an American woman, becomes an accomplished lawyer, has two kids, and one day 35 years disappears without telling his family anything. His daughter finds a love letter to a woman in Burma named Mi Mi and realizes her father has returned to Burma to this woman. She goes seeking her father in that Asian country to find out why he left his family, and therein is the story.
The story is compelling and touching. It is a love story but also a story about a clash of cultures. The Burmese culture is not our own.
It's a great story and I could hardly put it down. Great stuff!
This novel is written by a German. I gather it was popular in Europe before getting tanslated into English.
A Burmese man immigrates to the U.S., marries an American woman, becomes an accomplished lawyer, has two kids, and one day 35 years disappears without telling his family anything. His daughter finds a love letter to a woman in Burma named Mi Mi and realizes her father has returned to Burma to this woman. She goes seeking her father in that Asian country to find out why he left his family, and therein is the story.
The story is compelling and touching. It is a love story but also a story about a clash of cultures. The Burmese culture is not our own.
It's a great story and I could hardly put it down. Great stuff!
Friday, March 15, 2013
More About Nagel
A Darwinist Mob Goes After a Serious Philosopher
BY LEON WIESELTIER
Is there a greater gesture of intellectual contempt than the notion that a tweet constitutes an adequate intervention in a serious discussion? But when Thomas Nagel’s formidable book Mind and Cosmos recently appeared, in which he has the impudence to suggest that “the materialist neo-Darwinian conception of nature is almost certainly false,” and to offer thoughtful reasons to believe that the non-material dimensions of life—consciousness, reason, moral value, subjective experience—cannot be reduced to, or explained as having evolved tidily from, its material dimensions, Steven Pinker took to Twitter and haughtily ruled that it was “the shoddy reasoning of a once-great thinker.” Fuck him, he explained.
Here was a signal to the Darwinist dittoheads that a mob needed to be formed. In an earlier book Nagel had dared to complain of “Darwinist imperialism,” though in his scrupulous way he added that “there is really no reason to assume that the only alternative to an evolutionary explanation of everything is a religious one.” He is not, God forbid, a theist. But he went on to warn that “this may not be comforting enough” for the materialist establishment, which may find it impossible to tolerate also “any cosmic order of which mind is an irreducible and non-accidental part.” For the bargain-basement atheism of our day, it is not enough that there be no God: there must be only matter. Now Nagel’s new book fulfills his old warning. A mob is indeed forming, a mob of materialists, of free-thinking inquisitors. “In the present climate of a dominant scientific naturalism, heavily dependent on speculative Darwinian explanations of practically everything, and armed to the teeth against religion,” Nagel calmly writes, “... I would like to extend the boundaries of what is not regarded as unthinkable, in light of how little we really understand about the world.” This cannot be allowed! And so the Sacred Congregation for the Doctrine of the Secular Faith sprang into action. “If there were a philosophical Vatican,” Simon Blackburn declared in the New Statesman, “the book would be a good candidate for going on to the Index.” I hope that one day he regrets that sentence. It is not what Bruno, Galileo, Bacon, Descartes, Voltaire, Hume, Locke, Kant, and the other victims of the anti-philosophical Vatican had in mind.
I understand that nobody is going to burn Nagel’s book or ban it. These inquisitors are just more professors. But he is being denounced not merely for being wrong. He is being denounced also for being heretical. I thought heresy was heroic. I guess it is heroic only when it dissents from a doctrine with which I disagree. Actually, the defense of heresy has nothing to do with its content and everything to do with its right. Tolerance is not a refutation of heresy, but a retirement of the concept. I am not suggesting that there is anything outrageous about the criticism of Nagel’s theory of the explanatory limitations of Darwinism. He aimed to provoke and he provoked. His troublemaking book has sparked the most exciting disputation in many years, because no question is more primary than the question of whether materialism (which Nagel defines as “the view that only the physical world is irreducibly real”) is true or false.
And so scientists are busily animadverting on Nagel’s account of science. They like to note condescendingly that he calls himself a “layman.” Yet too many of Nagel’s interlocutors have been scientists, because Mind and Cosmos is not a work of science. It is a work of philosophy; and it is entirely typical of the scientistic tyranny in American intellectual life that scientists have been invited to do the work of philosophers. The problem of the limits of science is not a scientific problem. It is also pertinent to note that the history of science is a history of mistakes, and so the dogmatism of scientists is especially rich. A few of Nagel’s scientific critics have been respectful: in The New York Review of Books, H. Allen Orr has the decency to concede that it is not at all obvious how consciousness could have originated out of matter. But he then proceeds to an almost comic evasion. Finally, he says, we must suffice with “the mysteriousness of consciousness.” A Darwinii mysterium tremendum! He then cites Colin McGinn’s entirely unironic suggestion that our “cognitive limitations” may prevent us from grasping the evolution of mind from matter: “even if matter does give rise to mind, we might not be able to understand how.” Students of religion will recognize the dodge—it used to be called fideism, and atheists gleefully ridiculed it; and the expedient suspension of rational argument; and the double standard. What once vitiated godfulness now vindicates godlessness.
The most shabby aspect of the attack on Nagel’s heterodoxy has been its political motive. His book will be “an instrument of mischief,” it will “lend comfort (and sell a lot of copies) to the religious enemies of Darwinism,” and so on. It is bad for the left’s own culture war. Whose side is he on, anyway? Almost taunting the materialist left, which teaches skepticism but not self-skepticism, Nagel, who does not subscribe to intelligent design, describes some of its proponents as “iconoclasts” who “do not deserve the scorn with which they are commonly met.” I find this delicious, because it defies the prevailing regimentation of opinion and exemplifies a rebellious willingness to go wherever the reasoning mind leads. Cui bono? is not the first question that an intellectual should ask. The provenance of an idea reveals nothing about its veracity. “Accept the truth from whoever utters it,” said the rabbis, those poor benighted souls who had the misfortune to have lived so many centuries before Dennett and Dawkins. I like Nagel’s mind and I like Nagel’s cosmos. He thinks strictly but not imperiously, and in grateful view of the full tremendousness of existence; and he denies matter nothing except the subjection of mind; and he speaks, by example, for the soulfulness of reason.
BY LEON WIESELTIER
Is there a greater gesture of intellectual contempt than the notion that a tweet constitutes an adequate intervention in a serious discussion? But when Thomas Nagel’s formidable book Mind and Cosmos recently appeared, in which he has the impudence to suggest that “the materialist neo-Darwinian conception of nature is almost certainly false,” and to offer thoughtful reasons to believe that the non-material dimensions of life—consciousness, reason, moral value, subjective experience—cannot be reduced to, or explained as having evolved tidily from, its material dimensions, Steven Pinker took to Twitter and haughtily ruled that it was “the shoddy reasoning of a once-great thinker.” Fuck him, he explained.
Here was a signal to the Darwinist dittoheads that a mob needed to be formed. In an earlier book Nagel had dared to complain of “Darwinist imperialism,” though in his scrupulous way he added that “there is really no reason to assume that the only alternative to an evolutionary explanation of everything is a religious one.” He is not, God forbid, a theist. But he went on to warn that “this may not be comforting enough” for the materialist establishment, which may find it impossible to tolerate also “any cosmic order of which mind is an irreducible and non-accidental part.” For the bargain-basement atheism of our day, it is not enough that there be no God: there must be only matter. Now Nagel’s new book fulfills his old warning. A mob is indeed forming, a mob of materialists, of free-thinking inquisitors. “In the present climate of a dominant scientific naturalism, heavily dependent on speculative Darwinian explanations of practically everything, and armed to the teeth against religion,” Nagel calmly writes, “... I would like to extend the boundaries of what is not regarded as unthinkable, in light of how little we really understand about the world.” This cannot be allowed! And so the Sacred Congregation for the Doctrine of the Secular Faith sprang into action. “If there were a philosophical Vatican,” Simon Blackburn declared in the New Statesman, “the book would be a good candidate for going on to the Index.” I hope that one day he regrets that sentence. It is not what Bruno, Galileo, Bacon, Descartes, Voltaire, Hume, Locke, Kant, and the other victims of the anti-philosophical Vatican had in mind.
I understand that nobody is going to burn Nagel’s book or ban it. These inquisitors are just more professors. But he is being denounced not merely for being wrong. He is being denounced also for being heretical. I thought heresy was heroic. I guess it is heroic only when it dissents from a doctrine with which I disagree. Actually, the defense of heresy has nothing to do with its content and everything to do with its right. Tolerance is not a refutation of heresy, but a retirement of the concept. I am not suggesting that there is anything outrageous about the criticism of Nagel’s theory of the explanatory limitations of Darwinism. He aimed to provoke and he provoked. His troublemaking book has sparked the most exciting disputation in many years, because no question is more primary than the question of whether materialism (which Nagel defines as “the view that only the physical world is irreducibly real”) is true or false.
And so scientists are busily animadverting on Nagel’s account of science. They like to note condescendingly that he calls himself a “layman.” Yet too many of Nagel’s interlocutors have been scientists, because Mind and Cosmos is not a work of science. It is a work of philosophy; and it is entirely typical of the scientistic tyranny in American intellectual life that scientists have been invited to do the work of philosophers. The problem of the limits of science is not a scientific problem. It is also pertinent to note that the history of science is a history of mistakes, and so the dogmatism of scientists is especially rich. A few of Nagel’s scientific critics have been respectful: in The New York Review of Books, H. Allen Orr has the decency to concede that it is not at all obvious how consciousness could have originated out of matter. But he then proceeds to an almost comic evasion. Finally, he says, we must suffice with “the mysteriousness of consciousness.” A Darwinii mysterium tremendum! He then cites Colin McGinn’s entirely unironic suggestion that our “cognitive limitations” may prevent us from grasping the evolution of mind from matter: “even if matter does give rise to mind, we might not be able to understand how.” Students of religion will recognize the dodge—it used to be called fideism, and atheists gleefully ridiculed it; and the expedient suspension of rational argument; and the double standard. What once vitiated godfulness now vindicates godlessness.
The most shabby aspect of the attack on Nagel’s heterodoxy has been its political motive. His book will be “an instrument of mischief,” it will “lend comfort (and sell a lot of copies) to the religious enemies of Darwinism,” and so on. It is bad for the left’s own culture war. Whose side is he on, anyway? Almost taunting the materialist left, which teaches skepticism but not self-skepticism, Nagel, who does not subscribe to intelligent design, describes some of its proponents as “iconoclasts” who “do not deserve the scorn with which they are commonly met.” I find this delicious, because it defies the prevailing regimentation of opinion and exemplifies a rebellious willingness to go wherever the reasoning mind leads. Cui bono? is not the first question that an intellectual should ask. The provenance of an idea reveals nothing about its veracity. “Accept the truth from whoever utters it,” said the rabbis, those poor benighted souls who had the misfortune to have lived so many centuries before Dennett and Dawkins. I like Nagel’s mind and I like Nagel’s cosmos. He thinks strictly but not imperiously, and in grateful view of the full tremendousness of existence; and he denies matter nothing except the subjection of mind; and he speaks, by example, for the soulfulness of reason.
It Oughta Be Interesting
YESTERDAY was the first day of the rest of my life. I started early. Today I've got to mend that fence in the Lower 40 and I've got a sick mule to tend to. The rest of my life oughta be interesting.
Thursday, March 14, 2013
I am Not Riding the Bus to Ft. Smith
The South I grew up with is still here in abundance. I'm talking about the PWT and the RNs, the people who mow yards, hook up your cable, and frequent the thrift stores and brag about the bargains they find there. They keep the low-end fast food places in business and splurge once a year on a $20 entree. Some of them still ride the bus to Ft. Smith. I'm right there with 'em, though I'd have to do some serious thinking before I would ride the bus to Ft. Smith.
Obama vs. Ryan
.Why Obama and Paul Ryan Will Never, Ever Agree
By Jonathan Chait
The continued inability of President Obama and House Republicans to reach some sort of compromise in the budget wars has inspired an endless stream of advice, pleas, psychoanalysis, and befuddlement. Why oh why can’t the two sides meet in the middle somewhere?
The impasse has many causes, but the most fundamental of them may be that the entire left-right conception may be misplaced. If the goals of the two parties can be plotted along a left-right axis, then a sane negotiation ought to locate some sort of midpoint — a bit more tax here, a bit less spending there. But the release of the budget documents, first by House Republicans and then by Senate Democrats, suggests that the disagreements don’t lie along a simple left-right axis. The parties can’t find a middle ground because a middle ground does not exist.
Begin with President Obama’s goals. He has three:
1. As economic adviser and Bob Woodward threatener Gene Sperling recently explained, Obama believes that the long-term rise in spending on retirement programs crowds out the government’s ability to spend money on other things, especially investments in research and infrastructure. Of course, part of the solution is to raise taxes and spend more money on both investments and retirement programs, but raising taxes is politically painful and therefore justifies cutting retirement programs below the preferred level in order to get the revenue.
2. Obama believes the deficit is entirely a long-term problem, and that reductions should be delayed so as not to hamper the recovery. Ideally, long-term deficit cuts would be paired with an immediate, temporary stimulus.
3. Obama’s highest priority is that the long-term fiscal correction doesn’t worsen income inequality, the long-term rise of which he regards as a significant social problem. That principle dictates that cuts to retirement programs both be limited in scope, so as not to undercut their basic social insurance role, and balanced with higher taxes on the affluent. Most important to Obama, deficit reduction should spare benefits for the most desperately poor and sick Americans.
The distinction with the Republican approach as embodied by Paul Ryan’s budget is sharp. Consider where Ryan stands on these three points.
1. Obama wants to clear out more budget headroom for public investment, but Ryan proposes to subject domestic discretionary spending, which is already slated to fall to the lowest level as a proportion of the economy in four decades, to what the Bipartisan Policy Center calls “enormous reductions” of over a trillion dollars. Moreover, Obama is willing to offer up more cuts to Medicare, but as influential Ryan adviser Yuval Levin explains, Ryan doesn’t want to find more savings in the traditional Medicare program. He is focused on what he considers the transformative power of voucherizing the system. And even though Ryan’s plan would continue traditional Medicare for anybody over 55 years old — that is, leaving it in place for five decades — Ryan considers it pointless to spend less money on it. Indeed, worse than pointless. Since Obama’s proposals to require more efficiency out of traditional Medicare require “government bureaucracy,” this is ideologically alien to Ryan and thus, writes Levin, “steps in the wrong direction.” A more cynical observer than Levin might note that the more success we have in making traditional Medicare more efficient, the weaker the case will be to phase it out and replace it with some form of vouchers.
This is a crucial problem. The thing that Obama regards as his biggest potential concession to Republicans is greeted by Republicans as not a concession at all.
2. Rather than postponing his cuts to avoid economic contraction, Ryan would implement his immediately. Indeed, he rejects the entire economic principle that immediate spending cuts can reduce economic demand as, in a term he used in an interview yesterday, “sugar high economics.” Or, as fellow Billy Long Republican told Politico, Obama “said the short-term debt’s not that bad, and we think it is that bad.”
3. Where Obama insists that deficit reduction not widen income inequality, Ryan views it primarily as an opportunity to widen income inequality. Indeed, the inequality-widening characteristics of his budget are far more concrete than its deficit-reducing characteristics. Ryan’s plan reserves its largest cuts for programs targeted to the desperately poor and sick. He would cut the budget for Medicaid and Childrens’ Health Insurance by more than half, increasing the uninsured population by 40 to 50 million. He proposes a “goal” of cutting the top tax rate to 25 percent, offset by unspecified reductions to tax expenditures. The most generous reading of this tax proposal is that he would pour every dollar of savings from tax reform into rate cuts that would shift the tax burden down from the very rich, who benefit the most from lower rates, onto the middle class. The less generous reading is that Ryan wants to cut tax rates without paying for it, which is the policies he supported under President Bush.
In either case, Ryan views budget reforms as primarily an opportunity to defend the makers from the takers, and only secondarily as an exercise in fiscal balancing.
Viewed in this light, the impasse looks very hard to bridge. Obama and Ryan aren’t proposing different solutions to the same problem. They are attacking completely different problems, and the things each of them defines as a problem, the other side does not. The only path to agreement would seem to involve Obama going around Ryan and picking off a minority of Republicans unable to stomach sequestration and willing to compromise. Consummating a deal with Ryan seems ideologically impossible.
By Jonathan Chait
The continued inability of President Obama and House Republicans to reach some sort of compromise in the budget wars has inspired an endless stream of advice, pleas, psychoanalysis, and befuddlement. Why oh why can’t the two sides meet in the middle somewhere?
The impasse has many causes, but the most fundamental of them may be that the entire left-right conception may be misplaced. If the goals of the two parties can be plotted along a left-right axis, then a sane negotiation ought to locate some sort of midpoint — a bit more tax here, a bit less spending there. But the release of the budget documents, first by House Republicans and then by Senate Democrats, suggests that the disagreements don’t lie along a simple left-right axis. The parties can’t find a middle ground because a middle ground does not exist.
Begin with President Obama’s goals. He has three:
1. As economic adviser and Bob Woodward threatener Gene Sperling recently explained, Obama believes that the long-term rise in spending on retirement programs crowds out the government’s ability to spend money on other things, especially investments in research and infrastructure. Of course, part of the solution is to raise taxes and spend more money on both investments and retirement programs, but raising taxes is politically painful and therefore justifies cutting retirement programs below the preferred level in order to get the revenue.
2. Obama believes the deficit is entirely a long-term problem, and that reductions should be delayed so as not to hamper the recovery. Ideally, long-term deficit cuts would be paired with an immediate, temporary stimulus.
3. Obama’s highest priority is that the long-term fiscal correction doesn’t worsen income inequality, the long-term rise of which he regards as a significant social problem. That principle dictates that cuts to retirement programs both be limited in scope, so as not to undercut their basic social insurance role, and balanced with higher taxes on the affluent. Most important to Obama, deficit reduction should spare benefits for the most desperately poor and sick Americans.
The distinction with the Republican approach as embodied by Paul Ryan’s budget is sharp. Consider where Ryan stands on these three points.
1. Obama wants to clear out more budget headroom for public investment, but Ryan proposes to subject domestic discretionary spending, which is already slated to fall to the lowest level as a proportion of the economy in four decades, to what the Bipartisan Policy Center calls “enormous reductions” of over a trillion dollars. Moreover, Obama is willing to offer up more cuts to Medicare, but as influential Ryan adviser Yuval Levin explains, Ryan doesn’t want to find more savings in the traditional Medicare program. He is focused on what he considers the transformative power of voucherizing the system. And even though Ryan’s plan would continue traditional Medicare for anybody over 55 years old — that is, leaving it in place for five decades — Ryan considers it pointless to spend less money on it. Indeed, worse than pointless. Since Obama’s proposals to require more efficiency out of traditional Medicare require “government bureaucracy,” this is ideologically alien to Ryan and thus, writes Levin, “steps in the wrong direction.” A more cynical observer than Levin might note that the more success we have in making traditional Medicare more efficient, the weaker the case will be to phase it out and replace it with some form of vouchers.
This is a crucial problem. The thing that Obama regards as his biggest potential concession to Republicans is greeted by Republicans as not a concession at all.
2. Rather than postponing his cuts to avoid economic contraction, Ryan would implement his immediately. Indeed, he rejects the entire economic principle that immediate spending cuts can reduce economic demand as, in a term he used in an interview yesterday, “sugar high economics.” Or, as fellow Billy Long Republican told Politico, Obama “said the short-term debt’s not that bad, and we think it is that bad.”
3. Where Obama insists that deficit reduction not widen income inequality, Ryan views it primarily as an opportunity to widen income inequality. Indeed, the inequality-widening characteristics of his budget are far more concrete than its deficit-reducing characteristics. Ryan’s plan reserves its largest cuts for programs targeted to the desperately poor and sick. He would cut the budget for Medicaid and Childrens’ Health Insurance by more than half, increasing the uninsured population by 40 to 50 million. He proposes a “goal” of cutting the top tax rate to 25 percent, offset by unspecified reductions to tax expenditures. The most generous reading of this tax proposal is that he would pour every dollar of savings from tax reform into rate cuts that would shift the tax burden down from the very rich, who benefit the most from lower rates, onto the middle class. The less generous reading is that Ryan wants to cut tax rates without paying for it, which is the policies he supported under President Bush.
In either case, Ryan views budget reforms as primarily an opportunity to defend the makers from the takers, and only secondarily as an exercise in fiscal balancing.
Viewed in this light, the impasse looks very hard to bridge. Obama and Ryan aren’t proposing different solutions to the same problem. They are attacking completely different problems, and the things each of them defines as a problem, the other side does not. The only path to agreement would seem to involve Obama going around Ryan and picking off a minority of Republicans unable to stomach sequestration and willing to compromise. Consummating a deal with Ryan seems ideologically impossible.
Conservatism and Race
by Corey Robin
So here's a fascinating moment of right-wing self-revelation.
Last month, Sam Tanenhaus wrote a piece in The New Republic saying that American conservatives since the Fifties have been in thrall to John C. Calhoun. According to Tanenhaus, the southern slaveholder and inspiration of the Confederate cause is the founding theoretician of the postwar conservative movement.
When the intellectual authors of the modern right created its doctrines in the 1950s, they drew on nineteenth-century political thought, borrowing explicitly from the great apologists for slavery, above all, the intellectually fierce South Carolinian John C. Calhoun.
Progress, if you ask me: Tanenhaus never even mentioned Calhoun in his last book on American conservatism, which came out in 2009—though I do know of another book on conservatism that came out since then that makes a great deal of Calhoun's ideas and their structuring presence on the right. That book, just out in paperback, got panned by the New York Times Book Review, of which Tanenhaus is the editor. Thus advanceth the dialectic. But I digress.
Writing in the National Review, Jonah Goldberg and Ramesh Ponnuru naturally take great umbrage at being tarred with the Calhoun brush. No one wants to be connected, by however many degrees of separation (Tanenhaus counts two, maybe three, I couldn't quite tell), with a slaveholder and a racist.
But notice how they take umbrage:
Now Tanenhaus doesn’t want you to think he is saying that today’s conservatives are just a bunch of racists. Certainly not. He is up to something much more subtle than that. "This is not to say conservatives today share Calhoun’s ideas about race. It is to say instead that the Calhoun revival, based on his complex theories of constitutional democracy, became the justification for conservative politicians to resist, ignore, or even overturn the will of the electoral majority." With that to-be-sure throat-clearing out of the way, Tanenhaus continues with an essay that makes sense only as an attempt to identify racism as the core of conservatism.
In the worldview of the contemporary American right it is a grievous sin—or at least bad PR—to be called a racist. But the accusation that you wish "to resist, ignore, or even overturn the will of the electoral majority"—that is, that you are resolutely opposed, if not downright hostile, to the basic norms of democracy—can be passed over as if it were a grocery store circular. Hating democracy, apparently, is so anodyne a passion that it hardly needs to be addressed much less explained. Indeed, Goldberg and Ponnuru think the charge is Tanenhaus's way of covering his ass, a form of exculpatory "throat-clearing" designed to make it seem as if he's not making the truly heinous accusation of racism that he is indeed making.
So, that's where we are. It's 2013, and the American right thinks racism is bad, and contempt for democracy is...what? Okay, not worthy of remark, perhaps mitigating?
Corey Robin
March 14, 2013 at 12:14 am
So here's a fascinating moment of right-wing self-revelation.
Last month, Sam Tanenhaus wrote a piece in The New Republic saying that American conservatives since the Fifties have been in thrall to John C. Calhoun. According to Tanenhaus, the southern slaveholder and inspiration of the Confederate cause is the founding theoretician of the postwar conservative movement.
When the intellectual authors of the modern right created its doctrines in the 1950s, they drew on nineteenth-century political thought, borrowing explicitly from the great apologists for slavery, above all, the intellectually fierce South Carolinian John C. Calhoun.
Progress, if you ask me: Tanenhaus never even mentioned Calhoun in his last book on American conservatism, which came out in 2009—though I do know of another book on conservatism that came out since then that makes a great deal of Calhoun's ideas and their structuring presence on the right. That book, just out in paperback, got panned by the New York Times Book Review, of which Tanenhaus is the editor. Thus advanceth the dialectic. But I digress.
Writing in the National Review, Jonah Goldberg and Ramesh Ponnuru naturally take great umbrage at being tarred with the Calhoun brush. No one wants to be connected, by however many degrees of separation (Tanenhaus counts two, maybe three, I couldn't quite tell), with a slaveholder and a racist.
But notice how they take umbrage:
Now Tanenhaus doesn’t want you to think he is saying that today’s conservatives are just a bunch of racists. Certainly not. He is up to something much more subtle than that. "This is not to say conservatives today share Calhoun’s ideas about race. It is to say instead that the Calhoun revival, based on his complex theories of constitutional democracy, became the justification for conservative politicians to resist, ignore, or even overturn the will of the electoral majority." With that to-be-sure throat-clearing out of the way, Tanenhaus continues with an essay that makes sense only as an attempt to identify racism as the core of conservatism.
In the worldview of the contemporary American right it is a grievous sin—or at least bad PR—to be called a racist. But the accusation that you wish "to resist, ignore, or even overturn the will of the electoral majority"—that is, that you are resolutely opposed, if not downright hostile, to the basic norms of democracy—can be passed over as if it were a grocery store circular. Hating democracy, apparently, is so anodyne a passion that it hardly needs to be addressed much less explained. Indeed, Goldberg and Ponnuru think the charge is Tanenhaus's way of covering his ass, a form of exculpatory "throat-clearing" designed to make it seem as if he's not making the truly heinous accusation of racism that he is indeed making.
So, that's where we are. It's 2013, and the American right thinks racism is bad, and contempt for democracy is...what? Okay, not worthy of remark, perhaps mitigating?
Corey Robin
March 14, 2013 at 12:14 am
Subscribe to:
Posts (Atom)