Friday, May 31, 2013
Peter Onuf - The Mind of Thomas Jefferson
Peter Onuf holds "The Jefferson Chair" at UVA. Before him there was Merrill Peterson and Dumas Malone. Hence, a long line of distinguished Jeffersonian scholars. But if this work is any indication, Onuf is boring. The book focuses on the Jefferson's democratic so-called political philosophy. Onuf is one Jefferson scholar you have to be aware of, but he is not my cup of tea.
Wednesday, May 29, 2013
Thomas Fleming - Why We Fought the Civil War
Thomas Fleming is supposed to be a respected historian, and I suppose he is, but I see his writing as being rather soft. He has his footnotes and bibliography yet everything seems squishy.
This book purports to provide a new understanding of why we fought the Civil War? Must we trudge over this ground again? Probably not.
Northern abolitionists irrationally feared the "Slave Power." Really? Irrational? I am not so sure.
There is the influence of John Brown's raid. I still haven't fully understdand the effects of this traumatic incident.
The book is 'character driven" meaning that Fleming delves into the personal hitories of some people I've never heard of. I am not wild about this kind of history. Jack Rakove of Stanford writes the same kind of American history.
Extremists on boath sides led us to the war that couldn't be stopped by any other means. What peaceful solution could there have been that would have ended slavery with the 13th Amendment? I know of no such peaceful way.
This book is light and easy history but leaves no lasting impression in my mind.
This book purports to provide a new understanding of why we fought the Civil War? Must we trudge over this ground again? Probably not.
Northern abolitionists irrationally feared the "Slave Power." Really? Irrational? I am not so sure.
There is the influence of John Brown's raid. I still haven't fully understdand the effects of this traumatic incident.
The book is 'character driven" meaning that Fleming delves into the personal hitories of some people I've never heard of. I am not wild about this kind of history. Jack Rakove of Stanford writes the same kind of American history.
Extremists on boath sides led us to the war that couldn't be stopped by any other means. What peaceful solution could there have been that would have ended slavery with the 13th Amendment? I know of no such peaceful way.
This book is light and easy history but leaves no lasting impression in my mind.
Montaigne and the Essay
The Stone May 26, 2013, 3:00 pm
The Essayification of Everything
By CHRISTY WAMPOLE
The Stone is a forum for contemporary philosophers on issues both timely and timeless.
Lately, you may have noticed the spate of articles and books that take interest in the essay as a flexible and very human literary form. These include “The Wayward Essay” and Phillip Lopate’s reflections on the relationship between essay and doubt, and books such as “How to Live,” Sarah Bakewell’s elegant portrait of Montaigne, the 16th-century patriarch of the genre, and an edited volume by Carl H. Klaus and Ned Stuckey-French called “Essayists on the Essay: Montaigne to Our Time.”
The essayist samples more than a D.J.: a loop of the epic here, a little lyric replay there, all with a signature scratch on top.
.It seems that, even in the proliferation of new forms of writing and communication before us, the essay has become a talisman of our times. What is behind our attraction to it? Is it the essay’s therapeutic properties? Because it brings miniature joys to its writer and its reader? Because it is small enough to fit in our pocket, portable like our own experiences?
I believe that the essay owes its longevity today mainly to this fact: the genre and its spirit provide an alternative to the dogmatic thinking that dominates much of social and political life in contemporary America. In fact, I would advocate a conscious and more reflective deployment of the essay’s spirit in all aspects of life as a resistance against the zealous closed-endedness of the rigid mind. I’ll call this deployment “the essayification of everything.”
What do I mean with this lofty expression?
Let’s start with form’s beginning. The word Michel de Montaigne chose to describe his prose ruminations published in 1580 was “Essais,” which, at the time, meant merely “Attempts,” as no such genre had yet been codified. This etymology is significant, as it points toward the experimental nature of essayistic writing: it involves the nuanced process of trying something out. Later on, at the end of the 16th century, Francis Bacon imported the French term into English as a title for his more boxy and solemn prose. The deal was thus sealed: essays they were and essays they would stay. There was just one problem: the discrepancy in style and substance between the texts of Michel and Francis was, like the English Channel that separated them, deep enough to drown in. I’ve always been on Team Michel, that guy who would probably show you his rash, tell you some dirty jokes, and ask you what you thought about death. I imagine, perhaps erroneously, that Team Francis tends to attract a more cocksure, buttoned-up fan base, what with all the “He that hath wife and children hath given hostages to fortune; for they are impediments to great enterprises,” and whatnot.
Francis Bacon painted by Paul van Somer, circa 1600.With such divergent progenitors, the essay has never recovered from this chronic undecidability. As a genre that emerged to accommodate the expressive needs of the Renaissance Man, the essay necessarily keeps all tools and skills at its disposal. The essayist samples more than a D.J.: a loop of the epic here, a little lyric replay there, a polyvocal break and citations from greatnesses past, all with a signature scratch on top.
There is certainly disagreement on the wobbly matter of what counts as an essay and what does not. I have generally found that for every rule I could establish about the essay, a dozen exceptions scuttle up. I recently taught a graduate seminar on the topic and, at the end of the course, to the question “What can we say of the essay with absolute certainty?,” all of us, armed with our panoply of canonical essay theories and our own conjectures, had to admit that the answer is: “Almost nothing.” But this is the force of the essay: it impels you to face the undecidable. It asks you to get comfortable with ambivalence.
When I say “essay,” I mean short nonfiction prose with a meditative subject at its center and a tendency away from certitude. Much of the writing encountered today that is labeled as “essay” or “essay-like” is anything but. These texts include the kind of writing expected on the SAT, in seminar papers, dissertations, professional criticism or other scholarly writing; politically engaged texts or other forms of peremptory writing that insist upon their theses and leave no room for uncertainty; or other short prose forms in which the author’s subjectivity is purposely erased or disguised. What these texts often have in common is, first, their self-conscious hiding of the “I” under a shroud of objectivity. One has to pretend that one’s opinions or findings have emanated from some office of higher truth where rigor and science are the managers on duty.
Second, these texts are untentative: they know what they want to argue before they begin, stealthily making their case, anticipating any objections, aiming for air-tightness. These texts are not attempts; they are obstinacies. They are fortresses. Leaving the reader uninvited to this textual engagement, the writer makes it clear he or she would rather drink alone.
What is perhaps most interesting about the essay is what happens when it cannot be contained by its generic borders, leaking outside the short prose form into other formats such as the essayistic novel, the essay-film, the photo-essay, and life itself. In his unfinished novel “The Man Without Qualities,” the early 20th-century Austrian writer Robert Musil coined a term for this leakage. He called it “essayism” (Essayismus in German) and he called those who live by it “possibilitarians” (Möglichkeitsmenschen). This mode is defined by contingency and trying things out digressively, following this or that forking path, feeling around life without a specific ambition: not for discovery’s sake, not for conquest’s sake, not for proof’s sake, but simply for the sake of trying.
The possibilitarian is a virtuoso of the hypothetical. One of my dissertation advisers Thomas Harrison wrote a handsome book on the topic called “Essayism: Conrad, Musil, and Pirandello,” in which he argues that the essayism Musil sought to describe was a “solution in the absence of a solution,” a fuzzy response to Europe’s precarity during the years he worked on his unfinishable masterpiece. I would argue that many of us in contemporary America these days are prone to essayism, in various guises, but always in the spirit of open-endedness and with serious reservations about committing to any one thing.
Essayism consists in a self-absorbed subject feeling around life, exercising what Theodor Adorno called the “essay’s groping intention,” approaching everything tentatively and with short attention, drawing analogies between the particular and the universal. Banal, everyday phenomena — what we eat, things upon which we stumble, things that Pinterest us — rub elbows implicitly with the Big Questions: What are the implications of the human experience? What is the meaning of life? Why something rather than nothing? Like the Father of the Essay, we let the mind and body flit from thing to thing, clicking around from mental hyperlink to mental hyperlink: if Montaigne were alive today, maybe he too would be diagnosed with A.D.H.D.
The essayist is interested in thinking about himself thinking about things. We believe our opinions on everything from politics to pizza parlors to be of great import. This explains our generosity in volunteering them to complete strangers. And as D.I.Y. culture finds its own language today, we can recognize in it Arthur Benson’s dictum from 1922 that, “An essay is a thing which someone does himself.”
In Italian, the word for essay is “saggio” and contains the same root as the term “assaggiare,” which means to sample, taste or nibble food. Today, we like to sample, taste or nibble experiences: Internet dating, speed dating, online shopping and buy-and-try consumerism, mash-ups and digital sampling, the money-back guarantee, the temporary tattoo, the test-drive, shareware. If you are not satisfied with your product, your writing, your husband, you may return/delete/divorce it. The essay, like many of us, is notoriously noncommittal.
I certainly don’t argue that no one is committing these days; it only takes a few moments of exposure to contemporary American political discourse to realize the extent of dogmatic commitment to this or that party, to this or that platform. However, for many, the certainty with which the dogmatists make their pronouncements feels increasingly like a bothersome vestige of the past. We can either cling rigidly to dissolving categories or we can let ambivalence wash over us, allowing its tide to carry us toward new life configurations that were inconceivable even 20 years ago. Essayism, when imagined as a constructive approach to existence, is a blanket of possibilities draped consciously on the world.
--------------------------------------------------------------------------------
Essayism is predicated on at least three things: personal stability, technocratic stability and societal instability.
Michel de MontaigneMontaigne certainly possessed the first. He grew up in a privileged family, spoke Latin before French, had the educational, financial and social means to lead a life of civic engagement and writing. While most of us probably didn’t know fluent Latin as children (and never will) and aren’t in a position to become high-ranking civil servants, we have a relatively high literacy rate and unprecedented access to technologies of communication and reserves of knowledge. Furthermore, as a counter-narrative to our supposed busy-ness, there’s lots of evidence that we have plenty of idle time on our hands. Despite our search for distractions in any form, these empty hours give us time to contemplate the hardships of contemporary life. The thoughts just creep in if given the means.
Regarding technocracy, the maturation of print culture during the Renaissance meant that the great texts of Antiquity and newer philosophical, literary and scientific materials could reach a wider audience, albeit mainly composed of people of privilege. The experts of science and technology at that time siphoned some of the power that had been monopolized by the church and the crown. We could draw a similar analogy today: Silicon Valley and the technocratic business class still force the church and the state to share much of their cultural power. The essay thrives under these conditions.
.As for societal instability, life outside Montaigne’s château was not rosy: the Wars of Religion between Catholics and Protestants raged in France starting in the 1560s. Turmoil and uncertainty, dogmatism and blood: such circumstances make one reflect on the meaning of life, but it is sometimes too hard to look such a question right in the face. Instead, one asks it obliquely by wondering about those smallnesses that make up the human experience. Today, unresolved issues of class, race, gender, sexual orientation, political affiliation and other categories have created a volatile social dynamic, and, with our current economic instability to boot, it is no wonder that throwing oneself wholeheartedly toward any particular idea or endeavor seems a risky proposition to many of us. Finally, the bloody wars of religion and ideology continue to rage on in our time. In the early 20th century, when the French writer André Malraux predicted that the 21st century would be a century of renewed mysticism, he perhaps did not imagine that the pursuit of God would take such a politically volatile form.
Essayism, as an expressive mode and as a way of life, accommodates our insecurities, our self-absorption, our simple pleasures, our unnerving questions and the need to compare and share our experiences with other humans. I would argue that the weakest component in today’s nontextual essayism is its meditative deficiency. Without the meditative aspect, essayism tends toward empty egotism and an unwillingness or incapacity to commit, a timid deferral of the moment of choice. Our often unreflective quickness means that little time is spent interrogating things we’ve touched upon. The experiences are simply had and then abandoned. The true essayist prefers a more cumulative approach; nothing is ever really left behind, only put aside temporarily until her digressive mind summons it up again, turning it this way and that in a different light, seeing what sense it makes. She offers a model of humanism that isn’t about profit or progress and does not propose a solution to life but rather puts endless questions to it.
We need a cogent response to the renewed dogmatism of today’s political and social landscape and our intuitive attraction to the essay could be pointing us toward this genre and its spirit as a provisional solution. Today’s essayistic tendency — a series of often superficial attempts relatively devoid of thought — doesn’t live up to this potential in its current iteration, but a more meditative and measured version à la Montaigne would nudge us toward a calm taking into account of life without the knee-jerk reflex to be unshakeably right. The essayification of everything means turning life itself into a protracted attempt.
The essay, like this one, is a form for trying out the heretofore untried. Its spirit resists closed-ended, hierarchical thinking and encourages both writer and reader to postpone their verdict on life. It is an invitation to maintain the elasticity of mind and to get comfortable with the world’s inherent ambivalence. And, most importantly, it is an imaginative rehearsal of what isn’t but could be.
RELATED: “How to Live Without Irony” by Christy Wampole.
--------------------------------------------------------------------------------
Christy Wampole is an assistant professor of French at Princeton University. Her research focuses primarily on 20th- and 21st-century French and Italian literature and thought.
.
The Essayification of Everything
By CHRISTY WAMPOLE
The Stone is a forum for contemporary philosophers on issues both timely and timeless.
Lately, you may have noticed the spate of articles and books that take interest in the essay as a flexible and very human literary form. These include “The Wayward Essay” and Phillip Lopate’s reflections on the relationship between essay and doubt, and books such as “How to Live,” Sarah Bakewell’s elegant portrait of Montaigne, the 16th-century patriarch of the genre, and an edited volume by Carl H. Klaus and Ned Stuckey-French called “Essayists on the Essay: Montaigne to Our Time.”
The essayist samples more than a D.J.: a loop of the epic here, a little lyric replay there, all with a signature scratch on top.
.It seems that, even in the proliferation of new forms of writing and communication before us, the essay has become a talisman of our times. What is behind our attraction to it? Is it the essay’s therapeutic properties? Because it brings miniature joys to its writer and its reader? Because it is small enough to fit in our pocket, portable like our own experiences?
I believe that the essay owes its longevity today mainly to this fact: the genre and its spirit provide an alternative to the dogmatic thinking that dominates much of social and political life in contemporary America. In fact, I would advocate a conscious and more reflective deployment of the essay’s spirit in all aspects of life as a resistance against the zealous closed-endedness of the rigid mind. I’ll call this deployment “the essayification of everything.”
What do I mean with this lofty expression?
Let’s start with form’s beginning. The word Michel de Montaigne chose to describe his prose ruminations published in 1580 was “Essais,” which, at the time, meant merely “Attempts,” as no such genre had yet been codified. This etymology is significant, as it points toward the experimental nature of essayistic writing: it involves the nuanced process of trying something out. Later on, at the end of the 16th century, Francis Bacon imported the French term into English as a title for his more boxy and solemn prose. The deal was thus sealed: essays they were and essays they would stay. There was just one problem: the discrepancy in style and substance between the texts of Michel and Francis was, like the English Channel that separated them, deep enough to drown in. I’ve always been on Team Michel, that guy who would probably show you his rash, tell you some dirty jokes, and ask you what you thought about death. I imagine, perhaps erroneously, that Team Francis tends to attract a more cocksure, buttoned-up fan base, what with all the “He that hath wife and children hath given hostages to fortune; for they are impediments to great enterprises,” and whatnot.
Francis Bacon painted by Paul van Somer, circa 1600.With such divergent progenitors, the essay has never recovered from this chronic undecidability. As a genre that emerged to accommodate the expressive needs of the Renaissance Man, the essay necessarily keeps all tools and skills at its disposal. The essayist samples more than a D.J.: a loop of the epic here, a little lyric replay there, a polyvocal break and citations from greatnesses past, all with a signature scratch on top.
There is certainly disagreement on the wobbly matter of what counts as an essay and what does not. I have generally found that for every rule I could establish about the essay, a dozen exceptions scuttle up. I recently taught a graduate seminar on the topic and, at the end of the course, to the question “What can we say of the essay with absolute certainty?,” all of us, armed with our panoply of canonical essay theories and our own conjectures, had to admit that the answer is: “Almost nothing.” But this is the force of the essay: it impels you to face the undecidable. It asks you to get comfortable with ambivalence.
When I say “essay,” I mean short nonfiction prose with a meditative subject at its center and a tendency away from certitude. Much of the writing encountered today that is labeled as “essay” or “essay-like” is anything but. These texts include the kind of writing expected on the SAT, in seminar papers, dissertations, professional criticism or other scholarly writing; politically engaged texts or other forms of peremptory writing that insist upon their theses and leave no room for uncertainty; or other short prose forms in which the author’s subjectivity is purposely erased or disguised. What these texts often have in common is, first, their self-conscious hiding of the “I” under a shroud of objectivity. One has to pretend that one’s opinions or findings have emanated from some office of higher truth where rigor and science are the managers on duty.
Second, these texts are untentative: they know what they want to argue before they begin, stealthily making their case, anticipating any objections, aiming for air-tightness. These texts are not attempts; they are obstinacies. They are fortresses. Leaving the reader uninvited to this textual engagement, the writer makes it clear he or she would rather drink alone.
What is perhaps most interesting about the essay is what happens when it cannot be contained by its generic borders, leaking outside the short prose form into other formats such as the essayistic novel, the essay-film, the photo-essay, and life itself. In his unfinished novel “The Man Without Qualities,” the early 20th-century Austrian writer Robert Musil coined a term for this leakage. He called it “essayism” (Essayismus in German) and he called those who live by it “possibilitarians” (Möglichkeitsmenschen). This mode is defined by contingency and trying things out digressively, following this or that forking path, feeling around life without a specific ambition: not for discovery’s sake, not for conquest’s sake, not for proof’s sake, but simply for the sake of trying.
The possibilitarian is a virtuoso of the hypothetical. One of my dissertation advisers Thomas Harrison wrote a handsome book on the topic called “Essayism: Conrad, Musil, and Pirandello,” in which he argues that the essayism Musil sought to describe was a “solution in the absence of a solution,” a fuzzy response to Europe’s precarity during the years he worked on his unfinishable masterpiece. I would argue that many of us in contemporary America these days are prone to essayism, in various guises, but always in the spirit of open-endedness and with serious reservations about committing to any one thing.
Essayism consists in a self-absorbed subject feeling around life, exercising what Theodor Adorno called the “essay’s groping intention,” approaching everything tentatively and with short attention, drawing analogies between the particular and the universal. Banal, everyday phenomena — what we eat, things upon which we stumble, things that Pinterest us — rub elbows implicitly with the Big Questions: What are the implications of the human experience? What is the meaning of life? Why something rather than nothing? Like the Father of the Essay, we let the mind and body flit from thing to thing, clicking around from mental hyperlink to mental hyperlink: if Montaigne were alive today, maybe he too would be diagnosed with A.D.H.D.
The essayist is interested in thinking about himself thinking about things. We believe our opinions on everything from politics to pizza parlors to be of great import. This explains our generosity in volunteering them to complete strangers. And as D.I.Y. culture finds its own language today, we can recognize in it Arthur Benson’s dictum from 1922 that, “An essay is a thing which someone does himself.”
In Italian, the word for essay is “saggio” and contains the same root as the term “assaggiare,” which means to sample, taste or nibble food. Today, we like to sample, taste or nibble experiences: Internet dating, speed dating, online shopping and buy-and-try consumerism, mash-ups and digital sampling, the money-back guarantee, the temporary tattoo, the test-drive, shareware. If you are not satisfied with your product, your writing, your husband, you may return/delete/divorce it. The essay, like many of us, is notoriously noncommittal.
I certainly don’t argue that no one is committing these days; it only takes a few moments of exposure to contemporary American political discourse to realize the extent of dogmatic commitment to this or that party, to this or that platform. However, for many, the certainty with which the dogmatists make their pronouncements feels increasingly like a bothersome vestige of the past. We can either cling rigidly to dissolving categories or we can let ambivalence wash over us, allowing its tide to carry us toward new life configurations that were inconceivable even 20 years ago. Essayism, when imagined as a constructive approach to existence, is a blanket of possibilities draped consciously on the world.
--------------------------------------------------------------------------------
Essayism is predicated on at least three things: personal stability, technocratic stability and societal instability.
Michel de MontaigneMontaigne certainly possessed the first. He grew up in a privileged family, spoke Latin before French, had the educational, financial and social means to lead a life of civic engagement and writing. While most of us probably didn’t know fluent Latin as children (and never will) and aren’t in a position to become high-ranking civil servants, we have a relatively high literacy rate and unprecedented access to technologies of communication and reserves of knowledge. Furthermore, as a counter-narrative to our supposed busy-ness, there’s lots of evidence that we have plenty of idle time on our hands. Despite our search for distractions in any form, these empty hours give us time to contemplate the hardships of contemporary life. The thoughts just creep in if given the means.
Regarding technocracy, the maturation of print culture during the Renaissance meant that the great texts of Antiquity and newer philosophical, literary and scientific materials could reach a wider audience, albeit mainly composed of people of privilege. The experts of science and technology at that time siphoned some of the power that had been monopolized by the church and the crown. We could draw a similar analogy today: Silicon Valley and the technocratic business class still force the church and the state to share much of their cultural power. The essay thrives under these conditions.
.As for societal instability, life outside Montaigne’s château was not rosy: the Wars of Religion between Catholics and Protestants raged in France starting in the 1560s. Turmoil and uncertainty, dogmatism and blood: such circumstances make one reflect on the meaning of life, but it is sometimes too hard to look such a question right in the face. Instead, one asks it obliquely by wondering about those smallnesses that make up the human experience. Today, unresolved issues of class, race, gender, sexual orientation, political affiliation and other categories have created a volatile social dynamic, and, with our current economic instability to boot, it is no wonder that throwing oneself wholeheartedly toward any particular idea or endeavor seems a risky proposition to many of us. Finally, the bloody wars of religion and ideology continue to rage on in our time. In the early 20th century, when the French writer André Malraux predicted that the 21st century would be a century of renewed mysticism, he perhaps did not imagine that the pursuit of God would take such a politically volatile form.
Essayism, as an expressive mode and as a way of life, accommodates our insecurities, our self-absorption, our simple pleasures, our unnerving questions and the need to compare and share our experiences with other humans. I would argue that the weakest component in today’s nontextual essayism is its meditative deficiency. Without the meditative aspect, essayism tends toward empty egotism and an unwillingness or incapacity to commit, a timid deferral of the moment of choice. Our often unreflective quickness means that little time is spent interrogating things we’ve touched upon. The experiences are simply had and then abandoned. The true essayist prefers a more cumulative approach; nothing is ever really left behind, only put aside temporarily until her digressive mind summons it up again, turning it this way and that in a different light, seeing what sense it makes. She offers a model of humanism that isn’t about profit or progress and does not propose a solution to life but rather puts endless questions to it.
We need a cogent response to the renewed dogmatism of today’s political and social landscape and our intuitive attraction to the essay could be pointing us toward this genre and its spirit as a provisional solution. Today’s essayistic tendency — a series of often superficial attempts relatively devoid of thought — doesn’t live up to this potential in its current iteration, but a more meditative and measured version à la Montaigne would nudge us toward a calm taking into account of life without the knee-jerk reflex to be unshakeably right. The essayification of everything means turning life itself into a protracted attempt.
The essay, like this one, is a form for trying out the heretofore untried. Its spirit resists closed-ended, hierarchical thinking and encourages both writer and reader to postpone their verdict on life. It is an invitation to maintain the elasticity of mind and to get comfortable with the world’s inherent ambivalence. And, most importantly, it is an imaginative rehearsal of what isn’t but could be.
RELATED: “How to Live Without Irony” by Christy Wampole.
--------------------------------------------------------------------------------
Christy Wampole is an assistant professor of French at Princeton University. Her research focuses primarily on 20th- and 21st-century French and Italian literature and thought.
.
The Unfraying of the US
A Nation, Its Seams Fraying‘The Unwinding,’ by George Packer
By DWIGHT GARNER
Published: May 28, 2013
If you were to take apart George Packer’s ambitious new book, “The Unwinding,” as if it were a car’s engine, and spread the parts across your garage, you’d essentially be looking at 5 large pieces and 10 small ones — the nuts and bolts and cotter pins.
THE UNWINDING
An Inner History of the New America
By George Packer
434 pages. Farrar, Straus & Giroux. $27.
The large pieces are profiles: portraits of a Reagan Republican turned biodiesel entrepreneur; a thoughtful and disappointed longtime Joe Biden staffer; a female factory worker in Youngstown, Ohio, who becomes a community organizer; Peter Thiel, the libertarian Silicon Valley venture capitalist; and, finally, the City of Tampa in Florida, which had problems before the foreclosure crisis and seems like hell on earth now.
The small pieces are critical riffs, often acidic, on especially influential Americans of the past few decades. I’ll list them here in reverse order of Mr. Packer’s esteem for what each has brought to the commonweal: Sam Walton, Newt Gingrich, Robert E. Rubin, Andrew Breitbart, Colin L. Powell, Jay-Z, Oprah Winfrey, Alice Waters, Raymond Carver and Elizabeth Warren.
Some of the large pieces, which are chopped up and welded onto the rest in roughly 20-page blocks, began as articles in The New Yorker, where Mr. Packer is a staff writer. Other material is new.
It is Mr. Packer’s achievement in “The Unwinding” that these pieces, freshly shuffled and assembled, have speed and power to burn. This book hums — with sorrow, with outrage and with compassion for those who are caught in the gears of America’s increasingly complicated (and increasingly poorly calibrated) financial machinery.
“The Unwinding” begins like a horror novel, which in some regards it is. “No one can say when the unwinding began,” Mr. Packer writes, “when the coil that held Americans together in its secure and sometimes stifling grip first gave way.”
If you were born after 1960, Mr. Packer suggests, you have spent much of your life watching structures long in place collapsing — things like farms, factories, subdivisions and public schools on the one hand, and “ways and means in Washington caucus rooms, taboos on New York trading desks” and “manners and morals everywhere” on the other.
What has replaced them, he says, is organized money, as well as a society in which “winners win bigger than ever, floating away like bloated dirigibles, and losers have a long way to fall before they hit bottom, and sometimes they never do.”
If a solitary fact can stand in for Mr. Packer’s arguments in “The Unwinding,” it is probably this one, about the heirs to Walton’s Walmart fortune: “Eventually six of the surviving Waltons,” the author writes, “would have as much money as the bottom 30 percent of Americans.”
It was only after Walton’s death, Mr. Packer says, “that the country began to understand what his company had done.” He writes: “Over the years, America had become more like Walmart. It had gotten cheap. Prices were lower, and wages were lower. There were fewer union factory jobs, and more part-time jobs as store greeters.” He adds: “The hollowing out of the heartland was good for the company’s bottom line.”
“The Unwinding” contains many sweeping, wide-angle views of American life. Its portraits of Youngstown, Ohio; Tampa; Silicon Valley; Washington; and Wall Street are rich, complex and interlocking. Mr. Packer’s gifts are Steinbeckian in the best sense of that term.
Amid this narrative push are many small, memorable moments. The assessment of Mr. Biden is complicated and sometimes positive, but it includes these sentences, from one Biden insider to another: “Jeff, don’t take this personally. Biden disappoints everyone. He’s an equal-opportunity disappointer.”
Mr. Packer, whose previous books include “The Assassins’ Gate: America in Iraq” (2005), describes how Mr. Gingrich’s rhetoric, when he came to power in the late 1980s, forever changed the way elected leaders spoke to one another: “He gave them mustard gas, and they used it on every conceivable enemy, including him.”
He has a few complimentary things to say about Ms. Winfrey, but his section about her amounts to a comprehensive takedown. About her audience he maintains: “They had things that she didn’t — children, debts, spare time. They consumed the products that she advertised but would never buy — Maybelline, Jenny Craig, Little Caesars, Ikea. As their financial troubles grew, she would thrill them by selecting one of them and wiping out her debts on the air.”
He goes on: “Being instructed in Oprah’s magical thinking (vaccinations cause autism; positive thoughts lead to wealth, love, and success), and watching Oprah always doing more, owning more, not all her viewers began to live their best life.” It gets harsher from there.
Barack Obama’s presidency hovers at the margins of this book, largely as a somewhat disappointing work in progress. We do hear from a man who shakes the president’s hand and thinks: “It was the softest of any man he’d ever shaken hands with. It told him that Obama had never done a lick of physical work in his life.”
“The Unwinding” is a painful book to read. It made me feel ill, as if I’d contracted a three-day flu. Perhaps Mr. Packer put this thought in my mind. He frequently refers to what’s happening to America in medical terms — as an illness, a new virus, a plague, a bacterial infection.
Among this book’s few heroes is Ms. Warren, the former Harvard Law School professor and bankruptcy expert who is now the senior United States Senator from Massachusetts. “The Unwinding” is largely about how banks have become unchecked and unholy forces in American life, and part of what Mr. Packer likes about Ms. Warren, a Democrat, is that banks fear her.
His book specializes in plain talk, and in Ms. Warren he spies a rare politician with a gift for the same quality. Mr. Packer describes one of her appearances about banking this way:
“She seemed to have walked into the hearing room and taken her seat at the dais out of the past, from the era when the American prairie raised angry and eloquent champions of the common people, William Jennings Bryan and Robert La Follette, George Norris and Hubert Humphrey. Her very presence made insiders uneasy because it reminded them of the cozy corruption that had become the normal way of doing business around Capitol Hill. And that was unforgivable.”
At one point in “The Unwinding” we meet a talented reporter in Florida who is writing about the foreclosure mess. This reporter, we read, “believed that there were two kinds of journalists — the ones who told stories, and the ones who uncovered wrongdoing.”
Mr. Packer is both, and he’s written something close to a nonfiction masterpiece.
By DWIGHT GARNER
Published: May 28, 2013
If you were to take apart George Packer’s ambitious new book, “The Unwinding,” as if it were a car’s engine, and spread the parts across your garage, you’d essentially be looking at 5 large pieces and 10 small ones — the nuts and bolts and cotter pins.
THE UNWINDING
An Inner History of the New America
By George Packer
434 pages. Farrar, Straus & Giroux. $27.
The large pieces are profiles: portraits of a Reagan Republican turned biodiesel entrepreneur; a thoughtful and disappointed longtime Joe Biden staffer; a female factory worker in Youngstown, Ohio, who becomes a community organizer; Peter Thiel, the libertarian Silicon Valley venture capitalist; and, finally, the City of Tampa in Florida, which had problems before the foreclosure crisis and seems like hell on earth now.
The small pieces are critical riffs, often acidic, on especially influential Americans of the past few decades. I’ll list them here in reverse order of Mr. Packer’s esteem for what each has brought to the commonweal: Sam Walton, Newt Gingrich, Robert E. Rubin, Andrew Breitbart, Colin L. Powell, Jay-Z, Oprah Winfrey, Alice Waters, Raymond Carver and Elizabeth Warren.
Some of the large pieces, which are chopped up and welded onto the rest in roughly 20-page blocks, began as articles in The New Yorker, where Mr. Packer is a staff writer. Other material is new.
It is Mr. Packer’s achievement in “The Unwinding” that these pieces, freshly shuffled and assembled, have speed and power to burn. This book hums — with sorrow, with outrage and with compassion for those who are caught in the gears of America’s increasingly complicated (and increasingly poorly calibrated) financial machinery.
“The Unwinding” begins like a horror novel, which in some regards it is. “No one can say when the unwinding began,” Mr. Packer writes, “when the coil that held Americans together in its secure and sometimes stifling grip first gave way.”
If you were born after 1960, Mr. Packer suggests, you have spent much of your life watching structures long in place collapsing — things like farms, factories, subdivisions and public schools on the one hand, and “ways and means in Washington caucus rooms, taboos on New York trading desks” and “manners and morals everywhere” on the other.
What has replaced them, he says, is organized money, as well as a society in which “winners win bigger than ever, floating away like bloated dirigibles, and losers have a long way to fall before they hit bottom, and sometimes they never do.”
If a solitary fact can stand in for Mr. Packer’s arguments in “The Unwinding,” it is probably this one, about the heirs to Walton’s Walmart fortune: “Eventually six of the surviving Waltons,” the author writes, “would have as much money as the bottom 30 percent of Americans.”
It was only after Walton’s death, Mr. Packer says, “that the country began to understand what his company had done.” He writes: “Over the years, America had become more like Walmart. It had gotten cheap. Prices were lower, and wages were lower. There were fewer union factory jobs, and more part-time jobs as store greeters.” He adds: “The hollowing out of the heartland was good for the company’s bottom line.”
“The Unwinding” contains many sweeping, wide-angle views of American life. Its portraits of Youngstown, Ohio; Tampa; Silicon Valley; Washington; and Wall Street are rich, complex and interlocking. Mr. Packer’s gifts are Steinbeckian in the best sense of that term.
Amid this narrative push are many small, memorable moments. The assessment of Mr. Biden is complicated and sometimes positive, but it includes these sentences, from one Biden insider to another: “Jeff, don’t take this personally. Biden disappoints everyone. He’s an equal-opportunity disappointer.”
Mr. Packer, whose previous books include “The Assassins’ Gate: America in Iraq” (2005), describes how Mr. Gingrich’s rhetoric, when he came to power in the late 1980s, forever changed the way elected leaders spoke to one another: “He gave them mustard gas, and they used it on every conceivable enemy, including him.”
He has a few complimentary things to say about Ms. Winfrey, but his section about her amounts to a comprehensive takedown. About her audience he maintains: “They had things that she didn’t — children, debts, spare time. They consumed the products that she advertised but would never buy — Maybelline, Jenny Craig, Little Caesars, Ikea. As their financial troubles grew, she would thrill them by selecting one of them and wiping out her debts on the air.”
He goes on: “Being instructed in Oprah’s magical thinking (vaccinations cause autism; positive thoughts lead to wealth, love, and success), and watching Oprah always doing more, owning more, not all her viewers began to live their best life.” It gets harsher from there.
Barack Obama’s presidency hovers at the margins of this book, largely as a somewhat disappointing work in progress. We do hear from a man who shakes the president’s hand and thinks: “It was the softest of any man he’d ever shaken hands with. It told him that Obama had never done a lick of physical work in his life.”
“The Unwinding” is a painful book to read. It made me feel ill, as if I’d contracted a three-day flu. Perhaps Mr. Packer put this thought in my mind. He frequently refers to what’s happening to America in medical terms — as an illness, a new virus, a plague, a bacterial infection.
Among this book’s few heroes is Ms. Warren, the former Harvard Law School professor and bankruptcy expert who is now the senior United States Senator from Massachusetts. “The Unwinding” is largely about how banks have become unchecked and unholy forces in American life, and part of what Mr. Packer likes about Ms. Warren, a Democrat, is that banks fear her.
His book specializes in plain talk, and in Ms. Warren he spies a rare politician with a gift for the same quality. Mr. Packer describes one of her appearances about banking this way:
“She seemed to have walked into the hearing room and taken her seat at the dais out of the past, from the era when the American prairie raised angry and eloquent champions of the common people, William Jennings Bryan and Robert La Follette, George Norris and Hubert Humphrey. Her very presence made insiders uneasy because it reminded them of the cozy corruption that had become the normal way of doing business around Capitol Hill. And that was unforgivable.”
At one point in “The Unwinding” we meet a talented reporter in Florida who is writing about the foreclosure mess. This reporter, we read, “believed that there were two kinds of journalists — the ones who told stories, and the ones who uncovered wrongdoing.”
Mr. Packer is both, and he’s written something close to a nonfiction masterpiece.
Rename Those Southern Forts?
More South-Bashing!
by Michael Tomasky May 28, 2013 2:58 PM EDT
Did you all see Jamie Malanowski's provocative op-ed in the Times over the weekend arguing that we should rename the 10 US military facilities currently named after Confederate generals? After all, he writes, they were traitors of the US of A,
Fort Lee, in Virginia, is of course named for Robert E. Lee, a man widely respected for his integrity and his military skills. Yet, as the documentarian Ken Burns has noted, he was responsible for the deaths of more Army soldiers than Hitler and Tojo. John Bell Hood, for whom Fort Hood, Tex., is named, led a hard-fighting brigade known for ferocious straight-on assaults. During these attacks, Hood lost the use of an arm at Gettysburg and a leg at Chickamauga, but he delivered victories, at least for a while. Later, when the gallant but tactically inflexible Hood launched such assaults at Nashville and Franklin, Tenn., his armies were smashed.
This base-naming is part of a much larger problem, of course, which is the North's (and Lincoln's) overly forgiving posture, the insistence on the idea that we must become brothers again. Now, it's certainly true that the North occupied the South. And you had the carpetbaggers and all that, but as occupations go, it wasn't so brutal. In important ways, Southerners were welcomed back into the union.
More than that, the North gave the South the post-war narrative, as historian David Blight has shown, so that throughout the 1880s and 1890s and into the 20th century, the rebs were able to propagate all that Lost Cause nonsense that still really continues down there today. I was reading Josh Marshall earlier today, and he got an email from a guy who, having been raised down South, didn't even realize he'd been on the wrong side of the Civil War until he got to college. Not entirely clear whether by "wrong" he meant losing or morally wrong, but of course it was both anyway.
Lee and Longstreet and the others were traitors pure and simple. And worse than that, they were traitors in defense of slavery. Just let that thought marinate for a second. Not only did Robt E. Lee accepting training from America's premier military academy and then turn around and use that training to kill loyal Americans. He did all that in defense of slavery.
I have some friends who think the whole lot of them, including the cabinet of the CSA, should have been hanged. I wouldn't go that far. That only would have given the aggrieved losers a few beloved martyrs. Always deny an aggrieved people their martyrs. History teaches this clearly, I think. A living, doddering Lee was far less useful to the pitchfork crowd than a hanged, virile Lee would have been. So no, no hangings. I guess that makes me a moderate on the question!
And for the record, I suppose it's probably too late to change these names. At Pentagon procurement prices, the cost of switching the stationery alone would be astronomical. So I guess we just have to, uh, soldier on.
It could perhaps be done, but the paradox is that it could be accomplished only by a president who would never propose it (a conservative Southerner). Could you imagine if Obama tried to float these renamings? We'd have another Civil War.
by Michael Tomasky May 28, 2013 2:58 PM EDT
Did you all see Jamie Malanowski's provocative op-ed in the Times over the weekend arguing that we should rename the 10 US military facilities currently named after Confederate generals? After all, he writes, they were traitors of the US of A,
Fort Lee, in Virginia, is of course named for Robert E. Lee, a man widely respected for his integrity and his military skills. Yet, as the documentarian Ken Burns has noted, he was responsible for the deaths of more Army soldiers than Hitler and Tojo. John Bell Hood, for whom Fort Hood, Tex., is named, led a hard-fighting brigade known for ferocious straight-on assaults. During these attacks, Hood lost the use of an arm at Gettysburg and a leg at Chickamauga, but he delivered victories, at least for a while. Later, when the gallant but tactically inflexible Hood launched such assaults at Nashville and Franklin, Tenn., his armies were smashed.
This base-naming is part of a much larger problem, of course, which is the North's (and Lincoln's) overly forgiving posture, the insistence on the idea that we must become brothers again. Now, it's certainly true that the North occupied the South. And you had the carpetbaggers and all that, but as occupations go, it wasn't so brutal. In important ways, Southerners were welcomed back into the union.
More than that, the North gave the South the post-war narrative, as historian David Blight has shown, so that throughout the 1880s and 1890s and into the 20th century, the rebs were able to propagate all that Lost Cause nonsense that still really continues down there today. I was reading Josh Marshall earlier today, and he got an email from a guy who, having been raised down South, didn't even realize he'd been on the wrong side of the Civil War until he got to college. Not entirely clear whether by "wrong" he meant losing or morally wrong, but of course it was both anyway.
Lee and Longstreet and the others were traitors pure and simple. And worse than that, they were traitors in defense of slavery. Just let that thought marinate for a second. Not only did Robt E. Lee accepting training from America's premier military academy and then turn around and use that training to kill loyal Americans. He did all that in defense of slavery.
I have some friends who think the whole lot of them, including the cabinet of the CSA, should have been hanged. I wouldn't go that far. That only would have given the aggrieved losers a few beloved martyrs. Always deny an aggrieved people their martyrs. History teaches this clearly, I think. A living, doddering Lee was far less useful to the pitchfork crowd than a hanged, virile Lee would have been. So no, no hangings. I guess that makes me a moderate on the question!
And for the record, I suppose it's probably too late to change these names. At Pentagon procurement prices, the cost of switching the stationery alone would be astronomical. So I guess we just have to, uh, soldier on.
It could perhaps be done, but the paradox is that it could be accomplished only by a president who would never propose it (a conservative Southerner). Could you imagine if Obama tried to float these renamings? We'd have another Civil War.
Tuesday, May 28, 2013
Henry Wiencek - Master of the Mountain
This book could have been called "Monster of the Mountain" for this is a no-holds barred attack on the wicked Thomas Jefferson by an "independent" scholar who actually lives in Charlottesville. I can't imagine the author is very popular in Mr. Jefferson's town.
Published last October, this book has attracted lots of comments, pro and con. No matter what anyone says, Jefferson will always have his defenders and his attackers.
Some scholars will NEVER forgive Jefferson for his record on slavery. The author of the Declaration of Independence was one of the biggest slave holders in Virginia. He freed a few of his slaves before he died, but most were auctioned off after his passing to pay down his debts. Jefferson died insolvent.
It is hard to come to a definitive opinion of Mr. Jefferson. I would say at this point that it is impossible.
Somehow the defenders of Thomas Jefferson always find a way to exonerate him. This author says no way. The problem is the larger one of coming to grips with this country's history of slavery. I do not see how we informed Americans can ever excuse or come to rational grips with our history of slavery. We can't fool ourselves anymore. Where do we go from here?
Published last October, this book has attracted lots of comments, pro and con. No matter what anyone says, Jefferson will always have his defenders and his attackers.
Some scholars will NEVER forgive Jefferson for his record on slavery. The author of the Declaration of Independence was one of the biggest slave holders in Virginia. He freed a few of his slaves before he died, but most were auctioned off after his passing to pay down his debts. Jefferson died insolvent.
It is hard to come to a definitive opinion of Mr. Jefferson. I would say at this point that it is impossible.
Somehow the defenders of Thomas Jefferson always find a way to exonerate him. This author says no way. The problem is the larger one of coming to grips with this country's history of slavery. I do not see how we informed Americans can ever excuse or come to rational grips with our history of slavery. We can't fool ourselves anymore. Where do we go from here?
Heroes of Uncertainty
Just think: Once I wanted to be a clinicl psychologist. I dodged a bullet there!
FLH
Heroes of UncertaintyBy DAVID BROOKS
Published: May 27, 2013 199 Comments
We’re living in an empirical age. The most impressive intellectual feats have been achieved by physicists and biologists, and these fields have established a distinctive model of credibility.
To be an authoritative figure, you want to be coolly scientific. You want to possess an arcane body of technical expertise. You want your mind to be a neutral instrument capable of processing complex quantifiable data.
The people in the human sciences have tried to piggyback on this authority model. For example, the American Psychiatric Association has just released the fifth edition of the Diagnostic Statistical Manual of Mental Health Disorders. It is the basic handbook of the field. It defines the known mental diseases. It creates stable standards, so that insurance companies can recognize various diagnoses and be comfortable with the medications prescribed to treat them.
The recent editions of this manual exude an impressive aura of scientific authority. They treat mental diseases like diseases of the heart and liver. They leave the impression that you should go to your psychiatrist because she has a vast body of technical knowledge that will allow her to solve your problems. With their austere neutrality, they leave a distinct impression: Psychiatrists are methodically treating symptoms, not people.
The problem is that the behavorial sciences like psychiatry are not really sciences; they are semi-sciences. The underlying reality they describe is just not as regularized as the underlying reality of, say, a solar system.
As the handbook’s many critics have noted, psychiatrists use terms like “mental disorder” and “normal behavior,” but there is no agreement on what these concepts mean. When you look at the definitions psychiatrists habitually use to define various ailments, you see that they contain vague words that wouldn’t pass muster in any actual scientific analysis: “excessive,” “binge,” “anxious.”
Mental diseases are not really understood the way, say, liver diseases are understood, as a pathology of the body and its tissues and cells. Researchers understand the underlying structure of very few mental ailments. What psychiatrists call a disease is usually just a label for a group of symptoms. As the eminent psychiatrist Allen Frances writes in his book, “Saving Normal,” a word like schizophrenia is a useful construct, not a disease: “It is a description of a particular set of psychiatric problems, not an explanation of their cause.”
Furthermore, psychiatric phenomena are notoriously protean in nature. Medicines seem to work but then stop. Because the mind is an irregular cosmos, psychiatry hasn’t been able to make the rapid progress that has become normal in physics and biology. As Martin Seligman, a past president of the American Psychological Association, put it in The Washington Post early this year, “I have found that drugs and therapy offer disappointingly little additional help for the mentally ill than they did 25 years ago — despite billions of dollars in funding.”
All of this is not to damn people in the mental health fields. On the contrary, they are heroes who alleviate the most elusive of all suffering, even though they are overmatched by the complexity and variability of the problems that confront them. I just wish they would portray themselves as they really are. Psychiatrists are not heroes of science. They are heroes of uncertainty, using improvisation, knowledge and artistry to improve people’s lives.
The field of psychiatry is better in practice than it is in theory. The best psychiatrists are not austerely technical, like the official handbook’s approach; they combine technical expertise with personal knowledge. They are daring adapters, perpetually adjusting in ways more imaginative than scientific rigor.
The best psychiatrists are not coming up with abstract rules that homogenize treatments. They are combining an awareness of common patterns with an acute attention to the specific circumstances of a unique human being. They certainly are not inventing new diseases in order to medicalize the moderate ailments of the worried well.
If the authors of the psychiatry manual want to invent a new disease, they should put Physics Envy in their handbook. The desire to be more like the hard sciences has distorted economics, education, political science, psychiatry and other behavioral fields. It’s led practitioners to claim more knowledge than they can possibly have. It’s devalued a certain sort of hybrid mentality that is better suited to these realms, the mentality that has one foot in the world of science and one in the liberal arts, that involves bringing multiple vantage points to human behavior.
Hippocrates once observed, “It’s more important to know what sort of person has a disease than to know what sort of disease a person has.” That’s certainly true in the behavioral sciences and in policy making generally, though these days it is often a neglected truth.
FLH
Heroes of UncertaintyBy DAVID BROOKS
Published: May 27, 2013 199 Comments
We’re living in an empirical age. The most impressive intellectual feats have been achieved by physicists and biologists, and these fields have established a distinctive model of credibility.
To be an authoritative figure, you want to be coolly scientific. You want to possess an arcane body of technical expertise. You want your mind to be a neutral instrument capable of processing complex quantifiable data.
The people in the human sciences have tried to piggyback on this authority model. For example, the American Psychiatric Association has just released the fifth edition of the Diagnostic Statistical Manual of Mental Health Disorders. It is the basic handbook of the field. It defines the known mental diseases. It creates stable standards, so that insurance companies can recognize various diagnoses and be comfortable with the medications prescribed to treat them.
The recent editions of this manual exude an impressive aura of scientific authority. They treat mental diseases like diseases of the heart and liver. They leave the impression that you should go to your psychiatrist because she has a vast body of technical knowledge that will allow her to solve your problems. With their austere neutrality, they leave a distinct impression: Psychiatrists are methodically treating symptoms, not people.
The problem is that the behavorial sciences like psychiatry are not really sciences; they are semi-sciences. The underlying reality they describe is just not as regularized as the underlying reality of, say, a solar system.
As the handbook’s many critics have noted, psychiatrists use terms like “mental disorder” and “normal behavior,” but there is no agreement on what these concepts mean. When you look at the definitions psychiatrists habitually use to define various ailments, you see that they contain vague words that wouldn’t pass muster in any actual scientific analysis: “excessive,” “binge,” “anxious.”
Mental diseases are not really understood the way, say, liver diseases are understood, as a pathology of the body and its tissues and cells. Researchers understand the underlying structure of very few mental ailments. What psychiatrists call a disease is usually just a label for a group of symptoms. As the eminent psychiatrist Allen Frances writes in his book, “Saving Normal,” a word like schizophrenia is a useful construct, not a disease: “It is a description of a particular set of psychiatric problems, not an explanation of their cause.”
Furthermore, psychiatric phenomena are notoriously protean in nature. Medicines seem to work but then stop. Because the mind is an irregular cosmos, psychiatry hasn’t been able to make the rapid progress that has become normal in physics and biology. As Martin Seligman, a past president of the American Psychological Association, put it in The Washington Post early this year, “I have found that drugs and therapy offer disappointingly little additional help for the mentally ill than they did 25 years ago — despite billions of dollars in funding.”
All of this is not to damn people in the mental health fields. On the contrary, they are heroes who alleviate the most elusive of all suffering, even though they are overmatched by the complexity and variability of the problems that confront them. I just wish they would portray themselves as they really are. Psychiatrists are not heroes of science. They are heroes of uncertainty, using improvisation, knowledge and artistry to improve people’s lives.
The field of psychiatry is better in practice than it is in theory. The best psychiatrists are not austerely technical, like the official handbook’s approach; they combine technical expertise with personal knowledge. They are daring adapters, perpetually adjusting in ways more imaginative than scientific rigor.
The best psychiatrists are not coming up with abstract rules that homogenize treatments. They are combining an awareness of common patterns with an acute attention to the specific circumstances of a unique human being. They certainly are not inventing new diseases in order to medicalize the moderate ailments of the worried well.
If the authors of the psychiatry manual want to invent a new disease, they should put Physics Envy in their handbook. The desire to be more like the hard sciences has distorted economics, education, political science, psychiatry and other behavioral fields. It’s led practitioners to claim more knowledge than they can possibly have. It’s devalued a certain sort of hybrid mentality that is better suited to these realms, the mentality that has one foot in the world of science and one in the liberal arts, that involves bringing multiple vantage points to human behavior.
Hippocrates once observed, “It’s more important to know what sort of person has a disease than to know what sort of disease a person has.” That’s certainly true in the behavioral sciences and in policy making generally, though these days it is often a neglected truth.
Monday, May 27, 2013
Obamacare Shock Coming?
The Obamacare ShockBy PAUL KRUGMAN
Published: May 26, 2013 415 Comments
The Affordable Care Act, a k a Obamacare, goes fully into effect at the beginning of next year, and predictions of disaster are being heard far and wide. There will be an administrative “train wreck,” we’re told; consumers will face a terrible shock. Republicans, one hears, are already counting on the law’s troubles to give them a big electoral advantage.
No doubt there will be problems, as there are with any large new government initiative, and in this case, we have the added complication that many Republican governors and legislators are doing all they can to sabotage reform. Yet important new evidence — especially from California, the law’s most important test case — suggests that the real Obamacare shock will be one of unexpected success.
Before I can explain what the news means, I need to make a crucial point: Obamacare is a deeply conservative reform, not in a political sense (although it was originally a Republican proposal) but in terms of leaving most people’s health care unaffected. Americans who receive health insurance from their employers, Medicare or Medicaid — which is to say, the vast majority of those who have any kind of health insurance at all — will see almost no changes when the law goes into effect.
There are, however, millions of Americans who don’t receive insurance either from their employers or from government programs. They can get insurance only by buying it on their own, and many of them are effectively shut out of that market. In some states, like California, insurers reject applicants with past medical problems. In others, like New York, insurers can’t reject applicants, and must offer similar coverage regardless of personal medical history (“community rating”); unfortunately, this leads to a situation in which premiums are very high because only those with current health problems sign up, while healthy people take the risk of going uninsured.
Obamacare closes this gap with a three-part approach. First, community rating everywhere — no more exclusion based on pre-existing conditions. Second, the “mandate” — you must buy insurance even if you’re currently healthy. Third, subsidies to make insurance affordable for those with lower incomes.
Massachusetts has had essentially this system since 2006; as a result, nearly all residents have health insurance, and the program remains very popular. So we know that Obamacare — or, as some of us call it, ObamaRomneyCare — can work.
Skeptics argued, however, that Massachusetts was special: it had relatively few uninsured residents even before the reform, and it already had community rating. What would happen elsewhere? In particular, what would happen in California, where more than a fifth of the nonelderly population is uninsured, and the individual insurance market is largely unregulated? Would there be “sticker shock” as the price of individual policies soared?
Well, the California bids are in — that is, insurers have submitted the prices at which they are willing to offer coverage on the state’s newly created Obamacare exchange. And the prices, it turns out, are surprisingly low. A handful of healthy people may find themselves paying more for coverage, but it looks as if Obamacare’s first year in California is going to be an overwhelmingly positive experience.
What can still go wrong? Well, Obamacare is a complicated program, basically because simpler options, like Medicare for all, weren’t considered politically feasible. So there will probably be a lot of administrative confusion as the law goes into effect, again especially in states where Republicans have been doing their best to sabotage the process.
Also, some people are too poor to afford coverage even with the subsidies. These Americans were supposed to be covered by a federally financed expansion of Medicaid, but in states where Republicans have blocked Medicaid expansion, such unfortunates will be left out in the cold.
Still, here’s what it seems is about to happen: millions of Americans will suddenly gain health coverage, and millions more will feel much more secure knowing that such coverage is available if they lose their jobs or suffer other misfortunes. Only a relative handful of people will be hurt at all. And as contrasts emerge between the experience of states like California that are making the most of the new policy and that of states like Texas whose politicians are doing their best to undermine it, the sheer meanspiritedness of the Obamacare opponents will become ever more obvious.
So yes, it does look as if there’s an Obamacare shock coming: the shock of learning that a public program designed to help a lot of people can, strange to say, end up helping a lot of people — especially when government officials actually try to make it work.
Published: May 26, 2013 415 Comments
The Affordable Care Act, a k a Obamacare, goes fully into effect at the beginning of next year, and predictions of disaster are being heard far and wide. There will be an administrative “train wreck,” we’re told; consumers will face a terrible shock. Republicans, one hears, are already counting on the law’s troubles to give them a big electoral advantage.
No doubt there will be problems, as there are with any large new government initiative, and in this case, we have the added complication that many Republican governors and legislators are doing all they can to sabotage reform. Yet important new evidence — especially from California, the law’s most important test case — suggests that the real Obamacare shock will be one of unexpected success.
Before I can explain what the news means, I need to make a crucial point: Obamacare is a deeply conservative reform, not in a political sense (although it was originally a Republican proposal) but in terms of leaving most people’s health care unaffected. Americans who receive health insurance from their employers, Medicare or Medicaid — which is to say, the vast majority of those who have any kind of health insurance at all — will see almost no changes when the law goes into effect.
There are, however, millions of Americans who don’t receive insurance either from their employers or from government programs. They can get insurance only by buying it on their own, and many of them are effectively shut out of that market. In some states, like California, insurers reject applicants with past medical problems. In others, like New York, insurers can’t reject applicants, and must offer similar coverage regardless of personal medical history (“community rating”); unfortunately, this leads to a situation in which premiums are very high because only those with current health problems sign up, while healthy people take the risk of going uninsured.
Obamacare closes this gap with a three-part approach. First, community rating everywhere — no more exclusion based on pre-existing conditions. Second, the “mandate” — you must buy insurance even if you’re currently healthy. Third, subsidies to make insurance affordable for those with lower incomes.
Massachusetts has had essentially this system since 2006; as a result, nearly all residents have health insurance, and the program remains very popular. So we know that Obamacare — or, as some of us call it, ObamaRomneyCare — can work.
Skeptics argued, however, that Massachusetts was special: it had relatively few uninsured residents even before the reform, and it already had community rating. What would happen elsewhere? In particular, what would happen in California, where more than a fifth of the nonelderly population is uninsured, and the individual insurance market is largely unregulated? Would there be “sticker shock” as the price of individual policies soared?
Well, the California bids are in — that is, insurers have submitted the prices at which they are willing to offer coverage on the state’s newly created Obamacare exchange. And the prices, it turns out, are surprisingly low. A handful of healthy people may find themselves paying more for coverage, but it looks as if Obamacare’s first year in California is going to be an overwhelmingly positive experience.
What can still go wrong? Well, Obamacare is a complicated program, basically because simpler options, like Medicare for all, weren’t considered politically feasible. So there will probably be a lot of administrative confusion as the law goes into effect, again especially in states where Republicans have been doing their best to sabotage the process.
Also, some people are too poor to afford coverage even with the subsidies. These Americans were supposed to be covered by a federally financed expansion of Medicaid, but in states where Republicans have blocked Medicaid expansion, such unfortunates will be left out in the cold.
Still, here’s what it seems is about to happen: millions of Americans will suddenly gain health coverage, and millions more will feel much more secure knowing that such coverage is available if they lose their jobs or suffer other misfortunes. Only a relative handful of people will be hurt at all. And as contrasts emerge between the experience of states like California that are making the most of the new policy and that of states like Texas whose politicians are doing their best to undermine it, the sheer meanspiritedness of the Obamacare opponents will become ever more obvious.
So yes, it does look as if there’s an Obamacare shock coming: the shock of learning that a public program designed to help a lot of people can, strange to say, end up helping a lot of people — especially when government officials actually try to make it work.
Sunday, May 26, 2013
The Burke Biography
Reviewed: Edmund Burke: Philosopher, Politician, Prophet by Jesse Norman History has no author.
By John Gray Published 16 May 2013 9:04
Jesse Norman
William Collins, 320pp, £20
Citing Edmund Burke’s view according to which “The temper of the people amongst whom he presides ought to be the first study of a statesman,” Jesse Norman comments: “This is a thought utterly foreign to contemporary notions of leadership, which focus on forward planning, motivating ideology, great programmes of legislation, decisive action and the vigour of a leader’s personal will.” They were written before she died but it would be impossible to read these lines without thinking of Margaret Thatcher.
No doubt the economic transformation commonly attributed to Thatcher by her friends and enemies is much exaggerated. Britain’s deindustrialisation began long before she came to power. Ongoing globalisation would have demolished the old industries along with the communities they supported and while she accelerated the process, the upshot might not have been too different had she never existed. At the same time, Thatcher did change British society and did so quite deliberately. Far from preserving “the temper of the people”, she altered it profoundly.
The results were far from those she expected and in some ways the opposite of what she wanted. The Tory England she inherited, which even the turbulence of the 1970s hadn’t greatly shaken, no longer exists. Patterns of deference that had survived the postwar Labour settlement are now barely memories. No institution – the BBC, the Church of England, universities, the police – has anything like the authority Thatcher took for granted (and in some cases fiercely resented).
As a consequence of her leadership, the Conservative Party is in some ways weaker than it has ever been. Turning it into an instrument of her personal will, she triggered a coup that has left every subsequent Tory leader on permanent probation. Alienating Scotland, she virtually wiped out her party north of the border and planted a large question mark over the Union. Within England, her indifference to the human costs of de - industrialisation deepened the north-south divide. The result is a hollowed-out and shrunken party that faces huge obstacles in ever again forming a government. For someone who has been described as the greatest Conservative leader since Churchill, it’s quite a list of achievements. If you wanted to shake up Britain and change it beyond recognition, Thatcher was, of all postwar leaders, the one mostly likely to have this effect.
Thatcher’s career illustrates the paradoxical pattern of democratic politics over the past 30 years. Society has been revolutionised by parties of the right, while those of the left have tagged along behind; but the impact of this right-wing revolution has been highly destabilising and the economic regime that the right put in place is presently in the throes of a major crisis. No one has any very clear ideas as to what to do next and the temptation is to turn for guidance to great thinkers of the past. Since the crash, the Keynes-Hayek debate of the 1930s has been rehashed time and again but this looks more like a symptom of intellectual fatigue than anything else. How can anyone imagine that debates waged over 70 years ago could resolve the dilemmas that an utterly different world confronts today?
Turning to Edmund Burke –who was born in 1729 – seems, on the face of it, even more perverse. But if Norman fails to show how Burke can lead us out of our current impasse, he presents an intriguing and illuminating picture of the thinker who more than any other exemplifies the contradictions of conservatism.
Dividing the book into two parts, one on Burke’s life and the other on his thought, could be problematical with a thinker whose ideas were so closely intertwined with the politics of his day. Some have argued that Burke’s thought was not much more than a weapon in conflicts within the late-18thcentury English political elite – an idea supported by the historian Lewis Namier’s view of the politics of the period as being (as Norman puts it) “at root a matter not of grand parties and high principles but of personal self-interest expressed via an ever-shifting kaleidoscope of political factions”. Applying this view, it is possible to conclude that Burke – at times deeply in debt and heavily dependent on political patronage – was simply a stooge for powerful interests but Norman does a good job defending him against this accusation. Never entirely accepted in English society, the Irish-born writer and parliamentarian was too impassioned and wayward a character to be simply a hack.
Showing that Burke developed a coherent body of ideas is a harder task. Summarising what he sees as Burke’s chief themes, Norman writes: “He is effectively making a series of rather sophisticated and challenging philosophical points: that absolute consistency, however desirable in mathematics and logic, is neither available nor desirable in the conduct of human affairs; that universal principles are never sufficient in themselves to guide practical deliberation; and that it is a deep error to apply concepts from the exact sciences willy-nilly to the messy business of life.” There is nothing particularly original in any of this. Aristotle said much the same when he observed that it’s a mistake to look for a greater degree of precision in a subject than the nature of the subject allows. Where Burke is distinctive is in the political conclusions he draws from this insight.
While theorists such as Thomas Hobbes, John Locke and, later, Jean-Jacques Rousseau thought social institutions could be rebuilt on the basis of a set of principles, for Burke, institutions are the basis of our knowledge of society. His key insight was not that applying principles with strict consistency is destructive in politics, though he believed this to be the case. For him, principles were abstractions constructed from practical life, which meant participation in institutions. Giving priority to abstractions is inherently destructive because it gets things the wrong way round: principles have no authority aside from practice, he believed.
This wasn’t to say that reform is impossible or unnecessary. Burke was an active reformer, attacking British rule in India for damaging Indian traditions and impeaching the first governor general of Bengal, Warren Hastings, for corruption in a long but ultimately unsuccessful trial. However, for Burke, reform involved using standards that were already embedded in institutions. If he was a reformer who hated revolution, it was because he was first of all a traditionalist.
Burke’s view of reform as a type of immanent criticism has clear affinities with the ideas of later conservative thinkers such as Michael Oakeshott (1901-1990). Both were sharp critics of political rationalism – the view of politics in which it consists of projects aiming to reconstruct society on some kind of ideal model. These parallels are acknowledged by Norman, who comments that Oake - shott may have taken more from Burke than he admitted.
Oakeshott didn’t acknowledge such a debt – he mentions Burke only rarely in his writings, usually in negative terms, and in conversation was dismissive of Burke as a thinker. The two were at odds on some fundamental issues. Whereas Burke was a lifelong practising Anglican and a firm religious believer, Oakeshott was a religious sceptic – a difference with wide-ranging implications for how they understood politics. Burke viewed history in Whig terms as the steady advance of liberty and believed human pro - gress was divine providence at work in human affairs. Oakeshott shared the view of Burke’s more perceptive contemporary David Hume, who saw the rise of liberty as a succession of accidents. For Oakeshott, as for Hume, history couldn’t be the story of liberty, for history had no author and no plot.
Burke was horrified by the French Revolution because the victory of what he regarded as, in essence, malign and regressive forces challenged his faith in providence. Curiously, religion is almost absent from Norman’s account of Burke’s thinking. Towards the end of the book, there is a brief discussion of the utility of religion in countering the spread of anomie and promoting an ethic of community. Yet for Burke, religion wasn’t something to be evaluated in terms of its benefits to society – it supplied the categories through which he understood the world. Without providence, there might still be moral advance in particular societies; but history would have no overall significance. It’s a result that Oakeshott was happy to accept but few conservatives today share his sangfroid.
The central role of religion in Burke’s thought tends to undercut some of the more extravagant claims Norman makes on his behalf. He writes that Burke is not only the “hinge or pivot of political modernity, the thinker on whose shoulders much of the Anglo-American tradition of representative government still rests”, but also “the earliest postmodern political thinker, the first and greatest critic of the modern age, and of what has been called liberal individualism, a set of basic assumptions about human nature and human well-being that arose in the 19th century, long after Burke’s death, in reflection on the Enlightenment, and that govern the lives of millions, nay billions, of people today”.
It’s true that Burke anticipated some of the pathologies of individualism and (while being in many ways himself a product of the Enlightenment) identified important weaknesses in Enlightenment thinking – but the earliest postmodern political thinker? Come off it. The grand narrative of human progress that Burke inherited along with the idea of providence and, despite the French Revolution, never renounced clearly rules him out. If you are looking for the first postmodern philosopher, the sceptical Michel de Montaigne is a much better candidate.
The irony of Burke’s conservatism is that it has worked against the type of politics he favoured. Thatcher is not mentioned in Norman’s book, even though, more than any other 20th-century prime minister, she promoted the liberal individualist philosophy whose corrosive impact on society Burke presciently diagnosed. Norman has been an active promoter of “compassionate conservatism”. Portraying Burke as a critic of liberal individualism may be a way of writing Thatcher out of Conservative history. As a political strategy, it has its attractions – though David Cameron has wavered in applying it.
The contradictions in Burke and in conservatism remain unresolved – and irresolvable. Thatcher was a professed admirer of Hayek and Hayek an admirer of Burke; but Hayek wrote a postscript to his major work The Constitution of Liberty entitled “Why I Am Not a Conservative” and it was Burke the progressive Whig, not Burke the Tory defender of institutions, whom Hayek revered.
Like Burke, Thatcher had a vision of a social order in which individual and society were melded harmoniously together. She never understood that this vision was incompatible with the economic ethos she preached. This isn’t because that ethos promoted selfishness, as has so often been asserted. What Thatcher did was subtler and more enduring in its effects. By insisting that economic progress must come before anything else, she turned social institutions into more or less efficient means of achieving whatever is presently desired. Institutions ceased to be places in which people could find meaning and became mere tools. The result is the situation that exists today in Britain, where no institution is “fit for purpose”.
Unwittingly, Thatcher practised a revolutionary mode of politics of the kind Burke derided. At the same time, she came to see the settlement she put in place as a chapter in a Burkean grand narrative of liberty. Unsurprisingly, this settlement has now collapsed. The contradictions of conservatism are inherent in Burke’s thinking and looking back to this over-praised worthy won’t help anyone discern the way ahead.
John Gray is the New Statesman’s lead reviewer. His latest book is “The Silence of Animals: on Progress and Other Modern Sceptic: Michael Oakeshott in Cambridge Myths” (Allen Lane, £18.99)
.
By John Gray Published 16 May 2013 9:04
Jesse Norman
William Collins, 320pp, £20
Citing Edmund Burke’s view according to which “The temper of the people amongst whom he presides ought to be the first study of a statesman,” Jesse Norman comments: “This is a thought utterly foreign to contemporary notions of leadership, which focus on forward planning, motivating ideology, great programmes of legislation, decisive action and the vigour of a leader’s personal will.” They were written before she died but it would be impossible to read these lines without thinking of Margaret Thatcher.
No doubt the economic transformation commonly attributed to Thatcher by her friends and enemies is much exaggerated. Britain’s deindustrialisation began long before she came to power. Ongoing globalisation would have demolished the old industries along with the communities they supported and while she accelerated the process, the upshot might not have been too different had she never existed. At the same time, Thatcher did change British society and did so quite deliberately. Far from preserving “the temper of the people”, she altered it profoundly.
The results were far from those she expected and in some ways the opposite of what she wanted. The Tory England she inherited, which even the turbulence of the 1970s hadn’t greatly shaken, no longer exists. Patterns of deference that had survived the postwar Labour settlement are now barely memories. No institution – the BBC, the Church of England, universities, the police – has anything like the authority Thatcher took for granted (and in some cases fiercely resented).
As a consequence of her leadership, the Conservative Party is in some ways weaker than it has ever been. Turning it into an instrument of her personal will, she triggered a coup that has left every subsequent Tory leader on permanent probation. Alienating Scotland, she virtually wiped out her party north of the border and planted a large question mark over the Union. Within England, her indifference to the human costs of de - industrialisation deepened the north-south divide. The result is a hollowed-out and shrunken party that faces huge obstacles in ever again forming a government. For someone who has been described as the greatest Conservative leader since Churchill, it’s quite a list of achievements. If you wanted to shake up Britain and change it beyond recognition, Thatcher was, of all postwar leaders, the one mostly likely to have this effect.
Thatcher’s career illustrates the paradoxical pattern of democratic politics over the past 30 years. Society has been revolutionised by parties of the right, while those of the left have tagged along behind; but the impact of this right-wing revolution has been highly destabilising and the economic regime that the right put in place is presently in the throes of a major crisis. No one has any very clear ideas as to what to do next and the temptation is to turn for guidance to great thinkers of the past. Since the crash, the Keynes-Hayek debate of the 1930s has been rehashed time and again but this looks more like a symptom of intellectual fatigue than anything else. How can anyone imagine that debates waged over 70 years ago could resolve the dilemmas that an utterly different world confronts today?
Turning to Edmund Burke –who was born in 1729 – seems, on the face of it, even more perverse. But if Norman fails to show how Burke can lead us out of our current impasse, he presents an intriguing and illuminating picture of the thinker who more than any other exemplifies the contradictions of conservatism.
Dividing the book into two parts, one on Burke’s life and the other on his thought, could be problematical with a thinker whose ideas were so closely intertwined with the politics of his day. Some have argued that Burke’s thought was not much more than a weapon in conflicts within the late-18thcentury English political elite – an idea supported by the historian Lewis Namier’s view of the politics of the period as being (as Norman puts it) “at root a matter not of grand parties and high principles but of personal self-interest expressed via an ever-shifting kaleidoscope of political factions”. Applying this view, it is possible to conclude that Burke – at times deeply in debt and heavily dependent on political patronage – was simply a stooge for powerful interests but Norman does a good job defending him against this accusation. Never entirely accepted in English society, the Irish-born writer and parliamentarian was too impassioned and wayward a character to be simply a hack.
Showing that Burke developed a coherent body of ideas is a harder task. Summarising what he sees as Burke’s chief themes, Norman writes: “He is effectively making a series of rather sophisticated and challenging philosophical points: that absolute consistency, however desirable in mathematics and logic, is neither available nor desirable in the conduct of human affairs; that universal principles are never sufficient in themselves to guide practical deliberation; and that it is a deep error to apply concepts from the exact sciences willy-nilly to the messy business of life.” There is nothing particularly original in any of this. Aristotle said much the same when he observed that it’s a mistake to look for a greater degree of precision in a subject than the nature of the subject allows. Where Burke is distinctive is in the political conclusions he draws from this insight.
While theorists such as Thomas Hobbes, John Locke and, later, Jean-Jacques Rousseau thought social institutions could be rebuilt on the basis of a set of principles, for Burke, institutions are the basis of our knowledge of society. His key insight was not that applying principles with strict consistency is destructive in politics, though he believed this to be the case. For him, principles were abstractions constructed from practical life, which meant participation in institutions. Giving priority to abstractions is inherently destructive because it gets things the wrong way round: principles have no authority aside from practice, he believed.
This wasn’t to say that reform is impossible or unnecessary. Burke was an active reformer, attacking British rule in India for damaging Indian traditions and impeaching the first governor general of Bengal, Warren Hastings, for corruption in a long but ultimately unsuccessful trial. However, for Burke, reform involved using standards that were already embedded in institutions. If he was a reformer who hated revolution, it was because he was first of all a traditionalist.
Burke’s view of reform as a type of immanent criticism has clear affinities with the ideas of later conservative thinkers such as Michael Oakeshott (1901-1990). Both were sharp critics of political rationalism – the view of politics in which it consists of projects aiming to reconstruct society on some kind of ideal model. These parallels are acknowledged by Norman, who comments that Oake - shott may have taken more from Burke than he admitted.
Oakeshott didn’t acknowledge such a debt – he mentions Burke only rarely in his writings, usually in negative terms, and in conversation was dismissive of Burke as a thinker. The two were at odds on some fundamental issues. Whereas Burke was a lifelong practising Anglican and a firm religious believer, Oakeshott was a religious sceptic – a difference with wide-ranging implications for how they understood politics. Burke viewed history in Whig terms as the steady advance of liberty and believed human pro - gress was divine providence at work in human affairs. Oakeshott shared the view of Burke’s more perceptive contemporary David Hume, who saw the rise of liberty as a succession of accidents. For Oakeshott, as for Hume, history couldn’t be the story of liberty, for history had no author and no plot.
Burke was horrified by the French Revolution because the victory of what he regarded as, in essence, malign and regressive forces challenged his faith in providence. Curiously, religion is almost absent from Norman’s account of Burke’s thinking. Towards the end of the book, there is a brief discussion of the utility of religion in countering the spread of anomie and promoting an ethic of community. Yet for Burke, religion wasn’t something to be evaluated in terms of its benefits to society – it supplied the categories through which he understood the world. Without providence, there might still be moral advance in particular societies; but history would have no overall significance. It’s a result that Oakeshott was happy to accept but few conservatives today share his sangfroid.
The central role of religion in Burke’s thought tends to undercut some of the more extravagant claims Norman makes on his behalf. He writes that Burke is not only the “hinge or pivot of political modernity, the thinker on whose shoulders much of the Anglo-American tradition of representative government still rests”, but also “the earliest postmodern political thinker, the first and greatest critic of the modern age, and of what has been called liberal individualism, a set of basic assumptions about human nature and human well-being that arose in the 19th century, long after Burke’s death, in reflection on the Enlightenment, and that govern the lives of millions, nay billions, of people today”.
It’s true that Burke anticipated some of the pathologies of individualism and (while being in many ways himself a product of the Enlightenment) identified important weaknesses in Enlightenment thinking – but the earliest postmodern political thinker? Come off it. The grand narrative of human progress that Burke inherited along with the idea of providence and, despite the French Revolution, never renounced clearly rules him out. If you are looking for the first postmodern philosopher, the sceptical Michel de Montaigne is a much better candidate.
The irony of Burke’s conservatism is that it has worked against the type of politics he favoured. Thatcher is not mentioned in Norman’s book, even though, more than any other 20th-century prime minister, she promoted the liberal individualist philosophy whose corrosive impact on society Burke presciently diagnosed. Norman has been an active promoter of “compassionate conservatism”. Portraying Burke as a critic of liberal individualism may be a way of writing Thatcher out of Conservative history. As a political strategy, it has its attractions – though David Cameron has wavered in applying it.
The contradictions in Burke and in conservatism remain unresolved – and irresolvable. Thatcher was a professed admirer of Hayek and Hayek an admirer of Burke; but Hayek wrote a postscript to his major work The Constitution of Liberty entitled “Why I Am Not a Conservative” and it was Burke the progressive Whig, not Burke the Tory defender of institutions, whom Hayek revered.
Like Burke, Thatcher had a vision of a social order in which individual and society were melded harmoniously together. She never understood that this vision was incompatible with the economic ethos she preached. This isn’t because that ethos promoted selfishness, as has so often been asserted. What Thatcher did was subtler and more enduring in its effects. By insisting that economic progress must come before anything else, she turned social institutions into more or less efficient means of achieving whatever is presently desired. Institutions ceased to be places in which people could find meaning and became mere tools. The result is the situation that exists today in Britain, where no institution is “fit for purpose”.
Unwittingly, Thatcher practised a revolutionary mode of politics of the kind Burke derided. At the same time, she came to see the settlement she put in place as a chapter in a Burkean grand narrative of liberty. Unsurprisingly, this settlement has now collapsed. The contradictions of conservatism are inherent in Burke’s thinking and looking back to this over-praised worthy won’t help anyone discern the way ahead.
John Gray is the New Statesman’s lead reviewer. His latest book is “The Silence of Animals: on Progress and Other Modern Sceptic: Michael Oakeshott in Cambridge Myths” (Allen Lane, £18.99)
.
Where Nagel Went Wrong?
The Chronicle Review
5/13/13
Where Thomas Nagel Went Wrong
The philosopher's critique of evolution wasn't shocking. So why is he being raked over the coals?
By Michael Chorost
Thomas Nagel is a leading figure in philosophy, now enjoying the title of university professor at New York University, a testament to the scope and influence of his work. His 1974 essay "What Is It Like to Be a Bat?" has been read by legions of undergraduates, with its argument that the inner experience of a brain is truly knowable only to that brain. Since then he has published 11 books, on philosophy of mind, ethics, and epistemology.
But Nagel's academic golden years are less peaceful than he might have wished. His latest book, Mind and Cosmos (Oxford University Press, 2012), has been greeted by a storm of rebuttals, ripostes, and pure snark. "The shoddy reasoning of a once-great thinker," Steven Pinker tweeted. The Weekly Standard quoted the philosopher Daniel Dennett calling Nagel a member of a "retrograde gang" whose work "isn't worth anything—it's cute and it's clever and it's not worth a damn."
The critics have focused much of their ire on what Nagel calls "natural teleology," the hypothesis that the universe has an internal logic that inevitably drives matter from nonliving to living, from simple to complex, from chemistry to consciousness, from instinctual to intellectual.
This internal logic isn't God, Nagel is careful to say. It is not to be found in religion. Still, the critics haven't been mollified. According to orthodox Darwinism, nature has no goals, no direction, no inevitable outcomes. Jerry Coyne, an evolutionary biologist at the University of Chicago, is among those who took umbrage. When I asked him to comment for this article, he wrote, "Nagel is a teleologist, and although not an explicit creationist, his views are pretty much anti-science and not worth highlighting. However, that's The Chronicle's decision: If they want an article on astrology (which is the equivalent of what Nagel is saying), well, fine and good."
The odd thing is, however, that for all of this academic high dudgeon, there actually are scientists—respected ones, Nobel Prize-winning ones—who are saying exactly what Nagel said, and have been saying it for decades. Strangely enough, Nagel doesn't mention them. Neither have his critics. This whole imbroglio about the philosophy of science has left out the science.
Nagel didn't help his cause by (a) being a philosopher opining on science; (b) being alarmingly nice to intelligent-design theorists; and (c) writing in a convoluted style that made him sound unconvinced of his own ideas.
Related ContentFrom Nonlife to Life
The Florida State University philosopher of science Michael Ruse, who has written extensively about arguments over Darwinian theory, says Nagel is a horse who broke into the zebra pen. Evolutionary biologists don't like it when philosophers try to tell them their business: "When you've got a leader of a professional field who comes in and says, as a philosopher, 'I want to tell you all that Darwinian evolutionary theory is full of it,' then of course it's a rather different kettle of fish."
Joan Roughgarden, an ecologist and evolutionary biologist at the Hawaii Institute of Marine Biology, agrees that evolutionary biologists can be nasty when crossed. "I mean, these guys are impervious to contrary evidence and alternative formulations," she says. "What we see in evolution is stasis—conceptual stasis, in my view—where people are ardently defending their formulations from the early 70s."
Nagel really got their noses out of joint by sympathizing with theorists of intelligent design. "They do not deserve the scorn with which they are commonly met," he wrote. "It is manifestly unfair." To be sure, he was not agreeing with them. He notes several times that he is an atheist and has no truck with supernatural gods. He views the ID crowd the way a broad-minded capitalist would sum up Marx: right in his critique, wrong in his solutions. But ID, he says, does contain criticisms of evolutionary theory that should be taken seriously.
Whatever the validity of this stance, its timing was certainly bad. The war between New Atheists and believers has become savage, with Richard Dawkins writing sentences like, "I have described atonement, the central doctrine of Christianity, as vicious, sadomasochistic, and repellent. We should also dismiss it as barking mad. ..." In that climate, saying anything nice at all about religion is a tactical error.
And Nagel is diffident about his ideas. Take this sentence, which packs four negatives into 25 words: "I am not confident that this Aristotelian idea of teleology without intention makes sense, but I do not at the moment see why it doesn't." Mind and Cosmos is full of such negatively phrased assertions. If you're going to make a controversial claim, it helps to do so positively. It also helps to enlist distinguished allies. Nagel has done nothing of the sort. Which is strange, because he has plenty of allies to choose from.
Natural teleology is unorthodox, but it has a long and honorable history. For example, in 1953 the evolutionary biologist Julian Huxley argued that it's in the nature of nature to get more advanced over time. "If we take a snapshot view, improvement eludes us," he wrote. "But as soon as we introduce time, we see trends of improvement."
More recently, Kevin Kelly, founding editor of Wired, made the case for teleology as clearly as could be in his book What Technology Wants: "Evolution ... has an inherent direction, shaped by the nature of matter and energy." That is, there may be laws of nature that push the universe toward the creation of life and mind. Not a supernatural god, but laws as basic and fundamental as those of thermodynamics. Robert Wright said much the same in Nonzero: The Logic of Human Destiny: "This book is a full-throated argument for destiny in the sense of direction." Those books prompted discussion among the literati but little backlash from evolutionary biologists. Ruse thinks that's because the authors are science writers, not scientists: "At a certain level, it's their job either to give the science or to put forward provocative hypotheses, and nobody takes it personally."
But highly regarded scientists have made similar arguments. "Life is almost bound to arise, in a molecular form not very different from its form on Earth," wrote Christian de Duve, a Nobel laureate in physiology or medicine, in 1995. Robert Hazen, a mineralogist and biogeologist at the Carnegie Institution for Science, struck a similar note in 2007: "With autotrophy, biochemistry is wired into the universe. The self-made cell emerges from geochemistry as inevitably as basalt or granite." Harold J. Morowitz, a biophysicist at George Mason University, argued that evolution has an arrow built into it: "We start with observations, and if the evolving cosmos has an observed direction, rejecting that view is clearly nonempirical. There need not necessarily be a knowable end point, but there may be an arrow."
Nagel discusses none of that work. He asserts only that evolution is directional, without making a case for it. That has left him open to a number of obvious rebuttals. The biologist H. Allen Orr, at the University of Rochester, pointed out that some species become less complex—parasites, for example, after learning how to steal resources from their hosts. And many species, such as sharks, have been happy to stay just the way they are for millions of years. Only one species—us—has bothered to reach sentience. "The point is," Orr wrote in The New York Review of Books, "that if nature has goals, it certainly seems to have many and consciousness would appear to be fairly far down on the list."
Indeed, biologists usually argue that when you do get progress, it came about by accident.
When you have millions of species taking random walks through the wilds of genetic variation and natural selection, some will, by the luck of the draw, become more complex and more capable. That is, when there is an overall increase in variance, some of the variants will be more complex and capable than their ancestors. Biologists say that such ascents in complexity happen "passively."
Yet some scientists think that increases in complexity also happen "actively," that is, driven by physical laws that directly favor increases in complexity. As a group, these scientists have no sympathy for intelligent design. However, they do see reasons to think that seen as a whole, life does go from simple to complex, from instinctual to intellectual. And they are asking if there are fundamental laws of nature that make it happen.
Perhaps the best known of these scientists is Stuart Kauffman, of the Santa Fe Institute, who argues that the universe gives us "order for free." Kauffman has spent decades on origin-of-life research, aiming to show that the transition from chemistry to metabolism is as inevitable as a ball rolling down a slope. Molecules on the early earth, he suggests, inevitably began to catalyze themselves in self-sustaining reactions ("autocatalytic networks"), converting energy and raw materials into increasingly complex structures that eventually crossed the boundary between nonliving and living. Nagel mentions his work once, briefly, in a footnote.
Kauffman has plenty of company. The paleontologist Simon Conway Morris, at the University of Cambridge, has argued that natural structures such as eyes, neurons, brains, and hands are so beneficial that they will get invented over and over again. They are, in effect, attractors in an abstract biological space that pull life in their direction. Contingency and catastrophe will delay them but cannot stop them. Conway Morris sees this as evidence that not only life but human life, and humanlike minds, will emerge naturally from the cosmos: "If we humans had not evolved, then something more or less identical would have emerged sooner or later."
Other biologists are proposing laws that would explain evolutionary ascent in fundamental terms. Daniel McShea and Robert Brandon, a biologist and a philosopher of science, respectively, at Duke University, have argued for what they call a "zero-force evolutionary law," which posits that diversity and complexity will necessarily increase even without environmental change. The chemist Addy Pross, at Ben-Gurion University of the Negev, in Israel, argues that life exhibits "dynamic kinetic stability," in which self-replicating systems become more stable through becoming more complex—and are therefore inherently driven to do so.
Still other scientists have asked how we could measure increases in complexity without being biased by our human-centric perspective. Robert Hazen, working with the Nobel Prize winner Jack Szostak, has proposed a metric he calls "functional information," which measures the number of functions and relationships an organism has relative to its environment. The Harvard astrophysicist Eric Chaisson has proposed measuring a quantity that he calls "energy-rate density": how much energy flows through one gram of a system per second. He argues that when he plots energy-rate density against the emergence of new species, the clear result is an overall increase in complexity over time.
While the jury is most definitely out on whether these proposed laws and measures are right or wrong (and if right, whether they are profound or trivial), this is a body of work that Nagel could have drawn upon in making his argument. He apparently felt it was acceptable to ignore the science. "Philosophy cannot generate such explanations," he wrote; "it can only point out the gaping lack of them." But there is no gaping lack of attempts to supply them. "He's done so little serious homework," says Michael Ruse. "He just dismisses origin-of-life studies without any indication that he's done any work on it whatsoever."
In short, Mind and Cosmos is not only negative but underpowered, as if Nagel had brought a knife to a shootout. (He declines to comment, telling me by e-mail, "I have a longstanding aversion to interviews.")
But Nagel's goal was valid: to point out that fundamental questions of origins, evolution, and intelligence remain unanswered, and to question whether current ways of thinking are up to the task. A really good book on this subject would need to be both scientific and philosophical: scientific to show what is known, philosophical to show how to go beyond what is known. (A better term might be "metascientific," that is, talking about the science and about how to make new sciences.)
The pieces of this book are scattered about the landscape, in a thousand scraps of ideas from biologists, physicists, physicians, chemists, mathematicians, journalists, public intellectuals, and philosophers. But no book has yet emerged that is mighty enough to shove aside the current order, persuading scientists and nonscientists alike, sparking new experiments, changing syllabi, rejiggering budget priorities, spawning new departments, and changing human language and ways of thought forever. On the Origin of Species did it in 1859. We await the next Darwin.
Michael Chorost is the author of Rebuilt: How Becoming Part Computer Made Me More Human (Houghton Mifflin, 2005) and World Wide Mind: The Coming Integration of Humanity, Machines, and the Internet (Free Press, 2011).
5/13/13
Where Thomas Nagel Went Wrong
The philosopher's critique of evolution wasn't shocking. So why is he being raked over the coals?
By Michael Chorost
Thomas Nagel is a leading figure in philosophy, now enjoying the title of university professor at New York University, a testament to the scope and influence of his work. His 1974 essay "What Is It Like to Be a Bat?" has been read by legions of undergraduates, with its argument that the inner experience of a brain is truly knowable only to that brain. Since then he has published 11 books, on philosophy of mind, ethics, and epistemology.
But Nagel's academic golden years are less peaceful than he might have wished. His latest book, Mind and Cosmos (Oxford University Press, 2012), has been greeted by a storm of rebuttals, ripostes, and pure snark. "The shoddy reasoning of a once-great thinker," Steven Pinker tweeted. The Weekly Standard quoted the philosopher Daniel Dennett calling Nagel a member of a "retrograde gang" whose work "isn't worth anything—it's cute and it's clever and it's not worth a damn."
The critics have focused much of their ire on what Nagel calls "natural teleology," the hypothesis that the universe has an internal logic that inevitably drives matter from nonliving to living, from simple to complex, from chemistry to consciousness, from instinctual to intellectual.
This internal logic isn't God, Nagel is careful to say. It is not to be found in religion. Still, the critics haven't been mollified. According to orthodox Darwinism, nature has no goals, no direction, no inevitable outcomes. Jerry Coyne, an evolutionary biologist at the University of Chicago, is among those who took umbrage. When I asked him to comment for this article, he wrote, "Nagel is a teleologist, and although not an explicit creationist, his views are pretty much anti-science and not worth highlighting. However, that's The Chronicle's decision: If they want an article on astrology (which is the equivalent of what Nagel is saying), well, fine and good."
The odd thing is, however, that for all of this academic high dudgeon, there actually are scientists—respected ones, Nobel Prize-winning ones—who are saying exactly what Nagel said, and have been saying it for decades. Strangely enough, Nagel doesn't mention them. Neither have his critics. This whole imbroglio about the philosophy of science has left out the science.
Nagel didn't help his cause by (a) being a philosopher opining on science; (b) being alarmingly nice to intelligent-design theorists; and (c) writing in a convoluted style that made him sound unconvinced of his own ideas.
Related ContentFrom Nonlife to Life
The Florida State University philosopher of science Michael Ruse, who has written extensively about arguments over Darwinian theory, says Nagel is a horse who broke into the zebra pen. Evolutionary biologists don't like it when philosophers try to tell them their business: "When you've got a leader of a professional field who comes in and says, as a philosopher, 'I want to tell you all that Darwinian evolutionary theory is full of it,' then of course it's a rather different kettle of fish."
Joan Roughgarden, an ecologist and evolutionary biologist at the Hawaii Institute of Marine Biology, agrees that evolutionary biologists can be nasty when crossed. "I mean, these guys are impervious to contrary evidence and alternative formulations," she says. "What we see in evolution is stasis—conceptual stasis, in my view—where people are ardently defending their formulations from the early 70s."
Nagel really got their noses out of joint by sympathizing with theorists of intelligent design. "They do not deserve the scorn with which they are commonly met," he wrote. "It is manifestly unfair." To be sure, he was not agreeing with them. He notes several times that he is an atheist and has no truck with supernatural gods. He views the ID crowd the way a broad-minded capitalist would sum up Marx: right in his critique, wrong in his solutions. But ID, he says, does contain criticisms of evolutionary theory that should be taken seriously.
Whatever the validity of this stance, its timing was certainly bad. The war between New Atheists and believers has become savage, with Richard Dawkins writing sentences like, "I have described atonement, the central doctrine of Christianity, as vicious, sadomasochistic, and repellent. We should also dismiss it as barking mad. ..." In that climate, saying anything nice at all about religion is a tactical error.
And Nagel is diffident about his ideas. Take this sentence, which packs four negatives into 25 words: "I am not confident that this Aristotelian idea of teleology without intention makes sense, but I do not at the moment see why it doesn't." Mind and Cosmos is full of such negatively phrased assertions. If you're going to make a controversial claim, it helps to do so positively. It also helps to enlist distinguished allies. Nagel has done nothing of the sort. Which is strange, because he has plenty of allies to choose from.
Natural teleology is unorthodox, but it has a long and honorable history. For example, in 1953 the evolutionary biologist Julian Huxley argued that it's in the nature of nature to get more advanced over time. "If we take a snapshot view, improvement eludes us," he wrote. "But as soon as we introduce time, we see trends of improvement."
More recently, Kevin Kelly, founding editor of Wired, made the case for teleology as clearly as could be in his book What Technology Wants: "Evolution ... has an inherent direction, shaped by the nature of matter and energy." That is, there may be laws of nature that push the universe toward the creation of life and mind. Not a supernatural god, but laws as basic and fundamental as those of thermodynamics. Robert Wright said much the same in Nonzero: The Logic of Human Destiny: "This book is a full-throated argument for destiny in the sense of direction." Those books prompted discussion among the literati but little backlash from evolutionary biologists. Ruse thinks that's because the authors are science writers, not scientists: "At a certain level, it's their job either to give the science or to put forward provocative hypotheses, and nobody takes it personally."
But highly regarded scientists have made similar arguments. "Life is almost bound to arise, in a molecular form not very different from its form on Earth," wrote Christian de Duve, a Nobel laureate in physiology or medicine, in 1995. Robert Hazen, a mineralogist and biogeologist at the Carnegie Institution for Science, struck a similar note in 2007: "With autotrophy, biochemistry is wired into the universe. The self-made cell emerges from geochemistry as inevitably as basalt or granite." Harold J. Morowitz, a biophysicist at George Mason University, argued that evolution has an arrow built into it: "We start with observations, and if the evolving cosmos has an observed direction, rejecting that view is clearly nonempirical. There need not necessarily be a knowable end point, but there may be an arrow."
Nagel discusses none of that work. He asserts only that evolution is directional, without making a case for it. That has left him open to a number of obvious rebuttals. The biologist H. Allen Orr, at the University of Rochester, pointed out that some species become less complex—parasites, for example, after learning how to steal resources from their hosts. And many species, such as sharks, have been happy to stay just the way they are for millions of years. Only one species—us—has bothered to reach sentience. "The point is," Orr wrote in The New York Review of Books, "that if nature has goals, it certainly seems to have many and consciousness would appear to be fairly far down on the list."
Indeed, biologists usually argue that when you do get progress, it came about by accident.
When you have millions of species taking random walks through the wilds of genetic variation and natural selection, some will, by the luck of the draw, become more complex and more capable. That is, when there is an overall increase in variance, some of the variants will be more complex and capable than their ancestors. Biologists say that such ascents in complexity happen "passively."
Yet some scientists think that increases in complexity also happen "actively," that is, driven by physical laws that directly favor increases in complexity. As a group, these scientists have no sympathy for intelligent design. However, they do see reasons to think that seen as a whole, life does go from simple to complex, from instinctual to intellectual. And they are asking if there are fundamental laws of nature that make it happen.
Perhaps the best known of these scientists is Stuart Kauffman, of the Santa Fe Institute, who argues that the universe gives us "order for free." Kauffman has spent decades on origin-of-life research, aiming to show that the transition from chemistry to metabolism is as inevitable as a ball rolling down a slope. Molecules on the early earth, he suggests, inevitably began to catalyze themselves in self-sustaining reactions ("autocatalytic networks"), converting energy and raw materials into increasingly complex structures that eventually crossed the boundary between nonliving and living. Nagel mentions his work once, briefly, in a footnote.
Kauffman has plenty of company. The paleontologist Simon Conway Morris, at the University of Cambridge, has argued that natural structures such as eyes, neurons, brains, and hands are so beneficial that they will get invented over and over again. They are, in effect, attractors in an abstract biological space that pull life in their direction. Contingency and catastrophe will delay them but cannot stop them. Conway Morris sees this as evidence that not only life but human life, and humanlike minds, will emerge naturally from the cosmos: "If we humans had not evolved, then something more or less identical would have emerged sooner or later."
Other biologists are proposing laws that would explain evolutionary ascent in fundamental terms. Daniel McShea and Robert Brandon, a biologist and a philosopher of science, respectively, at Duke University, have argued for what they call a "zero-force evolutionary law," which posits that diversity and complexity will necessarily increase even without environmental change. The chemist Addy Pross, at Ben-Gurion University of the Negev, in Israel, argues that life exhibits "dynamic kinetic stability," in which self-replicating systems become more stable through becoming more complex—and are therefore inherently driven to do so.
Still other scientists have asked how we could measure increases in complexity without being biased by our human-centric perspective. Robert Hazen, working with the Nobel Prize winner Jack Szostak, has proposed a metric he calls "functional information," which measures the number of functions and relationships an organism has relative to its environment. The Harvard astrophysicist Eric Chaisson has proposed measuring a quantity that he calls "energy-rate density": how much energy flows through one gram of a system per second. He argues that when he plots energy-rate density against the emergence of new species, the clear result is an overall increase in complexity over time.
While the jury is most definitely out on whether these proposed laws and measures are right or wrong (and if right, whether they are profound or trivial), this is a body of work that Nagel could have drawn upon in making his argument. He apparently felt it was acceptable to ignore the science. "Philosophy cannot generate such explanations," he wrote; "it can only point out the gaping lack of them." But there is no gaping lack of attempts to supply them. "He's done so little serious homework," says Michael Ruse. "He just dismisses origin-of-life studies without any indication that he's done any work on it whatsoever."
In short, Mind and Cosmos is not only negative but underpowered, as if Nagel had brought a knife to a shootout. (He declines to comment, telling me by e-mail, "I have a longstanding aversion to interviews.")
But Nagel's goal was valid: to point out that fundamental questions of origins, evolution, and intelligence remain unanswered, and to question whether current ways of thinking are up to the task. A really good book on this subject would need to be both scientific and philosophical: scientific to show what is known, philosophical to show how to go beyond what is known. (A better term might be "metascientific," that is, talking about the science and about how to make new sciences.)
The pieces of this book are scattered about the landscape, in a thousand scraps of ideas from biologists, physicists, physicians, chemists, mathematicians, journalists, public intellectuals, and philosophers. But no book has yet emerged that is mighty enough to shove aside the current order, persuading scientists and nonscientists alike, sparking new experiments, changing syllabi, rejiggering budget priorities, spawning new departments, and changing human language and ways of thought forever. On the Origin of Species did it in 1859. We await the next Darwin.
Michael Chorost is the author of Rebuilt: How Becoming Part Computer Made Me More Human (Houghton Mifflin, 2005) and World Wide Mind: The Coming Integration of Humanity, Machines, and the Internet (Free Press, 2011).
Saturday, May 25, 2013
The Closing of the Conservative Mind
As Paul Krugman says, to be a conservative today in good standing you must believe or pretend to believe things that are demonstrably not true.
--------------------------------------------------------------------------------
May 25, 2013, 10:53 am 75 Comments
The Closing of the Conservative Mind
Jonathan Chait has an interesting portrait of Josh Barro; Mike Konczal, citing this and also a longer discussion of “reformish” conservatives by Ryan Cooper, argues that there really isn’t much to see here. I agree, and have been trying to pin down what I mean by that.
Start with the proposition that there is a legitimate left-right divide in U.S. politics, built around a real issue: how extensive should be make our social safety net, and (hence) how much do we need to raise in taxes? This is ultimately a values issue, with no right answer.
There are, however, a lot of largely empirical questions whose answers need not, in principle, be associated with one’s position on this left-right divide but, in practice, are. A partial list:
1.The existence of anthropogenic climate change
2.The effects of fiscal stimulus/austerity
3.The effects of monetary expansion, and the risks of inflation
4.The revenue effects of tax cuts
5.The workability of universal health care
I’ve deliberately chosen a list here where the evidence is, in each case, pretty much overwhelming. There is a real scientific consensus on 1; the evidence of the past few years has been very strong on 2 and 3; there are no serious studies supporting the view that we’re on the wrong side of the Laffer curve; one form or another of UHC operates all across the advanced world, with lower costs than the US system.
So? You could, as I said, take the “liberal” position on each of these issues while still being conservative in the sense that you want a smaller government. But what the “reformish” conservatives Ryan Cooper lists do, in almost all cases, is either (a) to follow the party line on these issues or (b) to hint at some flexibility – and thereby cultivate an image of being open-minded — as long as the issues don’t get close to an actual policy decision, but to always find a way to support the Republican position whenever it actually matters.
But aren’t there people like Bruce Bartlett or Josh Barro who really do break with the party line on some or all of these issues? Yes, but they are then immediately branded as “no longer conservatives”, in a sort of inverted version of the none-dare-call-it-treason effect.
The point is that there remains essentially no room for independent thinking within the conservative movement.
Could you say the same thing about liberals? I don’t think so. A few decades ago, you might have been able to draw up a somewhat similar list for the other side, involving things like the superiority of tradeable emission permits to command-and-control pollution regulation, the general undesirability of rent control, the benefits of airline deregulation, the absence of a usable long-run tradeoff between unemployment and inflation (and hence the impossibility of setting a 4 percent target for unemployment). But many liberals eventually conceded the point in each of these cases (maybe even conceded too far in a couple), without being declared no longer liberal. The point is that being a good liberal doesn’t require that you believe, or pretend to believe, lots of things that almost certainly aren’t true; being a good conservative does.
And like Mike Konczal, I see no sign that any of this is changing.
--------------------------------------------------------------------------------
May 25, 2013, 10:53 am 75 Comments
The Closing of the Conservative Mind
Jonathan Chait has an interesting portrait of Josh Barro; Mike Konczal, citing this and also a longer discussion of “reformish” conservatives by Ryan Cooper, argues that there really isn’t much to see here. I agree, and have been trying to pin down what I mean by that.
Start with the proposition that there is a legitimate left-right divide in U.S. politics, built around a real issue: how extensive should be make our social safety net, and (hence) how much do we need to raise in taxes? This is ultimately a values issue, with no right answer.
There are, however, a lot of largely empirical questions whose answers need not, in principle, be associated with one’s position on this left-right divide but, in practice, are. A partial list:
1.The existence of anthropogenic climate change
2.The effects of fiscal stimulus/austerity
3.The effects of monetary expansion, and the risks of inflation
4.The revenue effects of tax cuts
5.The workability of universal health care
I’ve deliberately chosen a list here where the evidence is, in each case, pretty much overwhelming. There is a real scientific consensus on 1; the evidence of the past few years has been very strong on 2 and 3; there are no serious studies supporting the view that we’re on the wrong side of the Laffer curve; one form or another of UHC operates all across the advanced world, with lower costs than the US system.
So? You could, as I said, take the “liberal” position on each of these issues while still being conservative in the sense that you want a smaller government. But what the “reformish” conservatives Ryan Cooper lists do, in almost all cases, is either (a) to follow the party line on these issues or (b) to hint at some flexibility – and thereby cultivate an image of being open-minded — as long as the issues don’t get close to an actual policy decision, but to always find a way to support the Republican position whenever it actually matters.
But aren’t there people like Bruce Bartlett or Josh Barro who really do break with the party line on some or all of these issues? Yes, but they are then immediately branded as “no longer conservatives”, in a sort of inverted version of the none-dare-call-it-treason effect.
The point is that there remains essentially no room for independent thinking within the conservative movement.
Could you say the same thing about liberals? I don’t think so. A few decades ago, you might have been able to draw up a somewhat similar list for the other side, involving things like the superiority of tradeable emission permits to command-and-control pollution regulation, the general undesirability of rent control, the benefits of airline deregulation, the absence of a usable long-run tradeoff between unemployment and inflation (and hence the impossibility of setting a 4 percent target for unemployment). But many liberals eventually conceded the point in each of these cases (maybe even conceded too far in a couple), without being declared no longer liberal. The point is that being a good liberal doesn’t require that you believe, or pretend to believe, lots of things that almost certainly aren’t true; being a good conservative does.
And like Mike Konczal, I see no sign that any of this is changing.
To the Graduates Etc.
To the graduates of 2013: Congratulations and best of luck as you go forth. You'll need it. Please remember your manners and send thank you notes to those who grudgingly favored you with graduation presents. You'll face trials & tribulations as you proceed down life's post-graduation road. Tribulations are better than trials because legal bills are so darn high in these litigious days. Yes, ...who you know can be important, but what you know will ultimately pay the bills especially if you expect to buy a house before you're 45. You are not bullet-proof. If you ever find yourself in the wrong place at the wrong time, well, good luck is all I can say. Sneak out quietly and go home. You'll muddle your way thru most things and learn as you go especially as you're on your 4th job in three years starting out. Your parents love you but they're probably already thinking about redoing your room so don't even think about returning home. A final word to the wise: stay off Facebook late at night. Nothing good happens on Facebook after midnight or if you're in a drunken stupor.
StatusPhotoPlaceLife Event
Drag Link/Photos HereDrop LinkDrop PhotoDrop Photos
What's on your mind?
Tag PeopleAdd dateTag actionOnly MePublicFriends (+)Friends except Acquaintances (+)Only Me (+)CustomClose FriendsCengage LearningSee all lists...FamilyWinfield High SchoolPelham, Alabama AreaWinfield FriendsAuburn UniversityJefferson State Community CollegeAcquaintancesGo BackPublicFriendsFriends except AcquaintancesOnly MeCustomClose FriendsCengage LearningSee all lists...FamilyWinfield High SchoolPelham, Alabama AreaWinfield FriendsAuburn UniversityJefferson State Community CollegeAcquaintancesGo BackPost..Fred Hudson
Yesterday
.To the graduates of 2013: Congratulations and best of luck as you go forth. You'll need it. Please remember your manners and send thank you notes to those who grudgingly favored you with graduation presents. You'll face trials & tribulations as you proceed down life's post-graduation road. Tribulations are better than trials because legal bills are so darn high in these litigious days. Yes, ...who you know can be important, but what you know will ultimately pay the bills especially if you expect to buy a house before you're 45. You are not bullet-proof. If you ever find yourself in the wrong place at the wrong time, well, good luck is all I can say. Sneak out quietly and go home. You'll muddle your way thru most things and learn as you go especially as you're on your 4th job in three years starting out. Your parents love you but they're probably already thinking about redoing your room so don't even think about returning home. A final word to the wise: stay off Facebook late at night. Nothing good happens on Facebook after midnight or if you're in a drunken stupor.See More
Like · · Promote · Share.
Diann O'Mary Harbin Barrett, Kimberley Stewart, Jane Moore Patton and 4 others like this..Kimberley Stewart Wise advice Fred!
22 hours ago via mobile · Like..Tom Tidwell Entitled: Extemporaneous Commencement Address by Non-invitee
16 hours ago · Like..Fred Hudson The post was not extemporaneous. I made it up as I went along.
16 hours ago · Like..Diann O'Mary Harbin Barrett Well said, Fred!!
10 hours ago via mobile · Like..
Write a comment.....Fred Hudson
Friday
.To Moyna: I got a hot rod ford and a two dollar bill, and I know a place just over the hill. Hey good lookin! What you got cookin? How's about cookin something up with me?
Like · · Promote · Share.
Hank Henley, Peggy Jackson Hopp, Lucy Horsley Lewis and 3 others like this..View 1 more comment..Fred Hudson You're on!
Friday at 8:28pm · Edited · Like..Josie LoGiudice You guys are too cute!!!
Friday at 8:37pm via mobile · Like..Lucy Horsley Lewis You're all dressed up for the dance!
Friday at 11:10pm · Like..Lynda Kirkpatrick Get a room.
Friday at 11:32pm · Like
StatusPhotoPlaceLife Event
Drag Link/Photos HereDrop LinkDrop PhotoDrop Photos
What's on your mind?
Tag PeopleAdd dateTag actionOnly MePublicFriends (+)Friends except Acquaintances (+)Only Me (+)CustomClose FriendsCengage LearningSee all lists...FamilyWinfield High SchoolPelham, Alabama AreaWinfield FriendsAuburn UniversityJefferson State Community CollegeAcquaintancesGo BackPublicFriendsFriends except AcquaintancesOnly MeCustomClose FriendsCengage LearningSee all lists...FamilyWinfield High SchoolPelham, Alabama AreaWinfield FriendsAuburn UniversityJefferson State Community CollegeAcquaintancesGo BackPost..Fred Hudson
Yesterday
.To the graduates of 2013: Congratulations and best of luck as you go forth. You'll need it. Please remember your manners and send thank you notes to those who grudgingly favored you with graduation presents. You'll face trials & tribulations as you proceed down life's post-graduation road. Tribulations are better than trials because legal bills are so darn high in these litigious days. Yes, ...who you know can be important, but what you know will ultimately pay the bills especially if you expect to buy a house before you're 45. You are not bullet-proof. If you ever find yourself in the wrong place at the wrong time, well, good luck is all I can say. Sneak out quietly and go home. You'll muddle your way thru most things and learn as you go especially as you're on your 4th job in three years starting out. Your parents love you but they're probably already thinking about redoing your room so don't even think about returning home. A final word to the wise: stay off Facebook late at night. Nothing good happens on Facebook after midnight or if you're in a drunken stupor.See More
Like · · Promote · Share.
Diann O'Mary Harbin Barrett, Kimberley Stewart, Jane Moore Patton and 4 others like this..Kimberley Stewart Wise advice Fred!
22 hours ago via mobile · Like..Tom Tidwell Entitled: Extemporaneous Commencement Address by Non-invitee
16 hours ago · Like..Fred Hudson The post was not extemporaneous. I made it up as I went along.
16 hours ago · Like..Diann O'Mary Harbin Barrett Well said, Fred!!
10 hours ago via mobile · Like..
Write a comment.....Fred Hudson
Friday
.To Moyna: I got a hot rod ford and a two dollar bill, and I know a place just over the hill. Hey good lookin! What you got cookin? How's about cookin something up with me?
Like · · Promote · Share.
Hank Henley, Peggy Jackson Hopp, Lucy Horsley Lewis and 3 others like this..View 1 more comment..Fred Hudson You're on!
Friday at 8:28pm · Edited · Like..Josie LoGiudice You guys are too cute!!!
Friday at 8:37pm via mobile · Like..Lucy Horsley Lewis You're all dressed up for the dance!
Friday at 11:10pm · Like..Lynda Kirkpatrick Get a room.
Friday at 11:32pm · Like
Jefferson
It seems that I am preoccupied with reading about Thomas Jefferson. The scholarship on our third President is so vast I realize that it will be impossible to ever come to a conclusive opinion about this man.
Tuesday, May 21, 2013
The Benefits of Optimism Are Real
BY Emily Esfahani Smith
The Atlantic
1 March 2013
One of the most memorable scenes of the Oscar-nominated film Silver Linings Playbook revolves around Ernest Hemingway's A Farewell to Arms, a novel that does not end well, to put it mildly.
Patrizio Solitano Jr. (Bradley Cooper) has come home after an eight-month stint being treated for bipolar disorder at a psychiatric hospital, where he was sentenced to go after he nearly beat his wife's lover to death. Home from the hospital, living under his parents' charge, Pat has lost his wife, his job, and his house. But he tries to put the pieces of his life back together. He exercises, maintains an upbeat lifestyle, and tries to better his mind by reading through the novels that his estranged wife Nikki, a high school English teacher, assigns her students.
Pat takes up a personal motto, excelsior -- Latin for "ever upward." He tells his state-appointed therapist, "I hate my illness and I want to control it. This is what I believe to be true: You have to do everything you can and if you stay positive you have a shot at a silver lining."
Which is why the Hemingway novel, which is part of Nikki's syllabus, is such a buzz kill. When he gets to the last pages, and discovers that it ends grimly with death, he slams the book shut, throws it through a glass window of his parents' house, and storms into their room in the middle of the night, saying:
Another best picture nominee, Life of Pi, employs a similar device. Pi finds himself aboard a lifeboat with a ferocious Bengal tiger in the aftermath of a shipwreck that has his entire family. Lost at sea in the Pacific Ocean for 227 days -- starved, desperate, and forced into a game of survival with the tiger -- Pi pushes forward, even though he, like Pat, has lost everything. Pi says, "You might think I lost all hope at that point. I did. And as a result I perked up and felt much better."
Pi's resilience is incredible once you realize what happens on board the lifeboat and how Pi copes with the tragedy that he witnesses and endures. There's more to the story than the boy and the tiger. Though what really happened is terrible, Pi chooses to tell a different story. His parallels what really happened, but is beautiful not bleak, transcendent not nihilistic.
"Which story do you prefer?" he asks at the end.
This questions turns out to matter a great deal if you are trying to figure out who grows after trauma and who gets swallowed up by it, a question that each movie addresses and that psychologists have been grappling with for years. Think back to the last time you experienced a loss, setback, or hardship. Did you respond by venting, ruminating, and dwelling on the disappointment, or did you look for a faint flash of meaning through all of the darkness -- a silver lining of some sort? How quickly did you bounce back -- how resilient are you?
The New Yorker's Richard Brody criticized Silver Linings Playbook for its sentimentality and "faith-based view of mental illness and, overall, of emotional redemption." The New York Times' A.O. Scott made a similar, if predictable, criticism of Life of Pi: "The novelist and the older Pi are eager...to repress the darker implications of the story, as if the presence of cruelty and senseless death might be too much for anyone to handle...Insisting on the benevolence of the universe in the way that Life of Pi does can feel more like a result of delusion or deceit than of earnest devotion."
But these criticisms miss the point. First, they fail to understand why these two strange and idiosyncratic movies, both based on novels, resonated with so many millions of people. Their themes of resilience speak to each of us -- and there is a reason for that. The key insight of each movie is, whether their creators realized it or not, grounded in a growing body of scientific research, which Brody and Scott overlook.
Far from being delusional or faith-based, having a positive outlook in difficult circumstances is not only an important predictor of resilience -- how quickly people recover from adversity -- but it is the most important predictor of it. People who are resilient tend to be more positive and optimistic compared to less-resilient folks; they are better able to regulate their emotions; and they are able to maintain their optimism through the most trying circumstances.
This is what Dr. Dennis Charney, the dean of Mount Sinai School of Medicine, found when he examined approximately 750 Vietnam war veterans who were held as prisoners of war for six to eight years. Tortured and kept in solitary confinement, these 750 men were remarkably resilient. Unlike many fellow veterans, they did not develop depression or posttraumatic stress disorder after their release, even though they endured extreme stress. What was their secret? After extensive interviews and tests, Charney found ten characteristics that set them apart. The top one was optimism. The second was altruism. Humor and having a meaning in life -- or something to live for -- were also important.
For many years, psychologists, following Freud, thought that people simply needed to express their anger and anxiety -- blow off some steam -- to be happier. But this is wrong. Researchers, for example, asked people who were mildly-to-moderately depressed to dwell on their depression for eight minutes. The researchers found that such ruminating caused the depressed people to become significantly more depressed and for a longer period of time than people who simply distracted themselves thinking about something else. Senseless suffering -- suffering that lacks a silver lining -- viciously leads to more depression.
Counter-intuitively, another study found that facing down adversity by venting -- hitting a punching bag or being vengeful toward someone who makes you angry -- actually leads to people feeling far worse, not better. Actually, doing nothing at all in response to anger was more effective than expressing the anger in these destructive ways.
Even more effective than doing nothing is channeling your depression toward a productive, positive goal, as Pat and Pi do. James Pennebaker, a psychological researcher at the University of Texas in Austin, has found that people who find meaning in adversity are ultimately healthier in the long run than those who do not. In a study, he asked people to write about the darkest, most traumatic experience of their lives for four days in a row for a period of 15 minutes each day.
Analyzing their writing, Pennebaker noticed that the people who benefited most from the exercise were trying to derive meaning from the trauma. They were probing into the causes and consequences of the adversity and, as a result, eventually grew wiser about it. A year later, their medical records showed that the meaning-makers went to the doctor and hospital fewer times than people in the control condition, who wrote about a non-traumatic event. People who used the exercise to vent, by contrast, received no health benefits. Interestingly, when Pennebaker had other research subjects express their emotions through song or dance, the health benefits did not appear. There was something unique and special about the stories people told themselves. Those stories helped people find a silver lining in their adversity.
Barbara Fredrickson, a psychological researcher at the University of North Carolina at Chapel Hill, has looked more closely at the relationship between being positive and resilience. Her research shows how important one is for the other.
For starters, having a positive mood makes people more resilient physically. In one study, research subjects were outfitted with a device that measured their heart activity. After their baseline heart activity was recorded, they were presented with a stressful task: Each was asked to quickly prepare and deliver a speech on why he or she is a good friend. They were told that the speech would be videotaped and evaluated.
Heart rates rapidly increased. Arteries constricted. Blood pressure shot up.
Then, participants were shown a short video clip that either evoked negative emotions (like sadness), positive emotions (like happiness), or a neutral condition of no emotions. The participants were also told that if they were shown a video clip "by chance" that they were off the hook: They did not have to give the speech after all. That meant that their anxiety would start to subside as the video clips started.
Here was the interesting finding: The heart activity of the participants who viewed the positive clips returned to normal much quicker than their peers who were shown the negative or neutral clips. Positive emotions can, the researchers concluded, undo the effects of a stressful negative experience. The researchers found that the most resilient people were also more positive in day-to-day life.
It turns out that resilient people are good at transforming negative feelings into positive ones. For instance, one of the major findings of Fredrickson's studies was that resilient people took a different attitude toward the speech task than non-resilient people. They viewed the task as a challenge and opportunity for growth rather than as a threat. In other words, they found the silver lining.
With that in mind, the researchers wondered if they could inject some positivity into the non-resilient people to make them more resilient. They primed both types of people to approach the task either positive or negatively. The researchers told some people to see the task as a threat and they told others to see it as a challenge. What they found is good news for resilient and non-resilient people alike.
Resilient people who saw the task as a challenge did fine, as predicted. So did, interestingly, resilient people who were told to view the task as a threat. Resilient people, no matter how they approached the task, had the same cardiovascular recovery rate.
The people who benefitted from the priming were non-resilient people. Those who were told to approach the task as an opportunity rather than a threat suddenly started looking like high resilient people in their cardiovascular measures. They bounced back quicker than they otherwise would have.
Resilient people are good at bouncing back because they are emotionally complex. In each of Fredrickson's studies, resilient people experience the same level of frustration and anxiety as the less resilient participants. Their physiological and emotional spikes were equally high. This is important. It reveals that resilient people are not Pollyannas, deluding themselves with positivity. They just let go of the negativity, worry less, and shift their attention to the positive more quickly.
Resilient people also respond to adversity by appealing to a wider range of emotions. In another study, for instance, participants were asked to write short essays about the most important problem that they were facing in their lives. While resilient people reported the same amount of anxiety as less resilient people in the essays, they also revealed more happiness, interest, and eagerness toward the problem. For resilient people, high levels of positive emotions exist side-by-side with negative emotions. Think of how Pi responds to his seemingly hopeless situation aboard the boat: "I tell you, if you were in such dire straits as I was, you too would elevate your thoughts. The lower you are, the higher your mind will want to soar."
When your mind starts soaring, you notice more and more positive things. This unleashes an upward spiral of positive emotions that opens people up to new ways of thinking and seeing the world -- to new ways forward. This is yet another reason why positive people are resilient. They see opportunities that negative people don't. Negativity, for adaptive reasons, puts you in defense mode, narrows your field of vision, and shuts you off to new possibilities since they're seen as risks.
This calls to mind one of the best scenes from Silver Linings Playbook, in which a bad situation nearly consumes Pat. He is at a diner with Tiffany (Jennifer Lawrence), when he hears "Ma Cherie Amour" playing in his head -- the song that was playing when he found his estranged wife naked in the shower with another man -- and has a traumatic flashback.
Tiffany helps him work past the episode: "You gonna go your whole life scared of that song? It's just a song. Don't make it a monster... There's no song playing. There's no song. Breathe, count backwards from ten. That's it." He recovers and their interaction sets the stage for the rest of the movie.
Like Life of Pi, Silver Linings Playbook is about how we can tame our inner demons with hope and a positive outlook on life. By finding meaning and love in terrible circumstances, as Pi and Pat do, they overcome their suffering and, in the process, reveal how uplifting silver linings can be.
The Atlantic
1 March 2013
One of the most memorable scenes of the Oscar-nominated film Silver Linings Playbook revolves around Ernest Hemingway's A Farewell to Arms, a novel that does not end well, to put it mildly.
Patrizio Solitano Jr. (Bradley Cooper) has come home after an eight-month stint being treated for bipolar disorder at a psychiatric hospital, where he was sentenced to go after he nearly beat his wife's lover to death. Home from the hospital, living under his parents' charge, Pat has lost his wife, his job, and his house. But he tries to put the pieces of his life back together. He exercises, maintains an upbeat lifestyle, and tries to better his mind by reading through the novels that his estranged wife Nikki, a high school English teacher, assigns her students.
Pat takes up a personal motto, excelsior -- Latin for "ever upward." He tells his state-appointed therapist, "I hate my illness and I want to control it. This is what I believe to be true: You have to do everything you can and if you stay positive you have a shot at a silver lining."
Which is why the Hemingway novel, which is part of Nikki's syllabus, is such a buzz kill. When he gets to the last pages, and discovers that it ends grimly with death, he slams the book shut, throws it through a glass window of his parents' house, and storms into their room in the middle of the night, saying:
This whole time you're rooting for this Hemingway guy to survive the war and to be with the woman that he loves, Catherine Barkley... And he does, he does, he survives the war after getting blown up. He survives it and he escapes to Switzerland with Catherine. You think he ends it there? No! She dies, dad! I mean, the world's hard enough as it is, guys. Can't someone say, hey let's be positive? Let's have a good ending to the story?
Another best picture nominee, Life of Pi, employs a similar device. Pi finds himself aboard a lifeboat with a ferocious Bengal tiger in the aftermath of a shipwreck that has his entire family. Lost at sea in the Pacific Ocean for 227 days -- starved, desperate, and forced into a game of survival with the tiger -- Pi pushes forward, even though he, like Pat, has lost everything. Pi says, "You might think I lost all hope at that point. I did. And as a result I perked up and felt much better."
Pi's resilience is incredible once you realize what happens on board the lifeboat and how Pi copes with the tragedy that he witnesses and endures. There's more to the story than the boy and the tiger. Though what really happened is terrible, Pi chooses to tell a different story. His parallels what really happened, but is beautiful not bleak, transcendent not nihilistic.
"Which story do you prefer?" he asks at the end.
***
The New Yorker's Richard Brody criticized Silver Linings Playbook for its sentimentality and "faith-based view of mental illness and, overall, of emotional redemption." The New York Times' A.O. Scott made a similar, if predictable, criticism of Life of Pi: "The novelist and the older Pi are eager...to repress the darker implications of the story, as if the presence of cruelty and senseless death might be too much for anyone to handle...Insisting on the benevolence of the universe in the way that Life of Pi does can feel more like a result of delusion or deceit than of earnest devotion."
But these criticisms miss the point. First, they fail to understand why these two strange and idiosyncratic movies, both based on novels, resonated with so many millions of people. Their themes of resilience speak to each of us -- and there is a reason for that. The key insight of each movie is, whether their creators realized it or not, grounded in a growing body of scientific research, which Brody and Scott overlook.
Far from being delusional or faith-based, having a positive outlook in difficult circumstances is not only an important predictor of resilience -- how quickly people recover from adversity -- but it is the most important predictor of it. People who are resilient tend to be more positive and optimistic compared to less-resilient folks; they are better able to regulate their emotions; and they are able to maintain their optimism through the most trying circumstances.
This is what Dr. Dennis Charney, the dean of Mount Sinai School of Medicine, found when he examined approximately 750 Vietnam war veterans who were held as prisoners of war for six to eight years. Tortured and kept in solitary confinement, these 750 men were remarkably resilient. Unlike many fellow veterans, they did not develop depression or posttraumatic stress disorder after their release, even though they endured extreme stress. What was their secret? After extensive interviews and tests, Charney found ten characteristics that set them apart. The top one was optimism. The second was altruism. Humor and having a meaning in life -- or something to live for -- were also important.
For many years, psychologists, following Freud, thought that people simply needed to express their anger and anxiety -- blow off some steam -- to be happier. But this is wrong. Researchers, for example, asked people who were mildly-to-moderately depressed to dwell on their depression for eight minutes. The researchers found that such ruminating caused the depressed people to become significantly more depressed and for a longer period of time than people who simply distracted themselves thinking about something else. Senseless suffering -- suffering that lacks a silver lining -- viciously leads to more depression.
Counter-intuitively, another study found that facing down adversity by venting -- hitting a punching bag or being vengeful toward someone who makes you angry -- actually leads to people feeling far worse, not better. Actually, doing nothing at all in response to anger was more effective than expressing the anger in these destructive ways.
Even more effective than doing nothing is channeling your depression toward a productive, positive goal, as Pat and Pi do. James Pennebaker, a psychological researcher at the University of Texas in Austin, has found that people who find meaning in adversity are ultimately healthier in the long run than those who do not. In a study, he asked people to write about the darkest, most traumatic experience of their lives for four days in a row for a period of 15 minutes each day.
Analyzing their writing, Pennebaker noticed that the people who benefited most from the exercise were trying to derive meaning from the trauma. They were probing into the causes and consequences of the adversity and, as a result, eventually grew wiser about it. A year later, their medical records showed that the meaning-makers went to the doctor and hospital fewer times than people in the control condition, who wrote about a non-traumatic event. People who used the exercise to vent, by contrast, received no health benefits. Interestingly, when Pennebaker had other research subjects express their emotions through song or dance, the health benefits did not appear. There was something unique and special about the stories people told themselves. Those stories helped people find a silver lining in their adversity.
***
For starters, having a positive mood makes people more resilient physically. In one study, research subjects were outfitted with a device that measured their heart activity. After their baseline heart activity was recorded, they were presented with a stressful task: Each was asked to quickly prepare and deliver a speech on why he or she is a good friend. They were told that the speech would be videotaped and evaluated.
Heart rates rapidly increased. Arteries constricted. Blood pressure shot up.
Then, participants were shown a short video clip that either evoked negative emotions (like sadness), positive emotions (like happiness), or a neutral condition of no emotions. The participants were also told that if they were shown a video clip "by chance" that they were off the hook: They did not have to give the speech after all. That meant that their anxiety would start to subside as the video clips started.
Here was the interesting finding: The heart activity of the participants who viewed the positive clips returned to normal much quicker than their peers who were shown the negative or neutral clips. Positive emotions can, the researchers concluded, undo the effects of a stressful negative experience. The researchers found that the most resilient people were also more positive in day-to-day life.
It turns out that resilient people are good at transforming negative feelings into positive ones. For instance, one of the major findings of Fredrickson's studies was that resilient people took a different attitude toward the speech task than non-resilient people. They viewed the task as a challenge and opportunity for growth rather than as a threat. In other words, they found the silver lining.
With that in mind, the researchers wondered if they could inject some positivity into the non-resilient people to make them more resilient. They primed both types of people to approach the task either positive or negatively. The researchers told some people to see the task as a threat and they told others to see it as a challenge. What they found is good news for resilient and non-resilient people alike.
Resilient people who saw the task as a challenge did fine, as predicted. So did, interestingly, resilient people who were told to view the task as a threat. Resilient people, no matter how they approached the task, had the same cardiovascular recovery rate.
The people who benefitted from the priming were non-resilient people. Those who were told to approach the task as an opportunity rather than a threat suddenly started looking like high resilient people in their cardiovascular measures. They bounced back quicker than they otherwise would have.
Resilient people are good at bouncing back because they are emotionally complex. In each of Fredrickson's studies, resilient people experience the same level of frustration and anxiety as the less resilient participants. Their physiological and emotional spikes were equally high. This is important. It reveals that resilient people are not Pollyannas, deluding themselves with positivity. They just let go of the negativity, worry less, and shift their attention to the positive more quickly.
Resilient people also respond to adversity by appealing to a wider range of emotions. In another study, for instance, participants were asked to write short essays about the most important problem that they were facing in their lives. While resilient people reported the same amount of anxiety as less resilient people in the essays, they also revealed more happiness, interest, and eagerness toward the problem. For resilient people, high levels of positive emotions exist side-by-side with negative emotions. Think of how Pi responds to his seemingly hopeless situation aboard the boat: "I tell you, if you were in such dire straits as I was, you too would elevate your thoughts. The lower you are, the higher your mind will want to soar."
When your mind starts soaring, you notice more and more positive things. This unleashes an upward spiral of positive emotions that opens people up to new ways of thinking and seeing the world -- to new ways forward. This is yet another reason why positive people are resilient. They see opportunities that negative people don't. Negativity, for adaptive reasons, puts you in defense mode, narrows your field of vision, and shuts you off to new possibilities since they're seen as risks.
This calls to mind one of the best scenes from Silver Linings Playbook, in which a bad situation nearly consumes Pat. He is at a diner with Tiffany (Jennifer Lawrence), when he hears "Ma Cherie Amour" playing in his head -- the song that was playing when he found his estranged wife naked in the shower with another man -- and has a traumatic flashback.
Tiffany helps him work past the episode: "You gonna go your whole life scared of that song? It's just a song. Don't make it a monster... There's no song playing. There's no song. Breathe, count backwards from ten. That's it." He recovers and their interaction sets the stage for the rest of the movie.
Like Life of Pi, Silver Linings Playbook is about how we can tame our inner demons with hope and a positive outlook on life. By finding meaning and love in terrible circumstances, as Pi and Pat do, they overcome their suffering and, in the process, reveal how uplifting silver linings can be.
Monday, May 20, 2013
Yeats 2013
Yeats 2013. Things fall apart. The center cannot hold. People are rushing to purchase Poweball lottery tickets (but please hold on to your grocery money). Another Gatsby movie is out (Lord have mercy). Another Dan Brown book is published (Heaven help us). Texas justice reigns (Stay out of that state). Another American Idol is crowned (The Republic survives once again). I am waiting for my yard to green up real good (Climate change).
The best lack intensity while the worst are full of conviction. Mere anarchy is loosed upon the world. What now?
The best lack intensity while the worst are full of conviction. Mere anarchy is loosed upon the world. What now?
Saturday, May 18, 2013
Gatsby Controversey is Nothing New
This Week in 'Nation' History: Reviewers Have Argued About 'Gatsby' Since 1925
Katrina vanden Heuvel on May 18, 2013 - 10:00 AM ET
.With its Jay-Z soundtrack, bizarre 3-D effects and commitment of Nick Carraway to a mental institution, Baz Luhrmann’s The Great Gatsby has seemed to some critics insufficiently deferential to a precious cultural totem. But long before it won silver in Modern Library’s list of the 100 best English-language novels, writers in The Nation offered drastically different assessments on both the book’s meaning and its legitimate place in the literary pantheon.
Carl Van Vechten, a writer and photographer who later served as Gertrude Stein’s literary executor, reviewed The Great Gatsby for The Nation in the issue of May 20, 1925, just a month after the book’s publication:
Mr. Fitzgerald is a born story-teller…[H]is work is imbued with that rare and beneficent essence we hail as charm. He is by no means lacking in power, as several passages in the current opus abundantly testify, and he commands a quite uncanny gift for hitting off character or presenting a concept in a striking or memorable manner…
Up to date, Mr. Fitzgerald has occupied himself almost exclusively with the aspects and operations of the coeval flapper and cake-eater. No one else, perhaps, has delineated these mundane creatures quite as skillfully as he, and his achievement in this direction has been awarded authoritative recognition. He controls, moreover, the necessary magic to make his most vapid and rotterish characters interesting and even, on occasion, charming, in spite of (or possibly because of) the fact that they are almost invariably presented in advanced stages of intoxication…
In “The Great Gatsby,” there are several of Mr. Fitzgerald’s typical flappers who behave in the manner he has conceived as typical of contemporary flapperdom. There is again a gargantuan drinking-party, conceived in a rowdy, hilarious, and highly titillating spirit. There is also, in this novel…something else. There is the character of Gatsby…
But in a review the following year of a stage production of Gatsby, The Nation’s theater critic, Joseph Wood Krutch, mocked Fitzgerald’s blurring of the line between spectator and spectated, satirist and satirized. Almost ignoring the theatrical production entirely, Krutch instead skewered the book:
F. Scott Fitzgerald was born into the flapper age with exactly the qualities and defects which would enable him to become its accredited historian. Though granted just enough detachment to make him undertake the task of describing, he is by temperament too much a part of the things described to view them with any very penetratingly critical eye and he sees flappers, male and female, much as they see themselves. Sharing to a very considerable extent in their psychological processes, he romanticizes their puerilities in much the same fashion as they do; and when he pictures the manners of the fraternity house or the Long Island villa he pictures them less as they are than as their practitioners like to imagine them. He makes cocktails and kisses seem thrillingly wicked; he flatters the younger generation with the solemn warning that it is leading the world straight to the devil; and as a result he writes The Flapper’s Own History of Flapperism. Thus he becomes less the genuine historian of a phase of social development than one of the characteristic phenomena of that development itself, and his books are seen to be little more than documents for the study of the thing which they purport to treat.
The book, Krutch added, was “preposterously maudlin.”
The Nation has had only distaste for both screen adaptations of Gatsby reviewed in its pages. Painter and film critic Manny Farber panned the 1949 Paramount adaptation as “a limp translation,” writing that the film’s purposefully antiquated style “takes on the heavy, washed-out, inaccurate dedication-to-the-past quality of a Radio City mural.” Farber also said that the actress Betty Field failed as Daisy because she was “no more marked by Southern aristocracy than a cheese blintz.”
“Respectful work and appalling” were the choice words Robert Hatch, a longtime executive editor of the magazine, had for the 1974 adaptation starring Robert Redford as Gatsby, Sam Waterston as Nick and Mia Farrow as Daisy, with a screenplay by Francis Ford Coppola. “When it sticks to the original, it adds nothing; when it deviates, it puts a heavy foot into Fitzgerald’s magic,” Hatch wrote. “Overall, its most conspicuous weakness is that it cannot handle vulgarity or ostentation without becoming vulgar or ostentatious”—precisely the same complaint Krutch expressed about the book itself in The Nation almost fifty years earlier.
Other Nation articles about Fitzgerald include a 1945 essay by Lionel Trilling—putting him in the same category with Shakespeare, Dickens, Voltaire, Balzac and Goethe—and a 1996 appreciation by friend-of-the-magazine E.L. Doctorow, who wrote that “in its few pages” Gatsby “arcs the American continent and gives us a perfect structural allegory of our deadly class-ridden longings.” And as many have argued, the release of “Gatsby” should also be an occasion for renewed discussion of inequality in America.
Read more: http://www.thenation.com/blog/174416/week-nation-history-reviewers-have-argued-about-gatsby-1925#ixzz2Th5NvRve
Katrina vanden Heuvel on May 18, 2013 - 10:00 AM ET
.With its Jay-Z soundtrack, bizarre 3-D effects and commitment of Nick Carraway to a mental institution, Baz Luhrmann’s The Great Gatsby has seemed to some critics insufficiently deferential to a precious cultural totem. But long before it won silver in Modern Library’s list of the 100 best English-language novels, writers in The Nation offered drastically different assessments on both the book’s meaning and its legitimate place in the literary pantheon.
Carl Van Vechten, a writer and photographer who later served as Gertrude Stein’s literary executor, reviewed The Great Gatsby for The Nation in the issue of May 20, 1925, just a month after the book’s publication:
Mr. Fitzgerald is a born story-teller…[H]is work is imbued with that rare and beneficent essence we hail as charm. He is by no means lacking in power, as several passages in the current opus abundantly testify, and he commands a quite uncanny gift for hitting off character or presenting a concept in a striking or memorable manner…
Up to date, Mr. Fitzgerald has occupied himself almost exclusively with the aspects and operations of the coeval flapper and cake-eater. No one else, perhaps, has delineated these mundane creatures quite as skillfully as he, and his achievement in this direction has been awarded authoritative recognition. He controls, moreover, the necessary magic to make his most vapid and rotterish characters interesting and even, on occasion, charming, in spite of (or possibly because of) the fact that they are almost invariably presented in advanced stages of intoxication…
In “The Great Gatsby,” there are several of Mr. Fitzgerald’s typical flappers who behave in the manner he has conceived as typical of contemporary flapperdom. There is again a gargantuan drinking-party, conceived in a rowdy, hilarious, and highly titillating spirit. There is also, in this novel…something else. There is the character of Gatsby…
But in a review the following year of a stage production of Gatsby, The Nation’s theater critic, Joseph Wood Krutch, mocked Fitzgerald’s blurring of the line between spectator and spectated, satirist and satirized. Almost ignoring the theatrical production entirely, Krutch instead skewered the book:
F. Scott Fitzgerald was born into the flapper age with exactly the qualities and defects which would enable him to become its accredited historian. Though granted just enough detachment to make him undertake the task of describing, he is by temperament too much a part of the things described to view them with any very penetratingly critical eye and he sees flappers, male and female, much as they see themselves. Sharing to a very considerable extent in their psychological processes, he romanticizes their puerilities in much the same fashion as they do; and when he pictures the manners of the fraternity house or the Long Island villa he pictures them less as they are than as their practitioners like to imagine them. He makes cocktails and kisses seem thrillingly wicked; he flatters the younger generation with the solemn warning that it is leading the world straight to the devil; and as a result he writes The Flapper’s Own History of Flapperism. Thus he becomes less the genuine historian of a phase of social development than one of the characteristic phenomena of that development itself, and his books are seen to be little more than documents for the study of the thing which they purport to treat.
The book, Krutch added, was “preposterously maudlin.”
The Nation has had only distaste for both screen adaptations of Gatsby reviewed in its pages. Painter and film critic Manny Farber panned the 1949 Paramount adaptation as “a limp translation,” writing that the film’s purposefully antiquated style “takes on the heavy, washed-out, inaccurate dedication-to-the-past quality of a Radio City mural.” Farber also said that the actress Betty Field failed as Daisy because she was “no more marked by Southern aristocracy than a cheese blintz.”
“Respectful work and appalling” were the choice words Robert Hatch, a longtime executive editor of the magazine, had for the 1974 adaptation starring Robert Redford as Gatsby, Sam Waterston as Nick and Mia Farrow as Daisy, with a screenplay by Francis Ford Coppola. “When it sticks to the original, it adds nothing; when it deviates, it puts a heavy foot into Fitzgerald’s magic,” Hatch wrote. “Overall, its most conspicuous weakness is that it cannot handle vulgarity or ostentation without becoming vulgar or ostentatious”—precisely the same complaint Krutch expressed about the book itself in The Nation almost fifty years earlier.
Other Nation articles about Fitzgerald include a 1945 essay by Lionel Trilling—putting him in the same category with Shakespeare, Dickens, Voltaire, Balzac and Goethe—and a 1996 appreciation by friend-of-the-magazine E.L. Doctorow, who wrote that “in its few pages” Gatsby “arcs the American continent and gives us a perfect structural allegory of our deadly class-ridden longings.” And as many have argued, the release of “Gatsby” should also be an occasion for renewed discussion of inequality in America.
Read more: http://www.thenation.com/blog/174416/week-nation-history-reviewers-have-argued-about-gatsby-1925#ixzz2Th5NvRve
Friday, May 17, 2013
Republicans are Fools Waiting for the Facts
From Jonathan Chait-------Republicans Shouldn’t Let the Facts Speak for ThemselvesBy Jonathan Chait 0 17 Share on email As the investigative phase of the Obama presidency commences in earnest, Republicans are promising that their overriding goal is to proceed cautiously and let the facts speak for themselves. “We have stuff here that’s real, so you don’t need the distraction of politics to give people an excuse to say we’re being silly,” a House Republican leadership aide involved in the investigations tells Politico. “Everyone is keenly aware of the overreach risk.” Likewise, Charles Boustany Jr., who is helping lead the IRS investigation on the House Ways and Means Committee, tells the New York Times, “I’m being very cautious not to overplay my hand.”
Part of this is advertising — if Republicans want the media to take them seriously, and not as crazed partisan witch-hunters, they have to assure the media they are serious and not crazed partisan witch-hunters. But there also seems to be an element of genuine calculation here. Republicans do recall the 1998 midterm election blowback they suffered for impeachment mania. They think a slow, patient investigative process will produce fruitful results.
I happen to think they’re wrong about this. While I don’t have much sympathy for their goals, as a pure strategic calculation — and my analysis here is completely value-free — I think caution is the wrong play for the Republicans here. They should probably let their freak flag fly.
The explicit assumption of the slow-careful strategy, which is also the implicit assumption of the stories reporting on it, is that more digging will produce harmful news about the Obama administration. “This is just the beginning. I want to emphasize that. We have a lot of work left to do in getting to the root of this,” Boustany tells the Times.
That might be true. But what if it’s not? What if we’ve already gotten to the root of it?
Indeed, what we’ve seen so far is that the stories looked most damaging when they were first reported, and subsequent revelations have made them look less, not more, scandalous. The idea that there is a series of “Obama scandals” took its root last week when ABC reporter Jonathan Karl misleadingly claimed to have seen incriminating White House e-mails, which turned out to have been doctored by House Republicans. An independent report of the IRS found no political direction at all led to the agency’s use of a one-sided search program to flag partisan tax-free groups.
Once journalists start to think of an issue as a “scandal,” then they assume it will necessarily lead to progressively stronger evidence of wrongdoing. That assumption isn’t necessarily true. And the sequence of events that made everybody start to think of a few disconnected stories as “Obama scandals” was mainly an odd and somewhat shaky confluence of events.
If Republicans do manage to unearth some significant misdeeds, then playing it cool and rational will help them build the case to force resignations, impeach the president, or wherever they want to take this. The more likely scenario is that they won’t find anything groundbreaking. And then they have to ask themselves how they want to continue to keep the scandal narrative going.
Endless hearings that produce little news won’t do. A constant drumbeat of impeachment talk, and browbeating reporters for failing to promote it, is more likely to succeed. The accusations from Republicans are what make the story newsworthy. And they reestablish the boundaries of opinion, so that impeaching Obama becomes defined as the irresponsible right-wing position, but “there’s nothing here” becomes the irresponsible left-wing position. The respectable centrist thing to say is that there’s definitely something fishy in the administration, even though impeachment seems premature.
You may think that screaming bloody murder over a non-scandal will utterly backfire. I invite you study volumes I to V of the Wall Street Journal editorial page’s collection of wild denunciations of massive, unprecedented criminality in the “Whitewater scandal.” The scandal, in fact, amounted to nothing in the end. But it did successfully implant an aura of sleaze and wrongdoing. If you’re looking to foment a scandal, having the facts on your side is obviously helpful, but it’s not necessary. (Republicans should probably stay away from actual impeachment — that part of the lesson of 1998 seems clear enough.)
I think Republicans made a huge strategic miscalculation on how to fight Obama in the first term. They assumed his policy agenda, and the economic devastation they figured it would bring, would be so unpopular they could oppose him on policy grounds alone. They made little effort to undermine Obama as a political figure. That reservoir of trust has helped Obama enjoy strong personal favorability ratings. If they’re smart, they’ll get to work on creating a narrative of wrongdoing and sleaze. If they wait for the facts to make the case for them, they may blow their chance altogether.
Part of this is advertising — if Republicans want the media to take them seriously, and not as crazed partisan witch-hunters, they have to assure the media they are serious and not crazed partisan witch-hunters. But there also seems to be an element of genuine calculation here. Republicans do recall the 1998 midterm election blowback they suffered for impeachment mania. They think a slow, patient investigative process will produce fruitful results.
I happen to think they’re wrong about this. While I don’t have much sympathy for their goals, as a pure strategic calculation — and my analysis here is completely value-free — I think caution is the wrong play for the Republicans here. They should probably let their freak flag fly.
The explicit assumption of the slow-careful strategy, which is also the implicit assumption of the stories reporting on it, is that more digging will produce harmful news about the Obama administration. “This is just the beginning. I want to emphasize that. We have a lot of work left to do in getting to the root of this,” Boustany tells the Times.
That might be true. But what if it’s not? What if we’ve already gotten to the root of it?
Indeed, what we’ve seen so far is that the stories looked most damaging when they were first reported, and subsequent revelations have made them look less, not more, scandalous. The idea that there is a series of “Obama scandals” took its root last week when ABC reporter Jonathan Karl misleadingly claimed to have seen incriminating White House e-mails, which turned out to have been doctored by House Republicans. An independent report of the IRS found no political direction at all led to the agency’s use of a one-sided search program to flag partisan tax-free groups.
Once journalists start to think of an issue as a “scandal,” then they assume it will necessarily lead to progressively stronger evidence of wrongdoing. That assumption isn’t necessarily true. And the sequence of events that made everybody start to think of a few disconnected stories as “Obama scandals” was mainly an odd and somewhat shaky confluence of events.
If Republicans do manage to unearth some significant misdeeds, then playing it cool and rational will help them build the case to force resignations, impeach the president, or wherever they want to take this. The more likely scenario is that they won’t find anything groundbreaking. And then they have to ask themselves how they want to continue to keep the scandal narrative going.
Endless hearings that produce little news won’t do. A constant drumbeat of impeachment talk, and browbeating reporters for failing to promote it, is more likely to succeed. The accusations from Republicans are what make the story newsworthy. And they reestablish the boundaries of opinion, so that impeaching Obama becomes defined as the irresponsible right-wing position, but “there’s nothing here” becomes the irresponsible left-wing position. The respectable centrist thing to say is that there’s definitely something fishy in the administration, even though impeachment seems premature.
You may think that screaming bloody murder over a non-scandal will utterly backfire. I invite you study volumes I to V of the Wall Street Journal editorial page’s collection of wild denunciations of massive, unprecedented criminality in the “Whitewater scandal.” The scandal, in fact, amounted to nothing in the end. But it did successfully implant an aura of sleaze and wrongdoing. If you’re looking to foment a scandal, having the facts on your side is obviously helpful, but it’s not necessary. (Republicans should probably stay away from actual impeachment — that part of the lesson of 1998 seems clear enough.)
I think Republicans made a huge strategic miscalculation on how to fight Obama in the first term. They assumed his policy agenda, and the economic devastation they figured it would bring, would be so unpopular they could oppose him on policy grounds alone. They made little effort to undermine Obama as a political figure. That reservoir of trust has helped Obama enjoy strong personal favorability ratings. If they’re smart, they’ll get to work on creating a narrative of wrongdoing and sleaze. If they wait for the facts to make the case for them, they may blow their chance altogether.
Subscribe to:
Posts (Atom)