This book was entertaining. It chronicles, through journal entries, a mother's experience in the first year of her son's life. The mother is author Anne Lamott, and this is her first child.
What I like is that the book is about a real person under real circumstances. I have never cared for fantasy. I prefer reading about life and the human condition. Lamott does that. She is honest about being a mother. Her perspective is not a stereotypical portrayal of motherhood being only a blessing and a joy. It is that, but also she tells us of her fears, anxieties, and worries. She wonders what kind of mother she will be. She wonders what kind of world her son will live in. She wishes that she had a companion, a father, to raise Sam with her. The honesty with which being a parent is described is what makes this book appealing.
My only complaint is that it is too long. It could cut fifty pages. I became tired of reading about the mundane goingson of Sam and how wonderful, or not so, that he is.
Tuesday, January 31, 2012
Saturday, January 28, 2012
Daniel Farber - Lincoln's Constitution
This book is a history of the Constitution and Lincoln's relation to it during the Civil War. If I were smart enough, I would read everything I could on the Constitution. The history and interpretation of our Constitution is of abiding interest to me.
Debates about the consitutionality of secession do not interest me. It happened; the war ended the discussion. The constitutionality of secession is a moot point.
Debates about Lincoln's record on civil liberties during the war does not interest me. Though an important topic, it just isn't something I care about for some reason.
The author discusses the political science question of sovereignty. I take it that sovereignty is a technical term meaning power in political terms. Where is the source of American sovereignty? The author concludes that there is no clear historical answer to this question.
The critical question comes down to this. Are we a nation of people collectively, or we a compact of states? Like all nationalists down thru the years like Hamilton, Lincoln, and FDR, I say we are a nation of people, not a confederation of states. After all, the Constitution begins with "We the people."
To be continued.
Debates about the consitutionality of secession do not interest me. It happened; the war ended the discussion. The constitutionality of secession is a moot point.
Debates about Lincoln's record on civil liberties during the war does not interest me. Though an important topic, it just isn't something I care about for some reason.
The author discusses the political science question of sovereignty. I take it that sovereignty is a technical term meaning power in political terms. Where is the source of American sovereignty? The author concludes that there is no clear historical answer to this question.
The critical question comes down to this. Are we a nation of people collectively, or we a compact of states? Like all nationalists down thru the years like Hamilton, Lincoln, and FDR, I say we are a nation of people, not a confederation of states. After all, the Constitution begins with "We the people."
To be continued.
Friday, January 27, 2012
Children of Nietzsche
Fulford: Carving a Nietzsche
Robert Fulford Jan 24, 2012 – 8:00 AM ET | Last Updated: Jan 24, 2012 11:44 AM ET
Whether we like it or not, “We continue to live within the intellectual shadow cast by Nietzsche.”
Friedrich Nietzsche is one of those philosophers you just can’t kill.
He’s been in his grave since 1900, having been silenced by insanity many years before. In 1898, The New York Times ran an article headed, “Interesting Revolutionary Theories from a Writer Now in the Madhouse.” He’s read, as he was then, only by a small minority, many of whom it would be flattering to call eccentric.
Nevertheless, he runs through our social bloodstream. Francis Fukuyama’s remark has the sound of truth: Whether we like it or not, “We continue to live within the intellectual shadow cast by Nietzsche.”
Our political leaders are Nietzschean heroes, fuelled by the will to power. In popular fiction and journalism we eternally reinvent the drama of Nietzschean characters who scorn tradition and prove their bravery by setting their own course, as he urged. Defiant originality is sanctified everywhere from art galleries to the business pages. Steve Jobs was perhaps the world’s most renowned Nietzschean character type.
Related
Downton Abbey and ‘the cult of the English country house’
Jennifer Egan’s A Visit from the Goon Squad is all about the timing
Nietzsche might recognize the tone of current American politics. In the Republican primaries politicians struggle against inherited dogma (big government) while Democrats pledge to fight the ideology they fear (capitalism). But of course both parties maintain a respect for Christianity that would make Nietzsche decide he had lived in vain.
We don’t know it but Nietzsche scripted many of our conversations, putting words in our mouths. When we talk about culture (the culture of this, the culture of that) we echo him. Anyone who discusses “values” (instead of, say, ethics) is talking Nietzsche-talk.
People who claim to be in a state of “becoming” are Nietzscheans, knowingly or otherwise. He believed (now everyone believes) that we are all constantly reconstructing ourselves. In Nietzsche there’s no such thing as a permanently stable personality.
He was the original culture warrior. He laid the foundation for the struggle between traditionalism and modernism, an enduring battle. The more important a tradition, the more he wanted to see it challenged.
Through his books and articles Nietzsche’s ideas conquered America just as American influence was conquering the world. How this event in intellectual history happened is the concern of American Nietzsche: A History of an Icon and His Ideas_ (University of Chicago Press), a first-class academic book by Jennifer Ratner-Rosenhagen, a University of Wisconsin historian.
Nietzsche seems formidable from a distance but turns out to be surprisingly easy to read. That’s deceptive, because understanding him is hard. He’s endlessly, infuriatingly contradictory. One day he leaves us in despair about the future of humanity. On another he says the potential for liberated humanity is limitless. His tone ranges from insistent to hysterical.
Not everyone likes it. In fact, he’s as often despised as adored. Casual cruelty runs through his work, above all in his belief that most people don’t count. He callously described the common “herd” of humanity.
Fascists liked him. Decades after Nietzsche’s death Hitler claimed him as a chum even though Nietzsche maintained that anti-Semitism was a stupid German fantasy. Sadly, his sister, Elizabeth Forster-Nietzsche, having inherited control of his reputation, let the Nazis use his name.
Nietzsche placed his biggest bet on the “higher man,” the overman or Übermensch, the superior being, often translated into English as Superman. That was the word Jerry Siegel and Joe Shuster borrowed in the 1930s for their creation, Superman, the comic.
Allan Bloom, the author of The Closing of the American Mind, blamed Nietzsche for the emotional deadness and intellectual sterility of university students. They learned, too easily and too soon, that God was dead and in fact they could set aside all intellectual traditions as calcified dogma, without bothering to understand them. In Bloom’s view Americans didn’t grasp the context and ended up accepting “nihilism with a happy ending.” If God is dead, relax.
Nietzsche’s great champion on this continent was H.L. Mencken, who at the age of 27 wrote the first book on Nietzsche in English. He loved the way his hero “hurled his javelin” at the authority of God and that he “broke from the crowd” of thinkers. After becoming the most famous American intellectual of the 1920s, Mencken admitted that his ideas were based on Nietzsche. “Without him, I’d never have come to them.”
The young Walter Lippmann, on his way to being the prince of political commentators, used Nietzsche’s example as a way of separating Americans from out-worn dogma in his book, _A Preface to Politics_ in 1913. Isadora Duncan, the founding genius of modern dance, claimed that “the seduction of Nietzsche’s philosophy ravished my being.” Emma Goldman, the legendary anarchist, said Nietzsche’s work took her to “undreamed-of heights.”
Jack London and Eugene O’Neill were among the major American writers who considered Nietzsche’s Thus Spoke Zarathustra their Bible. (“I will make company with creators, with harvesters, with rejoicers: I will show them the rainbow and the stairway to the Superman. I love him whose soul is lavish.”)
Ratner-Rosenhagen deals at affectionate length with the figure who most inspired Nietzsche, Ralph Waldo Emerson (1803-1882), the American philosopher, leader of the Transcendentalist movement, champion of individualism. Beginning at the age of 17, Nietzsche was a devoted reader of Emerson.
In Nietzsche’s library four volumes of Emerson essays were worn ragged, margins frequently filled with comments. Nietzsche saw Emerson as a sovereign among intellectuals who rejected inherited ideals and developed a new conception of individualism.
Ideas have pedigrees, even if they seem commonplace, even if they sound as if your grandfather might have invented them. Inevitably, however the pedigree becomes obscure once the idea is accepted.
Consider that Emerson wrote, “Every evil to which we do not succumb is a benefactor.” Nietzsche liked that. He underscored it and later wrote a version of it that has been endlessly quoted: “What does not kill me makes me stronger.” I’ve heard people say that without having any idea that it comes from a German philosopher. Later, a thousand magazine articles developed the idea that our defeats, by teaching us, eventually produce victories.
In the 1970s, when the magazine where I worked went out of business, my father-in-law consoled me with four words of Nietzsche-inspired street English. He said I should remember what sensible people say when failure happens to them: “Every knock, a boost.”
So there we were, doing what someone else is no doubt doing at this moment, recovering from disaster under Nietzsche’s inescapable shadow.
Robert Fulford Jan 24, 2012 – 8:00 AM ET | Last Updated: Jan 24, 2012 11:44 AM ET
Whether we like it or not, “We continue to live within the intellectual shadow cast by Nietzsche.”
Friedrich Nietzsche is one of those philosophers you just can’t kill.
He’s been in his grave since 1900, having been silenced by insanity many years before. In 1898, The New York Times ran an article headed, “Interesting Revolutionary Theories from a Writer Now in the Madhouse.” He’s read, as he was then, only by a small minority, many of whom it would be flattering to call eccentric.
Nevertheless, he runs through our social bloodstream. Francis Fukuyama’s remark has the sound of truth: Whether we like it or not, “We continue to live within the intellectual shadow cast by Nietzsche.”
Our political leaders are Nietzschean heroes, fuelled by the will to power. In popular fiction and journalism we eternally reinvent the drama of Nietzschean characters who scorn tradition and prove their bravery by setting their own course, as he urged. Defiant originality is sanctified everywhere from art galleries to the business pages. Steve Jobs was perhaps the world’s most renowned Nietzschean character type.
Related
Downton Abbey and ‘the cult of the English country house’
Jennifer Egan’s A Visit from the Goon Squad is all about the timing
Nietzsche might recognize the tone of current American politics. In the Republican primaries politicians struggle against inherited dogma (big government) while Democrats pledge to fight the ideology they fear (capitalism). But of course both parties maintain a respect for Christianity that would make Nietzsche decide he had lived in vain.
We don’t know it but Nietzsche scripted many of our conversations, putting words in our mouths. When we talk about culture (the culture of this, the culture of that) we echo him. Anyone who discusses “values” (instead of, say, ethics) is talking Nietzsche-talk.
People who claim to be in a state of “becoming” are Nietzscheans, knowingly or otherwise. He believed (now everyone believes) that we are all constantly reconstructing ourselves. In Nietzsche there’s no such thing as a permanently stable personality.
He was the original culture warrior. He laid the foundation for the struggle between traditionalism and modernism, an enduring battle. The more important a tradition, the more he wanted to see it challenged.
Through his books and articles Nietzsche’s ideas conquered America just as American influence was conquering the world. How this event in intellectual history happened is the concern of American Nietzsche: A History of an Icon and His Ideas_ (University of Chicago Press), a first-class academic book by Jennifer Ratner-Rosenhagen, a University of Wisconsin historian.
Nietzsche seems formidable from a distance but turns out to be surprisingly easy to read. That’s deceptive, because understanding him is hard. He’s endlessly, infuriatingly contradictory. One day he leaves us in despair about the future of humanity. On another he says the potential for liberated humanity is limitless. His tone ranges from insistent to hysterical.
Not everyone likes it. In fact, he’s as often despised as adored. Casual cruelty runs through his work, above all in his belief that most people don’t count. He callously described the common “herd” of humanity.
Fascists liked him. Decades after Nietzsche’s death Hitler claimed him as a chum even though Nietzsche maintained that anti-Semitism was a stupid German fantasy. Sadly, his sister, Elizabeth Forster-Nietzsche, having inherited control of his reputation, let the Nazis use his name.
Nietzsche placed his biggest bet on the “higher man,” the overman or Übermensch, the superior being, often translated into English as Superman. That was the word Jerry Siegel and Joe Shuster borrowed in the 1930s for their creation, Superman, the comic.
Allan Bloom, the author of The Closing of the American Mind, blamed Nietzsche for the emotional deadness and intellectual sterility of university students. They learned, too easily and too soon, that God was dead and in fact they could set aside all intellectual traditions as calcified dogma, without bothering to understand them. In Bloom’s view Americans didn’t grasp the context and ended up accepting “nihilism with a happy ending.” If God is dead, relax.
Nietzsche’s great champion on this continent was H.L. Mencken, who at the age of 27 wrote the first book on Nietzsche in English. He loved the way his hero “hurled his javelin” at the authority of God and that he “broke from the crowd” of thinkers. After becoming the most famous American intellectual of the 1920s, Mencken admitted that his ideas were based on Nietzsche. “Without him, I’d never have come to them.”
The young Walter Lippmann, on his way to being the prince of political commentators, used Nietzsche’s example as a way of separating Americans from out-worn dogma in his book, _A Preface to Politics_ in 1913. Isadora Duncan, the founding genius of modern dance, claimed that “the seduction of Nietzsche’s philosophy ravished my being.” Emma Goldman, the legendary anarchist, said Nietzsche’s work took her to “undreamed-of heights.”
Jack London and Eugene O’Neill were among the major American writers who considered Nietzsche’s Thus Spoke Zarathustra their Bible. (“I will make company with creators, with harvesters, with rejoicers: I will show them the rainbow and the stairway to the Superman. I love him whose soul is lavish.”)
Ratner-Rosenhagen deals at affectionate length with the figure who most inspired Nietzsche, Ralph Waldo Emerson (1803-1882), the American philosopher, leader of the Transcendentalist movement, champion of individualism. Beginning at the age of 17, Nietzsche was a devoted reader of Emerson.
In Nietzsche’s library four volumes of Emerson essays were worn ragged, margins frequently filled with comments. Nietzsche saw Emerson as a sovereign among intellectuals who rejected inherited ideals and developed a new conception of individualism.
Ideas have pedigrees, even if they seem commonplace, even if they sound as if your grandfather might have invented them. Inevitably, however the pedigree becomes obscure once the idea is accepted.
Consider that Emerson wrote, “Every evil to which we do not succumb is a benefactor.” Nietzsche liked that. He underscored it and later wrote a version of it that has been endlessly quoted: “What does not kill me makes me stronger.” I’ve heard people say that without having any idea that it comes from a German philosopher. Later, a thousand magazine articles developed the idea that our defeats, by teaching us, eventually produce victories.
In the 1970s, when the magazine where I worked went out of business, my father-in-law consoled me with four words of Nietzsche-inspired street English. He said I should remember what sensible people say when failure happens to them: “Every knock, a boost.”
So there we were, doing what someone else is no doubt doing at this moment, recovering from disaster under Nietzsche’s inescapable shadow.
Wednesday, January 25, 2012
Should Robert E. Lee Have Been Tried for Treason?
Facebook is not the best vehicle for this type of discussion. I defend Lee ONLY to the extent of saying that he should not have tried for treason. Reasons 1) General Grant said no. That in itself is good enough for me. Grant was THERE ...during those years between the end of the war and Lee's death in 1870. We have the luxury of looking back over almost 150 years. Grant didn't have that luxury.2) Trying Lee and possibly hanging him would have made him a martyr. In my opinion, this would have it far worse than the cult of Lee that arose amongst our Neo-Confederates. The Lost Cause would have been a lot more lost if Lee had become a martyr to the lost South. 3) In his second innaugral Mr. Lincoln spoke eloquently of binding up the nation's wounds with "malice toward none and charity for all." Given that purpose and who can argue with it, the prosecution of Lee would have made things much worse. 4) I have read all of the in print biographies of Lee and he was not an evil man. He was a tragic figure on the wrong side of history and you can argue that he made the wrong choice and it's easy to say that but we weren't there. He was a blatant racist by today's standards, but so was Lincoln. I can criticize Lee with the best of you, but I simply say it would have been a mistake for the country to prosecute him for treason.
Saturday, January 21, 2012
About Freud
Freud: the last great Enlightenment thinker
John Gray 14th December 2011 — Issue 190
Sigmund Freud is out of fashion. The reason? His heroic refusal to flatter humankind
Writing to Albert Einstein in the early 1930s, Sigmund Freud suggested that “man has in him an active instinct for hatred and destruction.” Freud went on to contrast this “instinct to destroy and kill” with one he called erotic—an instinct “to conserve and unify,” an instinct for love.
Without speculating too much, Freud continued, one might suppose that these instincts function in every living being, with what he called “the death instinct”—thanatos—acting “to work its ruin and reduce life to its primal state of inert matter.” The death instinct provided “the biological justification for all those vile, pernicious propensities [to war] which we are now combating.”
To be sure, Freud concluded, all this talk of eros and thanatos might give Einstein the impression that psychoanalytic theory amounted to a “species of mythology, and a gloomy one at that.” But if so, Freud was unabashed, asking Einstein: “Does not every natural science lead ultimately to this—a sort of mythology? Is it otherwise today with your physical sciences?”
Today the idea that psychoanalysis is not a science is commonplace, but no part of Freud’s inheritance is more suspect than the theory of the death instinct. The very idea of instinct is viewed with suspicion. Talk of human instincts, or indeed of human nature, is dismissed as a form of intellectual atavism: human behaviour is seen as far more complex and at the same time more amenable to rational control than Freud believed or implied. Theories of human instinct only serve to block those impulses to progress and rationality that (for all the scorn that is directed against the very idea of human nature) are considered to be quintessentially human.
Freud’s ideas are today not simply rejected as false. They are repudiated as being dangerous or immoral; the “gloomy mythology” of warring instincts is condemned as a kind of slander on the species, the fundamental nobility of which it is sacrilege to deny. To be sure, righteous indignation has informed the response to Freud’s thought from the beginning. But its new strength helps explain one of the more remarkable features of intellectual life at the start of the 21st century, a time that in its own eyes is more enlightened than any other: the intense unpopularity of Freud, the last great Enlightenment thinker.
Born in Austria-Hungary in 1856 and dying in London in 1939, Freud is commonly known as the originator of the idea of the unconscious mind. However, the idea can be found in a number of earlier thinkers, notably the philosopher Arthur Schopenhauer. It would be more accurate to describe Freud as aiming to make the unconscious mind an object of scientific investigation—a prototypically Enlightenment project of extending the scientific method into previously unexplored regions. Many other 20th century thinkers aimed to examine and influence human life through science and reason, the common pursuit of the quarrelling family of intellectual movements, appearing from the 17th century onwards, that formed the Enlightenment. But by applying the Enlightenment project to forbidden regions of the human mind Freud, more than anyone else, revealed the project’s limits.
Starting with research into hysteria, where he concluded that hysterical symptoms often reflected the persisting influence of repressed memories, Freud developed psychoanalysis—a body of thought in which the idea that much of our mental life is repressed and inaccessible to conscious awareness was central.
The practice of psychotherapy that Freud began—the so-called “talking cure”—had the effect of promoting the idea that psychological conflict can be overcome by the sufferer gaining insight into the early experiences from which it may have originated. Later thinkers would attack Freud’s emphasis on early experience and the claims attributed to him regarding the therapeutic value of psychoanalysis. Yet several generations of intellectuals were in no doubt that he was a thinker of major importance. It is only recently that his ideas have been widely disparaged and dismissed. Initially rejected because of the central importance they gave to sexuality in the formation of personality, Freud’s ideas are rejected today because they imply that the human animal is ineradicably flawed. It is not Freud’s insistence on sexuality that is the source of scandal, but the claim that humans are afflicted by a destructive impulse.
The opprobrium that surrounds Freud is all the more intriguing given that the idea that humankind might be possessed by an impulse to destruction was never confined to him alone. Many thinkers entertained similar thoughts around the start of the last century, including one who was largely forgotten until an early part of her life story caught the eye of the filmmaker David Cronenberg. Sabrina Spielrein, the pivotal figure in A Dangerous Method (to be released on 10th February 2012), appears in the film as a hysterical young woman, exhibiting a predilection for sadomasochistic sex following abuse by her father, who after being confined in a mental institution receives treatment from Jung, who then becomes her lover.
The story of the film seems not far from what actually happened. Spielrein did experience a variety of personal difficulties, and was for a time confined in an institution. Whether she and Jung were lovers is not known; but the consensus among those who have studied the episode is that what happened between them went beyond what can be properly expected, then or now, in a professional relationship. Where Spielrein has been remembered, it is as a minor figure in the developing conflict between the two psychoanalytic founders.
This is a pity, for she was much more than that. Spielrein trained and practised as a psychotherapist (the developmental psychologist Jean Piaget being one of her patients) and made important contributions to psychoanalytic theory, some aspects of which are echoed in Freud’s later work. Coming from a Russian-Jewish family of doctors and psychologists, she moved to the Soviet Union in the early 1920s, where she married and had children and worked with the neurologist Alexander Luria, among others. Information about her life and work after this point is sketchy. What is known is that Spielrein’s husband and several members of her family fell victim to Stalin’s terror, while Spielrien herself was shot, along with her children and the rest of the Jewish population of her native city Rostov, after being marched through the main street by the SS in 1942. She was then buried in a mass grave along with thousands of others.
If Spielrein’s life was blighted, it was not by her encounter with Jung (though she may have regretted the relationship). She emerged from the experience to produce some of the most interesting ideas of the early years of psychoanalysis. Her paper “Destruction as a cause of coming into being,” given as a lecture to a meeting of the Vienna Psychoanalytic Society chaired by Freud in 1911, prefigures Freud’s claim that human beings are ruled by two opposing instincts. Spielrein suggested that humans are driven by two basic impulses, one impelling them to independence and survival, the other to propagation and thereby (she suggested) to the loss of individuality.
Spielrein’s account differs from Freud’s in some ways—notably the link she makes between the impulse of procreation and the destruction of the individual. These differences point to the influence of Schopenhauer, who shaped much of the central European intelligentsia’s thinking at the start of the 20th century. Schopenhauer’s impact on fin-de-siècle European culture can hardly be exaggerated. His view that human intelligence is the blind servant of unconscious will informs the writings of Tolstoy, Conrad, Hardy and Proust. Schopenhauer’s most lasting impact, however, was in questioning the prevailing view of the human mind—a view that had shaped western thought at least since Aristotle, continued to be formative throughout the Christian era and underpinned the European Enlightenment.
Schopenhauer posed a major challenge to the prevailing Enlightenment worldview. In much of the western tradition, consciousness and thought were treated as being virtually one and the same; the possibility that thought might be unconscious was excluded almost by definition. But for Schopenhauer the conscious part of the human mind was only the visible surface of inner life, which obeyed the non-rational imperatives of bodily desire rather than conscious deliberation. It was Schopenhauer who, in a celebrated chapter on “The Metaphysics of Sexual Love” in The World as Will and Idea, affirmed the primary importance of sexuality in human life, suggesting that the sexual impulse operates independently of the choices and intentions of individuals, without regard for—and often at the expense of—their freedom and well-being. Schopenhauer also examined the meaning of dreams and the role of slips of the tongue in revealing repressed thoughts and emotions, ideas that Freud would make his own. Though Freud rarely mentions him, there can be little doubt that he read the philosopher closely. So most likely did Spielrein, whose account of sexuality as a threat to individual autonomy resembles Schopenhauer’s more even than does Freud’s.
From one point of view, Freud’s work was an attempt to transplant the idea of the unconscious mind posited in Schopenhauer’s philosophy into the domain of science. When Freud originated psychoanalysis, he wanted it to be a science. One reason was because achieving scientific standing for his ideas would enable them to overcome the opposition of moralising critics who objected to the central place of sexuality in psychoanalysis. Another was that, for most of his life, Freud never doubted that science was the only true repository of human knowledge. Here he revealed the influence of Ernst Mach (1838-1916), an Austrian physicist and philosopher whose ideas were pervasive in Freud’s Vienna. For Mach, science was not a mirror of nature but a method for ordering human sensations, continuing and refining the picture of the world that has been evolved in the human organism. If we perceived things as they are we would see chaos, since much of the order we perceive in the world is projected into it by the human mind.
Here Mach—like Schopenhauer—was developing the philosophy of Kant, who believed that the world we perceive is shaped by human categories. As is generally recognised, Kant is one of the greatest philosophers of the Enlightenment, who saw his task as rescuing human knowledge from the near-destruction that it had suffered under the assaults of David Hume, an Enlightenment philosopher of equal stature. What is less commonly understood is that Kant’s impact was to reinforce the scepticism he aimed to resist. Taking his point of departure from Kant, Schopenhauer came to the view that the world as understood by science was an illusion, while for Mach it was a human construction. It was against this background that Freud took for granted that science was the only source of knowledge, while at the same time accepting that science could not reveal the nature of things.
***
It is a paradoxical position, as the development of Freud’s thought illustrates. If science is a system of human constructions, useful for practical purposes but not a literal account of reality, what makes it superior to other modes of thinking? If science is also a sort of mythology—as Freud suggested in his correspondence with Einstein—what becomes of the Enlightenment project of dispelling myth through scientific inquiry? These were questions that Freud faced, and in some measure resolved, in the account of religion he developed towards the end of his life. In The Future of an Illusion (1927), he had interpreted religion largely in the standard Enlightenment fashion that has been revived in recent years, and is now so wearisomely familiar: religion was an error born of ignorance, which was bound to retreat as knowledge advanced. Never placing too much trust in reason, Freud did not expect religion to vanish; but at this point he seemed convinced that the diminishing role of religion in human life would be an altogether good thing.
The account of religion he presented ten years later in Moses and Monotheism (1937) was more complex. In the earlier book he had recognised that, answering to enduring human needs—particularly the need for consolation—religious beliefs were not scientific theories; but neither were they necessarily false. While religions might be illusions, illusions were not just errors—they could contain truth. In Moses and Monotheism, Freud went further, arguing that religion had played an essential role in the development of human inquiry. The Jewish belief in an unseen God was not a relic of ignorance without any positive value. By affirming a hidden reality, the idea of an invisible deity had encouraged inquiry into what lay behind the world that is disclosed to the senses. More, the belief in an unseen god had allowed a new kind of self-examination to develop—one that aimed to explore the inner world by looking beneath the surface of conscious awareness. Freud’s attempt to gain insight into the invisible workings of the mind may have been an extension of scientific method into new areas; but this advance was possible, Freud came to think, only because religion had prepared the ground. Without ever surrendering his uncompromising atheism, Freud acknowledged that psychoanalysis owed its existence to faith.
In accepting that illusion could be productive, Freud was retracing the steps of Schopenhauer’s errant disciple Nietzsche. At the same time Freud was making a decisive break with a dominant strand of Enlightenment thinking. According to Alasdair MacIntyre, who developed the idea in his book After Virtue (1981), Nietzsche brought the Enlightenment to a close by showing that the project of a morality that rested solely on human will was self-defeating. MacIntyre’s argument has the merit of recognising that Nietzsche was an Enlightenment thinker—rather than the crazed irrationalist of vulgar intellectual history—as well as one of the Enlightenment’s more formidable critics. It was Freud, however, who made the more radical break with Enlightenment thinking. Even if he confines its scope to the absurd figure of the Übermensch, Nietzsche remains a militant partisan of human autonomy. Freud, by contrast despite almost everything that has been written about him—aimed as much to mark the limits of human autonomy as to extend it. His words of advice to a patient indicate how much his thinking diverged from the view of open-ended human possibilities that is asserted adamantly today: “I do not doubt that it would be easier for fate to take away your suffering than it would be for me. But you will see for yourself how much has been gained if we succeed in turning your hysterical misery into common unhappiness. Having restored your inner life, you will be better able to arm yourself against that unhappiness.” The tone of this injunction—with its use of the language of fate, prohibited among progressive right-thinking people—could not be further from contemporary ways of feeling and thinking.
In some respects Freud’s conception of psychoanalysis has more in common with the ancient Stoic art of life than with any modern way of thinking. As Philip Rieff argued in Freud: the Mind of the Moralist (1959), which remains the most penetrating study of the subject, there are good reasons for thinking Freud was formulating a new version of Stoic ethics. The goal of the Stoics was self-mastery through the acceptance of a personal fate, a condition that was supposed to go with tranquillity of mind. In looking back to infancy and childhood, Freud was pointing to the fact that the choosing self—one of the central fictions of liberal humanism—is itself unchosen, formed in a state of helplessness and bearing the traces of that experience forever after. It was this beleaguered self that Freud aimed to fortify: by gaining insight into the early experiences that shape our habits of feeling, he believed, we can in some measure reorder our response to the world. This is the respect in which Freud was proposing a version of Stoic ethics. But his Stoicism differed from the ancients in at least two important ways.
In the Meditations of the Roman emperor Marcus Aurelius, self-mastery is achieved by identifying the self with the cosmos, a semi-divine order of things that is intrinsically rational. At bottom an uncompromisingly modern thinker, Freud had no such mystical faith in logic as the essence of the universe. The self-mastery he advocated—and practised—was not premised on the redemptive power of reason. Instead, it required accepting chaos as an ultimate fact. Here a second difference with ancient Stoicism appears: Freud never held out the hope of tranquillity. Rather, he aimed to reconcile those who entered psychoanalysis to a state of perpetual unrest. As has been argued by Adam Phillips, Freud’s most creative contemporary interpreter, psychoanalysis does not so much promise inner peace as open up a possibility of release from the fantasy that inner conflict will end. In this Freud also differed fundamentally from Schopenhauer, who never ceased to cling to a tormenting dream of salvation.
It may now be clearer, perhaps, why Freud’s thought is once again an object of scandal. His assault on the innocent verities of rationalism does not come from an avowed enemy of the Enlightenment—like that of Joseph de Maistre, say, whose attacks on reason were done in the service of revealed truth—but from one of its most resolute protagonists. An intrepid partisan of reason, Freud devoted his life to exploring reason’s limits. He was ready to accept that psychoanalysis could never be the science he had once wanted it to be. At the same time he came to accept that science might be superior to other modes of thinking only in limited ways. The myth-making impulse, which functions as the bogeyman of infantile rationalism, could not be eradicated from the human mind or from science.
Freud’s thought is a vital corrective to the scientific triumphalism that is making so much noise at the present time. But more than any other feature of his thinking, it is his acceptance of the flawed nature of human beings that is offensive today. Freud’s unforgivable sin was in locating the source of human disorder within human beings themselves. The painful conflicts in which humans have been entangled throughout their history and pre-history do not come only from oppression, poverty, inequality or lack of education. They originate in permanent flaws of the human animal. Of course Freud was not the first Enlightenment thinker to accept this fact. So did Thomas Hobbes. Like Hobbes, Freud belongs in a tradition of Enlightenment thinking that aims to understand rather than to edify. Both aimed to reduce needless conflict; but neither of them imagined that the sources of such conflict could be eliminated by any increase in human knowledge. Even more than Hobbes, Freud was clear that destructive conflict goes with being human. This, in the final analysis, is why Freud is so unpopular today.
In a well-known passage at the end of Civilization and Its Discontents (1930), Freud declared: “I have not the courage to rise up before my fellow-men as a prophet, and I bow to their reproach that I can offer them no consolation…” What is most in demand at the start of the 21st century, in contrast, is consolation and nothing else. Enlightenment fundamentalism—the insistence by writers such as Christopher Hitchens and Richard Dawkins that our salvation lies in affirming a highly selective set of “Enlightenment values”—serves this emotional need for meaning rather than any imperative of understanding. Like the religions they disparage, but with less profundity and little evident effect, the varieties of Enlightenment thinking on offer today are balm for the uneasy soul. The scientific-sounding formulae with which they appease their anxiety—the end of history, the flat world, the inexorable but forever delayed process of secularisation—are more fantastical than anything in Freud’s “gloomy mythology.”
The incessant ranting uplift and adamant certainty of latter-day partisans of Enlightenment are symptoms of a loss of nerve. Baffled and rattled by the unfolding scene, requiring incessant reassurance if they are not to fall into mawkish despair, these evangelists of reason are engaged—no doubt unconsciously—in a kind of collective therapy. Inevitably, they find Freud an intensely discomforting figure. Among many of his followers, the practice of self-inquiry that Freud invented has been turned into a technique of psychological adjustment—the opposite, in many ways, of what he intended. In this respect, at least, contemporary hostility to Freud expresses a sound intuition. What Freud offers is a way of thinking in which the experience of being human can be seen to be more intractably difficult, and at the same time more interesting and worthwhile, than anything imagined in the cheap little gospels of progress and self-improvement that are being hawked today.
If Freud has been misunderstood, neglected or repudiated, he would have expected nothing else. He is rejected now for the same reason that he was rejected in fin-de-siècle Vienna: his heroic refusal to flatter humankind. As his correspondence with Einstein confirms, he did not share the hope that reason could deliver humankind from the “active instinct for hatred and destruction,” which was clearly at work in Europe at the time. When he left Nazi-occupied Austria to spend the last year of his life in Britain, he knew that the destruction that lay ahead could not by then be prevented. But fate could still be mocked, and so defied. When leaving Austria, Freud was required to sign a document testifying that he had been well and fairly treated. He did so, adding in his own hand: “I can most highly recommend the Gestapo to everyone.”
John Gray 14th December 2011 — Issue 190
Sigmund Freud is out of fashion. The reason? His heroic refusal to flatter humankind
Writing to Albert Einstein in the early 1930s, Sigmund Freud suggested that “man has in him an active instinct for hatred and destruction.” Freud went on to contrast this “instinct to destroy and kill” with one he called erotic—an instinct “to conserve and unify,” an instinct for love.
Without speculating too much, Freud continued, one might suppose that these instincts function in every living being, with what he called “the death instinct”—thanatos—acting “to work its ruin and reduce life to its primal state of inert matter.” The death instinct provided “the biological justification for all those vile, pernicious propensities [to war] which we are now combating.”
To be sure, Freud concluded, all this talk of eros and thanatos might give Einstein the impression that psychoanalytic theory amounted to a “species of mythology, and a gloomy one at that.” But if so, Freud was unabashed, asking Einstein: “Does not every natural science lead ultimately to this—a sort of mythology? Is it otherwise today with your physical sciences?”
Today the idea that psychoanalysis is not a science is commonplace, but no part of Freud’s inheritance is more suspect than the theory of the death instinct. The very idea of instinct is viewed with suspicion. Talk of human instincts, or indeed of human nature, is dismissed as a form of intellectual atavism: human behaviour is seen as far more complex and at the same time more amenable to rational control than Freud believed or implied. Theories of human instinct only serve to block those impulses to progress and rationality that (for all the scorn that is directed against the very idea of human nature) are considered to be quintessentially human.
Freud’s ideas are today not simply rejected as false. They are repudiated as being dangerous or immoral; the “gloomy mythology” of warring instincts is condemned as a kind of slander on the species, the fundamental nobility of which it is sacrilege to deny. To be sure, righteous indignation has informed the response to Freud’s thought from the beginning. But its new strength helps explain one of the more remarkable features of intellectual life at the start of the 21st century, a time that in its own eyes is more enlightened than any other: the intense unpopularity of Freud, the last great Enlightenment thinker.
Born in Austria-Hungary in 1856 and dying in London in 1939, Freud is commonly known as the originator of the idea of the unconscious mind. However, the idea can be found in a number of earlier thinkers, notably the philosopher Arthur Schopenhauer. It would be more accurate to describe Freud as aiming to make the unconscious mind an object of scientific investigation—a prototypically Enlightenment project of extending the scientific method into previously unexplored regions. Many other 20th century thinkers aimed to examine and influence human life through science and reason, the common pursuit of the quarrelling family of intellectual movements, appearing from the 17th century onwards, that formed the Enlightenment. But by applying the Enlightenment project to forbidden regions of the human mind Freud, more than anyone else, revealed the project’s limits.
Starting with research into hysteria, where he concluded that hysterical symptoms often reflected the persisting influence of repressed memories, Freud developed psychoanalysis—a body of thought in which the idea that much of our mental life is repressed and inaccessible to conscious awareness was central.
The practice of psychotherapy that Freud began—the so-called “talking cure”—had the effect of promoting the idea that psychological conflict can be overcome by the sufferer gaining insight into the early experiences from which it may have originated. Later thinkers would attack Freud’s emphasis on early experience and the claims attributed to him regarding the therapeutic value of psychoanalysis. Yet several generations of intellectuals were in no doubt that he was a thinker of major importance. It is only recently that his ideas have been widely disparaged and dismissed. Initially rejected because of the central importance they gave to sexuality in the formation of personality, Freud’s ideas are rejected today because they imply that the human animal is ineradicably flawed. It is not Freud’s insistence on sexuality that is the source of scandal, but the claim that humans are afflicted by a destructive impulse.
The opprobrium that surrounds Freud is all the more intriguing given that the idea that humankind might be possessed by an impulse to destruction was never confined to him alone. Many thinkers entertained similar thoughts around the start of the last century, including one who was largely forgotten until an early part of her life story caught the eye of the filmmaker David Cronenberg. Sabrina Spielrein, the pivotal figure in A Dangerous Method (to be released on 10th February 2012), appears in the film as a hysterical young woman, exhibiting a predilection for sadomasochistic sex following abuse by her father, who after being confined in a mental institution receives treatment from Jung, who then becomes her lover.
The story of the film seems not far from what actually happened. Spielrein did experience a variety of personal difficulties, and was for a time confined in an institution. Whether she and Jung were lovers is not known; but the consensus among those who have studied the episode is that what happened between them went beyond what can be properly expected, then or now, in a professional relationship. Where Spielrein has been remembered, it is as a minor figure in the developing conflict between the two psychoanalytic founders.
This is a pity, for she was much more than that. Spielrein trained and practised as a psychotherapist (the developmental psychologist Jean Piaget being one of her patients) and made important contributions to psychoanalytic theory, some aspects of which are echoed in Freud’s later work. Coming from a Russian-Jewish family of doctors and psychologists, she moved to the Soviet Union in the early 1920s, where she married and had children and worked with the neurologist Alexander Luria, among others. Information about her life and work after this point is sketchy. What is known is that Spielrein’s husband and several members of her family fell victim to Stalin’s terror, while Spielrien herself was shot, along with her children and the rest of the Jewish population of her native city Rostov, after being marched through the main street by the SS in 1942. She was then buried in a mass grave along with thousands of others.
If Spielrein’s life was blighted, it was not by her encounter with Jung (though she may have regretted the relationship). She emerged from the experience to produce some of the most interesting ideas of the early years of psychoanalysis. Her paper “Destruction as a cause of coming into being,” given as a lecture to a meeting of the Vienna Psychoanalytic Society chaired by Freud in 1911, prefigures Freud’s claim that human beings are ruled by two opposing instincts. Spielrein suggested that humans are driven by two basic impulses, one impelling them to independence and survival, the other to propagation and thereby (she suggested) to the loss of individuality.
Spielrein’s account differs from Freud’s in some ways—notably the link she makes between the impulse of procreation and the destruction of the individual. These differences point to the influence of Schopenhauer, who shaped much of the central European intelligentsia’s thinking at the start of the 20th century. Schopenhauer’s impact on fin-de-siècle European culture can hardly be exaggerated. His view that human intelligence is the blind servant of unconscious will informs the writings of Tolstoy, Conrad, Hardy and Proust. Schopenhauer’s most lasting impact, however, was in questioning the prevailing view of the human mind—a view that had shaped western thought at least since Aristotle, continued to be formative throughout the Christian era and underpinned the European Enlightenment.
Schopenhauer posed a major challenge to the prevailing Enlightenment worldview. In much of the western tradition, consciousness and thought were treated as being virtually one and the same; the possibility that thought might be unconscious was excluded almost by definition. But for Schopenhauer the conscious part of the human mind was only the visible surface of inner life, which obeyed the non-rational imperatives of bodily desire rather than conscious deliberation. It was Schopenhauer who, in a celebrated chapter on “The Metaphysics of Sexual Love” in The World as Will and Idea, affirmed the primary importance of sexuality in human life, suggesting that the sexual impulse operates independently of the choices and intentions of individuals, without regard for—and often at the expense of—their freedom and well-being. Schopenhauer also examined the meaning of dreams and the role of slips of the tongue in revealing repressed thoughts and emotions, ideas that Freud would make his own. Though Freud rarely mentions him, there can be little doubt that he read the philosopher closely. So most likely did Spielrein, whose account of sexuality as a threat to individual autonomy resembles Schopenhauer’s more even than does Freud’s.
From one point of view, Freud’s work was an attempt to transplant the idea of the unconscious mind posited in Schopenhauer’s philosophy into the domain of science. When Freud originated psychoanalysis, he wanted it to be a science. One reason was because achieving scientific standing for his ideas would enable them to overcome the opposition of moralising critics who objected to the central place of sexuality in psychoanalysis. Another was that, for most of his life, Freud never doubted that science was the only true repository of human knowledge. Here he revealed the influence of Ernst Mach (1838-1916), an Austrian physicist and philosopher whose ideas were pervasive in Freud’s Vienna. For Mach, science was not a mirror of nature but a method for ordering human sensations, continuing and refining the picture of the world that has been evolved in the human organism. If we perceived things as they are we would see chaos, since much of the order we perceive in the world is projected into it by the human mind.
Here Mach—like Schopenhauer—was developing the philosophy of Kant, who believed that the world we perceive is shaped by human categories. As is generally recognised, Kant is one of the greatest philosophers of the Enlightenment, who saw his task as rescuing human knowledge from the near-destruction that it had suffered under the assaults of David Hume, an Enlightenment philosopher of equal stature. What is less commonly understood is that Kant’s impact was to reinforce the scepticism he aimed to resist. Taking his point of departure from Kant, Schopenhauer came to the view that the world as understood by science was an illusion, while for Mach it was a human construction. It was against this background that Freud took for granted that science was the only source of knowledge, while at the same time accepting that science could not reveal the nature of things.
***
It is a paradoxical position, as the development of Freud’s thought illustrates. If science is a system of human constructions, useful for practical purposes but not a literal account of reality, what makes it superior to other modes of thinking? If science is also a sort of mythology—as Freud suggested in his correspondence with Einstein—what becomes of the Enlightenment project of dispelling myth through scientific inquiry? These were questions that Freud faced, and in some measure resolved, in the account of religion he developed towards the end of his life. In The Future of an Illusion (1927), he had interpreted religion largely in the standard Enlightenment fashion that has been revived in recent years, and is now so wearisomely familiar: religion was an error born of ignorance, which was bound to retreat as knowledge advanced. Never placing too much trust in reason, Freud did not expect religion to vanish; but at this point he seemed convinced that the diminishing role of religion in human life would be an altogether good thing.
The account of religion he presented ten years later in Moses and Monotheism (1937) was more complex. In the earlier book he had recognised that, answering to enduring human needs—particularly the need for consolation—religious beliefs were not scientific theories; but neither were they necessarily false. While religions might be illusions, illusions were not just errors—they could contain truth. In Moses and Monotheism, Freud went further, arguing that religion had played an essential role in the development of human inquiry. The Jewish belief in an unseen God was not a relic of ignorance without any positive value. By affirming a hidden reality, the idea of an invisible deity had encouraged inquiry into what lay behind the world that is disclosed to the senses. More, the belief in an unseen god had allowed a new kind of self-examination to develop—one that aimed to explore the inner world by looking beneath the surface of conscious awareness. Freud’s attempt to gain insight into the invisible workings of the mind may have been an extension of scientific method into new areas; but this advance was possible, Freud came to think, only because religion had prepared the ground. Without ever surrendering his uncompromising atheism, Freud acknowledged that psychoanalysis owed its existence to faith.
In accepting that illusion could be productive, Freud was retracing the steps of Schopenhauer’s errant disciple Nietzsche. At the same time Freud was making a decisive break with a dominant strand of Enlightenment thinking. According to Alasdair MacIntyre, who developed the idea in his book After Virtue (1981), Nietzsche brought the Enlightenment to a close by showing that the project of a morality that rested solely on human will was self-defeating. MacIntyre’s argument has the merit of recognising that Nietzsche was an Enlightenment thinker—rather than the crazed irrationalist of vulgar intellectual history—as well as one of the Enlightenment’s more formidable critics. It was Freud, however, who made the more radical break with Enlightenment thinking. Even if he confines its scope to the absurd figure of the Übermensch, Nietzsche remains a militant partisan of human autonomy. Freud, by contrast despite almost everything that has been written about him—aimed as much to mark the limits of human autonomy as to extend it. His words of advice to a patient indicate how much his thinking diverged from the view of open-ended human possibilities that is asserted adamantly today: “I do not doubt that it would be easier for fate to take away your suffering than it would be for me. But you will see for yourself how much has been gained if we succeed in turning your hysterical misery into common unhappiness. Having restored your inner life, you will be better able to arm yourself against that unhappiness.” The tone of this injunction—with its use of the language of fate, prohibited among progressive right-thinking people—could not be further from contemporary ways of feeling and thinking.
In some respects Freud’s conception of psychoanalysis has more in common with the ancient Stoic art of life than with any modern way of thinking. As Philip Rieff argued in Freud: the Mind of the Moralist (1959), which remains the most penetrating study of the subject, there are good reasons for thinking Freud was formulating a new version of Stoic ethics. The goal of the Stoics was self-mastery through the acceptance of a personal fate, a condition that was supposed to go with tranquillity of mind. In looking back to infancy and childhood, Freud was pointing to the fact that the choosing self—one of the central fictions of liberal humanism—is itself unchosen, formed in a state of helplessness and bearing the traces of that experience forever after. It was this beleaguered self that Freud aimed to fortify: by gaining insight into the early experiences that shape our habits of feeling, he believed, we can in some measure reorder our response to the world. This is the respect in which Freud was proposing a version of Stoic ethics. But his Stoicism differed from the ancients in at least two important ways.
In the Meditations of the Roman emperor Marcus Aurelius, self-mastery is achieved by identifying the self with the cosmos, a semi-divine order of things that is intrinsically rational. At bottom an uncompromisingly modern thinker, Freud had no such mystical faith in logic as the essence of the universe. The self-mastery he advocated—and practised—was not premised on the redemptive power of reason. Instead, it required accepting chaos as an ultimate fact. Here a second difference with ancient Stoicism appears: Freud never held out the hope of tranquillity. Rather, he aimed to reconcile those who entered psychoanalysis to a state of perpetual unrest. As has been argued by Adam Phillips, Freud’s most creative contemporary interpreter, psychoanalysis does not so much promise inner peace as open up a possibility of release from the fantasy that inner conflict will end. In this Freud also differed fundamentally from Schopenhauer, who never ceased to cling to a tormenting dream of salvation.
It may now be clearer, perhaps, why Freud’s thought is once again an object of scandal. His assault on the innocent verities of rationalism does not come from an avowed enemy of the Enlightenment—like that of Joseph de Maistre, say, whose attacks on reason were done in the service of revealed truth—but from one of its most resolute protagonists. An intrepid partisan of reason, Freud devoted his life to exploring reason’s limits. He was ready to accept that psychoanalysis could never be the science he had once wanted it to be. At the same time he came to accept that science might be superior to other modes of thinking only in limited ways. The myth-making impulse, which functions as the bogeyman of infantile rationalism, could not be eradicated from the human mind or from science.
Freud’s thought is a vital corrective to the scientific triumphalism that is making so much noise at the present time. But more than any other feature of his thinking, it is his acceptance of the flawed nature of human beings that is offensive today. Freud’s unforgivable sin was in locating the source of human disorder within human beings themselves. The painful conflicts in which humans have been entangled throughout their history and pre-history do not come only from oppression, poverty, inequality or lack of education. They originate in permanent flaws of the human animal. Of course Freud was not the first Enlightenment thinker to accept this fact. So did Thomas Hobbes. Like Hobbes, Freud belongs in a tradition of Enlightenment thinking that aims to understand rather than to edify. Both aimed to reduce needless conflict; but neither of them imagined that the sources of such conflict could be eliminated by any increase in human knowledge. Even more than Hobbes, Freud was clear that destructive conflict goes with being human. This, in the final analysis, is why Freud is so unpopular today.
In a well-known passage at the end of Civilization and Its Discontents (1930), Freud declared: “I have not the courage to rise up before my fellow-men as a prophet, and I bow to their reproach that I can offer them no consolation…” What is most in demand at the start of the 21st century, in contrast, is consolation and nothing else. Enlightenment fundamentalism—the insistence by writers such as Christopher Hitchens and Richard Dawkins that our salvation lies in affirming a highly selective set of “Enlightenment values”—serves this emotional need for meaning rather than any imperative of understanding. Like the religions they disparage, but with less profundity and little evident effect, the varieties of Enlightenment thinking on offer today are balm for the uneasy soul. The scientific-sounding formulae with which they appease their anxiety—the end of history, the flat world, the inexorable but forever delayed process of secularisation—are more fantastical than anything in Freud’s “gloomy mythology.”
The incessant ranting uplift and adamant certainty of latter-day partisans of Enlightenment are symptoms of a loss of nerve. Baffled and rattled by the unfolding scene, requiring incessant reassurance if they are not to fall into mawkish despair, these evangelists of reason are engaged—no doubt unconsciously—in a kind of collective therapy. Inevitably, they find Freud an intensely discomforting figure. Among many of his followers, the practice of self-inquiry that Freud invented has been turned into a technique of psychological adjustment—the opposite, in many ways, of what he intended. In this respect, at least, contemporary hostility to Freud expresses a sound intuition. What Freud offers is a way of thinking in which the experience of being human can be seen to be more intractably difficult, and at the same time more interesting and worthwhile, than anything imagined in the cheap little gospels of progress and self-improvement that are being hawked today.
If Freud has been misunderstood, neglected or repudiated, he would have expected nothing else. He is rejected now for the same reason that he was rejected in fin-de-siècle Vienna: his heroic refusal to flatter humankind. As his correspondence with Einstein confirms, he did not share the hope that reason could deliver humankind from the “active instinct for hatred and destruction,” which was clearly at work in Europe at the time. When he left Nazi-occupied Austria to spend the last year of his life in Britain, he knew that the destruction that lay ahead could not by then be prevented. But fate could still be mocked, and so defied. When leaving Austria, Freud was required to sign a document testifying that he had been well and fairly treated. He did so, adding in his own hand: “I can most highly recommend the Gestapo to everyone.”
Friday, January 20, 2012
The Perfect Republican Candidate for President
It seems to me that if you could combine Gingrich and Romney you'd have the perfect Republican candidate. Consider this:
With Gingrich you'd have 1) the old Southern white male Confederate to the core 2) a race baiter 3) someone who knows how to cry like Boehner and Glenn Beck) and the perfect moral hypocrite who talks about family values while shredding family values in his private life.
With Romney you'd have 1) the quintessential country club rich rich Republican 2) a candidate who flips on issues continually 3) and who is evasive about his taxes and finances.
Newt Romney would be perfect!
With Gingrich you'd have 1) the old Southern white male Confederate to the core 2) a race baiter 3) someone who knows how to cry like Boehner and Glenn Beck) and the perfect moral hypocrite who talks about family values while shredding family values in his private life.
With Romney you'd have 1) the quintessential country club rich rich Republican 2) a candidate who flips on issues continually 3) and who is evasive about his taxes and finances.
Newt Romney would be perfect!
Thursday, January 19, 2012
The Conservative Reaction
The Conservative Reaction/ The Chronicle Review
By Corey Robin
It's been a rotten few months for the nation's wealthiest 1 percent. From the senatorial candidacy of Elizabeth Warren to Occupy Wall Street, economic elites have faced a concerted attack on their riches and power, their arrogant and unaccountable ways. And you can hear it in their voices, or at least the voices of their spokesmen. House Majority Leader Eric Cantor declared, "I, for one, am increasingly concerned about the growing mobs occupying Wall Street and the other cities across the country." Mitt Romney told an audience in Florida that "I think it's dangerous—this class warfare." So rattled is George Will that he's been forced to pull out a playbook from an older time. All but calling Warren a Communist, he accused the Oklahoma-born scholarship kid of believing that the government "is entitled to socialize—i.e., conscript—whatever portion" of an individual's property "it considers its share."
After decades of "compassionate conservatism," "a thousand points of light," and "Morning in America," dark talk of class warfare on the right can seem like a strange throwback. So accustomed are we to the sunny Reagan and the populist Tea Party that we've forgotten a basic truth about conservatism: It is a reaction to democratic movements from below, movements like Occupy Wall Street that threaten to reorder society from the bottom up, redistributing power and resources from those who have much to those who have not so much. With the roar against the ruling classes growing ever louder, the right seems to be reverting to type. It thus behooves us to take a second look at the conservative tradition, not just its current incarnation but also across time, for that tradition provides us with an understanding of why the conservative responds to Occupy Wall Street as he does.
Since the modern era began, men and women in subordinate positions have marched against their superiors. They have gathered under different banners—the labor movement, feminism, abolition, socialism—and shouted different slogans: freedom, equality, democracy, revolution. In virtually every instance, their superiors have resisted them. That march and démarche of democracy is one of the main stories of modern politics. And it is the second half of that story, the démarche, that drives the development of ideas we call conservative. For that is what conservatism is: a meditation on, and theoretical rendition of, the felt experience of having power, seeing it threatened, and trying to win it back.
Despite the very real differences among them, workers in a factory are like secretaries in an office, peasants on a manor, slaves on a plantation—even wives in a marriage—in that they live and labor in conditions of unequal power. They submit and obey, heeding the demands of their managers and masters, husbands and lords. Sometimes their lot is freely chosen—workers contract with their employers, wives with their husbands—but its entailments seldom are. What contract, after all, could ever itemize the ins and outs, the daily pains and continuing sufferance, of a job or a marriage? Throughout American history, in fact, the contract has served as a conduit to unforeseen coercion and constraint. Employment and marriage contracts have been interpreted by judges to contain all sorts of unwritten and unwanted provisions of servitude to which wives and workers tacitly consent, even when they have no knowledge of such provisions or wish to stipulate otherwise.
Until 1980, for example, it was legal in every state for a husband to rape his wife. The justification for this dates back to a 1736 treatise by the British jurist Matthew Hale. When a woman marries, he argued, she implicitly agrees to give "up herself in this kind [sexually] unto her husband." Hers is a tacit, if unknowing, consent, "which she cannot retract" for the duration of their union. Having once said yes, she can never say no. As recently as 1957, a standard legal treatise could state, "A man does not commit rape by having sexual intercourse with his lawful wife, even if he does so by force and against her will." If someone tried to write into the marriage contract a requirement that express consent had to be given in order for sex to proceed, judges were bound by common law to ignore or override it. Implicit consent was a structural feature of the contract that neither party could alter. Through that contract, women were doomed to be the sexual servants of their husbands.
Every once in a while, however, the subordinates of this world contest their fates. They protest their conditions, join movements, make demands. Their goals may be minimal and discrete, but in voicing them, they raise the specter of a more fundamental change in power. They cease to be servants or supplicants and become agents, speaking and acting on their own behalf. More than the reforms themselves, it is this assertion of agency that vexes their superiors.
American labor history is filled with complaints from employers and government officials that unionized workers are independent and self-organizing. Indeed, so potent is their self-organization that it threatens to render superfluous the employer and the state. During the Great Upheaval of 1877, striking railroad workers in St. Louis took to running the trains themselves. Fearful that the public might conclude the workers were capable of managing the railroad, the owners tried to stop them, starting a strike of their own in order to prove it was the owners, and only the owners, who could make the trains run on time. During the Seattle general strike of 1919, workers went to great lengths to provide basic government services, including law and order. So successful were they that the mayor concluded it was the workers' ability to limit violence and anarchy that posed the greatest threat to the established order:
The so-called sympathetic Seattle strike was an attempted revolution. ... True, there were no flashing guns, no bombs, no killings. Revolution, I repeat, doesn't need violence. The general strike, as practiced in Seattle, is of itself the weapon of revolution, all the more dangerous because quiet. ... That is to say, it puts the government out of operation.
Conservatism is the theoretical voice of this animus against the agency of the subordinate classes. It provides the most consistent and profound argument for why the lower orders should not be allowed to exercise their independent will, to govern themselves or the polity. Submission is their first duty; agency, the prerogative of elites. Such was the threat Edmund Burke saw in the French Revolution: not merely an expropriation of property or explosion of violence but an inversion of the obligations of deference and command. "The levelers," he claimed, "only change and pervert the natural order of things."
The occupation of an hair-dresser, or of a working tallowchandler, cannot be a matter of honour to any person—to say nothing of a number of other more servile employments. Such descriptions of men ought not to suffer oppression from the state; but the state suffers oppression, if such as they, either individually or collectively, are permitted to rule.
By virtue of membership in a polity, Burke allowed, men had certain rights—to the fruits of their labor, their inheritance, education, and more. But the one right he refused to concede to all men was a "share of power, authority, and direction" they might think they ought to have "in the management of the state."
One of the reasons the subordinate's exercise of agency agitates the conservative imagination is that it takes place in an intimate setting. Every great political blast—from the storming of the Bastille to the March on Washington—is set off by a private fuse: the contest for rights and standing in the family, the factory, and the field. Politicians and parties talk of constitution and amendment, natural rights and inherited privileges. But the real subject of their deliberations is the private life of power. "Here is the secret of the opposition to woman's equality in the state," Elizabeth Cady Stanton wrote. "Men are not ready to recognize it in the home." Behind the riot in the street or debate in Parliament is the maid talking back to her mistress, the worker disobeying his boss. That is why our political arguments—not only about the family but also the welfare state, civil rights, and much else—can be so explosive: They touch upon the most personal relations of power.
When the conservative looks upon a democratic movement from below, this is what he sees: a terrible disturbance in the private life of power. "The real object" of the French Revolution, Burke told Parliament in 1790, is "to break all those connexions, natural and civil, that regulate and hold together the community by a chain of subordination; to raise soldiers against their officers; servants against their masters; tradesmen against their customers; artificers against their employers; tenants against their landlords; curates against their bishops; and children against their parents." Nothing to the Jacobins, he declared at the end of his life, was worthy "of the name of the publick virtue, unless it indicates violence on the private."
Historically, the conservative has sought to forestall the march of democracy in both the public and the private spheres, on the assumption that advances in the one necessarily spur advances in the other. Still, the more profound and prophetic stance on the right has been to cede the field of the public, if he must, but stand fast in the private. Allow men and women to become democratic citizens of the state; make sure they remain feudal subjects in the family, the factory, and the field.
No simple defense of one's own place and privileges, the conservative position stems from a genuine conviction that a world thus emancipated will be ugly, brutish, and dull. It will lack the excellence of a world where the better man commands the worse. This vision of the connection between excellence and rule is what brings together in postwar America that unlikely alliance of the capitalist, with his vision of the employer's untrammeled power in the workplace; the traditionalist, with his vision of the father's rule at home; and the statist, with his vision of a heroic leader pressing his hand upon the face of the earth. Each in his way subscribes to this statement, from the 19th century, of the conservative creed: "To obey a real superior ... is one of the most important of all virtues—a virtue absolutely essential to the attainment of anything great and lasting."
The notion that conservative ideas are a mode of reactionary practice is likely to raise some hackles. It has long been an axiom on the left that the defense of power and privilege is an enterprise devoid of ideas, that right-wing politics is an emotional swamp rather than a movement of considered opinion. Thomas Paine called counterrevolution "an obliteration of knowledge"; Lionel Trilling described American conservatism as a mélange of "irritable mental gestures which seek to resemble ideas."
Conservatives, for their part, have tended to agree. Playing the part of the dull-witted country squire, conservatives have embraced the position of the historian F.J.C. Hearnshaw that "it is commonly sufficient for practical purposes if conservatives, without saying anything, just sit and think, or even if they merely sit." While the aristocratic overtones of that discourse no longer resonate, the conservative still holds on to the label of the untutored and the unlettered; it's part of his populist charm and demotic appeal. Yet nothing could be further from the truth. Conservatism is an idea-driven praxis, and no amount of preening from the right or polemic from the left can reduce or efface the catalog of mind one finds there.
Others will be put off by this argument for a different reason: It threatens the purity and profundity of conservative ideas. For many, the word "reaction" connotes an unthinking, lowly grab for power. But reaction is not reflex. It begins from a position of principle—that some are fit, and thus ought, to rule others—and then recalibrates that principle in light of a challenge from below. This recalibration is no easy task, for such challenges tend by their very nature to disprove the principle. After all, if a ruling class is truly fit to rule, why and how has it allowed a challenge to its power to emerge? What does the emergence of the one say about the fitness of the other?
The conservative faces an additional hurdle: how to defend a principle of rule in a world where nothing is solid, all is in flux. From the moment conservatism came onto the scene as an intellectual movement, it has had to contend with the decline of ancient and medieval ideas of an orderly universe, in which permanent hierarchies of power reflected the eternal structure of the cosmos. The overthrow of the old regime reveals not only the weakness and incompetence of its leaders but also a larger truth about the lack of design in the world. Reconstructing the old regime in the face of a declining faith in permanent hierarchies has proven to be a difficult feat. Not surprisingly, it also has produced some of the most remarkable works of modern thought.
There is another reason to be wary of the effort to dismiss the reactionary thrust of conservatism, and that is the testimony of the tradition itself. From Burke's claim that he and his ilk had been "alarmed into reflexion" by the French Revolution to Russell Kirk's admission that conservatism is a "system of ideas" that "has sustained men ... in their resistance against radical theories and social transformation," the conservative has consistently affirmed that his is a knowledge produced in response to the left. Sometimes that affirmation has been explicit. Lord Salisbury, three times prime minister of Britain, wrote in 1859 that "hostility to Radicalism, incessant, implacable hostility, is the essential definition of Conservatism." In his classic The Conservative Intellectual Movement in America Since 1945, George Nash defined conservatism as "resistance to certain forces perceived to be leftist, revolutionary, and profoundly subversive of what conservatives at the time deemed worth cherishing, defending, and perhaps dying for." More recently, the Harvard political theorist Harvey Mansfield has declared, "I understand conservatism as a reaction to liberalism. It isn't a position that one takes up from the beginning but only when one is threatened by people who want to take away or harm things that deserve to be conserved."
Those are the explicit professions of the counterrevolutionary creed. More interesting are the implicit statements, where antipathy to radicalism and reform is embedded in the very syntax of the argument. Take Michael Oakeshott's famous definition in his essay "On Being Conservative":
To be conservative, then, is to prefer the familiar to the unknown, to prefer the tried to the untried, fact to mystery, the actual to the possible, the limited to the unbounded, the near to the distant, the sufficient to the superabundant, the convenient to the perfect, present laughter to utopian bliss.
One cannot, it seems, enjoy fact and mystery, near and distant, laughter and bliss. One must choose. Far from affirming a simple hierarchy of preferences, Oakeshott's either/or signals that we are on existential ground, where the choice is between not something and its opposite but something and its negation. The conservative would enjoy familiar things in the absence of forces seeking their destruction, Oakeshott concedes, but his enjoyment "will be strongest when" it "is combined with evident risk of loss." And while Oakeshott suggests that such losses can be engineered by a variety of forces, the engineers invariably seem to work on the left. Marx and Engels are "the authors of the most stupendous of our political rationalisms," he writes elsewhere. "Nothing ... can compare with" their abstract utopianism.
There is more to this antagonistic structure of argument than the simple antinomies of partisan politics. As Karl Mannheim argued, what distinguishes conservatism from traditionalism—the universal "vegetative" tendency to remain attached to things as they are—is that conservatism is a deliberate, conscious effort to preserve or recall "those forms of experience which can no longer be had in an authentic way." Conservatism "becomes conscious and reflective when other ways of life and thought appear on the scene, against which it is compelled to take up arms in the ideological struggle."
Where the traditionalist takes the objects of his desire for granted, the conservative cannot. He seeks to enjoy them precisely as they are being—or have been—taken away. If he hopes to enjoy them again, he must fight for them in the public realm. He must speak of them in a language that is politically serviceable and intelligible. But as soon as those objects enter the medium of political speech, they cease to be items of lived experience and become incidents of an ideology. They get wrapped in a narrative of loss—in which the revolutionary or reformist plays a necessary part—and presented in a program of recovery. What was tacit becomes articulate, what was practice becomes polemic.
In defending hierarchical orders, the conservative invariably launches a counterrevolution, often requiring an overhaul of the very regime he is defending. "If we want things to stay as they are," in Lampedusa's classic formulation, "things will have to change." This program entails far more than clichés about preservation through renovation would suggest: Often it requires the most radical measures on the regime's behalf.
Indeed, some of the stuffiest partisans of order have been more than happy, when it has suited their purposes, to indulge in a bit of mayhem and madness. Kirk, the self-styled Burkean, wished to "espouse conservatism with the vehemence of a radical. The thinking conservative, in truth, must take on some of the outward characteristics of the radical, today: he must poke about the roots of society, in the hope of restoring vigor to an old tree half strangled in the rank undergrowth of modern passions." In God and Man at Yale, William F. Buckley declared conservatives "the new radicals."
There's a fairly simple reason for the embrace of radicalism on the right, and it has to do with the reactionary imperative that lies at the core of conservative doctrine. The conservative not only opposes the left; he also believes that the left has been in the driver's seat since, depending on who's counting, the French Revolution or the Reformation. If he is to preserve what he values, the conservative must declare war against the culture as it is. Though the spirit of militant opposition pervades the entirety of conservative discourse, Dinesh D'Souza has put the case most clearly:
Typically, the conservative attempts to conserve, to hold on to the values of the existing society. But ... what if the existing society is inherently hostile to conservative beliefs? It is foolish for a conservative to attempt to conserve that culture. Rather, he must seek to undermine it, to thwart it, to destroy it at the root level. This means that the conservative must ... be philosophically conservative but temperamentally radical.
By now it should be clear that it is not the style or pace of change that the conservative opposes. Burkean theorists like to draw a distinction between evolutionary reform and radical change. The first is slow, incremental, and adaptive; the second is fast, comprehensive, and by design. But that distinction, so dear to Burke and his followers, is often less clear in practice than the theorist allows. In the name of slow, organic, adaptive change, self-declared conservatives opposed the New Deal (Robert Nisbet, Kirk, and Whittaker Chambers) and endorsed the New Deal (Peter Viereck, Clinton Rossiter, and Whittaker Chambers). "Even Fabian Socialists," Nash tartly observes, "who believed in 'the inevitability of gradualness' might be labeled conservatives."
More often the blurriness of the distinction has allowed the conservative to oppose reform on the grounds that it either will lead to revolution or is revolution. Any demand from or on behalf of the lower orders, no matter how tepid or tardy, is too much, too soon, too fast. Reform is revolution, improvement is insurrection. "It may be good or bad," a gloomy Lord Carnarvon wrote of the Second Reform Act of 1867—a bill 20 years in the making that tripled the size of the British electorate—"but it is a revolution."
Today's conservative may have made his peace with some emancipations past. Others, like labor unions and reproductive freedom, he still contests. But that does not alter the fact that when those emancipations first arose as issues, his predecessor was in all likelihood against them. Michael Gerson, a former speechwriter for George W. Bush, is one of today's few conservatives who acknowledge the history of conservative opposition to emancipation. Where other conservatives like to lay claim to the abolitionist or civil-rights mantle, Gerson admits that "honesty requires the recognition that many conservatives, in other times, have been hostile to religiously motivated reform," and that "the conservative habit of mind once opposed most of these changes." Indeed, as Samuel Huntington suggested a half-century ago, saying no to such movements in real time may be what makes someone a conservative throughout time.
Given the reactionary thruST of conservatism, Occupy Wall Street may turn out to be the best thing that ever happened to the right. Thoughtful conservatives have long understood the symbiotic relationship between the right's intellectual—and ultimately political—vitality and insurgencies from the left. Friedrich Hayek accurately observed that the political theory of capitalism "became stationary when it was most influential" and "progressed" only when it was "on the defensive." Frank Meyer, intellectual architect of the fusion strategy that brought together the libertarian and traditionalist wings of the Republican Party, noted that it was "ironic, though not historically unprecedented," that bursts "of creative energy" on the right "should occur simultaneously with a continuing spread of the influence of liberalism in the practical political sphere."
Conversely, conservative writers like David Frum and Andrew Sullivan have worried of late about the intellectual flabbiness of the contemporary right: A movement that once seemed the emblem of heterodoxy has succumbed to stale thinking and rote incantations. But if Occupy Wall Street turns out to be a movement rather than a moment—if it has real staying power; if it moves from public squares to private institutions; if it starts to divest the elite of their privileges and powers, not just in their offshore accounts but in their backyards and board rooms—it could provide the kind of creative provocation that once produced a Burke or a Hayek. The metaphor of occupation is threatening enough; one can only imagine what might happen were it made real. And while the mavens of the right would probably prefer four more years to four good books, they might want to rethink that. They wouldn't be in the position they're in—when, even out of power, they still govern the country—had their predecessors made the same choice.
Corey Robin is an associate professor of political science at Brooklyn College of the City University of New York and CUNY's Graduate Center. He blogs at coreyrobin.com. This essay is adapted from his book The Reactionary Mind: Conservatism From Edmund Burke to Sarah Palin, published by Oxford University Press.
By Corey Robin
It's been a rotten few months for the nation's wealthiest 1 percent. From the senatorial candidacy of Elizabeth Warren to Occupy Wall Street, economic elites have faced a concerted attack on their riches and power, their arrogant and unaccountable ways. And you can hear it in their voices, or at least the voices of their spokesmen. House Majority Leader Eric Cantor declared, "I, for one, am increasingly concerned about the growing mobs occupying Wall Street and the other cities across the country." Mitt Romney told an audience in Florida that "I think it's dangerous—this class warfare." So rattled is George Will that he's been forced to pull out a playbook from an older time. All but calling Warren a Communist, he accused the Oklahoma-born scholarship kid of believing that the government "is entitled to socialize—i.e., conscript—whatever portion" of an individual's property "it considers its share."
After decades of "compassionate conservatism," "a thousand points of light," and "Morning in America," dark talk of class warfare on the right can seem like a strange throwback. So accustomed are we to the sunny Reagan and the populist Tea Party that we've forgotten a basic truth about conservatism: It is a reaction to democratic movements from below, movements like Occupy Wall Street that threaten to reorder society from the bottom up, redistributing power and resources from those who have much to those who have not so much. With the roar against the ruling classes growing ever louder, the right seems to be reverting to type. It thus behooves us to take a second look at the conservative tradition, not just its current incarnation but also across time, for that tradition provides us with an understanding of why the conservative responds to Occupy Wall Street as he does.
Since the modern era began, men and women in subordinate positions have marched against their superiors. They have gathered under different banners—the labor movement, feminism, abolition, socialism—and shouted different slogans: freedom, equality, democracy, revolution. In virtually every instance, their superiors have resisted them. That march and démarche of democracy is one of the main stories of modern politics. And it is the second half of that story, the démarche, that drives the development of ideas we call conservative. For that is what conservatism is: a meditation on, and theoretical rendition of, the felt experience of having power, seeing it threatened, and trying to win it back.
Despite the very real differences among them, workers in a factory are like secretaries in an office, peasants on a manor, slaves on a plantation—even wives in a marriage—in that they live and labor in conditions of unequal power. They submit and obey, heeding the demands of their managers and masters, husbands and lords. Sometimes their lot is freely chosen—workers contract with their employers, wives with their husbands—but its entailments seldom are. What contract, after all, could ever itemize the ins and outs, the daily pains and continuing sufferance, of a job or a marriage? Throughout American history, in fact, the contract has served as a conduit to unforeseen coercion and constraint. Employment and marriage contracts have been interpreted by judges to contain all sorts of unwritten and unwanted provisions of servitude to which wives and workers tacitly consent, even when they have no knowledge of such provisions or wish to stipulate otherwise.
Until 1980, for example, it was legal in every state for a husband to rape his wife. The justification for this dates back to a 1736 treatise by the British jurist Matthew Hale. When a woman marries, he argued, she implicitly agrees to give "up herself in this kind [sexually] unto her husband." Hers is a tacit, if unknowing, consent, "which she cannot retract" for the duration of their union. Having once said yes, she can never say no. As recently as 1957, a standard legal treatise could state, "A man does not commit rape by having sexual intercourse with his lawful wife, even if he does so by force and against her will." If someone tried to write into the marriage contract a requirement that express consent had to be given in order for sex to proceed, judges were bound by common law to ignore or override it. Implicit consent was a structural feature of the contract that neither party could alter. Through that contract, women were doomed to be the sexual servants of their husbands.
Every once in a while, however, the subordinates of this world contest their fates. They protest their conditions, join movements, make demands. Their goals may be minimal and discrete, but in voicing them, they raise the specter of a more fundamental change in power. They cease to be servants or supplicants and become agents, speaking and acting on their own behalf. More than the reforms themselves, it is this assertion of agency that vexes their superiors.
American labor history is filled with complaints from employers and government officials that unionized workers are independent and self-organizing. Indeed, so potent is their self-organization that it threatens to render superfluous the employer and the state. During the Great Upheaval of 1877, striking railroad workers in St. Louis took to running the trains themselves. Fearful that the public might conclude the workers were capable of managing the railroad, the owners tried to stop them, starting a strike of their own in order to prove it was the owners, and only the owners, who could make the trains run on time. During the Seattle general strike of 1919, workers went to great lengths to provide basic government services, including law and order. So successful were they that the mayor concluded it was the workers' ability to limit violence and anarchy that posed the greatest threat to the established order:
The so-called sympathetic Seattle strike was an attempted revolution. ... True, there were no flashing guns, no bombs, no killings. Revolution, I repeat, doesn't need violence. The general strike, as practiced in Seattle, is of itself the weapon of revolution, all the more dangerous because quiet. ... That is to say, it puts the government out of operation.
Conservatism is the theoretical voice of this animus against the agency of the subordinate classes. It provides the most consistent and profound argument for why the lower orders should not be allowed to exercise their independent will, to govern themselves or the polity. Submission is their first duty; agency, the prerogative of elites. Such was the threat Edmund Burke saw in the French Revolution: not merely an expropriation of property or explosion of violence but an inversion of the obligations of deference and command. "The levelers," he claimed, "only change and pervert the natural order of things."
The occupation of an hair-dresser, or of a working tallowchandler, cannot be a matter of honour to any person—to say nothing of a number of other more servile employments. Such descriptions of men ought not to suffer oppression from the state; but the state suffers oppression, if such as they, either individually or collectively, are permitted to rule.
By virtue of membership in a polity, Burke allowed, men had certain rights—to the fruits of their labor, their inheritance, education, and more. But the one right he refused to concede to all men was a "share of power, authority, and direction" they might think they ought to have "in the management of the state."
One of the reasons the subordinate's exercise of agency agitates the conservative imagination is that it takes place in an intimate setting. Every great political blast—from the storming of the Bastille to the March on Washington—is set off by a private fuse: the contest for rights and standing in the family, the factory, and the field. Politicians and parties talk of constitution and amendment, natural rights and inherited privileges. But the real subject of their deliberations is the private life of power. "Here is the secret of the opposition to woman's equality in the state," Elizabeth Cady Stanton wrote. "Men are not ready to recognize it in the home." Behind the riot in the street or debate in Parliament is the maid talking back to her mistress, the worker disobeying his boss. That is why our political arguments—not only about the family but also the welfare state, civil rights, and much else—can be so explosive: They touch upon the most personal relations of power.
When the conservative looks upon a democratic movement from below, this is what he sees: a terrible disturbance in the private life of power. "The real object" of the French Revolution, Burke told Parliament in 1790, is "to break all those connexions, natural and civil, that regulate and hold together the community by a chain of subordination; to raise soldiers against their officers; servants against their masters; tradesmen against their customers; artificers against their employers; tenants against their landlords; curates against their bishops; and children against their parents." Nothing to the Jacobins, he declared at the end of his life, was worthy "of the name of the publick virtue, unless it indicates violence on the private."
Historically, the conservative has sought to forestall the march of democracy in both the public and the private spheres, on the assumption that advances in the one necessarily spur advances in the other. Still, the more profound and prophetic stance on the right has been to cede the field of the public, if he must, but stand fast in the private. Allow men and women to become democratic citizens of the state; make sure they remain feudal subjects in the family, the factory, and the field.
No simple defense of one's own place and privileges, the conservative position stems from a genuine conviction that a world thus emancipated will be ugly, brutish, and dull. It will lack the excellence of a world where the better man commands the worse. This vision of the connection between excellence and rule is what brings together in postwar America that unlikely alliance of the capitalist, with his vision of the employer's untrammeled power in the workplace; the traditionalist, with his vision of the father's rule at home; and the statist, with his vision of a heroic leader pressing his hand upon the face of the earth. Each in his way subscribes to this statement, from the 19th century, of the conservative creed: "To obey a real superior ... is one of the most important of all virtues—a virtue absolutely essential to the attainment of anything great and lasting."
The notion that conservative ideas are a mode of reactionary practice is likely to raise some hackles. It has long been an axiom on the left that the defense of power and privilege is an enterprise devoid of ideas, that right-wing politics is an emotional swamp rather than a movement of considered opinion. Thomas Paine called counterrevolution "an obliteration of knowledge"; Lionel Trilling described American conservatism as a mélange of "irritable mental gestures which seek to resemble ideas."
Conservatives, for their part, have tended to agree. Playing the part of the dull-witted country squire, conservatives have embraced the position of the historian F.J.C. Hearnshaw that "it is commonly sufficient for practical purposes if conservatives, without saying anything, just sit and think, or even if they merely sit." While the aristocratic overtones of that discourse no longer resonate, the conservative still holds on to the label of the untutored and the unlettered; it's part of his populist charm and demotic appeal. Yet nothing could be further from the truth. Conservatism is an idea-driven praxis, and no amount of preening from the right or polemic from the left can reduce or efface the catalog of mind one finds there.
Others will be put off by this argument for a different reason: It threatens the purity and profundity of conservative ideas. For many, the word "reaction" connotes an unthinking, lowly grab for power. But reaction is not reflex. It begins from a position of principle—that some are fit, and thus ought, to rule others—and then recalibrates that principle in light of a challenge from below. This recalibration is no easy task, for such challenges tend by their very nature to disprove the principle. After all, if a ruling class is truly fit to rule, why and how has it allowed a challenge to its power to emerge? What does the emergence of the one say about the fitness of the other?
The conservative faces an additional hurdle: how to defend a principle of rule in a world where nothing is solid, all is in flux. From the moment conservatism came onto the scene as an intellectual movement, it has had to contend with the decline of ancient and medieval ideas of an orderly universe, in which permanent hierarchies of power reflected the eternal structure of the cosmos. The overthrow of the old regime reveals not only the weakness and incompetence of its leaders but also a larger truth about the lack of design in the world. Reconstructing the old regime in the face of a declining faith in permanent hierarchies has proven to be a difficult feat. Not surprisingly, it also has produced some of the most remarkable works of modern thought.
There is another reason to be wary of the effort to dismiss the reactionary thrust of conservatism, and that is the testimony of the tradition itself. From Burke's claim that he and his ilk had been "alarmed into reflexion" by the French Revolution to Russell Kirk's admission that conservatism is a "system of ideas" that "has sustained men ... in their resistance against radical theories and social transformation," the conservative has consistently affirmed that his is a knowledge produced in response to the left. Sometimes that affirmation has been explicit. Lord Salisbury, three times prime minister of Britain, wrote in 1859 that "hostility to Radicalism, incessant, implacable hostility, is the essential definition of Conservatism." In his classic The Conservative Intellectual Movement in America Since 1945, George Nash defined conservatism as "resistance to certain forces perceived to be leftist, revolutionary, and profoundly subversive of what conservatives at the time deemed worth cherishing, defending, and perhaps dying for." More recently, the Harvard political theorist Harvey Mansfield has declared, "I understand conservatism as a reaction to liberalism. It isn't a position that one takes up from the beginning but only when one is threatened by people who want to take away or harm things that deserve to be conserved."
Those are the explicit professions of the counterrevolutionary creed. More interesting are the implicit statements, where antipathy to radicalism and reform is embedded in the very syntax of the argument. Take Michael Oakeshott's famous definition in his essay "On Being Conservative":
To be conservative, then, is to prefer the familiar to the unknown, to prefer the tried to the untried, fact to mystery, the actual to the possible, the limited to the unbounded, the near to the distant, the sufficient to the superabundant, the convenient to the perfect, present laughter to utopian bliss.
One cannot, it seems, enjoy fact and mystery, near and distant, laughter and bliss. One must choose. Far from affirming a simple hierarchy of preferences, Oakeshott's either/or signals that we are on existential ground, where the choice is between not something and its opposite but something and its negation. The conservative would enjoy familiar things in the absence of forces seeking their destruction, Oakeshott concedes, but his enjoyment "will be strongest when" it "is combined with evident risk of loss." And while Oakeshott suggests that such losses can be engineered by a variety of forces, the engineers invariably seem to work on the left. Marx and Engels are "the authors of the most stupendous of our political rationalisms," he writes elsewhere. "Nothing ... can compare with" their abstract utopianism.
There is more to this antagonistic structure of argument than the simple antinomies of partisan politics. As Karl Mannheim argued, what distinguishes conservatism from traditionalism—the universal "vegetative" tendency to remain attached to things as they are—is that conservatism is a deliberate, conscious effort to preserve or recall "those forms of experience which can no longer be had in an authentic way." Conservatism "becomes conscious and reflective when other ways of life and thought appear on the scene, against which it is compelled to take up arms in the ideological struggle."
Where the traditionalist takes the objects of his desire for granted, the conservative cannot. He seeks to enjoy them precisely as they are being—or have been—taken away. If he hopes to enjoy them again, he must fight for them in the public realm. He must speak of them in a language that is politically serviceable and intelligible. But as soon as those objects enter the medium of political speech, they cease to be items of lived experience and become incidents of an ideology. They get wrapped in a narrative of loss—in which the revolutionary or reformist plays a necessary part—and presented in a program of recovery. What was tacit becomes articulate, what was practice becomes polemic.
In defending hierarchical orders, the conservative invariably launches a counterrevolution, often requiring an overhaul of the very regime he is defending. "If we want things to stay as they are," in Lampedusa's classic formulation, "things will have to change." This program entails far more than clichés about preservation through renovation would suggest: Often it requires the most radical measures on the regime's behalf.
Indeed, some of the stuffiest partisans of order have been more than happy, when it has suited their purposes, to indulge in a bit of mayhem and madness. Kirk, the self-styled Burkean, wished to "espouse conservatism with the vehemence of a radical. The thinking conservative, in truth, must take on some of the outward characteristics of the radical, today: he must poke about the roots of society, in the hope of restoring vigor to an old tree half strangled in the rank undergrowth of modern passions." In God and Man at Yale, William F. Buckley declared conservatives "the new radicals."
There's a fairly simple reason for the embrace of radicalism on the right, and it has to do with the reactionary imperative that lies at the core of conservative doctrine. The conservative not only opposes the left; he also believes that the left has been in the driver's seat since, depending on who's counting, the French Revolution or the Reformation. If he is to preserve what he values, the conservative must declare war against the culture as it is. Though the spirit of militant opposition pervades the entirety of conservative discourse, Dinesh D'Souza has put the case most clearly:
Typically, the conservative attempts to conserve, to hold on to the values of the existing society. But ... what if the existing society is inherently hostile to conservative beliefs? It is foolish for a conservative to attempt to conserve that culture. Rather, he must seek to undermine it, to thwart it, to destroy it at the root level. This means that the conservative must ... be philosophically conservative but temperamentally radical.
By now it should be clear that it is not the style or pace of change that the conservative opposes. Burkean theorists like to draw a distinction between evolutionary reform and radical change. The first is slow, incremental, and adaptive; the second is fast, comprehensive, and by design. But that distinction, so dear to Burke and his followers, is often less clear in practice than the theorist allows. In the name of slow, organic, adaptive change, self-declared conservatives opposed the New Deal (Robert Nisbet, Kirk, and Whittaker Chambers) and endorsed the New Deal (Peter Viereck, Clinton Rossiter, and Whittaker Chambers). "Even Fabian Socialists," Nash tartly observes, "who believed in 'the inevitability of gradualness' might be labeled conservatives."
More often the blurriness of the distinction has allowed the conservative to oppose reform on the grounds that it either will lead to revolution or is revolution. Any demand from or on behalf of the lower orders, no matter how tepid or tardy, is too much, too soon, too fast. Reform is revolution, improvement is insurrection. "It may be good or bad," a gloomy Lord Carnarvon wrote of the Second Reform Act of 1867—a bill 20 years in the making that tripled the size of the British electorate—"but it is a revolution."
Today's conservative may have made his peace with some emancipations past. Others, like labor unions and reproductive freedom, he still contests. But that does not alter the fact that when those emancipations first arose as issues, his predecessor was in all likelihood against them. Michael Gerson, a former speechwriter for George W. Bush, is one of today's few conservatives who acknowledge the history of conservative opposition to emancipation. Where other conservatives like to lay claim to the abolitionist or civil-rights mantle, Gerson admits that "honesty requires the recognition that many conservatives, in other times, have been hostile to religiously motivated reform," and that "the conservative habit of mind once opposed most of these changes." Indeed, as Samuel Huntington suggested a half-century ago, saying no to such movements in real time may be what makes someone a conservative throughout time.
Given the reactionary thruST of conservatism, Occupy Wall Street may turn out to be the best thing that ever happened to the right. Thoughtful conservatives have long understood the symbiotic relationship between the right's intellectual—and ultimately political—vitality and insurgencies from the left. Friedrich Hayek accurately observed that the political theory of capitalism "became stationary when it was most influential" and "progressed" only when it was "on the defensive." Frank Meyer, intellectual architect of the fusion strategy that brought together the libertarian and traditionalist wings of the Republican Party, noted that it was "ironic, though not historically unprecedented," that bursts "of creative energy" on the right "should occur simultaneously with a continuing spread of the influence of liberalism in the practical political sphere."
Conversely, conservative writers like David Frum and Andrew Sullivan have worried of late about the intellectual flabbiness of the contemporary right: A movement that once seemed the emblem of heterodoxy has succumbed to stale thinking and rote incantations. But if Occupy Wall Street turns out to be a movement rather than a moment—if it has real staying power; if it moves from public squares to private institutions; if it starts to divest the elite of their privileges and powers, not just in their offshore accounts but in their backyards and board rooms—it could provide the kind of creative provocation that once produced a Burke or a Hayek. The metaphor of occupation is threatening enough; one can only imagine what might happen were it made real. And while the mavens of the right would probably prefer four more years to four good books, they might want to rethink that. They wouldn't be in the position they're in—when, even out of power, they still govern the country—had their predecessors made the same choice.
Corey Robin is an associate professor of political science at Brooklyn College of the City University of New York and CUNY's Graduate Center. He blogs at coreyrobin.com. This essay is adapted from his book The Reactionary Mind: Conservatism From Edmund Burke to Sarah Palin, published by Oxford University Press.
More on the Corey Robin Book (2)
by Alan Wolfe
One Right The Power Lover The Visitor October 27, 2011 | 12:00 am Print
The Reactionary Mind: Conservatism from Edmund Burke to Sarah Palin
by Corey Robin
Oxford University Press, 304 pp., $36.75
AS THE REPUBLICAN Party lurches toward nominating a presidential candidate to run against Barack Obama, we are likely to hear talk of deep splits within the conservative movement. Tea Party activists, who hate state intervention into the economy, will be distinguished from social conservatives, who love state intervention into matters of sex. Ayn Rand’s militant atheism, so attractive to one half of the party leadership, will be contrasted to the equally warlike Christianity that appeals to the right’s other half. Pundits will discover that aggressive interventionists touched by neoconservatism are not the same thing as America-first nationalists influenced by isolationism. Some liberals will cheer. Long accustomed to divisions within their own ranks, they will for once take glee in the splits and bitter exchanges of their antagonists.
Don’t be fooled by any of this, argues Corey Robin. Against nearly all other leftists writing about rightists, Robin believes that there is only one kind of conservatism. Whether expressed in the lofty words of Burke or the rambling ravings of Palin, conservatism is always and everywhere a resentful attack on those who seek to make the world more fair. Take away the left and you destroy the rationale for the right. It is only because the modern world takes justice seriously, at least in theory, that we have thinkers and activists determined to put their bodies on the gears to stop the machinery from moving forward.
Robin treats conservatives as activists rather than as stand-patters. “Conservatism,” he writes, “has been a forward movement of restless and relentless change, partial to risk taking and ideological adventurism, militant in posture and populist in its bearings, friendly to upstarts and insurgents, outsiders and newcomers alike.” Burke, in Robin’s view, began this tradition, and figures such as de Maistre, de Bonald, and Sorel carried it forward. If we take all of them as the genuine articles, there is no need to draw a line between conservatives and reactionaries: all conservatives are reactionary. Conservatives are unified, and united in their rage. Their most passionate hate is directed at those they believe were assigned by God or nature to second-class status but still insist on their full rights as human beings.
For Robin, what began in the late eighteenth century has reached a kind of culmination in the early twenty-first century. Republicans in love with Ayn Rand express the same romantic protest against modern complexity as evangelical Christians lamenting for families of yore. Whatever their differences, both movements are counter-cultural, even counter-revolutionary. That is why they are the rightful heirs of all the European thinkers whom Robin evokes. Everything about these contemporary right-wing activists—their militant theatrics, their artificial populism, their refusal to compromise—was anticipated two centuries ago. “Far from being a recent innovation of the Christian Right or the Tea Party movement, reactionary populism runs like a red thread throughout conservative discourse from the very beginning.”
Robin adds a distinctive wrinkle to the common claim of Burke’s responsibility for modern conservatism. He says that it was not his Reflections on the Revolution in France but his Philosophical Enquiry into Our Ideas of the Sublime and the Beautiful that deserves the most attention. Power, as Robin summarizes Burke, “should never aspire to be—and can never actually be—beautiful. What great power needs is sublimity.” Owing to this emphasis on the sublime, Burke ought not to be read as a defender of the old regime. Not only had the Bourbons lost both their beauty and their sublimity, they had also become pathetic and decadent, lacking the capacity to justify themselves (and thus requiring thinkers such as Burke to carry out the thankless task).
Conservatives, says Robin, long for an imagined world too rarified ever to survive; they are theorists of loss. That is why, no matter how small the circle of privilege they defend, they have a certain appeal to the much larger collection of ordinary people whom they otherwise hold in contempt. Who has not experienced loss? Who would not want to return to an ideal world? The sacred is always more appealing than the profane. Try to make the world a more just place and you eliminate the sublime from it.
“The sublime,” Burke wrote, “is the sensation we feel in the face of extreme pain, danger or terror.” For all the emphasis on stability and tradition, conservatives admire revolutionaries because the terror they unleash gives us a glimpse of precisely such wonders. As Robin correctly points out, de Maistre preferred zealous if misguided Jacobins to lazy and self-satisfied nobles. Owing to its militancy, conservatism is zealously promoted by outsiders: Burke was Irish, de Maistre a Savoyard, Disraeli a Jew, Hamilton a West Indian. The same tendency can be witnessed today. It was not WASPs who revived the contemporary right but Jews and, downplayed by Robin, Catholics, who “helped transform the Republican Party from a cocktail party in Darien into the party of Scalia, d’Souza [sic] Gonzalez, and Yoo.”
Just as de Maistre could barely hide his Jacobin sympathies, the contemporary American right, in Robin’s account, is lock, stock, and barrel a product of the 1960s. “It’s time for God’s people to come out of the closet,” a Texas evangelist declares in Robin’s pages—a near perfect expression of the extent to which reaction against the gains of the 1960s could only be expressed in the language of the movement being denounced. Abby Hoffman prepared the way for Michele Bachmann. Mere economic protest does not get you the characters that constitute the Republican base today. For that you need people who genuinely believe that the world is coming to an end.
No other contemporary American figure captures this conservative combination of resentment and activism better than Antonin Scalia, the subject of one of Robin’s most interesting chapters. Despite talk of being faithful to texts, Robin argues, Scalia uses his power on the court to impose on the country the classic conservative mantra: the world is falling apart, and so only the obedience to rules, no matter how seemingly arbitrary and unfair, can save it from doom. “No Plato for him,” Robin writes of this intemperate and deeply reactionary judge. “He’s with Nietzsche all the way.” This at first does not seem quite right: Nietzsche is hardly a theorist of obedience to rules. But once we realize that for Scalia rule-following is only for the masses, while those on top get to do all the rule-writing, Robin’s take on the man strikes me as warranted. There are times when Scalia goes out of his way to remind us of how cruel the world can be—and how helpless we are in the face of these very cruelties. Scalia has buried himself deep inside the right-wing counterculture where winners, calling themselves victims, are given rights, while losers are instructed never to complain even as their rights are stripped from them.
I confess to being one of those who likes to divide conservatives into their parts as opposed to treating them as a whole. Robin makes a vigorous case that I am wrong, and I am tempted by his analysis—as far as it goes. To be sure, Robin exaggerates, and all too easily dismisses exceptions to his generalizations: he quotes Michael Oakeshott, and a bit too frequently, yet finally he has no choice but to throw him off the conservative bus. The very existence of such a thinker suggests that conservatism need not always be either as reactionary or as angry as Robin claims. Still, at least as regards reactionaries such as Scalia and Palin, a little rhetorical provocation seems justified. Robin is an engaging writer, and just the kind of broad-ranging public intellectual all too often missing in academic political science.
The real problem of persuasion lies elsewhere. In this book, Robin has chosen to republish essays, albeit with a comprehensive introduction, rather than to make a sustained argument. I cannot blame him for that; I have done the same myself. But at least one of the essays is so out of date that Robin repudiates it, and the entire second half of the book, while containing interesting asides on terror in Latin America or reactions to September 11, is only marginally related to the first half. Thus was lost an opportunity to develop an arresting theme, shape it with original and fresh examples, acknowledge its limits, and then make it part of our national conversation. Robin’s arguments deserve widespread attention. But they way he has presented them almost ensures that they will not get it.
One Right The Power Lover The Visitor October 27, 2011 | 12:00 am Print
The Reactionary Mind: Conservatism from Edmund Burke to Sarah Palin
by Corey Robin
Oxford University Press, 304 pp., $36.75
AS THE REPUBLICAN Party lurches toward nominating a presidential candidate to run against Barack Obama, we are likely to hear talk of deep splits within the conservative movement. Tea Party activists, who hate state intervention into the economy, will be distinguished from social conservatives, who love state intervention into matters of sex. Ayn Rand’s militant atheism, so attractive to one half of the party leadership, will be contrasted to the equally warlike Christianity that appeals to the right’s other half. Pundits will discover that aggressive interventionists touched by neoconservatism are not the same thing as America-first nationalists influenced by isolationism. Some liberals will cheer. Long accustomed to divisions within their own ranks, they will for once take glee in the splits and bitter exchanges of their antagonists.
Don’t be fooled by any of this, argues Corey Robin. Against nearly all other leftists writing about rightists, Robin believes that there is only one kind of conservatism. Whether expressed in the lofty words of Burke or the rambling ravings of Palin, conservatism is always and everywhere a resentful attack on those who seek to make the world more fair. Take away the left and you destroy the rationale for the right. It is only because the modern world takes justice seriously, at least in theory, that we have thinkers and activists determined to put their bodies on the gears to stop the machinery from moving forward.
Robin treats conservatives as activists rather than as stand-patters. “Conservatism,” he writes, “has been a forward movement of restless and relentless change, partial to risk taking and ideological adventurism, militant in posture and populist in its bearings, friendly to upstarts and insurgents, outsiders and newcomers alike.” Burke, in Robin’s view, began this tradition, and figures such as de Maistre, de Bonald, and Sorel carried it forward. If we take all of them as the genuine articles, there is no need to draw a line between conservatives and reactionaries: all conservatives are reactionary. Conservatives are unified, and united in their rage. Their most passionate hate is directed at those they believe were assigned by God or nature to second-class status but still insist on their full rights as human beings.
For Robin, what began in the late eighteenth century has reached a kind of culmination in the early twenty-first century. Republicans in love with Ayn Rand express the same romantic protest against modern complexity as evangelical Christians lamenting for families of yore. Whatever their differences, both movements are counter-cultural, even counter-revolutionary. That is why they are the rightful heirs of all the European thinkers whom Robin evokes. Everything about these contemporary right-wing activists—their militant theatrics, their artificial populism, their refusal to compromise—was anticipated two centuries ago. “Far from being a recent innovation of the Christian Right or the Tea Party movement, reactionary populism runs like a red thread throughout conservative discourse from the very beginning.”
Robin adds a distinctive wrinkle to the common claim of Burke’s responsibility for modern conservatism. He says that it was not his Reflections on the Revolution in France but his Philosophical Enquiry into Our Ideas of the Sublime and the Beautiful that deserves the most attention. Power, as Robin summarizes Burke, “should never aspire to be—and can never actually be—beautiful. What great power needs is sublimity.” Owing to this emphasis on the sublime, Burke ought not to be read as a defender of the old regime. Not only had the Bourbons lost both their beauty and their sublimity, they had also become pathetic and decadent, lacking the capacity to justify themselves (and thus requiring thinkers such as Burke to carry out the thankless task).
Conservatives, says Robin, long for an imagined world too rarified ever to survive; they are theorists of loss. That is why, no matter how small the circle of privilege they defend, they have a certain appeal to the much larger collection of ordinary people whom they otherwise hold in contempt. Who has not experienced loss? Who would not want to return to an ideal world? The sacred is always more appealing than the profane. Try to make the world a more just place and you eliminate the sublime from it.
“The sublime,” Burke wrote, “is the sensation we feel in the face of extreme pain, danger or terror.” For all the emphasis on stability and tradition, conservatives admire revolutionaries because the terror they unleash gives us a glimpse of precisely such wonders. As Robin correctly points out, de Maistre preferred zealous if misguided Jacobins to lazy and self-satisfied nobles. Owing to its militancy, conservatism is zealously promoted by outsiders: Burke was Irish, de Maistre a Savoyard, Disraeli a Jew, Hamilton a West Indian. The same tendency can be witnessed today. It was not WASPs who revived the contemporary right but Jews and, downplayed by Robin, Catholics, who “helped transform the Republican Party from a cocktail party in Darien into the party of Scalia, d’Souza [sic] Gonzalez, and Yoo.”
Just as de Maistre could barely hide his Jacobin sympathies, the contemporary American right, in Robin’s account, is lock, stock, and barrel a product of the 1960s. “It’s time for God’s people to come out of the closet,” a Texas evangelist declares in Robin’s pages—a near perfect expression of the extent to which reaction against the gains of the 1960s could only be expressed in the language of the movement being denounced. Abby Hoffman prepared the way for Michele Bachmann. Mere economic protest does not get you the characters that constitute the Republican base today. For that you need people who genuinely believe that the world is coming to an end.
No other contemporary American figure captures this conservative combination of resentment and activism better than Antonin Scalia, the subject of one of Robin’s most interesting chapters. Despite talk of being faithful to texts, Robin argues, Scalia uses his power on the court to impose on the country the classic conservative mantra: the world is falling apart, and so only the obedience to rules, no matter how seemingly arbitrary and unfair, can save it from doom. “No Plato for him,” Robin writes of this intemperate and deeply reactionary judge. “He’s with Nietzsche all the way.” This at first does not seem quite right: Nietzsche is hardly a theorist of obedience to rules. But once we realize that for Scalia rule-following is only for the masses, while those on top get to do all the rule-writing, Robin’s take on the man strikes me as warranted. There are times when Scalia goes out of his way to remind us of how cruel the world can be—and how helpless we are in the face of these very cruelties. Scalia has buried himself deep inside the right-wing counterculture where winners, calling themselves victims, are given rights, while losers are instructed never to complain even as their rights are stripped from them.
I confess to being one of those who likes to divide conservatives into their parts as opposed to treating them as a whole. Robin makes a vigorous case that I am wrong, and I am tempted by his analysis—as far as it goes. To be sure, Robin exaggerates, and all too easily dismisses exceptions to his generalizations: he quotes Michael Oakeshott, and a bit too frequently, yet finally he has no choice but to throw him off the conservative bus. The very existence of such a thinker suggests that conservatism need not always be either as reactionary or as angry as Robin claims. Still, at least as regards reactionaries such as Scalia and Palin, a little rhetorical provocation seems justified. Robin is an engaging writer, and just the kind of broad-ranging public intellectual all too often missing in academic political science.
The real problem of persuasion lies elsewhere. In this book, Robin has chosen to republish essays, albeit with a comprehensive introduction, rather than to make a sustained argument. I cannot blame him for that; I have done the same myself. But at least one of the essays is so out of date that Robin repudiates it, and the entire second half of the book, while containing interesting asides on terror in Latin America or reactions to September 11, is only marginally related to the first half. Thus was lost an opportunity to develop an arresting theme, shape it with original and fresh examples, acknowledge its limits, and then make it part of our national conversation. Robin’s arguments deserve widespread attention. But they way he has presented them almost ensures that they will not get it.
More on the Corey Robin Book
By JENNIFER SCHUESSLER
Published: January 18, 2012
For Corey Robin the author it’s been a bruising few months. Shortly after his essay collection “The Reactionary Mind: Conservatism From Edmund Burke to Sarah Palin” appeared last fall, The New York Times Book Review published a review by Sheri Berman dismissing the book as “a diatribe that preaches to the converted,” “so filled with exaggeration and invective that the reader’s eyes roll.” Then in late December, The New York Review of Books ran a withering assessment by Mark Lilla, who dismissed the book as “history as W.P.A. mural, ” if not the left-wing scholarly equivalent of Glenn Beck’s blackboard scribblings.
)For Corey Robin the blogger, however, the past few months have been quite excellent. Since starting CoreyRobin.com in June, Mr. Robin, an associate professor of political science at Brooklyn College, has established himself as a lively and combative online presence. He has racked up links from prominent bloggers and this month won the 2011 “best writer” award from Cliopatra, the blog of the History News Network, which called him “the quintessential public intellectual for the digital age.”
So when Mr. Lilla’s review hit the newsstands, Mr. Robin’s online admirers were ready to pounce, setting off a cycle of learned (and often lengthy) commentary and counterreviews that trickled up from smaller blogs like U.S. Intellectual History to big ones like Crooked Timber.
As one commenter on U.S. Intellectual History wrote, “Bashing Lilla’s review of Robin’s book seems to be the newest Internet meme.”
If Mr. Robin seems to be enjoying the online tumult, filing regular updates on his blog, he professes to remain puzzled by the hostile reviews that touched it all off.
“I don’t know what is driving the critics,” Mr. Robin, 44, said in a recent interview at his apartment in Park Slope, Brooklyn. “The argument itself just bothers them, and I don’t know why.”
“The Reactionary Mind” certainly cuts hard against the common view that the radical populist conservatism epitomized by Sarah Palin represents a sharp break with the cautious, reasonable, moderate, pragmatic conservatism inaugurated by the 18th-century British statesman Edmund Burke. For Mr. Robin even Burke, that great critic of the French Revolution, wasn’t a Burkean moderate, but a reactionary who celebrated the sublimity of violence and denounced the inability of flabby traditional elites to defend the existing order.
This counterrevolutionary spirit, Mr. Robin argues, animates every conservative, from the Southern slaveholders to Ayn Rand to Antonin Scalia, to name just a few of the figures he pulls into his often slashing analysis. Commitment to a limited government, devotion to the free market, or a wariness of change, Mr. Robin writes, are not the essence of conservatism but mere “byproducts” of one essential idea — “that some are fit, and thus ought, to rule others.”
These are fighting words, and to some of Mr. Robin’s readers they serve a useful purpose.
“By the standards of intellectual history it may be found wanting,” the political scientist Alan Wolfe, who gave “The Reactionary Mind” an appreciative if mixed review on The New Republic’s Web site, said in an interview. “But the argument is valuable at this moment because Robin’s analysis helps explain why there is so much fury and resentment in our politics.”
But to Mr. Lilla, a professor of humanities at Columbia, what he sees as the incoherent Manicheanism of Mr. Robin’s vision is more a symptom of our polarized politics than an explanation of it.
“He is interested in an anthropological, or maybe entomological, way of looking at how these little bugs are behaving or changing,” Mr. Lilla said. “But he can’t take conservative ideas seriously as ideas. Everything is just positioning.”
Mr. Robin counters that his own arguments are the ones that aren’t being taken seriously — or even really being read. The true subject of “The Reactionary Mind,” he said, isn’t the eternal sameness of conservatism but the way it transforms itself in response to threats to existing hierarchies, often by borrowing from the very movements it seeks to oppose.
“We see the left initiating a politics, whether it’s the French Revolution or abolition,” he said. “What’s fascinating to me is how the right reacts to that, how it learns from the left a whole capacity for political agency.”
Mr. Robin, who was the lead organizer for the graduate student union campaign at Yale while getting his doctorate there in the 1990s, dates his fascination with the right to 2000, when he got a magazine assignment to write about former free-marketeers who had become sharp critics of capitalism.
That assignment yielded some juicy sound bites, as when William F. Buckley (who was not one of the apostates) told Mr. Robin that conservative fixation on the market was as boring and repetitious as sex. But it also opened his eyes to what he calls “the agonies and ecstasies of the conservative mind,” a deep political romanticism that colleagues on the left often fail to appreciate.
Take, for example, the war in Iraq, which Mr. Robin argues was less about oil than the neoconservative longing for a project of national greatness more noble than simply making money.
“When I said that the neocon project was not about defending oil, that it was much more a Kulturkampf that goes to the heart of conservatism’s deep ambivalence about the free market, people on the left didn’t buy it,” he said. “To them it’s all just the pursuit of economic interest.”
As for his argument with Mr. Lilla, the two do find at least one point of agreement: There are few if any true Burkean political actors in American history, and certainly none anywhere near the Republican presidential primaries.
Mr. Lilla does see real Burkeans in Europe. But to Mr. Robin there is no actually existing Burkeanism anywhere, making those who cite the ideal of a reasonable, pragmatic, nonreactionary conservatism guilty of the kind of utopianism the left is more commonly faulted for.
“Their whole claim to credibility is, as William F. Buckley put it, ‘We are the politics of reality,’ ” Mr. Robin said. “But if you can only find two examples across two centuries, it’s not a political theory anymore.”
Published: January 18, 2012
For Corey Robin the author it’s been a bruising few months. Shortly after his essay collection “The Reactionary Mind: Conservatism From Edmund Burke to Sarah Palin” appeared last fall, The New York Times Book Review published a review by Sheri Berman dismissing the book as “a diatribe that preaches to the converted,” “so filled with exaggeration and invective that the reader’s eyes roll.” Then in late December, The New York Review of Books ran a withering assessment by Mark Lilla, who dismissed the book as “history as W.P.A. mural, ” if not the left-wing scholarly equivalent of Glenn Beck’s blackboard scribblings.
)For Corey Robin the blogger, however, the past few months have been quite excellent. Since starting CoreyRobin.com in June, Mr. Robin, an associate professor of political science at Brooklyn College, has established himself as a lively and combative online presence. He has racked up links from prominent bloggers and this month won the 2011 “best writer” award from Cliopatra, the blog of the History News Network, which called him “the quintessential public intellectual for the digital age.”
So when Mr. Lilla’s review hit the newsstands, Mr. Robin’s online admirers were ready to pounce, setting off a cycle of learned (and often lengthy) commentary and counterreviews that trickled up from smaller blogs like U.S. Intellectual History to big ones like Crooked Timber.
As one commenter on U.S. Intellectual History wrote, “Bashing Lilla’s review of Robin’s book seems to be the newest Internet meme.”
If Mr. Robin seems to be enjoying the online tumult, filing regular updates on his blog, he professes to remain puzzled by the hostile reviews that touched it all off.
“I don’t know what is driving the critics,” Mr. Robin, 44, said in a recent interview at his apartment in Park Slope, Brooklyn. “The argument itself just bothers them, and I don’t know why.”
“The Reactionary Mind” certainly cuts hard against the common view that the radical populist conservatism epitomized by Sarah Palin represents a sharp break with the cautious, reasonable, moderate, pragmatic conservatism inaugurated by the 18th-century British statesman Edmund Burke. For Mr. Robin even Burke, that great critic of the French Revolution, wasn’t a Burkean moderate, but a reactionary who celebrated the sublimity of violence and denounced the inability of flabby traditional elites to defend the existing order.
This counterrevolutionary spirit, Mr. Robin argues, animates every conservative, from the Southern slaveholders to Ayn Rand to Antonin Scalia, to name just a few of the figures he pulls into his often slashing analysis. Commitment to a limited government, devotion to the free market, or a wariness of change, Mr. Robin writes, are not the essence of conservatism but mere “byproducts” of one essential idea — “that some are fit, and thus ought, to rule others.”
These are fighting words, and to some of Mr. Robin’s readers they serve a useful purpose.
“By the standards of intellectual history it may be found wanting,” the political scientist Alan Wolfe, who gave “The Reactionary Mind” an appreciative if mixed review on The New Republic’s Web site, said in an interview. “But the argument is valuable at this moment because Robin’s analysis helps explain why there is so much fury and resentment in our politics.”
But to Mr. Lilla, a professor of humanities at Columbia, what he sees as the incoherent Manicheanism of Mr. Robin’s vision is more a symptom of our polarized politics than an explanation of it.
“He is interested in an anthropological, or maybe entomological, way of looking at how these little bugs are behaving or changing,” Mr. Lilla said. “But he can’t take conservative ideas seriously as ideas. Everything is just positioning.”
Mr. Robin counters that his own arguments are the ones that aren’t being taken seriously — or even really being read. The true subject of “The Reactionary Mind,” he said, isn’t the eternal sameness of conservatism but the way it transforms itself in response to threats to existing hierarchies, often by borrowing from the very movements it seeks to oppose.
“We see the left initiating a politics, whether it’s the French Revolution or abolition,” he said. “What’s fascinating to me is how the right reacts to that, how it learns from the left a whole capacity for political agency.”
Mr. Robin, who was the lead organizer for the graduate student union campaign at Yale while getting his doctorate there in the 1990s, dates his fascination with the right to 2000, when he got a magazine assignment to write about former free-marketeers who had become sharp critics of capitalism.
That assignment yielded some juicy sound bites, as when William F. Buckley (who was not one of the apostates) told Mr. Robin that conservative fixation on the market was as boring and repetitious as sex. But it also opened his eyes to what he calls “the agonies and ecstasies of the conservative mind,” a deep political romanticism that colleagues on the left often fail to appreciate.
Take, for example, the war in Iraq, which Mr. Robin argues was less about oil than the neoconservative longing for a project of national greatness more noble than simply making money.
“When I said that the neocon project was not about defending oil, that it was much more a Kulturkampf that goes to the heart of conservatism’s deep ambivalence about the free market, people on the left didn’t buy it,” he said. “To them it’s all just the pursuit of economic interest.”
As for his argument with Mr. Lilla, the two do find at least one point of agreement: There are few if any true Burkean political actors in American history, and certainly none anywhere near the Republican presidential primaries.
Mr. Lilla does see real Burkeans in Europe. But to Mr. Robin there is no actually existing Burkeanism anywhere, making those who cite the ideal of a reasonable, pragmatic, nonreactionary conservatism guilty of the kind of utopianism the left is more commonly faulted for.
“Their whole claim to credibility is, as William F. Buckley put it, ‘We are the politics of reality,’ ” Mr. Robin said. “But if you can only find two examples across two centuries, it’s not a political theory anymore.”
Tuesday, January 17, 2012
Rich Man Romney
Romney Caricatures Himself
by Jonathan Chait
Mitt Romney, in all likelihood, is going to walk away with his party’s presidential nomination. Yet from the standpoint of positioning himself for the general election, the primary season has been a disaster for him. Romney’s campaign has worked hard to avoid taking any substantive positions that would unduly burden him in a race against President Obama – no (additional, post-2007) policy flip-flops and an avoidance of any positions more right-wing than necessary to skate through the primary. That part has worked reasonably well. The utter failure is that Romney has come to be defined, through a recurring series of off-the-cuff gaffes, as a callous, out-of-touch rich man.
The latest is Romney’s response to questions about his tax returns. Romney declared that he would wait until April to release his returns, but previewed the event by predicting he pays a 15 percent federal tax rate, making him the beneficiary of conservatives’ favorite tax breaks, the capital gains preference and the carried-interest loophole, both of which allow very rich investors to pay a lower tax rate on their income than many people who make a fraction of their income. Romney compounded his problems by noting, as an aside, that he gets “not very much” income from speakers’ fees, a sum that turns out to be, um, $374,327.62.
Romney declared “I like to be able to fire people who provide services to me.” He described concern about rising inequality as “envy,” suggested only people who are independently wealthy should run for office, suggested inequality should be discussed only in “quiet rooms,” laid down a $10,000 bet in a debate with Rick Perry, deemed corporations to be people, and jokingly referred to himself as “unemployed.” He has done the work of an opposition researcher on himself.
Now, Romney does not deserve to be pilloried for all these gaffes. He was right about corporations consisting of people, and his professed love of firing people was an ode to the benefits of market competition, not of Burns-esque revelry in abusing his underlings. (A service provider you can’t replace has no incentive to provide better service, as any customer of the old cable monopoly can attest.) On the other hand, he clearly does deserve whatever grief he endures for the other statements, especially his dismissal of inequality.
Whatever the merits, the total self-portrait Romney has helped craft is utterly devastating: the scion of a wealthy executive, who helped create, and benefited from, revolutions in both the market economy and in public policy in the last three decades that favored the rich over the middle class, and who appears blithe about the gap between his privilege and the lot of most Americans.
As I’ve said before, Romney has been positively associated with “electability” because he is more electable than most of his rivals. But he is the one-eyed man in the land of the politically blind. Romney, by normal standards, is a terrible candidate. He is nowhere near as formidable as John McCain was four years before. The latest poll from PPP has his favorability rating at a miserable 35 percent positive, 53 percent negative. He may win – he probably will win if the economy dips back into recession – but he is a weak candidate who in many ways embodies the public’s distrust of his party.
by Jonathan Chait
Mitt Romney, in all likelihood, is going to walk away with his party’s presidential nomination. Yet from the standpoint of positioning himself for the general election, the primary season has been a disaster for him. Romney’s campaign has worked hard to avoid taking any substantive positions that would unduly burden him in a race against President Obama – no (additional, post-2007) policy flip-flops and an avoidance of any positions more right-wing than necessary to skate through the primary. That part has worked reasonably well. The utter failure is that Romney has come to be defined, through a recurring series of off-the-cuff gaffes, as a callous, out-of-touch rich man.
The latest is Romney’s response to questions about his tax returns. Romney declared that he would wait until April to release his returns, but previewed the event by predicting he pays a 15 percent federal tax rate, making him the beneficiary of conservatives’ favorite tax breaks, the capital gains preference and the carried-interest loophole, both of which allow very rich investors to pay a lower tax rate on their income than many people who make a fraction of their income. Romney compounded his problems by noting, as an aside, that he gets “not very much” income from speakers’ fees, a sum that turns out to be, um, $374,327.62.
Romney declared “I like to be able to fire people who provide services to me.” He described concern about rising inequality as “envy,” suggested only people who are independently wealthy should run for office, suggested inequality should be discussed only in “quiet rooms,” laid down a $10,000 bet in a debate with Rick Perry, deemed corporations to be people, and jokingly referred to himself as “unemployed.” He has done the work of an opposition researcher on himself.
Now, Romney does not deserve to be pilloried for all these gaffes. He was right about corporations consisting of people, and his professed love of firing people was an ode to the benefits of market competition, not of Burns-esque revelry in abusing his underlings. (A service provider you can’t replace has no incentive to provide better service, as any customer of the old cable monopoly can attest.) On the other hand, he clearly does deserve whatever grief he endures for the other statements, especially his dismissal of inequality.
Whatever the merits, the total self-portrait Romney has helped craft is utterly devastating: the scion of a wealthy executive, who helped create, and benefited from, revolutions in both the market economy and in public policy in the last three decades that favored the rich over the middle class, and who appears blithe about the gap between his privilege and the lot of most Americans.
As I’ve said before, Romney has been positively associated with “electability” because he is more electable than most of his rivals. But he is the one-eyed man in the land of the politically blind. Romney, by normal standards, is a terrible candidate. He is nowhere near as formidable as John McCain was four years before. The latest poll from PPP has his favorability rating at a miserable 35 percent positive, 53 percent negative. He may win – he probably will win if the economy dips back into recession – but he is a weak candidate who in many ways embodies the public’s distrust of his party.
Monday, January 16, 2012
A View of Dickens
The Whirling Sound of Planet Dickens
By VERLYN KLINKENBORG
Published: January 14, 2012
In death, Charles Dickens still keeps his greatest secret to himself — the essence of his energy. None of the physical relics he left behind betray it. The manuscripts of his novels — like “Our Mutual Friend” at the Morgan Library — look no more fevered or hectic than the manuscripts left behind by other novelists.
Two memorable characters from Charles Dickens, Micawber and the young Copperfield.
The handwritten words on the page, round and legible in blue ink, are the marks of a mind that has already settled itself to composition.
Dickens, who was born 200 years ago, wrote a long shelf of novels, 14 in all, not counting “The Mystery of Edwin Drood,” which lay half-finished at his death. They sit plump and bursting with life, spilling over with the chaos of existence itself. It’s easy to imagine writers working the way Dickens’s prolific contemporary, Anthony Trollope, did — steadily, routinely, knocking off his 2,000 words a day until, by the end of his life, he had written 47 novels. But this is not how Dickens wrote.
Find the tumultuous heart of your favorite Dickens novel, the place where 19th-century London seems to be seething, smoking, overcrowded, in a state of vulgar contradiction. Then imagine Dickens working in the midst of it — a small, brisk figure rushing past you on a dark and dirty street. He is lost in a kind of mental ventriloquism, calling up his emotions and studying them. Every night he walked a dozen miles, without which, he said, “I should just explode and perish.”
Under the pseudonym Boz, he wrote, “There is nothing we enjoy more than a little amateur vagrancy,” walking through London as though “the whole were an unknown region to our wandering mind.” Yet there was nothing remotely solitary about Dickens. One person who saw him in the highest spirits at a family party wrote that he “happily sang two or three songs, one the patter song, ‘The Dog’s Meat Man,’ and gave several successful imitations of the most distinguished actors of the day.”
It’s a wonder Dickens didn’t explode and perish long before his death in 1870, at age 58. Quite apart from the act of composing his novels, he was a whirlwind, living a life that is nearly unmatched in its vigor. He had one entire career as a magazine editor, another as an actor and manager of theatrical productions, still another as a philanthropist and social reformer. The record of his private engagements alone — dinners, outings, peregrinations with his entourage of family and friends — is exhausting to read. The novels stand out against the backdrop of hundreds of other compositions, all of them written against tight deadlines.
Dickens’s energy, which he made no effort to husband until he was nearly dead, was inexplicable. Call it metabolic if you like. Perhaps it was a reaction to the uncertainties of his childhood and the shame of his days as a child laborer, when he knew that as a precocious young entertainer he was already a spectacle well worth observing.
He was driven by gargantuan emotions, and the ferocious will needed to keep them in check, to release them in the creation of characters he loved more than some of his children. He could drive himself to anguished tears while writing the death of Little Nell, in “The Old Curiosity Shop.” And yet he could also coldly disown anyone who sided with his wife, Catherine, when they separated, including his namesake son.
Even Dickens didn’t understand his energy. He grasped that there was a wildness in him, and so did nearly everyone who knew him. When Dostoevsky met Dickens in 1862 — a meeting that is hard to imagine — Dickens explained that there were two people inside him, “one who feels as he ought to feel and one who feels the opposite.”
Out of these two people he constructed his universe of characters, good and evil. Dostoevsky’s comment is laconic and ambiguous. “Only two people?” he asked. Dickens’s public readings, which began in 1858, drew tens of thousands of people in England and America. They came not only to see the author himself but also the people who inhabited him — Scrooge and Pickwick, Micawber and Mrs. Gamp.
Those characters, and dozens more, still live with all their old vitality. And though we feel the unevenness of Dickens’s novels more plainly than when they were appearing in monthly parts, it’s easier now to see that the unevenness in most of them is symptomatic of his overpowering energy.
The man himself was uneven and could not be beaten into consistency any more than he could beat every one of his novels into perfection. The fact is that Charles Dickens was as Dickensian as the most outrageous of his characters, and he was happy to think so, too. Soon after the publication of “A Christmas Carol” in 1843, he wrote of himself to a close friend: “two and thirty years ago, the planet Dick appeared on the horizon. To the great admiration, wonder and delight of all who live, and the unspeakable happiness of mankind.” Planet Dickens feels as real as it does to us because he stalked the world around him.
And when he finally settled at his desk, he was still driving himself through a world of his own invention, peopled by characters waiting, as he said, to come “ready made to the point of the pen.”
By VERLYN KLINKENBORG
Published: January 14, 2012
In death, Charles Dickens still keeps his greatest secret to himself — the essence of his energy. None of the physical relics he left behind betray it. The manuscripts of his novels — like “Our Mutual Friend” at the Morgan Library — look no more fevered or hectic than the manuscripts left behind by other novelists.
Two memorable characters from Charles Dickens, Micawber and the young Copperfield.
The handwritten words on the page, round and legible in blue ink, are the marks of a mind that has already settled itself to composition.
Dickens, who was born 200 years ago, wrote a long shelf of novels, 14 in all, not counting “The Mystery of Edwin Drood,” which lay half-finished at his death. They sit plump and bursting with life, spilling over with the chaos of existence itself. It’s easy to imagine writers working the way Dickens’s prolific contemporary, Anthony Trollope, did — steadily, routinely, knocking off his 2,000 words a day until, by the end of his life, he had written 47 novels. But this is not how Dickens wrote.
Find the tumultuous heart of your favorite Dickens novel, the place where 19th-century London seems to be seething, smoking, overcrowded, in a state of vulgar contradiction. Then imagine Dickens working in the midst of it — a small, brisk figure rushing past you on a dark and dirty street. He is lost in a kind of mental ventriloquism, calling up his emotions and studying them. Every night he walked a dozen miles, without which, he said, “I should just explode and perish.”
Under the pseudonym Boz, he wrote, “There is nothing we enjoy more than a little amateur vagrancy,” walking through London as though “the whole were an unknown region to our wandering mind.” Yet there was nothing remotely solitary about Dickens. One person who saw him in the highest spirits at a family party wrote that he “happily sang two or three songs, one the patter song, ‘The Dog’s Meat Man,’ and gave several successful imitations of the most distinguished actors of the day.”
It’s a wonder Dickens didn’t explode and perish long before his death in 1870, at age 58. Quite apart from the act of composing his novels, he was a whirlwind, living a life that is nearly unmatched in its vigor. He had one entire career as a magazine editor, another as an actor and manager of theatrical productions, still another as a philanthropist and social reformer. The record of his private engagements alone — dinners, outings, peregrinations with his entourage of family and friends — is exhausting to read. The novels stand out against the backdrop of hundreds of other compositions, all of them written against tight deadlines.
Dickens’s energy, which he made no effort to husband until he was nearly dead, was inexplicable. Call it metabolic if you like. Perhaps it was a reaction to the uncertainties of his childhood and the shame of his days as a child laborer, when he knew that as a precocious young entertainer he was already a spectacle well worth observing.
He was driven by gargantuan emotions, and the ferocious will needed to keep them in check, to release them in the creation of characters he loved more than some of his children. He could drive himself to anguished tears while writing the death of Little Nell, in “The Old Curiosity Shop.” And yet he could also coldly disown anyone who sided with his wife, Catherine, when they separated, including his namesake son.
Even Dickens didn’t understand his energy. He grasped that there was a wildness in him, and so did nearly everyone who knew him. When Dostoevsky met Dickens in 1862 — a meeting that is hard to imagine — Dickens explained that there were two people inside him, “one who feels as he ought to feel and one who feels the opposite.”
Out of these two people he constructed his universe of characters, good and evil. Dostoevsky’s comment is laconic and ambiguous. “Only two people?” he asked. Dickens’s public readings, which began in 1858, drew tens of thousands of people in England and America. They came not only to see the author himself but also the people who inhabited him — Scrooge and Pickwick, Micawber and Mrs. Gamp.
Those characters, and dozens more, still live with all their old vitality. And though we feel the unevenness of Dickens’s novels more plainly than when they were appearing in monthly parts, it’s easier now to see that the unevenness in most of them is symptomatic of his overpowering energy.
The man himself was uneven and could not be beaten into consistency any more than he could beat every one of his novels into perfection. The fact is that Charles Dickens was as Dickensian as the most outrageous of his characters, and he was happy to think so, too. Soon after the publication of “A Christmas Carol” in 1843, he wrote of himself to a close friend: “two and thirty years ago, the planet Dick appeared on the horizon. To the great admiration, wonder and delight of all who live, and the unspeakable happiness of mankind.” Planet Dickens feels as real as it does to us because he stalked the world around him.
And when he finally settled at his desk, he was still driving himself through a world of his own invention, peopled by characters waiting, as he said, to come “ready made to the point of the pen.”
Jean Edward Smith - Grant
This is the best current biography of Ulysses S. Grant of which I am aware. The book makes me realize more than ever that we owe our country to Lincoln, Sherman, & Grant. Maybe to Grant more than anybody else. Without these 3 men, perhaps Grant most of all, the Confederacy would have succeeded and the United States as we know it today would not exist.
Jean Edward Smith is one of the foremost historical biographers of our time. A few years ago I read his biography of FDR and I see that he has a forthcoming work on Eisenhower.
It's good to see that in recent years there has been a scholarly reevaluation of Grant's presidency. For decades he had been viewed as a bad President. The scandals that occurred during his administration are still there, but how much blame can be ascribed to Grant? He was loyal to a fault to his friends, and that's the worst thing that can be said about him.
Grant was apparently a failure in life except that he was a brilliant soldier. He had a genius for leading men into battle. He has to be considered to be one of the great military leaders of all time. No doubt the North would have lost the war were it not for Ulysses S. Grant.
One big thing I learned in this biography is how progressive Grant was. He fought for Reconstruction and by the end of his administrations, he alone was fighting for the rights of the freedmen in the South. He wanted peace with Native Americans rather than war. If Grant were alive today, he'd be a Democrat.
I also learned that he might have been the Republican presidential nominee in 1880 were it not for the ham-handed mistakes of his backer Roscoe Conkling. What difference would this have made in American history? I do not know.
Grant lived his last days in unending pain from throat and mouth cancer. He finished his acclaimed memoirs literally the day before he died. Having lost all of him money when his investments failed thru fraud, he was forced to finish him memoiors in order to financially take care of his family.
Jean Edward Smith is one of the foremost historical biographers of our time. A few years ago I read his biography of FDR and I see that he has a forthcoming work on Eisenhower.
It's good to see that in recent years there has been a scholarly reevaluation of Grant's presidency. For decades he had been viewed as a bad President. The scandals that occurred during his administration are still there, but how much blame can be ascribed to Grant? He was loyal to a fault to his friends, and that's the worst thing that can be said about him.
Grant was apparently a failure in life except that he was a brilliant soldier. He had a genius for leading men into battle. He has to be considered to be one of the great military leaders of all time. No doubt the North would have lost the war were it not for Ulysses S. Grant.
One big thing I learned in this biography is how progressive Grant was. He fought for Reconstruction and by the end of his administrations, he alone was fighting for the rights of the freedmen in the South. He wanted peace with Native Americans rather than war. If Grant were alive today, he'd be a Democrat.
I also learned that he might have been the Republican presidential nominee in 1880 were it not for the ham-handed mistakes of his backer Roscoe Conkling. What difference would this have made in American history? I do not know.
Grant lived his last days in unending pain from throat and mouth cancer. He finished his acclaimed memoirs literally the day before he died. Having lost all of him money when his investments failed thru fraud, he was forced to finish him memoiors in order to financially take care of his family.
Sunday, January 15, 2012
The New Debate About Capitalism
by Gary HartPresident, Hart International, Ltd.
GET UPDATES FROM Gary Hart
The New Debate About Capitalism
Posted: 1/12/12 07:14 PM ET
The current Republican nomination contest has revealed serious confusion over the nature of our economic system. Very conservative candidates are attacking Governor Romney because his experience at Bain Capital involved buying companies with borrowed money, firing their employees, then selling them for a profit. Many, though not all, of these companies then went bankrupt, and Mr. Romney and his partners made millions. Sounding very much like Democrats, Mr. Romney's opponents are highly critical of this kind of capitalism.
Traditional capitalism was based on the idea that business people would invest money, usually borrowed, to establish a business, hire people, and make and sell things. Risk was involved, but jobs were created and profits were made. Traditional Republicans, including those now attacking Mr. Romney, still believe, as do most Americans, that this is the way our economic system should work.
The confusion within the ranks of capitalism is stimulated by the shift in our economy from making and selling things to manipulation of money. The rise of the money culture began a couple of decades or more ago and involves mergers and acquisitions, venture capitalism, leveraged buyouts, workouts and turnarounds, currency speculation, and arbitrage. While the money culture was booming, traditional capitalism based on manufacturing was declining. The national government had to intervene to save what was left of the American auto industry.
The money culture led directly to the housing bubble and subsequent economic collapse in 2008, from which we are still struggling to emerge. Unlike traditional capitalism, it is oriented toward the short term, not the long term, and toward quick profit, not productivity. If tens of millions of dollars can be made in bundling high-risk mortgages and collateralized debt obligations in a few months, why go to the trouble of building a factory, hiring and training workers, and producing a product that is competitive in world markets with profits emerging sometime down the road?
The Tea Party blames our government for this situation, and Occupy Wall Street focuses on the money culture (Wall Street). The government did not create the money culture. And public deficits, so much the focus of the Tea Party, arise from insufficient revenues to pay for the programs, including Social Security and Medicare, that most Americans, including Tea Party members, want. The money manipulators, now the focus of Republican candidates, manage to find intriguing ways, including off-shore bank accounts, to avoid paying fair taxes.
All of this is pretty well-known. What is surprising is that it is now being discovered and strongly criticized by Republican leaders who, up to now, have been silent on this transformation of American capitalism. While struggling to convince the American public that Barack Obama is a socialist, it is very easy to overlook the drift of our market economy away from its traditional roots into something resembling a high-risk, high-rollers', fast-shuffle casino that produces nothing except massive incomes for one-percent insiders. Republican leaders are now welcome to the struggle to return our market economy to its true purpose and original intent.
GET UPDATES FROM Gary Hart
The New Debate About Capitalism
Posted: 1/12/12 07:14 PM ET
The current Republican nomination contest has revealed serious confusion over the nature of our economic system. Very conservative candidates are attacking Governor Romney because his experience at Bain Capital involved buying companies with borrowed money, firing their employees, then selling them for a profit. Many, though not all, of these companies then went bankrupt, and Mr. Romney and his partners made millions. Sounding very much like Democrats, Mr. Romney's opponents are highly critical of this kind of capitalism.
Traditional capitalism was based on the idea that business people would invest money, usually borrowed, to establish a business, hire people, and make and sell things. Risk was involved, but jobs were created and profits were made. Traditional Republicans, including those now attacking Mr. Romney, still believe, as do most Americans, that this is the way our economic system should work.
The confusion within the ranks of capitalism is stimulated by the shift in our economy from making and selling things to manipulation of money. The rise of the money culture began a couple of decades or more ago and involves mergers and acquisitions, venture capitalism, leveraged buyouts, workouts and turnarounds, currency speculation, and arbitrage. While the money culture was booming, traditional capitalism based on manufacturing was declining. The national government had to intervene to save what was left of the American auto industry.
The money culture led directly to the housing bubble and subsequent economic collapse in 2008, from which we are still struggling to emerge. Unlike traditional capitalism, it is oriented toward the short term, not the long term, and toward quick profit, not productivity. If tens of millions of dollars can be made in bundling high-risk mortgages and collateralized debt obligations in a few months, why go to the trouble of building a factory, hiring and training workers, and producing a product that is competitive in world markets with profits emerging sometime down the road?
The Tea Party blames our government for this situation, and Occupy Wall Street focuses on the money culture (Wall Street). The government did not create the money culture. And public deficits, so much the focus of the Tea Party, arise from insufficient revenues to pay for the programs, including Social Security and Medicare, that most Americans, including Tea Party members, want. The money manipulators, now the focus of Republican candidates, manage to find intriguing ways, including off-shore bank accounts, to avoid paying fair taxes.
All of this is pretty well-known. What is surprising is that it is now being discovered and strongly criticized by Republican leaders who, up to now, have been silent on this transformation of American capitalism. While struggling to convince the American public that Barack Obama is a socialist, it is very easy to overlook the drift of our market economy away from its traditional roots into something resembling a high-risk, high-rollers', fast-shuffle casino that produces nothing except massive incomes for one-percent insiders. Republican leaders are now welcome to the struggle to return our market economy to its true purpose and original intent.
Saturday, January 14, 2012
Tinker, Tailor, Soldier, Spy---The Movie
Having tried three times to read the book and not being able to finish it, I was glad to see the movie. Spy novels are not my forte. I finally decided that in the book the densely layered plot wasn't worth my struggle to finish it. Overall the movie is pretty good.
The movie is completely plot driven with no special effects and minimal music. Hurray!
I already knew who the mole was so the ending was anti-climactic. Still I enjoyed it.
The movie is completely plot driven with no special effects and minimal music. Hurray!
I already knew who the mole was so the ending was anti-climactic. Still I enjoyed it.
Friday, January 13, 2012
"Lefty" From a Strange Source
Gingrich, Romney, and the Morality of Capitalism
by Jonathan Cohn
Gingrich, Romney, and the Morality of Capitalism Republicans v. the Unemployed
I finally got around to watching “When Mitt Romney Came to Town,” the propaganda film about Romney’s work at Bain Capital. It’s even more remarkable than advertised. The film, paid for by the Super PAC supporting Newt Gingrich, doesn't merely stake out territory that's to the left of the Republican mainstream. It also stakes out territory that's to the left of the Democratic mainstream.
At the risk of simplifying things a bit, a rough consensus about the economy has prevailed among both Democrats and Republicans for some time. It's the idea that, fundamentally, capitalism works – that, except in cases of obvious market failure like health care or environmental degradation, we’re all better off if the free market is allowed to operate without substantial interference. Capital should flow to the most profitable investments, labor should be flexible to allow the greatest efficiency, and so on. That’s the way to create the most wealth for society and, ultimately, creating the most wealth will benefit the most people – even if, in the process, some people lose their jobs or struggle in other ways.
Within this consensus, the major debate has been over treatment of the victims of this “creative destruction.” Should government intervene extensively, as Democrats prefer, by redistributing income through the tax code and building a strong safety net? Or should government do the absolute minimum, as Republicans prefer, on the theory that taxes and public programs only hold back the market's wealth-creating abilities. In other words, the focus was on dealing with the by-products of capitalism, rather than tinkering with the machinery itself.
“When Mitt Romney Came to Town” pretends, very early on, to affirm that consensus: Its first line is "Capitalism made America great." The implication is that the film is an attack on Romney's work at Bain, not capitalism itself. But the rest of the movie telegraphs a rather different message. It never grapples with the question of whether Bain’s actions made the economy more efficient (which, by the way, they may have) because it doesn't appear to consider that question relevant. Instead, it focuses on a purely moral issue: Is it fair to make huge profits by downsizing and outsourcing, as Romney and Bain frequently did.
The film's answer is apparent in many places, among them a montage of quotes from from displaced workers:
“It all ends up back at greed.”
“No matter how much they already had, they could never get enough money.”
“What do you get out of treating people like this?”
“He is there for the almighty dollar.”
Once upon a time, arguments like that were pretty common in conventional American politics. Liberals, particularly those in and around the labor movement, openly questioned the morality of business practices. Although (mostly) they didn't want to end capitalism, they were not shy about trying to control and redirect it. But with the decline of the labor movement and centrist reorientation of the Democratic Party, that language and that message have largely fallen out of style, with only outliers like Michael Moore making it loudly and consistently.
Perhaps that's starting to change, thanks in no small part to Occupy Wall Street and, now, supporters of Newt Gingrich. Yes, that's an ironic development. It's also an overdue one. The lefty critique may have flaws, but it also improves our political conversation.
by Jonathan Cohn
Gingrich, Romney, and the Morality of Capitalism Republicans v. the Unemployed
I finally got around to watching “When Mitt Romney Came to Town,” the propaganda film about Romney’s work at Bain Capital. It’s even more remarkable than advertised. The film, paid for by the Super PAC supporting Newt Gingrich, doesn't merely stake out territory that's to the left of the Republican mainstream. It also stakes out territory that's to the left of the Democratic mainstream.
At the risk of simplifying things a bit, a rough consensus about the economy has prevailed among both Democrats and Republicans for some time. It's the idea that, fundamentally, capitalism works – that, except in cases of obvious market failure like health care or environmental degradation, we’re all better off if the free market is allowed to operate without substantial interference. Capital should flow to the most profitable investments, labor should be flexible to allow the greatest efficiency, and so on. That’s the way to create the most wealth for society and, ultimately, creating the most wealth will benefit the most people – even if, in the process, some people lose their jobs or struggle in other ways.
Within this consensus, the major debate has been over treatment of the victims of this “creative destruction.” Should government intervene extensively, as Democrats prefer, by redistributing income through the tax code and building a strong safety net? Or should government do the absolute minimum, as Republicans prefer, on the theory that taxes and public programs only hold back the market's wealth-creating abilities. In other words, the focus was on dealing with the by-products of capitalism, rather than tinkering with the machinery itself.
“When Mitt Romney Came to Town” pretends, very early on, to affirm that consensus: Its first line is "Capitalism made America great." The implication is that the film is an attack on Romney's work at Bain, not capitalism itself. But the rest of the movie telegraphs a rather different message. It never grapples with the question of whether Bain’s actions made the economy more efficient (which, by the way, they may have) because it doesn't appear to consider that question relevant. Instead, it focuses on a purely moral issue: Is it fair to make huge profits by downsizing and outsourcing, as Romney and Bain frequently did.
The film's answer is apparent in many places, among them a montage of quotes from from displaced workers:
“It all ends up back at greed.”
“No matter how much they already had, they could never get enough money.”
“What do you get out of treating people like this?”
“He is there for the almighty dollar.”
Once upon a time, arguments like that were pretty common in conventional American politics. Liberals, particularly those in and around the labor movement, openly questioned the morality of business practices. Although (mostly) they didn't want to end capitalism, they were not shy about trying to control and redirect it. But with the decline of the labor movement and centrist reorientation of the Democratic Party, that language and that message have largely fallen out of style, with only outliers like Michael Moore making it loudly and consistently.
Perhaps that's starting to change, thanks in no small part to Occupy Wall Street and, now, supporters of Newt Gingrich. Yes, that's an ironic development. It's also an overdue one. The lefty critique may have flaws, but it also improves our political conversation.
Romney's Lies
by Paul Krugman
--------------------------------------------------------------------------------
January 13, 2012, 9:00 am
Untruths, Wholly Untrue, And Nothing But Untruths
I was deeply radicalized by the 2000 election. At first I couldn’t believe that then-candidate George W. Bush was saying so many clearly, provably false things; then I couldn’t believe that nobody in the news media was willing to point out the lies. (At the time, the Times actually told me that I couldn’t use the l-word either). That was when I formulated my “views differ on shape of planet” motto.
Now, however, Mitt Romney seems determined to rehabilitate Bush’s reputation, by running a campaign so dishonest that it makes Bush look like a model of truth-telling.
I mean, is there anything at all in Romney’s stump speech that’s true? It’s all based on attacking Obama for apologizing for America, which he didn’t, on making deep cuts in defense, which he also didn’t, and on being a radical redistributionist who wants equality of outcomes, which he isn’t. When the issue turns to jobs, Romney makes false assertions both about Obama’s record and about his own. I can’t find a single true assertion anywhere.
And he keeps finding new frontiers of falsehood. The good people at CBPP find him asserting, with regard to programs aiding low-income Americans, that
What unfortunately happens is with all the multiplicity of federal programs, you have massive overhead, with government bureaucrats in Washington administering all these programs, very little of the money that’s actually needed by those that really need help, those that can’t care for themselves, actually reaches them.
which is utterly, totally untrue. Administrative costs are actually quite small, and between 91 and 99 percent of spending, depending on the program, does in fact go to beneficiaries.
At this rate, Romney will soon start lying about his own name. Oh, wait.
--------------------------------------------------------------------------------
January 13, 2012, 9:00 am
Untruths, Wholly Untrue, And Nothing But Untruths
I was deeply radicalized by the 2000 election. At first I couldn’t believe that then-candidate George W. Bush was saying so many clearly, provably false things; then I couldn’t believe that nobody in the news media was willing to point out the lies. (At the time, the Times actually told me that I couldn’t use the l-word either). That was when I formulated my “views differ on shape of planet” motto.
Now, however, Mitt Romney seems determined to rehabilitate Bush’s reputation, by running a campaign so dishonest that it makes Bush look like a model of truth-telling.
I mean, is there anything at all in Romney’s stump speech that’s true? It’s all based on attacking Obama for apologizing for America, which he didn’t, on making deep cuts in defense, which he also didn’t, and on being a radical redistributionist who wants equality of outcomes, which he isn’t. When the issue turns to jobs, Romney makes false assertions both about Obama’s record and about his own. I can’t find a single true assertion anywhere.
And he keeps finding new frontiers of falsehood. The good people at CBPP find him asserting, with regard to programs aiding low-income Americans, that
What unfortunately happens is with all the multiplicity of federal programs, you have massive overhead, with government bureaucrats in Washington administering all these programs, very little of the money that’s actually needed by those that really need help, those that can’t care for themselves, actually reaches them.
which is utterly, totally untrue. Administrative costs are actually quite small, and between 91 and 99 percent of spending, depending on the program, does in fact go to beneficiaries.
At this rate, Romney will soon start lying about his own name. Oh, wait.
Sunday, January 8, 2012
The Anti-Obama Cult
PartySunday, Jan 8, 2012 8:00 AM 10:18:17 CST
The anti-Obama cult
In the GOP’s hatred of the president, the rote ravings of True BelieversBy Gary Kamiya
On Wednesday morning, I opened the New York Times to read that president Hu Jin-Tao had denounced the West for launching a culture war against China. “We must clearly see that international hostile forces are intensifying the strategic plot of westernizing and dividing China, and ideological and cultural fields are the focal areas of their long-term infiltration,” Hu pronounced in “Seeking Truth,” a Communist Party magazine. “We should deeply understand the seriousness and complexity of the ideological struggle, always sound the alarms and remain vigilant, and take forceful measures to be on guard and respond.”
I didn’t know whether to laugh or cry. Was it really possible that such wooden slogans were still being used by the leaders of the country with the most dynamic economy on earth? “We should deeply understand”? “Always sound the alarms”? Those antique phrases sounded like they’d been torn from a poster that had been pasted up during the Cultural Revolution and somehow never taken down. It seemed that not that much had changed since soon-to-be-Chairman Mao was writing tomes rejoicing in titles like “To Be Attacked by the Enemy Is Not a Bad Thing but a Good Thing” and urging the members of the party to cut off the head of imperialist snakes. A belief system as nutty as Maoism took a long time to get out of a nation’s system. I pitied the poor 1.3 billion Chinese, living in a country so insecure, so adolescent, so in thrall to authoritarian nationalism, that its politicians felt impelled to keep the cult alive. Thank God I’m an American, I told myself. We have plenty of cults, but at least they don’t get involved with our national politics.
Then I watched Michele Bachmann’s withdrawal speech.
Bachmann’s speech was a religious testimony, informing us that on the evening of March 21, 2010, she had a divine revelation. OK, she didn’t use the word “divine,” but that was basically the idea. You see, her holy revelation started with the Founding Fathers. And for Bachmann, Washington and Jefferson, if not literally angels, are flying around in their neighborhood.
“Entrusted to every American is their responsibility to watch over our Republic,” she began her speech. “You can look back from the time of the Pilgrims to the time of William Penn, to the time of our Founding Fathers. All we have to do is look around because very clearly we are encompassed with a great cloud of witnesses that bear witness to the sacrifices that were made to establish the U.S. and the precious principle of freedom that has made it the greatest force for good that has ever been seen on the planet.”
The “great cloud of witnesses” is a biblical term. By invoking it, Bachmann moved the Founding Fathers into the company of the prophets. And then she related her own humble journey to join the saved souls atop that great cloud – an epic quest that was spurred by the near-miraculous intercession of a painting of the Founding Fathers signing the Constitution.
“Every schoolchild is familiar with this painting,” Bachmann said. “But I’ve been privileged to see it on a regular basis, doing my duties in Congress. But never were the painting’s poignant reminder more evident than on the evening of March 21, 2010. That was the evening that Obamacare was passed and staring out from the painting are the faces of the founders, and in particular the face of Ben Franklin, who served as a constant reminder of the fragile Republic that he and the founders gave to us. That day served as the inspiration for my run for the President of the United States, because I believed firmly that what Congress had done and what President Obama had done in passing Obamacare endangered the very survival of the United States of America, our Republic.”
Bachmann closed her sermon by saying, “I look forward to the next chapter in God’s plan.”
Of such blinding revelations, religions are made. And cults.
The Republican hatred of Obama has become a cult. It is typically dressed up with the trappings of Christianity, but the cult does not reflect the teachings of that Jewish heretic known as Jesus of Nazareth — unless you believe, as Bachmann appears to, that defeating “Obamacare” is an essential part of the Lord’s master plan for the universe. (Personally, I would have thought that the great soul who reached out to the poor, the sick and the despised would have preferred universal healthcare over a system devoted to swelling the profits of those modern-day money-changers known as insurance companies, but what do I know?) But that is not to say that the version of Christianity embraced by many members of the anti-Obama cult does not play a key role in the movement, in ways we shall presently explore.
The anti-Obama cult is based on an irrational, grossly excessive fear and hatred of something the cult members call “big government” or “socialism,” and an equally irrational worship of something they call “freedom” or “liberty.” The fear and hatred of big government is irrational and excessive because Obama’s innocuous heathcare bill, the passage of which cult members like Bachmann see as the beginning of the end for America, is far less momentous as a piece of “social engineering” than Social Security, Medicare, welfare or progressive taxation.
We already live in a world where government intrudes on our freedom in a multitude of ways. Moreover, other enormous, impersonal forces, mainly corporate ones, constrain our liberty even more directly. Many of the “average Americans” the cult members claim to be speaking for lost their life savings when the bubble caused by an orgy of unregulated financial speculation burst – a far greater infringement on their “freedom” than being required to carry health insurance.
As for Obama himself, he is a bland left-leaning centrist, a slightly more liberal clone of moderate Republicans like Dwight D. Eisenhower, and his “socialist” policies are part of a long American tradition that goes back to FDR.
Why, then, did the anti-Obama cult suddenly take over the entire Republican Party?
The main reason, I believe, is that the American right was backed into a corner and had no other card to play. The disastrous presidency of George W. Bush revealed the complete bankruptcy (literally) of the two core right-wing nostrums, “freedom” (good) and “big government” (bad). “Freedom” had led to the biggest meltdown since the Great Depression. And big government – which was greatly expanded by Bush, to the deafening silence of the soon-to-be-anti-Obama fanatics – had done nothing to prevent it. In the wreckage left by Bush, there was nothing for the right to do, if it wanted to live to fight another day, except deny causality (and reality) and demonize Obama. By naively reaching out to Republicans, Obama let them get away with this, and squandered a teachable moment that could have changed the face of American politics.
The right survived. But defending this indefensible position squeezed its core beliefs into a kind of black hole, a blank spot of pure resentment, devoid of content, where the laws of logic did not apply. (According to Wikipedia, “Black holes of stellar mass are expected to form when very massive stars collapse at the end of their life cycle.”) As a result, “freedom” and its evil twin, “big government,” became metaphysical concepts, so elastic and amorphous that they could mean anything or nothing. They have come to play the same role in right-wing discourse as “the bourgeoisie” and “the workers” do in Marxism – they’re catchalls that can be plugged into any situation.
Thus, “big government” mostly means “giving money to undeserving people with dark skin” – a core GOP belief, central to the party since Nixon’s Southern Strategy, that Rick Santorum was rash enough to articulate. But it also has a cultural dimension in which it means pointy-headed elites who look down on “real Americans.” And trickiest of all, it also has a personal dimension in which it means anything that limits individual freedom — which explains the appeal, to those Republicans and independents who are genuine and consistent libertarians, of Ron Paul. (It is because “freedom” does not actually mean anything in the orthodox right-wing universe that non-libertarian conservatives like Romney, Bachmann, Santorum and the rest can advocate for intrusive drug laws, anti-gay laws and massive military budgets, while wrapping themselves in the mantle of “liberty.”)
Because “big government” does not have a fixed meaning, attacking it can simultaneously serve as a rallying cry for racial resentment, an impassioned demand for personal liberation and a marker of class- and region-based solidarity. This is why when the Republican candidates inveigh against big government, which they do approximately every time they open their mouths, their rants have all the weird, malevolent imprecision of a Stalinist attack on “running dog lackeys of the bourgeoisie.” They are the ravings of True Believers, of cult members.
Also lurking in that black hole was the one right-wing card that Bush did not destroy, because it is indestructible — the “culture war.” The far right’s free-floating hatred of America’s liberal, secular culture waxes and wanes, but it never goes away, and it is responsible for the rise of Rick Santorum, the GOP’s latest Dispose-a-Candidate. For Santorum, sinful modern life is to blame for everything, and it is our duty to always sound the alarms and remain vigilant against it. Thus, when the Catholic Church’s pedophilia scandal broke, Santorum blamed, not the church that covered it up or the individual priests who disgraced themselves and abused their position, but – Boston.
He wrote:
“When the culture is sick, every element in it becomes infected. While it is no excuse for this scandal, it is no surprise that Boston, a seat of academic, political and cultural liberalism in America, lies at the center of the storm. We must clearly see that international hostile forces are intensifying the strategic plot of liberalizing and dividing America, and ideological and cultural fields are the focal areas of their long-term infiltration.”
OK, I borrowed that last sentence from the quote by Comrade Hu, but you have to admit it tracks pretty well with the thoughts of Chairman Santorum.
The implosion of right-wing ideology and the persistence of the culture war toxin might have been enough by itself to create the anti-Obama cult, but two other factors also played a role. The first was his race. For many right-wingers, Obama was a foreign object, whose unexpected entrance into the body politic activated their immune systems – hence the “birther” movement and other bizarre right-wing obsessions. Whether the right’s aversion to Obama constitutes classic racism is a Talmudic question; what is undeniable is that his race activated a horde of (literally) white cells, rushing to expel the invader. Like organisms, cults always delineate themselves by drawing sharp lines between Us and Them.
The second reason involves Christianity. As Michele Bachmann’s speech demonstrated, for many devout right-wing Christians, there is no real difference between politics and religion. If religion is the uppermost thing in one’s life, if Jesus is with one every minute of every day, then it is easy to see how a true believer like Bachmann could come to see preserving her vision of the Republic as a semi-sacred trust, and defeating “Obamacare” as an essential part of that godly mission. Moreover, devoutly literalistic Christians tend to divide the world up into Good and Evil, with the founding dyad of God and the devil lurking in the background; it is not too much of a stretch to say that for many right-wing Christians, Barack Obama is at least of the devil’s party, if not Beelzebub himself.
Let me make it clear that I am not arguing that Christianity itself is a cult, or that Christians (or adherents of any religion) are inherently drawn to cultlike thinking. I am simply making the case that the right wing’s irrational hatred of Obama is cultlike, and that the literalist Christian faith of many right-wing opponents of Obama, including many of the GOP presidential candidates, clearly plays a role in their extreme beliefs.
To be sure, much of the anti-Obama cult is just Machiavellian politics. You hunt where the ducks are, and the ducks in this case are loons. It is extremely unlikely that Mitt Romney stares at a painting of Ben Franklin every day and has celestial visions of turning back Obama’s satanic plan to destroy America — which is precisely why the True Believers can’t stand him. But things have gotten Chairman Mao-y enough in the Republican Party that Romney has been forced to do his best to pretend he is a card-carrying member of the People’s Glorious Tea Party, Determined to Kill All Wriggling Socialist Snakes. Whether a fake cult member will prove more attractive to Republican voters than the genuine article will determine who will face Obama this fall.
The anti-Obama cult
In the GOP’s hatred of the president, the rote ravings of True BelieversBy Gary Kamiya
On Wednesday morning, I opened the New York Times to read that president Hu Jin-Tao had denounced the West for launching a culture war against China. “We must clearly see that international hostile forces are intensifying the strategic plot of westernizing and dividing China, and ideological and cultural fields are the focal areas of their long-term infiltration,” Hu pronounced in “Seeking Truth,” a Communist Party magazine. “We should deeply understand the seriousness and complexity of the ideological struggle, always sound the alarms and remain vigilant, and take forceful measures to be on guard and respond.”
I didn’t know whether to laugh or cry. Was it really possible that such wooden slogans were still being used by the leaders of the country with the most dynamic economy on earth? “We should deeply understand”? “Always sound the alarms”? Those antique phrases sounded like they’d been torn from a poster that had been pasted up during the Cultural Revolution and somehow never taken down. It seemed that not that much had changed since soon-to-be-Chairman Mao was writing tomes rejoicing in titles like “To Be Attacked by the Enemy Is Not a Bad Thing but a Good Thing” and urging the members of the party to cut off the head of imperialist snakes. A belief system as nutty as Maoism took a long time to get out of a nation’s system. I pitied the poor 1.3 billion Chinese, living in a country so insecure, so adolescent, so in thrall to authoritarian nationalism, that its politicians felt impelled to keep the cult alive. Thank God I’m an American, I told myself. We have plenty of cults, but at least they don’t get involved with our national politics.
Then I watched Michele Bachmann’s withdrawal speech.
Bachmann’s speech was a religious testimony, informing us that on the evening of March 21, 2010, she had a divine revelation. OK, she didn’t use the word “divine,” but that was basically the idea. You see, her holy revelation started with the Founding Fathers. And for Bachmann, Washington and Jefferson, if not literally angels, are flying around in their neighborhood.
“Entrusted to every American is their responsibility to watch over our Republic,” she began her speech. “You can look back from the time of the Pilgrims to the time of William Penn, to the time of our Founding Fathers. All we have to do is look around because very clearly we are encompassed with a great cloud of witnesses that bear witness to the sacrifices that were made to establish the U.S. and the precious principle of freedom that has made it the greatest force for good that has ever been seen on the planet.”
The “great cloud of witnesses” is a biblical term. By invoking it, Bachmann moved the Founding Fathers into the company of the prophets. And then she related her own humble journey to join the saved souls atop that great cloud – an epic quest that was spurred by the near-miraculous intercession of a painting of the Founding Fathers signing the Constitution.
“Every schoolchild is familiar with this painting,” Bachmann said. “But I’ve been privileged to see it on a regular basis, doing my duties in Congress. But never were the painting’s poignant reminder more evident than on the evening of March 21, 2010. That was the evening that Obamacare was passed and staring out from the painting are the faces of the founders, and in particular the face of Ben Franklin, who served as a constant reminder of the fragile Republic that he and the founders gave to us. That day served as the inspiration for my run for the President of the United States, because I believed firmly that what Congress had done and what President Obama had done in passing Obamacare endangered the very survival of the United States of America, our Republic.”
Bachmann closed her sermon by saying, “I look forward to the next chapter in God’s plan.”
Of such blinding revelations, religions are made. And cults.
The Republican hatred of Obama has become a cult. It is typically dressed up with the trappings of Christianity, but the cult does not reflect the teachings of that Jewish heretic known as Jesus of Nazareth — unless you believe, as Bachmann appears to, that defeating “Obamacare” is an essential part of the Lord’s master plan for the universe. (Personally, I would have thought that the great soul who reached out to the poor, the sick and the despised would have preferred universal healthcare over a system devoted to swelling the profits of those modern-day money-changers known as insurance companies, but what do I know?) But that is not to say that the version of Christianity embraced by many members of the anti-Obama cult does not play a key role in the movement, in ways we shall presently explore.
The anti-Obama cult is based on an irrational, grossly excessive fear and hatred of something the cult members call “big government” or “socialism,” and an equally irrational worship of something they call “freedom” or “liberty.” The fear and hatred of big government is irrational and excessive because Obama’s innocuous heathcare bill, the passage of which cult members like Bachmann see as the beginning of the end for America, is far less momentous as a piece of “social engineering” than Social Security, Medicare, welfare or progressive taxation.
We already live in a world where government intrudes on our freedom in a multitude of ways. Moreover, other enormous, impersonal forces, mainly corporate ones, constrain our liberty even more directly. Many of the “average Americans” the cult members claim to be speaking for lost their life savings when the bubble caused by an orgy of unregulated financial speculation burst – a far greater infringement on their “freedom” than being required to carry health insurance.
As for Obama himself, he is a bland left-leaning centrist, a slightly more liberal clone of moderate Republicans like Dwight D. Eisenhower, and his “socialist” policies are part of a long American tradition that goes back to FDR.
Why, then, did the anti-Obama cult suddenly take over the entire Republican Party?
The main reason, I believe, is that the American right was backed into a corner and had no other card to play. The disastrous presidency of George W. Bush revealed the complete bankruptcy (literally) of the two core right-wing nostrums, “freedom” (good) and “big government” (bad). “Freedom” had led to the biggest meltdown since the Great Depression. And big government – which was greatly expanded by Bush, to the deafening silence of the soon-to-be-anti-Obama fanatics – had done nothing to prevent it. In the wreckage left by Bush, there was nothing for the right to do, if it wanted to live to fight another day, except deny causality (and reality) and demonize Obama. By naively reaching out to Republicans, Obama let them get away with this, and squandered a teachable moment that could have changed the face of American politics.
The right survived. But defending this indefensible position squeezed its core beliefs into a kind of black hole, a blank spot of pure resentment, devoid of content, where the laws of logic did not apply. (According to Wikipedia, “Black holes of stellar mass are expected to form when very massive stars collapse at the end of their life cycle.”) As a result, “freedom” and its evil twin, “big government,” became metaphysical concepts, so elastic and amorphous that they could mean anything or nothing. They have come to play the same role in right-wing discourse as “the bourgeoisie” and “the workers” do in Marxism – they’re catchalls that can be plugged into any situation.
Thus, “big government” mostly means “giving money to undeserving people with dark skin” – a core GOP belief, central to the party since Nixon’s Southern Strategy, that Rick Santorum was rash enough to articulate. But it also has a cultural dimension in which it means pointy-headed elites who look down on “real Americans.” And trickiest of all, it also has a personal dimension in which it means anything that limits individual freedom — which explains the appeal, to those Republicans and independents who are genuine and consistent libertarians, of Ron Paul. (It is because “freedom” does not actually mean anything in the orthodox right-wing universe that non-libertarian conservatives like Romney, Bachmann, Santorum and the rest can advocate for intrusive drug laws, anti-gay laws and massive military budgets, while wrapping themselves in the mantle of “liberty.”)
Because “big government” does not have a fixed meaning, attacking it can simultaneously serve as a rallying cry for racial resentment, an impassioned demand for personal liberation and a marker of class- and region-based solidarity. This is why when the Republican candidates inveigh against big government, which they do approximately every time they open their mouths, their rants have all the weird, malevolent imprecision of a Stalinist attack on “running dog lackeys of the bourgeoisie.” They are the ravings of True Believers, of cult members.
Also lurking in that black hole was the one right-wing card that Bush did not destroy, because it is indestructible — the “culture war.” The far right’s free-floating hatred of America’s liberal, secular culture waxes and wanes, but it never goes away, and it is responsible for the rise of Rick Santorum, the GOP’s latest Dispose-a-Candidate. For Santorum, sinful modern life is to blame for everything, and it is our duty to always sound the alarms and remain vigilant against it. Thus, when the Catholic Church’s pedophilia scandal broke, Santorum blamed, not the church that covered it up or the individual priests who disgraced themselves and abused their position, but – Boston.
He wrote:
“When the culture is sick, every element in it becomes infected. While it is no excuse for this scandal, it is no surprise that Boston, a seat of academic, political and cultural liberalism in America, lies at the center of the storm. We must clearly see that international hostile forces are intensifying the strategic plot of liberalizing and dividing America, and ideological and cultural fields are the focal areas of their long-term infiltration.”
OK, I borrowed that last sentence from the quote by Comrade Hu, but you have to admit it tracks pretty well with the thoughts of Chairman Santorum.
The implosion of right-wing ideology and the persistence of the culture war toxin might have been enough by itself to create the anti-Obama cult, but two other factors also played a role. The first was his race. For many right-wingers, Obama was a foreign object, whose unexpected entrance into the body politic activated their immune systems – hence the “birther” movement and other bizarre right-wing obsessions. Whether the right’s aversion to Obama constitutes classic racism is a Talmudic question; what is undeniable is that his race activated a horde of (literally) white cells, rushing to expel the invader. Like organisms, cults always delineate themselves by drawing sharp lines between Us and Them.
The second reason involves Christianity. As Michele Bachmann’s speech demonstrated, for many devout right-wing Christians, there is no real difference between politics and religion. If religion is the uppermost thing in one’s life, if Jesus is with one every minute of every day, then it is easy to see how a true believer like Bachmann could come to see preserving her vision of the Republic as a semi-sacred trust, and defeating “Obamacare” as an essential part of that godly mission. Moreover, devoutly literalistic Christians tend to divide the world up into Good and Evil, with the founding dyad of God and the devil lurking in the background; it is not too much of a stretch to say that for many right-wing Christians, Barack Obama is at least of the devil’s party, if not Beelzebub himself.
Let me make it clear that I am not arguing that Christianity itself is a cult, or that Christians (or adherents of any religion) are inherently drawn to cultlike thinking. I am simply making the case that the right wing’s irrational hatred of Obama is cultlike, and that the literalist Christian faith of many right-wing opponents of Obama, including many of the GOP presidential candidates, clearly plays a role in their extreme beliefs.
To be sure, much of the anti-Obama cult is just Machiavellian politics. You hunt where the ducks are, and the ducks in this case are loons. It is extremely unlikely that Mitt Romney stares at a painting of Ben Franklin every day and has celestial visions of turning back Obama’s satanic plan to destroy America — which is precisely why the True Believers can’t stand him. But things have gotten Chairman Mao-y enough in the Republican Party that Romney has been forced to do his best to pretend he is a card-carrying member of the People’s Glorious Tea Party, Determined to Kill All Wriggling Socialist Snakes. Whether a fake cult member will prove more attractive to Republican voters than the genuine article will determine who will face Obama this fall.
Subscribe to:
Posts (Atom)