Thursday, February 28, 2013

E.L. Doctorow - Ragtime

I first read this novel in July of 1976 when it came out in paperback.  I read it again in the same PB I purchased on 7/2/76.  The books is just as enchanting today as it was then.

This book is unlike anything else I've ever read.  The book truly takes the reader back to the early days of the 20th Century focusing on the New York area.  It's not so much that you read this book that you are transported in your mind, immersed in the country of that time.  The book is engrossing, enchanting, and all-consuming.  Nothing like it has been written before or since in my opinion.  You are drawn into this novel as if you are living in America in the early years of the 20th Century.  It's uncanny; it's astonishing; it's provacative.

It is thrilling how the author weaves real people like Harry Houdini and Emma Goldman with his fictional characters.  The books weaves together disparate characters and events and wraps it all up at the end.  The book has a begnning, a middle, and end.  I like that.

This is one of my favorite novels of all time. 

Wednesday, February 27, 2013

This and That

The Supreme Court heard oral arguments today on the case of Shelby County vs. Holder, a challenge from my own county to the Voting Rights Act of 1965.  I am fearful that the  Court will gut this historic law, which open the floodgates on renewed voter suppression.

Sequestration, automatic indiscriminate budget cuts, will likely go into effect Friday.  I guess we'll see what the effects are.

This country is in the process of reversing decades of progress.

Sunday, February 24, 2013

The New Nagel Book

.The New York Review of Books

 Years.Awaiting a New DarwinFebruary 7, 2013H. by  Allen Orr. Mind and Cosmos: Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False

by Thomas Nagel

Oxford University Press, 130 pp., $24.95



1.

The history of science is partly the history of an idea that is by now so familiar that it no longer astounds: the universe, including our own existence, can be explained by the interactions of little bits of matter. We scientists are in the business of discovering the laws that characterize this matter. We do so, to some extent at least, by a kind of reduction. The stuff of biology, for instance, can be reduced to chemistry and the stuff of chemistry can be reduced to physics.



Thomas Nagel has never been at ease with this view. Nagel, University Professor of Philosophy and Law at New York University, is one of our most distinguished philosophers. He is perhaps best known for his 1974 paper, “What Is It Like to Be a Bat?,” a modern classic in the philosophy of mind. In that paper, Nagel argued that reductionist, materialist accounts of the mind leave some things unexplained. And one of those things is what it would actually feel like to be, say, a bat, a creature that navigates its environment via the odd (to us) sense of echolocation. To Nagel, then, reductionist attempts to ground everything in matter fail partly for a reason that couldn’t be any nearer to us: subjective experience. While not denying that our conscious experiences have everything to do with brains, neurons, and matter, Nagel finds it hard to see how these experiences can be fully reduced with the conceptual tools of physical science.



In Mind and Cosmos, Nagel continues his attacks on reductionism. Though the book is brief its claims are big. Nagel insists that the mind-body problem “is not just a local problem” but “invades our understanding of the entire cosmos and its history.” If what he calls “materialist naturalism” or just “materialism” can’t explain consciousness, then it can’t fully account for life since consciousness is a feature of life. And if it can’t explain life, then it can’t fully account for the chemical and physical universe since life is a feature of that universe. Subjective experience is not, to Nagel, some detail that materialist science can hand-wave away. It’s a deal breaker. Nagel believes that any future science that grapples seriously with the mind-body problem will be one that is radically reconceived.



As Nagel makes clear in the subtitle of Mind and Cosmos, part of what he thinks must be reconceived is our reigning theory of evolutionary biology, neo-Darwinism. Neo-Darwinism maintains, or at least implies, that the origin and history of life can be explained by materialist means. Once the first life arose on earth, the fate of the resulting evolutionary lineage was, neo-Darwinists argued, shaped by a combination of random mutation and natural selection. Biological types that survive or reproduce better than others will ultimately replace those others. While natural selection ensures that species constantly adapt to the changing environments around them, the process has no foresight: natural selection responds only to the present environment and evolution cannot, therefore, be aiming for any goal. This view, Nagel tells us, is “almost certainly false.”



Before creationists grow too excited, it’s important to see what Nagel is not claiming. He is not claiming that life is six thousand years old, that it did not evolve, or that natural selection played no part in this evolution. He believes that life has a long evolutionary history and that natural selection had a part in it. And while he does believe that intelligent design creationists have asked some incisive questions, Nagel rejects their answers. Indeed he is an atheist. Instead Nagel’s view is that neo-Darwinism, and in fact the whole materialist view elaborated by science since the seventeenth century, is radically incomplete. The materialist laws of nature must, he says, be supplemented by something else if we are to fold ourselves and our minds fully into our science.



His leading contender for this something else is teleology, a tendency of the universe to aim for certain goals as it unfolds through time. Nagel believes that (currently unknown) teleological laws of nature might mean that life and consciousness arise with greater probability than would result from the known laws of physics, chemistry, and biology.



Scientists shouldn’t be shocked by Nagel’s claim that present science might not be up to cracking the mind-brain problem or that a profoundly different science might lie on the horizon. The history of science is filled with such surprising transformations. Nor should we dismiss Nagel’s claims merely because they originate from outside science, from a philosopher. Much the same thing happened when natural theology—the scientific attempt to discern God’s attributes from His biological handiwork—gave way to Darwinism.



It was the philosopher David Hume who began to dismantle important aspects of natural theology. In a devastating set of arguments, Hume identified grievous problems with the argument from design (which claims, roughly, that a designer must exist because organisms show intricate design). Hume was not, however, able to offer an alternative account for the apparent design in organisms. Darwin worked in Hume’s wake and finally provided the required missing theory, natural selection. Nagel, consciously or not, now aspires to play the part of Hume in the demise of neo-Darwinism. He has, he believes, identified serious shortcomings in neo-Darwinism. And while he suspects that teleological laws of nature may exist, he recognizes that he hasn’t provided anything like a full theory. He awaits his Darwin.



Mind and Cosmos is certainly provocative and it reflects the efforts of a fiercely independent mind. In important places, however, I believe that it is wrong. Because Nagel’s book sits at the intersection of philosophy and science it will surely attract the attention of both communities.1 As a biologist, I will perhaps inevitably focus on Nagel’s more scientific claims. But these are, it appears, the claims that are most responsible for the excitement over the book.



I begin by considering the reasons Nagel believes that materialist science, including neo-Darwinism, is false. I then turn to his alternative theory, teleology.



2.

Nagel believes that materialism confronts two classes of problems. One, which is new to Nagel’s thought, concerns purported empirical problems with neo-Darwinism. The other, which is more familiar to philosophers, is the alleged failure of materialism to explain consciousness and allied mental phenomena.



Nagel argues that there are purely “empirical reasons” to be skeptical about reductionism in biology and, in particular, about the plausibility of neo-Darwinism. Nagel’s claims here are so surprising that it’s best to quote them at length:



I would like to defend the untutored reaction of incredulity to the reductionist neo-Darwinian account of the origin and evolution of life. It is prima facie highly implausible that life as we know it is the result of a sequence of physical accidents together with the mechanism of natural selection. We are expected to abandon this naïve response, not in favor of a fully worked out physical/chemical explanation but in favor of an alternative that is really a schema for explanation, supported by some examples. What is lacking, to my knowledge, is a credible argument that the story has a nonnegligible probability of being true. There are two questions. First, given what is known about the chemical basis of biology and genetics, what is the likelihood that self-reproducing life forms should have come into existence spontaneously on the early earth, solely through the operation of the laws of physics and chemistry? The second question is about the sources of variation in the evolutionary process that was set in motion once life began: In the available geological time since the first life forms appeared on earth, what is the likelihood that, as a result of physical accident, a sequence of viable genetic mutations should have occurred that was sufficient to permit natural selection to produce the organisms that actually exist?

Nagel claims that both questions concern “highly specific events over a long historical period in the distant past, the available evidence is very indirect, and general assumptions have to play an important part.” He therefore concludes that “the available scientific evidence, in spite of the consensus of scientific opinion, does not in this matter rationally require us to subordinate the incredulity of common sense.”



This conclusion is remarkable in a couple ways. For one thing, there’s not much of an argument here. Instead Nagel’s conclusion rests largely on the strength of his intuition. His intuition recoils from the claimed plausibility of neo-Darwinism and that, it seems, is that. (Richard Dawkins has called this sort of move the argument from personal incredulity.) But plenty of scientific truths are counterintuitive (does anyone find it intuitive that we’re hurtling around the sun at 67,000 miles per hour?) and a scientific education is, to a considerable extent, an exercise in taming the authority of one’s intuition. Nagel never explains why his intuition should count for so much here.



As for his claim that evolutionary theory is somewhat schematic and that it concerns events that happened long ago, leaving indirect evidence, this is partly true of any historical science, including any alternative to neo-Darwinism, e.g., the one that Nagel himself suggests. In any case, a good part of the evidence for neo-Darwinism is not indirect but involves experiments in which evolutionary change is monitored in real time.2



More important, Nagel’s conclusions about evolution are almost certainly wrong. The origin of life is admittedly a hard problem and we don’t know exactly how the first self-replicating system arose. But big progress has been made. The discovery of so-called ribozymes in the 1980s plausibly cracked the main principled problem at the heart of the origin of life. Research on life’s origin had always faced a chicken and egg dilemma: DNA, our hereditary material, can’t replicate without the assistance of proteins, but one can’t get the required proteins unless they’re encoded by DNA. So how could the whole system get off the ground?



Answer: the first genetic material was probably RNA, not DNA. This might sound like a distinction without a difference but it isn’t. The point is that RNA molecules can both act as a hereditary material (as DNA does) and catalyze certain chemical reactions (as some proteins do), possibly including their own replication. (An RNA molecule that can catalyze a reaction is called a ribozyme.) Consequently, many researchers into the origins of life now believe in an “RNA world,” in which early life on earth was RNA-based. “Physical accidents” were likely still required to produce the first RNA molecules, but we can now begin to see how these molecules might then self-replicate.



Nagel’s astonishment that a “sequence of viable genetic mutations” has been available to evolution over billions of years is also unfounded.3 His concern appears to be that evolution requires an unbroken chain of viable genetic variants that connect the first living creature to, say, human beings. How could nature ensure that a viable mutation was always available to evolution? The answer is that it didn’t. That’s why species go extinct. Indeed that’s what extinction is. The world changes and a species can’t find a mutation fast enough to let it live. Extinction is the norm in evolution: the vast majority of all species have gone extinct. Nagel has, I think, been led astray by a big survivorship bias: the evolutionary lineage that led to us always found a viable mutation, ergo one must, it seems, always be available. Tyrannosaurus rex would presumably be less impressed by nature’s munificence.4



3.

While Nagel’s worries about neo-Darwinism are misplaced, he’s on somewhat firmer (or at least more familiar) ground when he turns to mental phenomena like consciousness. These are, after all, separate problems. A science might explain the evolution of life but leave consciousness—the subjective experience of the saltiness of popcorn, the shock of cold water, or the sting of pain—unaccounted for. Consciousness is Nagel’s big problem:



Consciousness is the most conspicuous obstacle to a comprehensive naturalism that relies only on the resources of physical science. The existence of consciousness seems to imply that the physical description of the universe, in spite of its richness and explanatory power, is only part of the truth, and that the natural order is far less austere than it would be if physics and chemistry accounted for everything.

Nagel’s story here starts, as it must, with Descartes. As Nagel writes, Descartes posited that matter and mind are “both fully real and irreducibly distinct, though they interact.” Given this, science was, from the outset, concerned solely with matter; mind belonged to a different domain. While scientists happily toiled under Cartesian dualism, giving rise to a recognizably modern science, philosophers often demurred. Instead, thinkers like Berkeley favored various forms of idealism, which maintains that nature is at bottom mind. Under idealism, then, any reductionist program would be in the business of collapsing matter to mind.



Nagel argues that as a result of a rapid shift whose causes are unclear, these idealist philosophies were “largely displaced in later twentieth-century analytic philosophy by attempts at unification in the opposite direction, starting from the physical.” This approach likely seems natural to most of us. But we live with a tension. Though the materialist program of reducing mind to matter would appear the properly “scientific” approach, we haven’t the slightest idea how it would work. And it’s not for lack of trying. Philosophers have, Nagel reminds us, attempted many ways of tying mind to matter: conceptual behaviorism, physical identity theory, causal behaviorism, and functionalism, to name a few. To Nagel all these approaches have failed “for the same old reason”:



Even with the brain added to the picture, they clearly leave out something essential, without which there would be no mind. And what they leave out is just what was deliberately left out of the physical world by Descartes and Galileo in order to form the modern concept of the physical, namely, subjective appearances.

Nagel is deeply skeptical that any species of materialist reductionism can work. Instead, he concludes, progress on consciousness will require an intellectual revolution at least as radical as Einstein’s theory of relativity.



Nagel’s chapter on consciousness is a concise and critical survey of a literature that is both vast and fascinating. He further extends his survey to other mental phenomena, including reason and value, that he also finds recalcitrant to materialism. (Nagel concludes that the existence of objective moral truths is incompatible with materialist evolutionary theory; because he is sure that moral truths exist, he again concludes that evolutionary theory is incomplete.)



Nagel concedes that many philosophers do not share his skepticism about the plausibility of reducing mind to matter. And I can assure readers that most scientists don’t. I, however, share Nagel’s sense of mystery here. Brains and neurons obviously have everything to do with consciousness but how such mere objects can give rise to the eerily different phenomenon of subjective experience seems utterly incomprehensible.



Despite this, I can’t go so far as to conclude that mind poses some insurmountable barrier to materialism. There are two reasons. The first is, frankly, more a sociological observation than an actual argument. Science has, since the seventeenth century, proved remarkably adept at incorporating initially alien ideas (like electromagnetic fields) into its thinking. Yet most people, apparently including Nagel, find the resulting science sufficiently materialist. The unusual way in which physicists understand the weirdness of quantum mechanics might be especially instructive as a crude template for how the consciousness story could play out. Physicists describe quantum mechanics by writing equations. The fact that no one, including them, can quite intuit the meaning of these equations is often deemed beside the point. The solution is the equation. One can imagine a similar course for consciousness research: the solution is X, whether you can intuit X or not. Indeed the fact that you can’t intuit X might say more about you than it does about consciousness.



And this brings me to the second reason. For there might be perfectly good reasons why you can’t imagine a solution to the problem of consciousness. As the philosopher Colin McGinn has emphasized, your very inability to imagine a solution might reflect your cognitive limitations as an evolved creature. The point is that we have no reason to believe that we, as organisms whose brains are evolved and finite, can fathom the answer to every question that we can ask. All other species have cognitive limitations, why not us? So even if matter does give rise to mind, we might not be able to understand how.



To McGinn, then, the mysteriousness of consciousness may not be so much a challenge to neo-Darwinism as a result of it. Nagel obviously draws the opposite conclusion. But the availability of both conclusions gives pause.



4.

Given the problems that Nagel has with materialism, the obvious question is, What’s the alternative? In the most provocative part of Mind and Cosmos, he suggests one, teleology. While we often associate teleology with a God-like mind—events occur because an agent wills them as means to an end—Nagel finds theism unattractive. But he insists that materialism and theism do not exhaust the possibilities.



Instead he proposes a special species of teleology that he calls natural teleology. Natural teleology doesn’t depend on any agent’s intentions; it’s just the way the world is. There are teleological laws of nature that we don’t yet know about and they bias the unfolding of the universe in certain desirable directions, including the formation of complex organisms and consciousness. The existence of teleological laws means that certain physical outcomes “have a significantly higher probability than is entailed by the laws of physics alone—simply because they are on the path toward a certain outcome.”



Nagel intends natural teleology to be, among other things, a biological theory. It would explain not only the “appearance of physical organisms” but the “development of consciousness and ultimately of reason in those organisms.” Teleology would also provide an “account of the existence of the biological possibilities on which natural selection can operate.”



Nagel concedes that his new theory isn’t fully fleshed out. He hopes merely to sketch the outlines of a plausible alternative to materialism. It’s unfortunate, though, that Mind and Cosmos is too brief to allow consideration of problems that attend natural teleology. For it seems to me that there are some, especially where the view confronts biology.



Darwin himself wrestled with attempts to reconcile his theory with teleology and concluded, reluctantly, that it seemed implausible. While Darwin published almost nothing on such philosophical matters they loom large in his correspondence, particularly with Asa Gray, an American champion of evolution and a Christian. Gray, like Nagel, wanted to believe that, while Darwin had identified an important force in the history of life, nature also features teleology. In particular, Gray suggested that the variation provided by nature to natural selection biases the process in desirable directions.



Darwin, though sometimes vacillating, argued that Gray’s reconciliation was implausible. Exercising his uncanny ability to discern deep truths in prosaic facts—in this case the artificial selection of a pigeon breed by a few fanciers—Darwin wrote Gray:



But I grieve to say that I cannot honestly go as far as you do about Design…. You lead me to infer that you believe “that variation has been led along certain beneficial lines”.—I cannot believe this; & I think you would have to believe, that the tail of the Fan-tail was led to vary in the number & direction of its feathers in order to gratify the caprice of a few men.5

Here’s another problem. Nagel’s teleological biology is heavily human-centric or at least animal-centric. Organisms, it seems, are in the business of secreting sentience, reason, and values. Real biology looks little like this and, from the outset, must face the staggering facts of organismal diversity. There are millions of species of fungi and bacteria and nearly 300,000 species of flowering plants. None of these groups is sentient and each is spectacularly successful. Indeed mindless species outnumber we sentient ones by any sensible measure (biomass, number of individuals, or number of species; there are only about 5,500 species of mammals). More fundamentally, each of these species is every bit as much the end product of evolution as we are. The point is that, if nature has goals, it certainly seems to have many and consciousness would appear to be fairly far down on the list.



Similarly, Nagel’s teleological biology is run through with talk about the “higher forms of organization toward which nature tends” and progress toward “more complex systems.” Again, real biology looks little like this. The history of evolutionary lineages is replete with reversals, which often move from greater complexity to less. A lineage will evolve a complex feature (an eye, for example) that later gets dismantled, evolutionarily deconstructed after the species moves into a new environment (dark caves, say). Parasites often begin as “normal” complicated organisms and then lose evolutionarily many of their complex traits after taking up their new parasitic way of life. Such reversals are easily explained under Darwinism but less so under teleology. If nature is trying to get somewhere, why does it keep changing its mind about the destination?6



I’ll be the first to admit that these problems may not be fatal. But they represent the sorts of awkward facts that occur immediately to any biologist. Minimally, they pose serious challenges to teleology, challenges that deserve, but do not receive, consideration in Mind and Cosmos.



5.

I will also be the first to admit that we cannot rule out the formal possibility of teleology in nature. It could turn out that teleological laws affect how the universe unfolds through time. While I suspect some might regard such heterodoxy as a crime against science, Nagel is right that there’s nothing intrinsically unscientific about teleology. If that’s the way nature is, that’s the way it is, and we scientists would need to get on with the business of characterizing these surprising laws. Teleological science is, in fact, more than imaginable. It’s actual, at least historically. Aristotelian science, with its concern for final cause, was thoroughly teleological. And the biological tradition that Darwinism displaced, natural theology, also featured a good deal of teleological thinking.



The question, then, is not whether teleology is formally compatible with the practice of science. The question is whether the practice of science leads to taking teleology seriously. Nagel may find this question unfair. He is, he says, engaging in a “philosophical task,” not the “internal pursuit of science.” But it seems clear that he is doing more than this. He’s emphasizing purported “empirical reasons” for finding neo-Darwinism “almost certainly false” and he’s suggesting the existence of new scientific laws. These represent moves, however halting, into science proper. But science, finally, isn’t about defining the space of all formally possible explanations of nature. It’s about inference to the most likely hypothesis. And on these grounds there’s simply no comparison between neo-Darwinism (for which there is overwhelming evidence) and natural teleology (for which there is none). While one might complain that it’s unfair to stack up the empirical successes of neo-Darwinism with those of a new theory, this, again, gets the history wrong. Teleology is the traditional view; neo-Darwinism is the new kid on the block.



None of this is to suggest that evolutionary biology will not, someday, change radically. Of course it might; any science might. Nor is it to suggest that materialism represents some final unassailable view and that teleology or, for that matter, theism will inevitably be spoken of in the past tense by many scientists. It is to say that the way to any such alternative view will have to acknowledge the full powers of present science. I cannot conclude that Mind and Cosmos does this.



1 Nagel’s work has long attracted the attention of both philosophers and scientists. Indeed the careful reader will notice that I’m mentioned in his new book as a scientist-participant in a workshop that he organized on some of the topics covered in the book; many of the other participants were philosophers. ↩

2 The field of “experimental evolution” is concerned with watching evolution as it occurs. Because of their short generation time, microbes are the focus of much of this work. ↩

3 While I’ve heard this concern before, I must admit that I think I only now understand it. ↩

4 This is not to say that adaptation is rare or that natural selection doesn’t modify the DNA sequences of species. Even species that ultimately go extinct have experienced many previous bouts of successful adaptation. ↩

5 November 26, 1860; see www.darwinp roject.ac.uk/entry-2998. Historians of science do not all agree that Darwin wholly banished teleology from his thinking; see the exchange between James G. Lennox (1993, 1994) and Michael T. Ghiselin (1994) in Biology and Philosophy. ↩

6 It’s true that organisms are on average more complex now than they were three billion years ago. But as biologists have long recognized, this doesn’t require any inexorable bias toward complexity. If life starts from a floor of zero complexity, it can on average only get more complicated. ↩



1

Nagel’s work has long attracted the attention of both philosophers and scientists. Indeed the careful reader will notice that I’m mentioned in his new book as a scientist-participant in a workshop that he organized on some of the topics covered in the book; many of the other participants were philosophers. ↩



2

The field of “experimental evolution” is concerned with watching evolution as it occurs. Because of their short generation time, microbes are the focus of much of this work. ↩



3

While I’ve heard this concern before, I must admit that I think I only now understand it. ↩



4

This is not to say that adaptation is rare or that natural selection doesn’t modify the DNA sequences of species. Even species that ultimately go extinct have experienced many previous bouts of successful adaptation. ↩



5

November 26, 1860; see www.darwinp roject.ac.uk/entry-2998. Historians of science do not all agree that Darwin wholly banished teleology from his thinking; see the exchange between James G. Lennox (1993, 1994) and Michael T. Ghiselin (1994) in Biology and Philosophy. ↩



6

It’s true that organisms are on average more complex now than they were three billion years ago. But as biologists have long recognized, this doesn’t require any inexorable bias toward complexity. If life starts from a floor of zero complexity, it can on average only get more complicated. ↩



Saturday, February 23, 2013

This is Where I Leave You by Jonathan Tropper

This is an enjoyable book, the most humorous that I have ever read.  It is a family drama, the most central character being Judd Foxman.  Judd has learned about his wife's adultery with his boss, and his father has died.  His family, including his mother and three siblings, are sitting shiva for a week.  However, his family does not get along.  Bickering and long held resentments are the glue that keep the Foxmans together.  Judd must deal with spending a week with his family amidst the turmoil in his personal life.  To complicate matters, he has moved out of his house, his adulterous wife is pregnant with his child, and his brother Paul blames him for ruining his baseball career when they were younger.

Through this complicated mess, Judd learns how selfish he was towards Paul, how you have to stop comparing yourself to what you have lost, and how you have to let go and be content with yourself before you can find happiness.

This is an easy book to read, although it is also the most explicit that I have ever read.

Friday, February 22, 2013

I Know So Many Things

.I know so many things this morning I hardly know where to start. Should I 1) go to work early 2) hang out with the Pelham police at Dunkin Donuts first or 3) do a few random acts of kindness first?

Like · · Promote · Share.

Lynne Maki, Deborah Davis Weeks and 2 others like this..Kelley Sanford Sharit Dunkin Donuts!!!! What's there to think about????

10 hours ago · Like..Moyna O'Riley Hudson Pick me up a smoked sausage muffin while you're there!

9 hours ago · Like..Jody Britt You could always just randomly help some cop at Dunkin Donuts, then get to work.

3 hours ago · Like..Fred Hudson Jody---I offered to purchase Officer Smith a coffee and donut, but he said he couldn't accept. Something about regulations. I have never let "regulations" get in my way!

Thursday, February 21, 2013

"Obamacare" is Going to Win

Today at 8:07 AMCommentRick Scott Delivers Death Blow to Obamacare Repeal                                    By Jonathan Chait

From the moment President Obama set out to reform the health care system, Republican opposition was a Terminator robot driven by boundless, remorseless determination to kill. Every single Republican in Congress opposed the bill, and Republicans who even considered supporting something vaguely like it were ruthlessly purged. Even after it was passed, Republicans ginned up far-fetched legal challenges, held endless votes to repeal it, and vowed not to implement it at the state level. They couldn't be bargained with, couldn't be reasoned with, and felt no pity.



The repeal machine has suffered a series of devastating blows – the Supreme Court upholding the individual mandate, Obama’s reelection, the decision of several Republican governors to accept the program’s expansion of Medicaid – and continued to lurch forward. But Governor Rick Scott’s announcement that he will enroll uninsured Floridians in Medicaid appears to be a real death blow, the moment the cyborg’s head is crushed in a steel press.



From the moment he appeared on the national stage, Scott seemed to be engineered to fight health care reform. The wealthy owner of a vast hospital chain that paid massive fines for overbilling Medicare during his tenure, Scott bankrolled an anti-reform lobby, then ran and won in 2010 on a platform of obsessive opposition to Obamacare. He has steadfastly vowed to turn down federal subsidies to cover his state’s uninsured, and even concocted phony accounting assumptions to justify his stance. Rick Scott really hates health care reform.



But Scott is a vulnerable incumbent in a swing state. And his refusal to accept Medicaid expansions would left his state’s hospitals on the hook for $2.8 billion when uninsured Floridians show up in emergency rooms, prompting them to lobby Scott to change his mind. And so he has. For an enjoyable sampling of conservative apoplexy, try Philip Klein (“waving the white flag is an accurate description of Scott’s decision,”) Mario Loyola (“the most grievous blow since the Supreme Court’s decision upholding Obamacare last year,”) and Michael Cannon (“will he sell out Florida’s job creators too?”).



Cannon’s outrage in particular is almost poignant. He has served as a health care adviser to Scott in Florida, and as a founder of the “Anti-Universal Coverage Club,” lent Scott the closest link, of all the governors, to the conservative movement’s maniacal hatred for providing health insurance to those too sick or poor to obtain it on their own. The ability of governors to turn down Medicaid funding is the last line of defense against Obamacare, and Scott’s betrayal of the cause – choosing the financial health of his own state’s hospitals over the chance to deny medical care to his own state’s poor – lands a blow of both substantive and symbolic power.



We are not about to enter a new era of peace and health care love. The death struggle between liberals fighting to make health insurance a basic right and conservative fighting to prevent that is over. What’s replacing it is a more mundane form of trench warfare. The new conservative position will come to revolve around expanding the role and prerogative of private insurance, and the liberal goal to strengthen regulation and help the poor and sick.



A glimpse of the new conservative health care line comes from former Romney adviser Avik Roy and conservative think-tank apparatchik Douglas Holtz-Eakin in a joint-bylined column. In it, they point the way toward the future of the health care debate. Gone is the millennial struggle to preserve the dying embers of freedom. They actually allow that the central architecture of Obamacare – the establishment of subsidized exchanges where individuals can purchase private insurance – is actually an “important concession to the private sector.”



Right! It’s a Republican-designed idea! It might have helped if Republicans had noticed this, instead of screaming about socialism, back when Obama was trying to pass the plan.



In any case, Roy and Holtz-Eakin argue that their discovery that Obamacare consists mainly of a free market health insurance mechanism offers conservatives a wonderful opportunity. Here they’re thinking grows extremely confused. The problem with Obamacare , they argue, is that the exchanges are regulated. The “community rating” provision, which prevents insurers from charging higher rates to people more likely to get sick, “will dramatically increase premiums for young people.” They propose to get rid of such regulations and turn the exchanges into a free-market paradise “modeled on the Swiss system.”



As a policy guide, this is utterly daft. Health care economist Aaron Carroll fisks the op-ed and concludes that they have no idea at all how the Swiss system works. It’s more regulated than Obamacare, not less. Community rating is needed because that’s how you make insurance affordable to sick people – otherwise, insurers will just sign up healthy customers.



But as a political roadmap, Roy and Holtz-Eakin offer what looks like the most plausible way forward for the GOP. The health insurance industry doesn’t want the government forcing them to sell products to money-losing sick people. Insurers will want to skim the healthiest people from the pool. And conservatives don’t like regulation. That is a perfect match of constituency and ideology.



So the broader struggle will never end. But the conservatives understand that the struggle to preserve “American exceptionalism” in health care – America’s standing as the sole advanced democracy without universal citizen access to medical care – is over.



Wednesday, February 20, 2013

7 More?

Rational Champions: Birmingham lawyer’s new book argues Auburn should claim 7 more national titles

Written by Jeremy Henderson Football, People, Sports Feb 13, 2013


Michael Skotnicki is a Birmingham lawyer with two degrees from Auburn and magna cum laude honors from Samford, and he’s written a bunch of scientific papers and even briefs that have been reviewed by the Supreme Court, and the dude just comes out with it: Auburn has nine teams, not two, that should be claimed and heralded and celebrated by Auburn fans as National Champions.



And he’s not talking some halftime Text Your Answer To The Jumbotron. He wants banners. That’s the way he ends every chapter of his book, “Auburn’s Unclaimed National Championships”—with a demand for banners at Jordan-Hare. And “agree with the premise or not,” as David Housel blurbs on the back cover—our own Van Allen Plexico blurbs, too— you have to respect that.



You respect it even more when you realize Skotnicki isn’t appealing to Auburn fans and our need to keep up with the Jones’ and their tacky bumper stickers, but for the players—the bruised and bloodied sepia-toned 20-year-olds who bent themselves to the glory of Auburn 100 years ago as much as Zeke and Burkett and Cam and Nick after them, and in many ways more (the exploits of the 1913 and 1914 teams will drop your jaw).



We interviewed Skotnicki about his motivations for writing the book, how it could affect Auburn’s self-esteem, and just why the university seems intent on celebrating a lack of banners almost as if it was a banner itself.



Why did you attend Auburn? Did you grow up an Auburn fan?



I didn’t grow up an Auburn fan, and perhaps that keyed my interest in learning about Auburn football history when I eventually attended Auburn. I grew up in Ohio and Pennsylvania in the 1960’s and 1970’s and followed Notre Dame, Ohio State, and Penn State football. I decided to attend Auburn for college because at that time I wanted to be a fisheries biologist and Auburn had the best program in the nation. Like many people, I changed majors and ended up getting both a B.S. and M.S. in geology and even taught Geology 102 in 1985 and 1986 when I filled in for a professor on sabbatical. Several years later I went to law school and I have made a career in law ever since.



Why did you write “Auburn’s Unclaimed National Championships”? When did the idea come to you? What was the hardest thing about writing it? The easiest?



I wrote the book because I simply grew tired of the ignorance about the greatness of Auburn football history that you see in both national and in-state sports media, who think the only highlights of Auburn football before Bo Jackson went “Over the Top” in 1982 were the 1957 National Championship, the “Punt Bama Punt” game in 1972, and the fact John Heisman was the head coach for five years in the 1890’s. I wanted to bring to the attention of people that Auburn was a true power in college football with Coach “Iron Mike” Donahue. That period is an important part of Auburn’s football history that the Auburn administration has neglected and most Auburn people thus know very little if anything about.



The idea came to me this past summer when I read that the University of Minnesota had decided to claim a National Championship for 1904 based on the ranking of a recognized retroactive selector, the Billingsley Report. Then I read that Texas A&M University, upon joining the SEC in football, had decided to claim being National Champion for 1919 and 1927, also based on the rankings of retroactive selectors. I had a good idea from my reading of Auburn football history that Auburn could make similar claims and I set about gathering the information to support making such an argument. When I realized there was plenty of material available, I set out to write a book that would tell a concise history of Auburn football, focusing on the seven seasons where, using a strict standard, Auburn could legitimately claim additional National Championships.



The hardest thing about writing the book was actually finding information about Coach Donahue and his teams. I had to scour many out-of-print books about Auburn football and newspaper archives available over the internet just to find the information I provided in the book.



The easiest thing was actually making the case for Auburn to claim a National Championship for the seven seasons discussed in the book. As an appellate brief writer, I often struggle trying to make the case for a client that the trial court’s ruling was in error and should be reversed because there is little evidence or law to support the argument. I had no such problem writing this book. In fact, the issue is a “slam dunk” for some years, such as 1913, 1983, and 2004.



Why has Auburn never claimed some of these championships? Did some of the older teams think of themselves as national champions? 1913? 1914? When did the NCAA officially recognizes Auburn as a national champion for 1913?



I don’t know why Auburn decided not to claim these additional National Championships. Certainly the Athletic Department is aware of the possibility, as the Football Media Guide makes note of most of them in fine print no one notices. Many universities in addition to the University of Alabama do so. For example, in 2004, the University of Southern California claimed a National Championship for 1939, when it was the undefeated Pacific Coast Conference Champion.



While the idea of a National Champion was discussed in the early years of college football, there really was no means to determine a national champion and so newspapers would name regional champions. The 1908, 1910, 1913, and 1914 Auburn teams under Coach Donahue were all named “Champions of the South” and it wasn’t until 1936 that the Associated Press developed the first college football ranking poll that named a National Champion. Certainly, the information I found suggests that those Auburn teams, and students and alumni, were very proud of the fact that they were Champions of the South. Auburn football was very important in that era when it was by far the dominant football team in this state and was only challenged for dominance in the entire South, from Louisiana to Virginia, by two other schools.



The NCAA does indeed recognize Auburn as a National Champion for 1913 based on the retroactive rankings of Richard Billingsley, who developed a computer program for rating college football teams based on strength of schedule in 1970. Billingsley’s current program is one of six computer formulas used in creating in the current BCS rankings.



I’m not certain when the NCAA began recognizing retroactive National Championships for years before the A.P. poll started in 1936, but I do think that Auburn should recognize the 1913 team as a National Champion if for no other reason that that the NCAA does. Having a stricter standard than the NCAA makes no sense to me.



In recent years, the prevailing “Well, if we counted like Bama…” mentality among Auburn fans seems to have established a refusal to claim Auburn as national champions in at least more well-known So-Close seasons like 1983 and 2004 almost as a point of pride, another thing to set us apart (“We Don’t Do That”), even to the point of temporarily erasing undefeated seasons from Auburn history. Do you think that would factor into the university’s reluctance to change its official policy (if there is one) toward recognizing these teams as national champions?



I completely understand that point of view that some Auburn people, and apparently some very important Auburn people, have. I simply disagree and believe it is holding the Auburn football program back from being recognized for the greatness that it has earned on the gridiron for over a century through the efforts of these coaches and players. I say why let what Alabama does control what Auburn does? Moreover, it’s just not Alabama. Many universities have done the same. In addition to my mention of Texas A&M, in the SEC, the University of Tennessee claims six National Championships. Just one of those is an A.P. title and another is its 1998 BCS title. The other four are from retroactive selectors just as I propose Auburn should claim for 1913 and other years.



You straight up say “Auburn should do this…” for each team at the end of each chapter. But how likely do you think that they will? How do you imagine it happening? How do you think Auburn fans would feel about it? Alabama fans?



I can’t say how likely it is that Auburn will ever claim a National Championship for any of the seven years discussed in my book. My hope in writing the book was that if I could make the case for each of these seven seasons – and there is a strong case in each instance, the Auburn people would judge for themselves what should be done. If there is enough of a clamor for a change in the Athletic Department’s position then I believe it is possible that Auburn will one day recognize additional National Championships. I don’t believe that Auburn would claim all seven additional National Championships at one time, although there is precedent given that Alabama’s Athletic Department claimed five additional National Championships in 1986. I think it might be done a few years at a time, on the occasion of an anniversary season. Given that this year will be the 100 year anniversary of the 1913 team, there would be no better time that this fall to recognize that 1913 team as a National Champion. It is also the 30th and 20th anniversaries of the 1983 and 1993 teams, so this makes it a great year to add a trifecta of National Championships.





That’s the 1983 Alabama media guide. National championships listed? Six. Three years later, they had 11.

I think Auburn fans, true Auburn fans, would be proud to claim additional National Championships if they understood the strength of the arguments supporting such a claim. I’m sure Alabama fans would find some way to complain or try and belittle Auburn for taking such action. However, given that the standard for Auburn claiming additional National Championships that I set forth in my book cannot be met for at least one of the National Championships claimed by Alabama (1941), Alabama fans would have actually very little to argue about unless they want to start subtracting championships from their own total.



What’s the reaction been to the book so far?



I’ve had a good reaction from those who have read it. Many Auburn people tell me they are surprised to learn how dominant the Auburn football program was under Coach Donahue. A few former players have passed word to me that they appreciate my efforts at getting their teams officially recognized by Auburn as National Champions. The book has begun what I think is an important discussion for Auburn people, and I hope that discussion continues as we move toward the 100th anniversary of the 1913 team.



Michael Skotnicki - Auburn's Unclaimed National Championships

The author of this slim book (which also serves as a brief history of Auburn football)  asserts that Auburn should lay claim to 7 more football national championships in addition to 1957 and 2010 that it has so far failed to do.  He makes the case that the Tigers deserve to claim national championships in 1910, 1913, 1914, 1958, 1983, 1993, and 2004. 

The NCAA does not officially recognize a football national championship.  Over the years and now including the infamous BCS, various organizations and people have selected "national champions."  Some institutions, most notably the U of Alabama, claim a NC when ANY person or group recognizes one of their teams as such.  So you have UAT claiming national championships back in the past when any magazine or group or person gives them that honor.  By Alabama's standards, Auburn could lay claim to 7 more NCs.

The book makes the case that Auburn SHOULD lay claim to 7 more national championship.  Should Auburn do so?  Other schools have done so.  Or should Auburn have higher standards and not play UAT's silly game? 

Good question this book raises!  See what you think after reading this book.

Sunday, February 17, 2013

Nice Things

There are people in this world who like "nice things." The problem is that sometimes they don't pay for those "nice things" or they pay for those "nice things" illicitly with other people's money. I've never had access to other people's money to be tempted. I don't really care for "nice things" anyways. But if I had access to other people's money. . . . But that isn't gonna happen. I'll never be tempted so I'm safe.


Saturday, February 16, 2013

Calvin Coolidge - Fraud

.The Great Refrainer‘Coolidge,’ by Amity Shlaes


By JACOB HEILBRUNN

Published: February 14, 2013

This past December the Claremont Institute convened a forum at the Ronald Reagan Building in Washington to discuss the presidential election. The mood among the conservative stalwarts during the reception may have ranged from pensive to glum, but it brightened somewhat when Clare­mont’s panelists contemplated a return to true Republican principles as advanced by the last president to slash both taxes and the federal budget — Calvin Coolidge. James Ceaser, a political scientist at the University of Virginia and a regular contributor to The Weekly Standard, said it was important to revive the “moral stigma” of debt, and added, “I want to go back to Coolidge and even McKinley.” The Claremont fellow Charles Kesler, author of “I Am the Change,” a recent book denouncing President Obama and liberalism, agreed: “We’re in for a Coolidge revival.”



COOLIDGE



By Amity Shlaes


Indeed we are. Coolidge was a figure of sport in his own era. H. L. Mencken mocked his daily naps — “Nero fiddled, but Coolidge only snored” — and Dorothy Parker reportedly asked, “How could they tell?” when his death was announced. But such quips have only heightened the determination of a growing contingent of Coolidge buffs to resurrect him. They abhor the progressive tradition among Democrats (Woodrow Wilson) and Republicans (Theodore Roosevelt and Herbert Hoover) as hostile to big business and prosperity. Instead, their aim is to spread the austere doctrine of what might be called Republican Calvinism. Their liturgy is based on Coolidge’s remark “If the federal government were to go out of business, the common run of people would not detect the difference.” Coolidge, the new Calvinists say, has been calumniated by liberal intellectuals for his embrace of what amounted to supply-side economics — tax cuts for the wealthy that would pay for themselves. Far from being a hapless president who set the stage for the Great Depression, they argue, he presided over a notable golden age.



This view of Silent Cal as a prophet on the right emerged during the Reagan administration. A freshly inaugurated Reagan banished Harry Truman’s portrait from the Cabinet Room and replaced it with Coolidge’s. As president, Reagan praised Coolidge, read his autobiography and met with Thomas Silver, the author of “Coolidge and the Historians,” a pioneering attack on his liberal detractors. Bouquets to Coolidge proliferated: in 1983 Paul Johnson declared in his popular history “Modern Times” that Coolidge had presided over “the last Arcadia.” Then more than a decade later the prolific business writer Robert Sobel published a tribute called “Coolidge: An American Enigma” with the Regnery press. In it, he bestowed what has become the right’s highest commendation — “the last president who believed in a passive executive branch in times of peace and prosperity.”



Since then a flurry of panegyrists have gone on to contrast that passivity favorably with Obama’s alleged hubris. In a column about the “unsilent Barack,” for example, George F. Will lamented that Coolidge was “the last president with a proper sense of his office’s constitutional proportions.” Sarah Palin has expressed her devotion to Coolidge in “America by Heart,” while Michele Bachmann has urged that his visage be carved on Mount Rushmore. Meanwhile, Arthur C. Brooks, the president of the American Enterprise Institute, singles him out in the recent book “The Road to Freedom.”



No one, however, is offering as silky a defense of Coolidge as Amity Shlaes. ­Shlaes, whose new biography is blurbed by Representative Paul Ryan as a “must-read,” has always had a deft finger on the conservative pulse. Her previous book — the best seller “The Forgotten Man,” which assailed both Herbert Hoover and Franklin Roosevelt for perpetuating the Great Depression through big-­government activism — was described in 2009 by Politico as an essential text for House Republicans, who were, it said, “tearing through . . . ‘The Forgotten Man’ like soccer moms before book club night.”



Now Shlaes, a trustee of the Calvin Coolidge Memorial Foundation and a former editor at The Wall Street Journal, has turned to Coolidge. She has assiduously researched Coolidge’s life, drawing both on his private papers (going so far as to photograph his appointment books) and on contemporary newspaper reports. Her biography depicts him as a paragon of a president, less for what he did than for what he did not do — Coolidge, she says, “is our great refrainer.” But that, for the most part, is as far as Shlaes is willing to go. She has not written a fiery polemic, but a stylistically assured narrative of Coolidge’s life that seeks to nudge the reader imperceptibly into sharing his (and her) views. Wall Street potentates like J. P. Morgan and notorious railroad tycoons like E. H. Harriman are blandly depicted as underdogs, the victims of progressive politicians and intellectuals. She displays a marked disinclination to mention, let alone engage, differing interpretations of key events in Coolidge’s career. It is also the case that she assumes an omniscient insight into his views, playing Edgar Bergen to Coolidge’s Charlie McCarthy, which can make it difficult to distinguish where Coolidge ends and Shlaes begins. Her distinctive approach to Coolidge results in a soothing lullaby about a vanished America, an exercise in nostalgic reverence rather than an authoritative history.



The pity of it is that Coolidge was in fact a more astute politician than the easy scorn of his contemporaries suggested, and historians like David Greenberg, the author of a fine study of Coolidge (which Shlaes scarcely mentions), are starting to offer a more judicious appraisal of him. Coolidge was canny enough to work with several spinmeisters who sought, as far as possible, to mold his public image as a Yankee steward of the Republic through the new media of newsreels and radio. Shlaes doesn’t attempt to scrutinize Coolidge’s image but to burnish it.



What makes Coolidge a fascinating character, however, aren’t his bromidic phrases and vapid homilies, designed to reassure a public unsettled by rapid social and economic change; or his loyalty to his vivacious wife, Grace; or his taciturnity or any of his other personal qualities. Rather, it is that he represented the right’s first sweeping counterrevolution against liberal Republicans in a battle that continues down to the present. What Shlaes’s biography underscores is the fantastic tenacity with which the party still adheres to the ossified pre-New-Deal-era economic doctrines enunciated by Silent Cal.



Coolidge, who was born in Vermont on the Fourth of July in 1872, grew up in the village of Plymouth Notch, where the Yankee virtues of thrift and industry were prized. His father was a farmer, a merchant and a state legislator; at the age of 3, Calvin sat in the governor’s chair, hewed from the timbers of the U.S.S. Constitution. Yet the future president — a sickly, gaunt and solitary boy who suffered great anguish when his mother died on her 39th birthday — seemed himself to be made of unpromising material. Not until he entered Amherst College did he display much potential. All his life Coolidge may have been a shy fellow, but he blossomed when he spoke publicly at Amherst. He also studied under a popular professor named Charles Garman, who, Shlaes writes, mesmerized his disciples with the revelation that “the group was less important than the individual . . . because there was really no such thing as group happiness.”



It was a lesson Coolidge never forgot. He first made it clear that he was at loggerheads with the progressives when he backed the pro-business William Howard Taft in the 1912 election against Theodore Roosevelt, a split that allowed the Democrat Woodrow Wilson to enter the Oval Office. Coolidge could not discern any distinction, however minute, between the welfare of business and individuals. They were identical. In the Massachusetts Senate his “crowning achievement,” Shlaes says, “had been killing a tax on stocks at the last minute by masterfully exercising the Senate president’s privilege to create a tie vote.” Coolidge believed that anyone who constructed a factory was building a temple deserving of “reverence and praise.”



This sentiment reached full bloom when he won praise as Massachusetts governor for breaking a policemen’s strike in Boston in 1919 at a moment when the country was in the midst of the Red Scare, fearful of radicals and immigrants. As Shlaes depicts it, Coolidge acted decisively and heroically to restore the “reign of law” for businesses and private property by sending in the state guard. “Coolidge felt certain of one thing,” she reports. “The progressives could not be met. Conciliation would not work.” Maybe not, but was it really necessary to fire all the policemen in the aftermath? And why had Coolidge not intervened sooner rather than waiting until the strike degenerated into violence?



But Coolidge’s intransigence at a time of internal tumult made him a national hero, and his wealthy backers like Frank Stearns, a Boston department store magnate, and Dwight Morrow, a partner at J. P. Morgan, began to see in him presidential timber. The finishing touches were supplied by the advertising man Bruce Barton, a fellow Amherst graduate and an evangelist for capitalism who himself earned fame for his tract “The Man Nobody Knows,” a best seller that depicted Jesus and his disciples as the pioneers of a business organization that conquered the world. As part of Coolidge’s quest for the presidential nomination in 1920, Barton wrote the first national article about him, in Collier’s, emphasizing his flinty “Yankee” character — Coolidge’s frugality meant that he rented half of a two-family house in Northampton — and portraying him as a “man who kept his own counsel, a novelty when most prominent politicians freely gave advice on everything.” He was, Barton said, “cut from granite.”



What Stearns, Morrow and Barton detected in 1920 was that the Coolidge brand could be effectively marketed to a public weary of World War I and Woodrow Wilson. But Coolidge’s Republican adversaries kept him from receiving the nomination (“Nominate a man who lives in a two-family house?” Senator Henry Cabot Lodge said. “Never!”), and he had to settle for a spot on the ticket as vice president to Warren Harding. After Harding died in August 1923, Lodge told a reporter: “My God! That means Coolidge is president!”



No sooner did Coolidge become president than he went on a budget and tax cutting spree to terminate what he called the “despotic exactions” of the past years. The immediate aim was to enact Harding’s hope to roll back the higher progressive income tax that Wilson had imposed during World War I. Coolidge, for his part, idolized the Treasury secretary Andrew Mellon, whose frosty credo was that “the people generally must become more interested in saving the government’s money than in spending it.” Tax cuts, Shlaes asserts, were “not merely to favor the rich, as many said. The tax rate cuts at the top were designed to favor enterprise. If people got to keep more of their money, they would hire others, Mellon said.” As Mellon saw it, this was “scientific taxation,” a program he detailed in 1924 in his classic statement of supply-side economics, “Taxation: The People’s Business.” But progressive Republicans initially impeded Coolidge and Mellon’s sweeping plans. Coolidge was undaunted. “Cutting rates brought more revenue,” says Shlaes. “So cutting rates even more might bring yet more cash.” All Coolidge could think about was economizing. He was a cheap tipper. He berated the White House housekeeper, Mrs. Jaffray, for favoring specialty shops rather than the new supermarkets. Talking to a group of Jewish philanthropists, he admitted that the budget was “a sort of obsession with me. . . . I regard a good budget as among the noblest monuments of virtue.” When the mayor of Johannesburg, South Africa, sent the Coolidges two lion cubs, the White House named them Tax Reduction and Budget Bureau. In 1926, Coolidge finally got the tax cuts he had always dreamed about, what ­Shlaes deems nothing less than “a beauty to behold, with its surtax rate topping out at 20 percent.” By the end of his term, only the very wealthiest Americans paid any income tax at all. Frenzied speculation took off. The bubble would soon pop. But Shlaes suavely dismisses the notion that Coolidge bears responsibility for the Great Depression and suggests his work was “complete, ready as a kind of blessing for another era.”



This is flapdoodle. No, Coolidge was not single-handedly culpable for the economic calamity of the 1930s. But neither can he be safely extracted from the ruin that followed his presidency. Quite the contrary. Coolidge was the pre-­eminent cheerleader for the economic nostrums that led to the crash. His opposition to regulation allowed Wall Street and the banks to engage in rampant speculation and insider trading, practices that were not curbed until Joseph Kennedy was appointed head of the new Securities and Exchange Commission by Franklin Roose­velt to ban the very practices he himself had employed. So deep was Coolidge’s antipathy to any form of government action that he even viewed his gifted secretary of commerce and successor Herbert Hoover with a measure of contempt, calling him the “wonder boy” because he fell into the progressive Republican camp.



With yet another tribute about to appear — “Why Coolidge Matters,” by the former Claremont Institute fellow Charles C. Johnson, will be published in March — Coolidge will surely continue to enjoy a comeback on the right. Yet his actual record shows that he was an extraordinarily blinkered and foolish and complacent leader, no less than George W. Bush before the stock market plummeted in 2008. The bogus nostrums that Coolidge touted have directly led either to enormous deficits during the Reagan era or to outright catastrophe during the Bush era. Shlaes never stops to ponder the abundant literature chafing at and exposing the conformity and avarice of the Roaring Twenties, but the prosperity offered by Calvinism has always proved as elusive as the promise of the green light that Jay Gatsby watches at the end of Daisy’s dock. Conservatives may be intent on excavating a hero, but Coolidge is no model for the present. He is a bleak omen from the past.





Jacob Heilbrunn, a regular contributor to the Book Review, is a senior editor at The National Interest.



The Smell of Napalm

Kilgore said in the movie Apocalypse Now, "I love the smell of napalm in the morning." Well, we don't normally experience that thrill in Shelby County. We're stuck with the smell of bacon and cakes on the griddle in the morning. We have to sacrifice in Shelby County it seems.



Like · · Promote · Share.

Freddy Hudson likes this..Randal Berrows Mmmm...bacon

6 hours ago via mobile · Like..Don Waller According to the AMA you'd be better off smelling Napalm

The Pragmatic Lincoln




By STEVEN B. SMITH

Published: February 14, 2013

There have been many ways to think about Abraham Lincoln, our most enigmatic president, but the image of him as a moral philosopher is not the most obvious. We have “Honest Abe,” the great rail-splitter of American legend, Lincoln the political operative and architect of the Republican Party, and Lincoln the savvy wielder of executive power as portrayed in Steven Spielberg’s recent film.



LINCOLN'S TRAGIC PRAGMATISM



Lincoln, Douglas, and Moral Conflict



By John Burt



814 pp. The Belknap Press/Harvard University Press. $39.95.

.Yet several works have put the issue of Lincoln’s language, rhetoric and political thought front and center. Among them, Garry Wills’s “Lincoln at Gettysburg,” Ronald C. White Jr.’s “Lincoln’s Greatest Speech” and Allen Guelzo’s “Abraham Lincoln as a Man of Ideas”all deserve honorable mention. But the first and still best effort to advance a philosophical reading of Lincoln was Harry V. Jaffa’s “Crisis of the House Divided,” published in 1959.



A student of the philosopher Leo Strauss, Jaffa argued that the issue between Lincoln and Douglas during the 1850s was the clash between Lincoln’s doctrine of natural right and Douglas’s doctrine of popular sovereignty. This was, as Jaffa declared, identical to the conflict between Socrates and Thrasymachus in Plato’s “Republic.” Douglas argued that whatever the people of a state or territory wanted made it right for them. For Lincoln, however, only a prior commitment to the moral law could make a free people.



The originality of Jaffa’s book was his ability to make what seemed a purely historical debate address the deepest themes of the Western philosophic tradition. At issue were two contending conceptions of justice. Lincoln’s appeal to the “self-­evident truth” of equality in the Declaration of Independence provided the moral touchstone of the American republic. Douglas’s affirmation of popular sovereignty was a statement of sheer power politics in which questions of justice are ultimately decided by the will of the majority. For Jaffa, any falling away from the transcendent doctrine of pure natural law was tantamount to a slide into relativism, historicism and ultimately nihilism.



For the first time in over half a century, Jaffa’s book has a serious rival. John Burt, a professor of English at Brandeis University, has written a work that every serious student of Lincoln will have to read, although its sheer bulk alone — more than 800 pages — as well as the density of its prose may deter all but the most intrepid Lincolnophiles. It is a work of history presented as an argument about moral conflict, and a work of philosophy presented as a rhetorical analysis of Lincoln’s most famous speeches. Unlike Jaffa, who projected Lincoln through the long history of natural law from Plato and Cicero through Aquinas, Locke and the American framers, Burt refracts Lincoln through the philosophy of Kant, Rawls and contemporary liberal political theory. His is very much a Lincoln for our time.



Burt begins from the problem of how to resolve conflict in an open society. Does liberalism presuppose agreement around a common moral core — all men are created equal — or is it merely a modus vivendi for people with different values and interests who consent to work together for purely opportunistic reasons? James Madison, in The Federalist No. 10, thought it was the second. He saw a vast republic of competing factions that would cooperate because none could muster the resources to exercise a permanent dominance over the others. But what happens, as in the case of slavery during the 1850s, when these factions cease to pursue interests that can be negotiated and become wedded to principles central to identity? Compromise over interests is possible; compromise over principles is far more difficult.



The problem of moral compromise is at the center of the Lincoln-Douglas debates. At the time of the founding, as Lincoln told the story, slavery had been treated as a regrettable evil and one that would slowly be put on the path to “ultimate extinction.” There was even a certain shame over slavery, which accounts for why the word is not mentioned in the Constitution. But in the succeeding generations this view radically changed, and what was once seen as merely a peculiar institution came to be regarded by John C. Calhoun and others as a “positive good,” crucial to the Southern way of life. Lincoln’s barb that if slavery is a good, it is a good that no one has ever chosen for himself made no difference.



A welcome aspect of Burt’s study is that it presents the debate between Lincoln and Douglas as a real debate between two principled political actors struggling to make sense of their time. Douglas’s defense of popular sovereignty was not the first step down the slippery slope to nihilism; it was an effort to defuse the slavery issue by returning it to the state and territorial legislatures. Douglas remained loyal to the Madisonian vision of politics that seeks to find some reasonable middle ground in which differences can be accommodated. His claim that he was indifferent as to whether slavery was voted up or down was not simply a piece of callous value neutrality, but an effort to prove to Southern slave owners as well as to Northern anti-abolitionists that he was a man with whom all of them could do business.



Lincoln came to regard slavery as a unique moral evil, something beyond the limits of a consensual society. There are some things — like taxes — that are subject to deal-making, and others — human dignity, for one — that are not. On slavery as an institution, Lincoln was prepared to negotiate; on slavery as a principle, he would not. This is not to say that Lincoln ever crossed into the territory of William Lloyd Garrison and the New England abolitionists, who regarded the Constitution’s compromises on slavery as a treaty with the Devil. This kind of higher-law idealism — think of Thaddeus Stevens as played by Tommy Lee Jones — may be rhetorically attractive but contains its own hidden dangers. The most obvious danger of a politics of conscience is the ever-present threat of violence and war. For those who cannot or will not see things our way, there may be no other recourse but to force of arms.



Lincoln never succumbed to the narcissism of the Emersonian beautiful soul, putting the purity of his own convictions above the law. He retained a statesmanlike ability to treat his opponents not as enemies to be conquered but as rational agents who might be persuaded through reasoned argument. As he told his audience in Peoria, Ill., in 1854: “I think I have no prejudice against the Southern people. They are just what we would be in their situation.” Democracy meant for him more than a Madisonian modus vivendi; it represented a commitment to a structure of fairness that respected the moral autonomy of free men and women.



If Burt’s Lincoln is a Rawlsian liberal seeking something like the basic requirements of justice, he is also someone with a tragic sense of “negative capability.” By this Burt means that our moral concepts remain so deeply embedded in our lives and histories that we can never fully understand what they entail except retrospectively. Our moral commitments unfold over time and cannot be rendered intelligible on the basis of first principles alone.



For example, when, in Peoria, Lincoln called slavery a “monstrous injustice,” could he have imagined that this would later commit him to securing the passage of the 13th Amendment? Or was it conceivable that his position would eventually lead to the election of our first African-American president? Probably not. Burt’s Lincoln sounds like a Hegelian philosopher for whom our moral conceptions become known to us only in the fullness of time and under the force of circumstances that no one — not even a Lincoln — can imagine.



It is at this point that Burt’s reading offers a powerful challenge to Jaffa. For Jaffa, Lincoln was a philosophical rationalist whose commitment to natural law proceeded from almost geometric logic; its consequences can be known to all on the basis of unaided reason. Think of the scene in Spielberg’s “Lincoln”in which he deduces the necessity for equality from the same axiomatic premises he recalls from reading Euclid’s “Elements” as a young man.



But Burt sees Lincoln as a historicist for whom our moral conceptions emerge only over time and in ways that we can never fully comprehend. We are always viewing our lives as through a glass darkly. “The story of democracy, in Lincoln’s view,” Burt writes, “is the story of something with a destiny, but it is a destiny never fully understood either by the founders or by Lincoln himself.” But where will this destiny take us? American democracy remains a work in progress. The unanswered question is whether destiny — the obscure and mysterious workings of fate — will issue in a new birth of freedom or a new dark age.



Burt argues that Lincoln’s decision to pursue a politics of principle over deal-­making was ultimately an act of faith, something beyond the limits of reason alone. Like the biblical Abraham, told to sacrifice his only son, he could not possibly have known where the consequences of his acts might lead. This does not mean Lincoln bade farewell to reason, but his decisions to fight a war, emancipate slaves and push for racial equality were choices that only history could make clear.



Burt suggests, but never directly asks, the W.W.L.D. question — what would Lincoln do? What are the conditions for compromise in our own intensely polarized age? He admits that after giving the question long consideration, he has been unable to come up with anyone who managed this as well as Lincoln. Let me only suggest the names of Nelson Mandela and Vaclav Havel.



Lincoln’s example was rare, though not unique. The problem with making Lincoln so absolutely singular is that it puts him outside of history. Those who invoke Lincoln’s legacy today tend to see him either as a Machiavellian wielder of political power or as a secular saint of modern democracy. Each of these views is false. Lincoln reminds us that statecraft requires an attention to both principle and compromise. Principle without compromise is empty; compromise without principle is blind. This is a valuable lesson for our politicians even today.





Steven B. Smith, the Alfred Cowles professor of political science at Yale, is the author, most recently, of “Political Philosophy.”



Tuesday, February 12, 2013

Fat Tuesday

It's Fat Tuesday as if we needed a reason to be gluttinous and hedonistic. I'm just gonna act natural and let the cheese dip and Oysters Rockefeller fall where they will. Forward!


Monday, February 11, 2013

The White People's Party

AN HISTORICAL INVESTIGATION FEBRUARY 10, 2013

Original Sin

Why the GOP is and will continue to be the party of white people

BY SAM TANENHAUS

POST TO TWITTER POST With Barack Obama sworn in for a second term—the first president in either party since Ronald Reagan to be elected twice with popular majorities—the GOP is in jeopardy, the gravest since 1964, of ceasing to be a national party. The civil rights pageantry of the inauguration—Abraham Lincoln's Bible and Martin Luther King's, Justice Sonia Sotomayor's swearing in of Joe Biden, Beyoncé's slinky glamor, the verses read by the gay Cuban poet Richard Blanco—seemed not just an assertion of Democratic solidarity, but also a reminder of the GOP's ever-narrowing identity and of how long it has been in the making.



"Who needs Manhattan when we can get the electoral votes of eleven Southern states?" Kevin Phillips, the prophet of "the emerging Republican majority," asked in 1968, when he was piecing together Richard Nixon's electoral map. The eleven states, he meant, of the Old Confederacy. "Put those together with the Farm Belt and the Rocky Mountains, and we don't need the big cities. We don't even want them. Sure, Hubert [Humphrey] will carry Riverside Drive in November. La-de-dah. What will he do in Oklahoma?"



Forty-five years later, the GOP safely has Oklahoma, and Dixie, too. But Phillips's Sunbelt strategy was built for a different time, and a different America. Many have noted Mitt Romney's failure to collect a single vote in 91 precincts in New York City and 59 precincts in Philadelphia. More telling is his defeat in eleven more of the nation's 15 largest cities. Not just Chicago and Columbus, but also Indianapolis, San Diego, Houston, even Dallas—this last a reason the GOP fears that, within a generation Texas will become a swing state. Remove Texas from the vast, lightly populated Republican expanse west of the Mississippi, and the remaining 13 states yield fewer electoral votes than the West Coast triad of California, Oregon, and Washington. If those trends continue, the GOP could find itself unable to count on a single state that has as many as 20 electoral votes.



It won't do to blame it all on Romney. No doubt he was a weak candidate, but he was the best the party could muster, as the GOP's leaders insisted till the end, many of them convinced he would win, possibly in a landslide. Neither can Romney be blamed for the party's whiter-shade-of-pale legislative Rotary Club: the four Republicans among the record 20 women in the Senate, the absence of Republicans among the 42 African Americans in the House (and the GOP's absence as well among the six new members who are openly gay or lesbian). These are remarkable totals in a two-party system, and they reflect not only a failure of strategy or "outreach," but also a history of long-standing indifference, at times outright hostility, to the nation's diverse constituencies—blacks, women, Latinos, Asians, gays.



But that history, with its repeated instances of racialist political strategy dating back many decades, only partially accounts for the party's electoral woes. The true problem, as yet unaddressed by any Republican standard-bearer, originates in the ideology of modern conservatism. When the intellectual authors of the modern right created its doctrines in the 1950s, they drew on nineteenth-century political thought, borrowing explicitly from the great apologists for slavery, above all, the intellectually fierce South Carolinian John C. Calhoun. This is not to say conservatives today share Calhoun's ideas about race. It is to say instead that the Calhoun revival, based on his complex theories of constitutional democracy, became the justification for conservative politicians to resist, ignore, or even overturn the will of the electoral majority.



This is the politics of nullification, the doctrine, nearly as old as the republic itself, which holds that the states, singly or in concert, can defy federal actions by declaring them invalid or simply ignoring them. We hear the echoes of nullification in the venting of anti-government passions and also in campaigns to "starve government," curtail voter registration, repeal legislation, delegitimize presidents. There is a strong sectionalist bias in these efforts. They flourish in just the places Kevin Phillips identified as Republican strongholds—Plains, Mountain, but mainly Southern states, where change invites suspicion, especially when it seems invasive, and government is seen as an intrusive force. Yet those same resisters—most glaringly, Tea Partiers—cherish the entitlements and benefits provided by "Big Government." Their objections come when outsider groups ask for consideration, too. Even recent immigrants to this country sense the "hidden hand" of Calhoun's style of dissent, the extended lineage of rearguard politics, with its aggrieved call, heard so often today, "to take back America"—that is, to take America back to the "better" place it used to be. Today's conservatives have fully embraced this tradition, enshrining it as their own "Lost cause," redolent with the moral consolations of noble defeat.



In the 1950s, when the civil rights movement began, Republicans helped lead it. The president during this period, Dwight D. Eisenhower, was skeptical about intervening on behalf of black equality and, in his first campaign, courted segregationist officials like James F. Byrnes and Harry F. Byrd. But Eisenhower also "advocated the end of segregation in the armed forces and the District of Columbia and urged the lifting of black voter restrictions," Robert Fredrick Burk writes in The Eisenhower Administration and Black Civil Rights. It was Eisenhower, too, who appointed another Republican, his vanquished rival Earl Warren, to the Supreme Court, resulting in the Brown v. Board of Education decision that outlawed legalized segregation—a bolder step than many in either party were ready for when it came in 1954.


Eisenhower, a Republican president, stood at the forefront of civil rights in the '50s.Yet the Eisenhower campaign also saw potential advantages in Brown—and a possible route, through the nation's cities, to recapturing the House, which they had lost in 1954. "GOP strategists regard this election as a period of maximum opportunity in their dream of shattering the Roosevelt coalition and regaining the allegiance of the Negroes," James Reston wrote in The New York Times. In 1956, the GOP improved its totals in black precincts by double digits in New York and Chicago, and made gains below the Mason-Dixon Line. Overall, Eisenhower received between 35 and 40 percent of the black vote, about 5 percentage points more than he did in 1952.



In late 1955, the Eisenhower administration began drafting a civil rights bill, with voting rights at its core. Its passage through Congress took almost two years, with intense debate on a provision authorizing federal judges to enforce voter rights from the bench, instead of leaving each case up to local (often all-white) juries. It was killed by Southern Democrats, who formed an alliance that included Senator John F. Kennedy, an avowed liberal eager to appease the Dixie senators who denied him the vice presidential nomination in 1956 and might thwart his presidential plans for 1960. The weakened bill that passed, in September 1957, established the federal Civil Rights Commission and added a Civil Rights Division to the Justice Department—far short of what many hoped for, yet "incomparably the most significant domestic action of any Congress in this century," according to The New York Timeseditorial page. Not one Republican senator voted against it. All 18 No's came from Democrats. "White southerners viewed the bill as Republican legislation," Joseph Crespino writes in his recent book, Strom Thurmond's America.



The GOP is in jeopardy, the gravest since 1964, of ceasing to be a national party.

Then, within weeks, an authentic crisis arose. Arkansas's governor, Orval Faubus, defied a federal court order to desegregate Little Rock's Central High School, bringing in the National Guard to surround the school and block a group of black pupils from entering, while a shrieking mob threatened violence. Unable to compromise with Faubus, Eisenhower federalized the Guardsmen and also sent in 1,000 paratroopers from the 101st Airborne Division. For the first time since Reconstruction, U.S. government troops, armed with bayonets, "occupied" a state in the old Confederacy.



A Republican president and his party now stood at the forefront of civil rights in America. Yet within a few years, this advantage would be lost and the party would be defined thereafter by its resistance to civil rights. Why did this happen? The reason was a historical coincidence: Just as the civil rights movement became a national concern, movement conservatism was being born.



A cherished myth today, at least on the right, is that National Review (NR)arrived at a moment of widespread hostility toward conservatism. In fact, the opposite was true. Two brutally disruptive decades, the 1930s and 1940s, a time of extremist ideology and world war, had given way to the infant nuclear age and with it a universal longing for a politics of consolidation and stability. "In the thirties it was socialism that many American intellectuals adopted as their paper money. ... [I]n the fifties it seems that notes on conservatism are being printed in inflationary quantities," the British political theorist Bernard Crick wrote in July 1955. The "notes" included ambitious books by the "New Conservatives" Russell Kirk, Peter Viereck, and Clinton Rossiter, as well as important work by Leo Strauss (on "natural right") and the French conservative Bertrand de Jouvenel (on "power"). For most of these writers, conservatism was more a matter of disposition—a belief in order, tradition, the revival of humanist values—than of developing or sharpening a political program.



Some of the right's heroes supported civil rights—for instance, William Knowland, the Senate minority leader who had marshaled the GOP votes for the 1957 bill. NR's favorite politician, Knowland wrote the lead article—an attack on the Geneva summits—in its first issue, published in November 1955. And NReditors diligently promoted him for the presidency. For Knowland, beingboth anti-communist and pro–civil rights made sense, part of the "hearts and minds" campaign the United States was waging in the Third World. Eisenhower and his secretary of state, John Foster Dulles, had both warned that Little Rock would feed Soviet propaganda mills.



But the intellectuals at NR interpreted all this differently. William F. Buckley Jr. had strong libertarian leanings, as did many of his colleagues. Some had come to NRby way of its predecessor The Freeman, "a fortnightly for individualists." This seemed fertile ground for making the case that Southern blacks were being denied the rudimentary means for self-advancement owing to a state-contrived caste system. Buckley, in fact, supported the first important modern civil rights protest—the Montgomery bus boycott—on the principle that blacks were "exercising, in legitimate fashion, their right to protest whatever laws or customs they deem offensive." Buckley and NRwould make the same argument in defense of the student-led boycotts and sit-ins of 1960, "a wholly defensible—we go so far as to say wholly commendable—form of protest [and] a form of social assertiveness which we must understand, and can sympathize with."



This alone was a breakthrough of sorts. Both Buckley's parents came from the South, and he and his nine siblings were raised with "culturally Southern" attitudes. The family and its servants, some of them black, wintered in Camden, South Carolina, on an estate that had once belonged to Mary Chesnut, the great Civil War diarist. Buckley's father, in particular, was an ardent segregationist and was convinced blacks were inferior. Conscious of these attitudes, and of their prevalence on the right, Buckley avoided contact with racist organizations and counseled others to do the same. He also closely guarded NR's reputation. He was incensed when a Q&A with Richard Russell, the Georgia senator leading the fight against civil rights, which read like "a fanatical and highly subjective polemic against miscegenation," made it into print. And when Strom Thurmond, a friend of Buckley's father, jointly praised NRand The American Mercury—a right-wing monthly that included articles like "Quotations from the American Jewish Yearbook" and "Rothschilds and Rockefellers: Dedicated Monopolists"—Buckley beseeched him not to lump the two publications together, since the Mercuryhad"degenerated into an irresponsible anti-Semitic sheet and has considerably embarrassed conservatives who were once associated with it."



And yet when it came to discussing the concrete realities of race in America, NR had almost nothing to say, and the little NR said did not differ much from what was appearing in the Mercury. Other small-circulation journals, including The New Republic and The Nation, sent reporters to the South, commissioned articles from Southern journalists, and combed the local press, black and white, for up-to-date information on school desegregation campaigns, sit-in strikes, and protests. But none of these were covered or even seriously discussed in the country's most ambitious and high-minded conservative journal.



Its editors were instead clarifying and reiterating two objectives, rolling back both communism abroad and the New Deal at home. Every federal action that hinted of "statism"—Brown, Little Rock, even civil rights legislation—freshly imperiled liberty, even if undertaken on behalf of those who were plainly being denied it in the South. It was a new civil war, a struggle not "between the states" or even between the states and the federal government, but rather between autonomous individuals and a homogenizing liberalism. While many saw the government moving cautiously on civil rights—with the Supreme Court, Congress, and the executive each addressing issues as they emerged—NR's editors saw an interlocking pattern of state-enforced dogma.



"'Integration' and 'Communization' are, after all, pretty closely synonymous," one of the magazine's most eminent contributors, Richard Weaver, a Southern agrarian perched at the University of Chicago, wrote in July 1957, when the civil rights bill was being debated. From this perspective, the Little Rock Nine, far from personifying the hopes of a community, were instead the "pawns and guinea pigs" of liberal social experimenters. The actual conflicts were almost irrelevant. "Segregated schooling, in terms of the larger issues involved, is about as important as Jenkin's Ear," Buckley wrote in 1956. And the judicial enforcement provision in the original 1957 Civil Rights Act, which some saw as a practical necessity, was for NR's editors a potential "extension of unchecked federal power ... without precedent in our history or in that of any Anglo-Saxon nation since the decline of the doctrine of the Divine Right of Kings."



The movement had a voice, however strident. What it lacked was an organizing principle. In America, there was just one place where rigorously conservative theory could be found: the South. In the antebellum period, it had yielded a surprisingly rich and rigorous school of political argument.



The most brilliant figure in this "reactionary enlightenment" was John C. Calhoun, the South Carolina political giant. Vice president under both John Quincy Adams and Andrew Jackson, he became the great philosophical defender of the South. He led the protest against the protective "Tariff of Abominations," which favored the industrial North over the agrarian South. Later, when the states divided bitterly over the issue of expanding slavery into the new western territories, he helped spur the conflict that led to the Civil War. Calhoun, "the Great Nullifier," was "Lincoln's deepest and most intransigent opponent," John Burt writes in his new book, Lincoln's Tragic Pragmatism,"and it was with Calhoun that the issue was joined whether the United States is to be a liberal society, offering civil rights and possibly even political rights to all persons by virtue of their being human, or a merely republican society, offering procedural equality only to a handful of elite players."



Calhoun's innovation was to develop a radical theory of minority-interest democracy based on his mastery of the Constitution's quirky arithmetic, which often subordinated the will of the many to the settled prejudices of the few. At the time of the constitutional convention, the total population of the Union, as reported by the most recent census, was just under 3.5 million; yet, Calhoun pointed out, the four smallest states, "with a population of only 241,490, something more than the fourteenth part of the whole, could have defeated the ratification." In other words, "numerical" or "absolute majorities" were severely limited in the actions they could take—or impose on others—especially on questions that put sectional interests at odds with the "General Government." One of Calhoun's classic arguments, the Fort Hill Address (1831)—written at and named for his home—defended South Carolina's "Ordinance of Nullification" of the tariff on the principle that the Union was a confederation of equally sovereign states, each in effect its own nation, its autonomy codified in the Tenth Amendment. And since the Constitution was itself "a compact, to which each state is a party . . . the several States, or parties, have a right to judge of its infractions" and to exercise it through the "right of interposition" (a term he got from James Madison). "Be it called what it may—State-right, veto, nullification, or by any other name [it is] the fundamental principle of our system. ... [O]n its recognition depend the stability and safety of our political institutions." In sum, each state was free to override the federal government, because local and sectional imperatives outweighed national ones.


John C. Calhoun, the brilliant political thinker who set conservatives on a path to ruin.Today, Calhoun is often described as a kind of crank—and with some reason. He called slavery "a positive good" and ridiculed the Declaration's "all men are created equal." ("Taking the proposition literally ... there is not a word of truth in it.") But in the early cold war years, when so many intellectuals, left and right, rebelled against the numbing dictates of consensus and conformism, there was a Calhoun revival. He became "the philosophic darling of students of American political thought," Louis Hartz wrote in The Liberal Tradition in America, published in 1955. A liberal like the historian Richard Hofstadter was stimulated by his bold theories on class and labor ("the Marx of the master class"), and conservatives were drawn to his protest against encroaching big government. Calhoun, Russell Kirk wrote in The Conservative Mind(1953), was "the most resolute enemy of national consolidation and of omnicompetent democratic majorities" and had valiantly uncovered "the forbidding problem of the rights of individuals and groups menaced by the will of overbearing majorities." The Calhoun apostle James J. Kilpatrick, the editor of The Richmond News Leader, wrote a defense of segregation, The Sovereign States (1957), that had an epigraph from the Fort Hill Address and exhaustively catalogued examples of "interposition" dating back to the origins of the Republic. Kilpatrick repeated the exercise in an attack on the Little Rock intervention, published in NR.



For NR, Calhoun was the Ur-theorist of a burgeoning but outnumbered conservative movement, "the principal philosopher of the losing side," whose championing of the Tenth Amendment "may have the effect of shaking inchoate states-righters out of their opportunistic stupor" and give rise to a new politics.



In his most notorious editorial, "Why the South Must Prevail," Buckley drew on Calhoun's championing of the "concurrent voice" to defend voting restrictions since "the South is entitled to take such measures as are necessary to prevail, politically and culturally, in areas in which it does not predominate numerically," even if it meant violating the Fourteenth and Fifteenth Amendments. Buckley repeated the argument in his book Up From Liberalism(1959), suggesting that African Americans needed to be properly educated and trained before they were brought up to the level of the enfranchised whites who were holding them down. And just as Calhoun had defended the "positive good" of slavery, so Buckley defended Jim Crow as being born of "custom and tradition ... a whole set of deeply-rooted folkways and mores." As long as the South did "not exploit the fact of Negro backwardness to preserve the Negro as a servile class," segregation was acceptable.



These early writings would be forgotten had they not formed the ideology that shaped a generation of conservative politicians, including Barry Goldwater and Ronald Reagan. Goldwater, the movement's first national leader, "was by no means the obvious man for the job," Rick Perlstein notes in his book Before the Storm. "He had gone to the 1952 convention as an Eisenhower delegate, had voted for a higher minimum wage and to extend Social Security, and had voted for the 1957 and 1960 civil rights bills." But then, partly under the influence of NR, Goldwater had become more ideological, a champion of states' rights, which he defended in terms that echoed the nullifying passions of the antebellum period. In 1959, he electrified an audience in Greenville, South Carolina, when he said the Brown decision, because it was "not based on law," ought "not be enforced by arms"—an overt reference to Little Rock. Goldwater's manifesto, The Conscience of a Conservative (1960), written by Buckley's brother-in-law (and NRcolumnist) L. Brent Bozell, had chapters on both states' rights and civil rights, elevating the first above the second whenever they came into conflict: "I therefore support all efforts by the States, excluding violence of course, to preserve their rightful powers over education."



In July 1963, Goldwater joined with Dixie senators in attacking the Pentagon's newly announced policy of shunning segregated businesses located near military bases in the South. A year later, he joined the Dixie contingent again when he opposed the Civil Rights Act of 1964, passed with a large bipartisan majority, including 27 out of 33 Senate Republicans. "It is at least conceivable that Goldwater would have welcomed an opportunity to vote with the majority," Richard Rovere wrote, in puzzlement, after the bill was passed. "But for Goldwater the opportunity had been all but foreclosed by Brent Bozell—or some other hand guided by the 'guiding hand'—in The Conscience of a Conservative. In that book, Goldwater allowed himself to be committed to a states'-rights position that Jefferson Davis could hardly have found acceptable."



By this time, Goldwater stood on the verge of the Republican presidential nomination, thanks to the work of campaign strategist F. Clifton White and NR's publisher, William A. Rusher. Together, they plotted a new Southern route to electoral victory—not by explicit race-baiting (which could be left to hard-core racist Democrats), but by high-minded appeals to affluent whites "in the southern cities and suburbs, where the tides of social change are tending to run fastest," as Rusher wrote in NR. The politics of defiance, tinged with nullification, might hold the seeds of an eventual majority.



But Goldwater was only one herald of a new racially driven politics in 1964. Another was the Democrat George Wallace, the Alabama governor who had become the voice of a white-supremacist populism. With his cry of "segregation now; segregation tomorrow; segregation forever" and his promise of "rebel" protest against "communistic amalgamation," Wallace entered Democratic primaries in Indiana, Maryland, and Wisconsin, and did shockingly well, especially in cities where there were inter-ethnic conflicts over schools and "fair housing," and where Wallace's promise to stand tall against what even liberals were calling "the Negro revolution" spoke directly to the anxieties of Northerners. When he ran again in 1968, this time on a third-party ticket (shades of Thurmond's Dixiecrats), he burnished his appeal with the "constitutional" language so favored by nullifiers and adopted by later GOP insurgencies.



Wallace captured five Southern states in 1968, and 13.5 percent of the popular vote, meaning Kevin Phillips's majority was an election away. Yet he presciently saw where it would come from: defecting Democrats. Whites "will desert their party in droves the minute it becomes a black party," he predicted. "Wallace is helping, too—in the long run." The axis of the realignment, based on the politics of nullification, was settling into place. "[W]atch us in [nineteen] seventy-two. Our tabulations and techniques will be perfected by then; we'll have four years to work on them, and all the resources of the federal government. I'd hate to be the opponent in that race." It was George McGovern, who absorbed one of the worst drubbings in history.



With this, Calhounism went into remission. Nixon, like Eisenhower before him, was neither nullifier nor rearguardist. True, he had appeased the right—energetically campaigning for Goldwater in 1964 when liberal Republicans had renounced him. But Nixon was an ambidextrous courter of all sections and factions. He nominated Southern judges to the Supreme Court and at the same time urged Northern unions to recruit black workers. Yet nullification didn't die. It found its new target in Nixon's policy innovations, which seemed to be advancing liberal heresy. Nixon's urban adviser, Daniel Patrick Moynihan, went so far as to say Nixon's intention was not to undo but outdothe Great Society. To an ideologue like Rusher, the GOP itself was now the enemy. Nixon had betrayed conservatives, operating from inside an establishment eager "to 'pay off' their minority-group allies with all sorts of cultural and economic goodies," including "posts in a burgeoning bureaucracy, admissions quotas in elite universities, welfare benefits of assorted kinds, quotas in the job market, etc." Rusher proposed a third party, suggesting as its tribune Ronald Reagan, who had a history of sympathy for Southern nullifiers. Early in his career, he had shared the stage with Faubus and other segregationists, and in 1980, he flew directly from the nominating convention to Philadelphia, Mississippi—where three civil rights workers had been slain in 1964. Reagan dismissed the sitting chairman of the U.S. Commission on Civil Rights,Arthur Flemming, who warned that the Reagan administration's handling of school desegregation cases reflected the doctrine of "separate but equal." There was a protest as well from state agencies. The chairmen of 33 of them signed a letter warning that Reagan had created a "dangerous deterioration in the Federal enforcement of civil rights."



It is not a coincidence that the resurgence of nullification is happening while our first African American president is in office.

The largest targets in these years were affirmative action programs. They had begun under Nixon with the support of some conservatives, including Buckley, who favored "preferential hiring" by businesses. So did Garry Wills. When he was at NR in the '60s, he had urged Goldwater to advocate such programs even if "some conservatives have cynically borrowed the very egalitarian professions they normally condemn in order to support a faceless 'equality' among those seeking jobs." The attack came from others on the right, the rising faction of neoconservatives, who denounced "affirmative discrimination." Liberal policymakers formed a "new class" of social engineers who had devised a "spoils system" that rigged "outcomes" and stigmatized and perhaps even harmed those who advanced through the system. This argument coincided with a new literature that revived the doctrine of black inferiority, genetic or cultural, and dominated the race debate in the 1980s and 1990s. But efforts by the Reagan administration to roll back affirmative action failed. And later attempts did, too. In 1995, Bob Dole, the GOP Senate leader, introduced legislation to end all federal affirmative action programs, only to drop the issue from his presidential campaign once polls showed that, while voters disliked "quotas" and "preferences," they supported the broader principle of inclusion and diversity, especially when they realized its beneficiaries included not only blacks but also women. Latinos, another growing population, were enjoying the advantages, too.



This was a conservative strategy built for an earlier moment, when a party could prosper by exploiting the anxieties of white America. But many now were adjusting to the reality of a diverse society, multicultural and multiracial. Modernity could not be nullified. Some Republicans recognized this. George W. Bush, for one. Conservatives who were dismayed by Romney's dismal showing with Latinos (27 percent) remember the 40 percent share of the Latino vote Bush won in 2004 and suggest that a more humane immigration policy might close the gap. But Bush's success with Hispanic voters grew out of an established record of sympathy dating back to his Texas governorship, when he proposed an innovative tax plan, including a levy on "professional partnerships" (doctors, lawyers, accountants, and more) that would have increased financing for the state's poorest (in most cases Latino) school districts. The plan was squelched by his own party, just as Bush's attempt at an amnesty program was squelched by it in 2007.



Bush, of course, was unable to build the "permanent Republican majority" envisioned by Karl Rove. Yet, it is startling how little he managed to move the rhetoric and worldview of his party, which remains largely stuck where it was a generation ago or longer. Romney seldom addressed black audiences during the campaign. When he did venture into the inner city, meeting with teachers and administrators at a charter school in Philadelphia, he suggested they instruct students in the virtues of "getting married and having families where there's a mom and a dad together. ... That's critical down the road for those that are already in a setting where they don't have two parents." Paul Ryan said much the same thing: "The best thing to help prevent violent crime in the inner cities is to bring opportunity ... to help teach people good discipline, good character."



Character, he presumably meant, like that exhibited by Republican delegates in Tampa, who thrilled to the refrain "We built it"—with the identity of the "we" all too visible to TV audiences—just as the inimical "they" were being targeted by a spurious campaign to pass voter-identification laws, a throwback to Jim Crow. Romney's disparagements of the "47 percent" and his postmortem assessment that Obama won because of the "gifts" he had lavished on blacks, young people, and women also repeat the dogma of an earlier time.


Mitt Romney received just 6 percent of the black vote.This remains the perspective of the American right, only today the minority of "concurrent voices" speak in the bitter tones of denial, as modernization and egalitarianism go forward. In retreat, the nullifying spirit has been revived as a form of governance—or, more accurately, anti-governance. Its stronghold is the Tea Party–inflected House of Representatives, whose nullifiers would plunge us all over the "fiscal cliff." We see it too in continuing challenges to "Obamacare," even after it was validated by the Roberts Court. And we see it as well in Senator Rand Paul's promise to "nullify anything the president does" to impose new gun controls. Each is presented not as a practical attempt to find a better answer, but as a "Constitutional" demand for restoration of the nation to its hallowed prior self. It is not a coincidence that the resurgence of nullification is happening while our first African American president is in office.



"American politics," Wills wrote in 1975, "is the South's revenge for the Civil War." He was referring to the rise of Southern and Sunbelt figures—the later ones would include Jimmy Carter, Reagan, Bill Clinton, and the two Bushes—whose dominance of presidential politics ended only with Obama's election in 2008. However, the two parties dealt with race differently. Carter and Clinton had pro–civil rights histories and directly courted black voters. But as the GOP continued remolding itself into a Southern party—led in the '90s by the Georgian Newt Gingrich and by the Texans Dick Armey and Tom DeLay—it resorted to an overtly nullifying politics: The rise of the Senate veto as a routine obstructionist tool, Jesse Helms's warning that Clinton "better have a bodyguard" if he ever traveled to North Carolina, the first protracted clashes over the debt ceiling, Gingrich's threat to withhold disaster relief, the government shutdown, Clinton's impeachment despite public disapproval of the trial. All this, moreover, seemed to reflect, or at least parallel, extremism in the wider culture often saturated in racism: Let's not forget Minutemen and Aryan Nation militias, nor the "anti-government" terrorist Timothy McVeigh, whom the FBI linked to white supremacists. The war on government—and against agencies like the Bureau of Alcohol, Tobacco, Firearms, and Explosives—had become a metaphor for the broader "culture wars," one reason that the GOP's dwindling base is now at odds with the "absolute majority" on issues like gun control and same-sex marriage.



Reformers in the GOP insist that this course can be reversed with more intensive outreach efforts, better recruitment of minority candidates, and an immigration compromise. And a new cast of GOP leaders—Ted Cruz, Nikki Haley, Bobby Jindal, Marco Rubio—have become national favorites. But each remains tethered to movement ideology. At the recent National Review Institute conference in Washington, Cruz even urged a "partial government shutdown," recalling the glory years of the '90s, but downplaying its destructive outcome.



Denial has always been the basis of a nullifying politics. Calhoun, too, knew he was on the losing side. The arithmetic he studied most closely was the growing tally of new free territories. Eventually, they would become states, and there would be sufficient "absolute" numbers in Congress to abolish slavery. A century later, history pushed forward again. Nonetheless, conservatives, giving birth to their movement, chose to ignore these realities and to side with "the South."



Race will always be a complex issue in America. There is no total cleansing of an original sin. But the old polarizing politics is a spent force. The image of the "angry black man" still purveyed by sensationalists such as Ann Coulter and Dinesh D'Souza is anachronistic today, when blacks and even Muslims, the most conspicuous of "outsider" groups, profess optimism about America and their place in it. A politics of frustration and rage remains, but it is most evident within the GOP's dwindling base—its insurgents and anti-government crusaders, its "middle-aged white guys." They now form the party's one solid bloc, its agitated concurrent voice, struggling not only against the facts of demography, but also with the country's developing ideas of democracy and governance. We are left with the profound historical irony that the party of Lincoln—of the Gettysburg Address, with its reiteration of the Declaration's assertion of equality and its vision of a "new birth of freedom"—has found sustenance in Lincoln's principal intellectual and moral antagonist. It has become the party of Calhoun.



Sam Tanenhaus, editor of The New York Times Book Review, is working on a biography about William F. Buckley Jr.