Wednesday, December 31, 2014

Huck Finn's America

In 'Huckleberry Finn,' A History In Echoes

Huck Finn's America
Mark Twain and the Era That Shaped His Masterpiece
Hardcover, 342 pages purchase
Is there anything left to say about Adventures Of Huckleberry Finn?
That is the question that animates big parts of Andrew Levy's Huck Finn's America: Mark Twain And The Era That Shaped His Masterpiece, a richly researched, copiously annotated, fascinating argument that in all the debates over the book's treatment of race and despite its position as both a widely banned book and a widely assigned book, we tend to miss some of the most important things it teaches.
Levy argues that while the 1885 book is certainly reflective of Twain's powerful but complicated feelings about race, it is perhaps even more a book about childhood and especially boyhood, conceived and written at a time when the American mind was preoccupied with panic over dangerous and violent boys and with vigorous debates over the proper role of public education. Levy's bracing thesis is that close attention to both the race aspects and the childhood aspects of both the book and Twain's authorial approach lead to the same conclusion: that things change less than they seem to.
Twain, Levy argues, was tweaking those who worried then (as they do now) that boys were suddenly being turned violent by popular media (then, the dime novel; now, the violent video game). And despite frequent arguments about whether the book is racist or anti-racist, neither Twain nor the novel is easily placed along an arc of steady progress in racial enlightenment — an arc Levy argues is mostly illusory. Through those two lenses, he says, we can see in Huck Finn the way history – as Twain is widely quoted as having said, whether he did or not – may not quite repeat itself, but frequently rhymes.
This theme of recurrence, of nothing new under the sun, repeats as to both small points and large ones: for example, the first time Huck Finn was removed from schools for "passages derogatory to negroes" was not in some newly hypersensitive post-hippie modern day – it was in New York in 1957. (A response in the Harvard Crimson at the time sounds, aside from a few word choices, like it could have been written in the last 12 months were the book removed from a reading list now.)
And the book, which would later become for many an icon of wholesome Americana, had been removed from libraries for other reasons – of crudeness and rebellion, for instance – almost from the moment it was published. For many, Huck wound up as a lovable scamp, but he started as something far more intentionally and genuinely unnerving. (If you read the book again, you'll indeed find that compared to the text, its reputation for idyllic river travel and life philosophizing is exaggerated relative to its reputation for gore. In fact, Huck speaks of how you feel "mighty free and easy and comfortable on a raft" just after he witnesses and barely escapes a bloody murder.)
Levy is not engaged in only literary theory, but also in history and biography: Much of Huck Finn's America is taken up with tracing Twain's book tour, known as "Twins Of Genius," undertaken in late 1884 with southern pro-integration writer George Washington Cable. During the tour, Cable published the controversial essay "The Freedman's Case In Equity," which became for a time a bigger focus of attention, and certainly a more overtly charged text about race, than anything Twain was presenting.
The book tracks how Twain, at least sometimes, changed readings he gave for different audiences. He points out that, for instance, while Twain may have only used the word "nigger" on stage when reading in character as Huck, when he appeared in front of at least one black audience in Hartford, he chose different readings so that he didn't say it at all.
The inclusion of that story uneasily suggests that while Twain considered his performance of both Huck's voice and Jim's to be perfectly just and undoubtedly didn't consider himself a racist, he may have known exactly what students in schools would say a hundred years later: that his renditions of those voices, however intended, would not sound the same to everyone. That even if he thought he was offering a measure of understanding, he may have been uncomfortable doing so in the presence of people who might receive it as something else. Levy does an exceptional job of not overexplaining these small facts when they appear in the story: they lock into place as pieces of a much larger picture. Like any good biographer, he is adept at arranging information, even not strictly chronologically, so that it adheres naturally to a meaningful frame.
Huck Finn is full of contradictions: Huck comes to appreciate Jim's kindness and ultimately proves willing to "go to hell" to free him – but he treats Jim as exceptional, a worthy person because he's not the way Huck expects a black man to behave. As he puts it after Jim extends compassion to Tom Sawyer, "I knowed he was white inside." On the one hand, it's an extension of respect. On the other, it's equating integrity with whiteness. Its basic racism and its wisp of understanding are both real; they're both sitting right there.
Contradictions about Twain and complexities in the readings he and Cable were doing on tour are there, too:
What startles many readers now is that Twain loved minstrel shows, and that he used racial slurs. This was true 130 years ago, too. But what would have startled (and thrilled, and maddened) his Victorian audience was the sight of respectable men in respectable venues performing blackness – without the burnt cork on their faces.
Those contradictions are not simply curious; they are analytically significant. Levy points up Twain's genuine affection for black culture and the ways it echoes, he says, in performances today from white artists who confront accusations of something like minstrelsy and counter with heartfelt admiration for forms like hip-hop.
In one of the most intriguing parallels between Reconstruction-era culture and culture now, the book is unsparing in its description of Twain's view of black culture and especially dialect as something he had the right to translate and present, from well before Huck Finn:
...he still couldn't quite shake the idea that black culture, if not black individuals, worked for him and people like him: that only Cable and [Uncle Remus writer Joel Chandler] Harris could be "masters" of dialects millions of people of color spoke daily, for instance.
And later:
Imitating carefully the voices of black men and women, telling their stories, then selling that material [on tour] for twenty-five cents to one buck per ticket, and then making overwhelmingly white ticketholders jump out of respect or fear – this mix satisfied his genuine love of black culture, his eye for the main chance, and his sense of justice all at once.
Levy draws a comparison, in fact, between the way people he calls "amateur and professional anthropologists" sometimes saw children and the way they saw black people: as unsophisticated but privy to special, almost magical knowledge that it was the place of "a civilized and mature white person" to make sense of and to be good enough to pass along. It is cultural appropriation as an act of imagined generosity; it is very nearly appropriation as a duty. And while Levy says Twain was not much for playing the adult, he did take it upon himself to present stories and dialects he'd taken from others to curious paying audiences.
Ultimately, for Levy, Adventures Of Huckleberry Finn – and a careful reading of it and the history around it – demonstrate a lot of stasis. By noticing how we still struggle with much of what troubled Twain about race as well as what's troubling in what Twain himself was doing, he argues, we see the limitations of progress. And by noticing that we've always thought the things kids loved would ruin them, and we've always argued over how much the state should meddle in their development (and thus over the size of government and the balance between acting collectively and individually), we see the limitations of any notion of modern decline – particularly decline based on the influence of popular entertainment and the scandalous state of young people.
It's a provocative stance, not only because it declines to try to resolve some of the most common arguments about whether Huck Finn is appropriate to assign to students for reasons of language, but because it suggests that those arguments – while hugely important, deeply felt and entirely legitimate – are incomplete as ways to engage with such a complex text, both as literature and as history. Levy specifically says, in fact, that one temptation with Huck Finn has been to "uncomplicate" it in some direction or another, much as adults tried to uncomplicate young Huck himself. (Perhaps a metaphorical bridge too far, but perhaps also ... a little true.)
Trying to find firm ground from which to analyze culture is not a challenge that arises only with Twain. It's true of many pieces of mass entertainment that while they can't be understood as separate from their historical moments, they also lose value if treated as if they're still inside those moments. A book like Adventures Of Huckleberry Finn comes to readers now having both marked the last 130 years of American history and cultural argument and been marked by them.
And what's unnerving is that every day alters those marks in both directions. Decisions made by one writer before the beginning of the twentieth century become part of 21st-century conversations about language and racism. Modern understandings of traditions like minstrelsy – which Levy argues has a history more complex than the one often credited to it, and pieces of which are often recognized throughout the structure of Huck Finn – reach back in time to continue to shape not just our sense of Huck but our sense of Twain. Both those things are right and inevitable, and both are kind of humbling.
This is what can be frustrating about analyzing popular culture while it's still happening. In theory, parts of what Levy highlights about the book and about Twain – and what countless other scholars across decades have said about both – could have been visible then to a keen eye, although at least in book reviews, many of them were rarely raised. (He says, for instance, that almost none of the book's reviews published at the time saw anything noteworthy or particularly provocative in its discussion of race.) He cites again and again the vast landscape of writing that's already been done on Huck Finn, and the notes and bibliography occupy half the book's heft, so Huck Finn's America perhaps even argues existentially for the validity of cultural writing as a continuing project.
And when it is a continuing project, even Huck Finn finds new – or nominally new — conversations to invade. For instance, as Levy points out, 1885 was before the full flowering of a separate industry of children's literature, and it was more common then than now for books – and especially for authors – to write for adults sometimes and kids sometimes, even within books. Would it have been undignified, then, for an adult to read Adventures Of Huckleberry Finn, into which Twain poured social satire invisible to children, simply because it was about children and written in a young boy's identifiable voice? (As is often the case now, the conversations Twain had about making it a children's book versus an adults' book sound more like marketing discussions than literary ones.)
This is a debate we have had culturally just this year. Even if we were to condemn adults for reading YA fiction for its purely literary excellence, what about reading it for its connection to culture and its relationship to the national and international mood? Might that have been one of the reasons for adults to read Huck Finn? And can merit and relevance, art and currency, really be separated so cleanly?
Whatever Levy is saying about Twain and Huck, he is implicitly saying this, too: there is no particular value in cracking open a piece of culture, agreeing on its value or right interpretation, and freezing it under glass. It's not a new theory, but one always worth a reminder, that it's when we constantly visit and revisit classics and do not allow them to fall to a straightforward and stone-carved notion of What They Mean, perhaps established before we were born, that they bump up usefully against developments they could not have anticipated: the value of fiction that might seem to be for children, or the appropriation connection between Mark Twain and Macklemore.
Of course, of course, we can argue that Great Literary Icon Huck Finn is not Katniss Everdeen for the purposes of something like dignified reading for adults, but in 1885, an awful lot of folks didn't think Huck Finn was Great Literary Icon Huck Finn, either.
As hard as we may try, we cannot substitute scholarship and sophistication for the perspective that time, and interaction with its passage, will place on Katniss – and The Sopranos, and The Simpsons, and Madonna – in 130 years. We will offer the best analysis we can for now about what one book or one show or one film means about The Way Things Are. And the irony, Levy would perhaps argue, is that the more time passes, the less The Way Things Are will seem different from The Way Things Always Were.

Like Nicholson?

I have new reading glasses. My wife says they make me look like Jack Nicholson. Do you think this should be a compliment?

Favorite Books of 2014



FAVORITE BOOKS OF 2014
(In the order I read them)
1. Ian Haney Lopez – Dog Whistle Politics - Dog whistle politics is simply using code words to talk about race and to utilize racial prejudice to advance a political agenda.  Dog whistle politics is alive and well from the right-wing. 
2.  Bruce Alan Murphy – Scalia - This is the definitive biography of the life and medieval mind of Justice Scalia.  This man is truly one of the most dangerous people in the country.  Nino belongs in the Middle Ages with his pre-Vatican II Catholicism, his 18th Century version of the Constitution with his fallacious original intent scheme, and his textualism to mask a right-wing agenda.  Nino knows all, and if you don’t agree with him then you are stupid.  He’s probably hoping for a chance to name a Republican president in 2016 as he did in 2000.
3. Alan T. Nolan - Lee Considered - Remove the halo from Bobby Lee and what you find underneath may not be to your liking.
4. Nicholas Wapshott – The Sphinx - How Franklin Roosevelt masterfully prepared the country for war.
5. Seth Davis – Wooden: A Coach’s Life -  This is a carefully balanced and provocative biography of Coach John Wooden, the so-called Wizard of Westwood, winner of 10 NCAA basketball championships in a 12 year period at UCLA.   Saint John was a great coach but there were warts in the man.
6.  F. Scott Fitzgerald – The Great Gatsby.  This was the 4th time around and each time I have enjoyed this great novel anew.  This time I was struck by the humor of the book, and I still wonder why Gatsby wasted his life on an airhead like Daisy.  What about that enticing green light on Daisy’s dock?  So much symbolism in this slim novel! 
7. Edward E. Baptist – The Half Has Never Been Told - The much talked about author’s bold thesis is that contrary to conventional scholarly belief, slavery was economically efficient due to the lash and that slavery fueled America’s capitalist surge in the 19th Century.  This is the most shocking book of the year.
8. James Tobin – The Man He Became – This is the riveting story of how Franklin Roosevelt overcome polio to become the most consequential US President of the 20th Century.  The author’s thesis is that FDR became President not IN SPITE of his handicap but that he became President BECAUSE of polio
9.  Rick Bragg – Jerry Lee Lewis: His Own Story.  This is a rollicking biography of Jerry Lee Lewis, an original Son of the South, written by Rick Bragg, one of our great current Southern writers.  Great balls of fire, this book was fun reading!
10.  Roxanne Dunbar-Ortiz - An Indigenous Peoples’ History of the United States.   My understanding of American history will never be the same after reading this Indigenous people’s history of the country.  How does the history of the country look when viewed from the viewpoint of the original inhabitants?  I agree with the author that this country was founded on colonial settler genocide and that the country will never come to terms with it, but the author overstates her case in blaming all of our history of imperialism & militarism on this historical fact.

Tuesday, December 30, 2014

As the Year Turns

The year will end not with a bang but with a whimper. As these last days of December 2014 dissipate, I wish I could stay right here in 2014. Right here in the last days of 2014. Freeze it right here! Whatever will be will be. Just leave me out of it.

Sunday, December 28, 2014

Raining in Dixie

Down South we say "Good Lord willin' and the creeks don't rise." Well, we know the Good Lord is always willin'; we just hope the creeks don't rise today as rain sweeps across Alabama. We also like to say "we need the rain." It's just that we don't need too much of it at one time. We gonna batten down the hatches and be just fine. The term "flash flooding" don't go over well down here.

Saturday, December 27, 2014

On Dawkins

Richard Dawkins: Fundamentalist
By
If an autobiography can ever contain a true reflection of the author, it is nearly always found in a throwaway sentence. When the world’s most celebrated atheist writes of the discovery of evolution, Richard Dawkins unwittingly reveals his sense of his mission in the world. Toward the end of An Appetite for Wonder, the first installment in what is meant to be a two-volume memoir, Dawkins cites the opening lines of the first chapter of the book that made him famous, The Selfish Gene, published in 1976:
Intelligent life on a planet comes of an age when it first works out the reason for its own existence. If superior creatures from space ever visit earth, the first question they will ask, in order to assess the level of our civilisation, is: “Have they discovered evolution yet?” Living organisms had existed on earth, without ever knowing why, for over three thousand million years before the truth finally dawned on one of them. His name was Charles Darwin.
Several of the traits that Dawkins displays in his campaign against religion are on show here. There is his equation of superiority with cleverness: the visiting aliens are more advanced creatures than humans because they are smarter and know more than humans do. The theory of evolution by natural selection is treated not as a fallible theorythe best account we have so far of how life emerged and developedbut as an unalterable truth, which has been revealed to a single individual of transcendent genius. There cannot be much doubt that Dawkins sees himself as a Darwin-like figure, propagating the revelation that came to the Victorian naturalist. 
Among these traits, it is Dawkins’s identification with Darwin that is most incongruous. No two minds could be less alike than those of the great nineteenth-century scientist and the latter-day evangelist for atheism. Hesitant, doubtful, and often painfully perplexed, Darwin understood science as an empirical investigation in which truth is never self-evident and theories are always provisional. If science, for Darwin, was a method of inquiry that enabled him to edge tentatively and humbly toward the truth, for Dawkins, science is an unquestioned view of the world. The Victorians are often mocked for their supposed certainties, when in fact many of them (Darwin not least) were beset by anxieties and uncertainties. Dawkins, by contrast, seems never to doubt for a moment the capacity of the human mindhis own, at any rateto resolve questions that previous generations have found insoluble.
Dawkins may not be Victorian, but the figure who emerges from An Appetite for Wonder is in many ways decidedly old-fashioned. Before Dawkins’s own story begins, the reader is given a detailed account of the Dawkins family treeperhaps a natural prelude for one involved so passionately with genes, but slightly eccentric in a twenty-first-century memoir. Dawkins’s description of growing up in British colonial Africa, going on to boarding school and then to Oxford, has a similarly archaic flavor and could easily have been written before World War II. The style in which he recounts his early years has a labored jocularity of a sort one associates with some of the stuffier products of that era, whodimly aware that they lacked any sense of humorwere determined to show they appreciated the lighter side of life. 
Born in 1941 in Nairobi, Kenya, and growing up in Nyasaland, now Malawi, Dawkins writes of life in the colonies in glowingly idyllic terms: “We always had a cook, a gardener and several other servants. ... Tea was served on the lawn, with beautiful silver teapot and hot-water jug, and a milk jug under a dainty muslin cover weighted down with periwinkle shells sewn around the edges.” He remembers with special fondness the head servant, Ali, who “loyally accompanied” the family in its travels, and later became Dawkins’s “constant companion and friend.” Unlike the best of the colonial administrators, some of whom were deeply versed in the languages and histories of the peoples they ruled, Dawkins displays no interest in the cultures of the African countries where he lived as a boy. It is the obedient devotion of those who served his family that has remained in his memory. 
ADVERTISEMENT
Loyal servants turn up at several points in Dawkins’s progress through life. When he arrives at Oxford, the porter at Balliola college that had demonstrated its intellectual credentials by admitting three members of his familyrecalls Dawkins’s father and two uncles but mistakes them for Dawkins’s brothers. This, Dawkins tells us, showed the “timeless view” characteristic of “that loyal and bowler-hatted profession.” He goes on to recount an anecdote about a new recruit to the profession, who recorded in his log-book of his duties how he could hear “rain banging on me bowler hat while I did me rounds.” The tone of indulgent superiority is telling. Dawkins is ready to smile on those he regards as beneath him as long as it is clear who is on top. 
It is a different matter when those he sees as his intellectual underlingsreligious believers and any who stray from the strictest interpretation of Darwinismrefuse to follow his lead. Recalling his years at boarding school, Dawkins winces at the memory of the bullying suffered by a sensitive boy, “a precociously brilliant scholar” who was reduced to “a state of whimpering, abject horror” when he was stripped of his clothing and forced to take cold baths. Today, Dawkins is baffled by the fact that he didn’t feel sympathy for the boy. “I don’t recall feeling even secret pity for the victim of the bullying,” he writes. Dawkins’s bafflement at his lack of empathy suggests a deficiency in self-knowledge. As anyone who reads his sermons against religion can attest, his attitude towards believers is one of bullying and contempt reminiscent of the attitude of some of the more obtuse colonial missionaries towards those they aimed to convert.
Exactly how Dawkins became the anti-religious missionary with whom we are familiar will probably never be known. From what he writes here, I doubt he knows himself. Still, there are a few clues. He began his pilgrimage to unbelief at the age of nine, when he learned from his mother “that Christianity was one of many religions and they contradicted each other. They couldn’t all be right, so why believe the one in which, by sheer accident of birth, I happened to be brought up?” But he was not yet ready to embrace atheism, and curiously his teenage passion for Elvis Presley reinforced his vestigial Christianity. Listening to Elvis sing “I Believe,” Dawkins was amazed to discover that the rock star was religious. “I worshipped Elvis,” he recalls, “and I was a strong believer in a non-denominational creator god.” Dawkins confesses to being puzzled as to why he should have been so surprised that Elvis was religious: “He came from an uneducated working-class family in the American South. How could he not have been religious?” By the time he was sixteen, Dawkins had “shed my last vestige of theistic credulity.” As one might expect, the catalyst for his final conversion from theism was Darwinism. “I became increasingly aware that Darwinian evolution was a powerfully available alternative to my creator god as an explanation of the beauty and apparent design of life. ... It wasn’t long then before I became strongly and militantly atheistic.”
What is striking is the commonplace quality of Dawkins’s rebellion against religion. In turning away from the milk-and-water Anglicanism in which he had been rearedafter his conversion from theism, he “refused to kneel in chapel,” he writes proudlyhe was doing what tens of thousands of Britain’s young people did at the time. Compulsory religious instruction of the kind that exists in British schools, it has often been observed, creates a fertile environment for atheism. Dawkins’s career illustrates the soundness of this truism. If there is anything remarkable in his adolescent rebellion, it is that he has remained stuck in it. At no point has Dawkins thrown off his Christian inheritance. Instead, emptying the faith he was taught of its transcendental content, he became a neo-Christian evangelist. A more inquiring mind would have noticed at some point that religion comes in a great many varieties, with belief in a creator god figuring in only a few of the world’s faiths and most having no interest in proselytizing. It is only against the background of a certain kind of monotheism that Dawkins’s evangelical atheism makes any sense.
Even more remarkable is Dawkins’s inveterate literal-mindedness. He tells us that “the Pauline belief that everybody is born in sin, inherited from Adam (whose embarrassing non-existence was unknown to St. Paul), is one of the very nastiest aspects of Christianity.” It is true that the idea of original sin has become one with a morbid preoccupation with sexuality, which has been part of Christianity throughout much of its history. Even so, it is an idea that contains a vital truth: evil is not error, a mistake of the mind, a failure of understanding that can be corrected by smarter thinking. It is something deeper and more constitutive of human life itself. The capacity and propensity for destruction goes with being human. One does not have to be religious to acknowledge this dark fact. With his myth or metaphor of the death instinct thanatos, Freuda lifelong atheistrecognized that impulses of hatred and cruelty are integral to the human psyche. As an atheist myself, it is a view I find no difficulty in sharing.
Quite apart from the substance of the idea, there is no reason to suppose that the Genesis myth to which Dawkins refers was meant literally. Coarse and tendentious atheists of the Dawkins variety prefer to overlook the vast traditions of figurative and allegorical interpretations with which believers have read Scripture. Both Augustine and before him the Jewish philosopher Philo of Alexandria explicitly cautioned against literalism in interpreting the biblical creation story. Later, in the twelfth century, Maimonides took a similar view. It was only around the time of the Reformation that the idea that the story was a factual account of events became widely held. When he maintains that Darwin’s account of evolution displaced the biblical story, Dawkins is assuming that both are explanatory theoriesone primitive and erroneous, the other more advanced and literally true. In treating religion as a set of factual propositions, Dawkins is mimicking Christianity at its most fundamentalist.
There is an interesting inconsistency between Dawkins’s dismissal of religion as being little more than a tissue of falsehood and his adherence to an evolutionary account of human behavior. In the later chapters of An Appetite for Wonder, Dawkins recounts his work on the behavior of blowflies, and later mice, guppy fish, and crickets. As always, what he describes as his “Darwin-obsessed brain” analyzed these behaviors in terms that were meant to be consistent with Darwinian orthodoxy. His best-selling manifesto The Selfish Gene originated in 1973, when strike action by miners led to a “three-day week” in which there were frequent power-cuts. Dawkins needed electricity for his work on crickets, but he could do without it for writing, which he did on a portable typewriter, so he began to write instead. By 1982, we find him “trying to push Universal Darwinism”the view that genes are not the only replicators in natural selectiona theme he had explored in the last chapter of his best-seller, where he presented his theory of memes. Dawkins’s suggestion is that memes “leap from brain to brain, via a process which, in the broad sense, can be called imitation,” and it is clear that he sees this process at work throughout human culture, including religion.
There are many difficulties in talk of memes, including how they are to be identified. Is Romanticism a meme? Is the idea of evolution itself a meme, jumping unbidden from brain to brain? My suspicion is that the entire “theory” amounts to not much more than a misplaced metaphor. The larger problem is that a meme-based Darwinian account of religion is at odds with Dawkins’s assault on religion as a type of intellectual error. If Darwinian evolution applies to religion, then religion must have some evolutionary value. But in that case there is a tension between naturalism (the study of humans and other animals as organisms in the natural world) and the rationalist belief that the human mind can rid itself of error and illusion through a process of critical reasoning. To be sure, Dawkins and those who think like him will object that evolutionary theory tells us how we got where we are, but does not preclude our taking charge of ourselves from here on. But who are “we”? In a passage from The Selfish Gene that Dawkins quotes in this memoir, he writes:
They are in you and me; they created us, body and mind; and their preservation is the ultimate rationale for our existence. They have come a long way, these replicators. Now they come by the name of genes, and we are their survival machines.
If we “are” survival machines, it is unclear how “we” can decide anything. The idea of free will, after all, comes from religion and not from science. Science may give us the unvarnished truthor some of itabout our species. Part of that truth may prove to be that humans are not and can never be rational animals. Religion may be an illusion, but that does not mean science can dispel it. On the contrary, science may well show that religion cannot be eradicated from the human mind. Unsurprisingly, this is a possibility that Dawkins never explores. 
For all his fervent enthusiasm for science, Dawkins shows very little interest in asking what scientific knowledge is or how it comes to be possible. There are many philosophies of science. Among them is empiricism, which maintains that scientific knowledge extends only so far as observation and experiment can reach; realism, which holds that science can give an account of parts of the world that can never be observed; irrealism, according to which there is no one truth of things to which scientific theories approximate; and pragmatism, which views science theories as useful tools for organizing and controlling experience. If he is aware of these divergent philosophies, Dawkins never discusses them. His attitude to science is that of a practitioner who does not need to bother with philosophical questions. 
It is worth noting, therefore, that it is not as a practicing scientist that Dawkins has produced his assaults against religion. As he makes clear in this memoir, he gave up active research in the 1970s when he left his crickets behind and began to write The Selfish Gene. Ever since, he has written as an ideologue of scientism, the positivistic creed according to which science is the only source of knowledge and the key to human liberation. He writes wellfluently, vividly, and at times with considerable power. But the ideas and the arguments that he presents are in no sense novel or original, and he seems unaware of the critiques of positivism that appeared in its Victorian heyday.
Some of them bear re-reading today. One of the subtlest and most penetrating came from the pen of Arthur Balfour, the Conservative statesman, British foreign secretary, and sometime prime minister. Well over a century ago, Balfour identified a problem with the evolutionary thinking that was gaining ascendancy at the time. If the human mind has evolved in obedience to the imperatives of survival, what reason is there for thinking that it can acquire knowledge of reality, when all that is required in order to reproduce the species is that its errors and illusions are not fatal? A purely naturalistic philosophy cannot account for the knowledge that we believe we possess. As he framed the problem in The Foundations of Belief in 1895, “We have not merely stumbled on truth in spite of error and illusion, which is odd, but because of error and illusion, which is even odder.” Balfour’s solution was that naturalism is self-defeating: humans can gain access to the truth only because the human mind has been shaped by a divine mind. Similar arguments can be found in a number of contemporary philosophers, most notably Alvin Plantinga. Again, one does not need to accept Balfour’s theistic solution to see the force of his argument. A rigorously naturalistic account of the human mind entails a much more skeptical view of human knowledge than is commonly acknowledged.
Balfour’s contributions to the debate about science and religion are nowadays little knowncompelling testimony to the historical illiteracy of contemporary philosophy. But Balfour also testifies to how shallow, crass, and degraded the debate has become since Victorian times. Unlike most of those who debated then, Dawkins knows practically nothing of the philosophy of science, still less about theology or the history of religion. From his point of view, he has no need to know. He can deduce everything he wants to say from first principles. Religion is a type of supernatural belief, which is irrational, and we will all be better off without it: for all its paraphernalia of evolution and memes, this is the sum total of Dawkins’s argument for atheism. His attack on religion has a crudity that would make a militant Victorian unbeliever such as T.H. Huxleydescribed by his contemporaries as “Darwin’s bulldog” because he was so fierce in his defense of evolutionblush scarlet with embarrassment. 
If religion comes in many varieties, so too does atheism. Dawkins takes for granted that being an atheist goes with having liberal values (with the possible exception of tolerance). But as the Victorians well knew, there are many types of atheism, liberal and illiberal, and many versions of atheist ethics. Again, Dawkins imagines an atheist is bound to be an enemy of religion. But there is no necessary connection between atheism and hostility to religion, as some of the great Victorian unbelievers understood. More intelligent than their latter-day disciple, the positivists tried to found a new religion of humanityespecially August Comte (1798–1857), who established a secular church in Paris that for a time found converts in many other parts of the world. The new religion was an absurdity, with rituals being practiced that were based on the pseudo-science of phrenologybut at least the positivists understood that atheism cannot banish human needs that only faith can meet.
One might wager a decent sum of money that it has never occurred to Dawkins that to many people he appears as a comic figure. His default mode is one of rational indignationa stance of withering patrician disdain for the untutored mind of a kind one might expect in a schoolmaster in a minor public school sometime in the 1930s. He seems to have no suspicion that any of those he despises could find his stilted pose of indignant rationality merely laughable. “I am not a good observer,” he writes modestly. He is referring to his observations of animals and plants, but his weakness applies more obviously in the case of humans. Transfixed in wonderment at the workings of his own mind, Dawkins misses much that is of importance in human beingshimself and others.
To the best of my recollection, I have met Dawkins only once and by chance, when we coincided at some meeting in London. It must have been in late 2001, since conversation at dinner centered around the terrorist attacks of September 11. Most of those at the table were concerned with how the West would respond: would it retaliate, and if so how? Dawkins seemed uninterested. What exercised him was that Tony Blair had invited leaders of the main religions in Britain to Downing Street to discuss the situationbut somehow omitted to ask a leader of atheism (presumably Dawkins himself) to join the gathering. There seemed no question in Dawkins’s mind that atheism as he understood it fell into the same category as the world’s faiths.
In this, Dawkins is surely right. To suppose that science can liberate humankind from ignorance requires considerable credulity. We know how science has been used in the pastnot only to alleviate the human lot, but equally to serve tyranny and oppression. The notion that things might be fundamentally different in the future is an act of faithone as gratuitous as any of the claims of religion, if not more so. Consider Pascal. One of the founders of modern probability theory and the designer of the world’s first mass-transit system, he was far too intelligent to imagine that human reason could resolve perennial questions. His celebrated wager has always seemed to me a rather bad bet. Since we cannot know what gods there may be (if any), why stake our lives on pleasing one of them? But Pascal’s wager was meant as a pedagogical device rather than a demonstrative argument, and he reached faith himself by way of skeptical doubt. In contrast, Dawkins shows not a trace of skepticism anywhere in his writings. In comparison with Pascal, a man of restless intellectual energy, Dawkins is a monument to unthinking certitude.
We must await the second volume of his memoirs to discover how Dawkins envisions his future. But near the end of the present volume, an inadvertent remark hints at what he might want for himself. Darwin was “never Sir Charles,” he writes, “and what an amazing indictment of our honours system that is.” It is hard to resist the thought that the public recognition that in Britain is conferred by a knighthood is Dawkins’s secret dream. A life peerage would be even better. What could be more fitting for this tireless evangelist than to become the country’s officially appointed atheist, seated alongside the bishops in the House of Lords? He may lack their redeeming tolerance and display none of their sense of humor, but there cannot be any reasonable doubt that he belongs in the same profession. 
John Gray is emeritus professor of European thought at the London School of Economics and author of The Silence of Animals: On Progress and Other Modern Myths (Farrar, Straus, and Giroux). 

Friday, December 26, 2014

Post Christmas 2014

After being rebuffed by Ruth Chris we procure dinner from Kobe, a Japanese steakhouse on 280 which opens at 4:30.  The place is packed and I have to wait over an hour for our food although we all agree the food was good.  I supposed it turned out okay. At the restaurant a lady fell to the floor and the paramedics had to be called.  Another man was waiting a long time for his order and told me congratulations as I left.   Freddy has to return to Little Rock today.  I look forward to the day when we are all living in the same town.

Wednesday, December 24, 2014

Christmas Eve

There will be no chestnuts roasting on an open fire, no Jack Frost nipping at our noses, but the Hudsons are all home for Christmas. We'll sing our own yuletide carols and dress up like Eskimos if need be. Santa is on his way. We'll sing our own Oh, Holy Night and Away in a Manger. We are blessed and thankful. Merry Christmas and Happy Holidays to all from Alabama.

Tuesday, December 23, 2014

When Humans Quit Hunting And Gathering, Their Bones Got Wimpy

BY Nell Greenfieldboyce
NPR
22 December 2014

Compared with other primates and our early human ancestors, we modern humans have skeletons that are relatively lightweight — and scientists say that basically may be because we got lazy.

Biological anthropologist Habiba Chirchir and her colleagues at the Smithsonian's National Museum of Natural History were studying the bones of different primates including humans. When they looked at the ends of bones near the joints, where the inside of the bone looks almost like a sponge, they were struck by how much less dense this spongy bone was in humans compared with chimpanzees or orangutans.

"So the next step was, what about the fossil record? When did this feature evolve?" Chirchir wondered.

Their guess was that the less dense bones showed up a couple of million years ago, about when Homo erectus, a kind of proto-human, left Africa. Having lighter bones would have made it a lot easier to travel long distances, Chirchir speculated.

But after examining a bunch of early human fossils, she realized their guess was wrong. "This was absolutely surprising to us," she says. "The change is occurring much later in our history."

The lightweight bones don't appear until about 12,000 years ago. That's right when humans were becoming less physically active because they were leaving their nomadic hunter-gatherer life behind and settling down to pursue agriculture.

A report on the work appeared Monday in the Proceedings of the National Academy of Sciences, along with a study from a different research group that came to much the same conclusion.

Those researchers looked at the bones of people in more recent history who lived in farming villages nearly 1,000 years ago and compared them with the bones of people who had lived nearby, earlier, as foragers.

The bones of people from the farming communities were less strong and less dense than those of the foragers, whose measured bone strength was comparable to similar-size nonhuman primates.

"We see a similar shift, and we attribute it to lack of mobility and more sedentary populations," says Timothy Ryan, an associate professor of anthropology at Penn State University. "Definitely physical activity and mobility is a critical component in building strong bones."

Study: White Men Most Influential on Capital juries - and That's Bad News for Black Defendants

BY Brendan Kirby
23 December 2014

Two white women on a jury deciding the fate of convicted murderer Mitchell Hall were expressing doubts about whether he should get the death penalty when an older white man on the panel took control of deliberations.

Before the forewoman had finished her explanation of why she hoped the defendant might one day positively influence his children, the death penalty advocate cut her off.

"I would refute that completely," he said, before launching into a long argument in favor of execution.

By the time he had finished, the holdouts for life-without-parole prison sentences had flipped, and the jury rendered a unanimous verdict in favor of death.

The case was not real, and the jurors were residents of a central California county paid $40 to participate in a study by University of California-Irvine criminology professor Mona Lynch and University of California-Santa Cruz psychology professor Craig Haney. The study, published this month in the Journal of the American Bar Foundation, indicates that emotion plays a central role in death penalty deliberations. It also draws three main conclusions:

  • White men on juries are most likely to argue forcefully for their position - whether it is life in prison or the death penalty, and are more likely to be elected jury foremen.

  • When jurors change their minds during deliberations, it usually is to switch from life to death.

  • Race plays an important, if subtle role. White male jurors were more likely to empathize with white defendants and more likely to discount mitigating factors in cases involving black defendants.

  • "In the case of capital jury decision making, our findings suggest that emotions significantly shaped outcomes and the very processes by which they were reached," the study states. "Indeed, they appeared to be inseparable from the critically important judgments that capital jurors were called on to make."

    Devising the study

    The study reaches similar conclusions to previous research, based on interviews with people who had served on capital juries, that demonstrated the influence of race on deliberations.</

    Lynch and Haney constructed their study by dividing participants into 100 juries of four to seven members. They told jurors that the defendant in the case they would review already had been convicted of capital murder and that they were to consider one of two options - the death penalty or life in prison without possibility of parole.

    They then watched a videotape of the penalty phase of a trial based on a real California case. In each case, the victim's name was Mitchell Hall and the victim was named John Emerson. The races of the defendant and victim varied from panel to panel, but the facts were the same otherwise.

    The study found that overall, deliberations produced an 11 percent increase in jurors opting for death compared with pre-deliberation preferences that jurors marked after watching the penalty phase. Among the 134 participants who changed their minds during deliberations, about three quarters moved from life to death. According to the study, several jurors did not even admit their original positions in favor of life in prison that they marked on their questionnaires.

    White men were most successful at swaying other jurors to their point of view, according to the study. Also, juries with higher proportions of white men had a greater tendency toward racial disparities in sentencing.

    "In the present analysis, we found that white male jurors often asserted emotional authority in the deliberations in two distinct ways: first in terms of asserting their own emotional responses and, second, by policing the emotional expressions of others," the authors wrote. "Moreover, this authority appeared to be deployed differentially as a function of the defendant's race."

    White men most forceful

    White male jurors were more likely to be forceful advocates for life when the defendant was white - especially when the victim was black. In one case, a white man on a jury distinguished the defendant from other murder defendants.

    "So Mitchell, as compared to your generic gangster, he's a guy that is really going to suffer in prison, this is going to be a tremendous pain to him and it's going to be with him for his whole life," he said.

    On the other hand, white men on the mock juries were more likely to shut down fellow jurors who expressed empathy for black defendants. In some cases, white male jurors viewed a white defendant's children as a mitigating factor in his favor while other when men viewed the black defendant's children as a sign of irresponsibility.

    In one case, a white male juror used a racist slur during deliberations. While that was unusual, the study's authors wrote, "race remained a subtext" that helped shape the discussions in most of the other cases.

    "Finally, our analysis of these deliberations underscores the apparent intractability of racism in capital decision making," Lynch and Haney wrote. "It suggests that racial bias can operate emotionally as well as cognitively."

    Monday, December 22, 2014

    Slow Coming

    It seems like Christmas has been slow coming, but come it will.  It rained today and is supposed to rain much more tomorrow.  Rain, rain, go away.  Come again some other day.

    Sunday, December 21, 2014

    Holiday Suggestions

    Suggestions for a happy holiday season: Tylenol. Avoid political talk with family and friends. Tylenol. No regifting to avoid guilt feelings. Tylenol. That Salvation Army bell tolls for thee. Tylenol. Forget about being politically correct (oh so 90's). Tylenol. It won't hurt you to act happy with that ugly tie gift or something else you will have forgotten by the 26th. Tylenol. Don't blow your cool. Tylenol.

    Why No Resistance?

    Why Have Americans Stopped Resisting Economic Privilege?

    Marx once described high finance as “the Vatican of capitalism,” its diktat to be obeyed without question. Several decades have come and gone during which we’ve learned not to mention Marx in polite company. Our vocabulary went through a kind of linguistic cleansing, exiling suspect and nasty phrases like “class warfare” or “the reserve army of labor” or even something as apparently innocuous as “working class.”
    In times past, however, such language and the ideas they conjured up struck our forebears as useful, even sometimes as accurate depictions of reality. They used them regularly along with words and phrases like “plutocracy,” “robber baron,” and “ruling class” to identify the sources of economic exploitation and inequality that oppressed them, as well as to describe the political disenfranchisement they suffered and the subversion of democracy they experienced. Never before, however, has the Vatican of capitalism captured quite so perfectly the specific nature of the oligarchy that recently ran the country for a long generation and ended up running it into the ground. Even political consultant and pundit James Carville (no Marxist he), confessed as much during the Clinton years, when he said the bond market “intimidates everybody.”
    Southern Labor Archives at Georgia State University
    Southern Labor Archives at Georgia State University
    Occupy Wall Street, even bereft of strategy, program, and specific demands as many lamented when it was a newborn, nonetheless opened up space again for our political imagination by confronting this elemental, determining feature of our society’s predicament. It rediscovered something that, beneath thickets of political verbiage about tax this and cut that, about end‑of‑the ­world deficits and ­missionary-​minded “job creators,” had been hiding in plain sight: namely, what our ancestors once called “the street of torments.” It achieved a giant leap backward, so to speak, summoning up a history of opposition that had mysteriously withered away.
    steve-fraser-120True turning points in American political history are rare. This might seem counterintuitive once we recognize that for so long society was in a constant uproar. Arguably the country was formed and re‑formed in serial acts of violent expropriation. Like the market it has been (and remains) infinitely fungible, living in the perpetually changing present, panting after the future, the next big thing. The demographics of American society are and have always been in permanent upheaval, its racial and ethnic complexion mutating from one generation to the next. Its economic hierarchies exist in a fluid state of dissolution and recrystallization. Social classes go in and out of existence.
    Nonetheless, in the face of this all­sided​ liquefaction, American politics have tended to flow within very narrow banks from one generation to the next. The capacious, sometimes stultifying embrace of the two­-party­ system has absorbed most of the heat generated by this or that hot­-​button­ issue, leaving the fundamentals intact. Only under the most trying circumstances has the political system ruptured or come close. Then the prevailing balance of power and wealth between classes and regions has been called into question; then the political geography and demography of the nation have been reconfigured, sometimes for decades to come; only then have axiomatic beliefs about wealth and work, democracy and elitism, equality and individualism, government and the free market been reformulated or at least opened to serious debate, however briefly.
    Why, until the sudden eruption of ­OWS — a​ flare‑up that died down rather quickly­ — was​ the second Gilded Age one of acquiescence rather than resistance?
    A double mystery then is the subject of this book. Speaking generally, one might ask why people submit for so long to various forms of exploitation, oppression, and domination. And then, equally mysterious, why they ever stop giving in. Why acquiesce? Why resist? Looking backward, the indignities and injustices, the hypocrisies and lies, the corruption and cruelty may seem insupportable. Yet they are tolerated. Looking backward, the dangers to life, limb, and livelihood entailed in rebelling may seem too dire to contemplate. Yet in the teeth of all that, rebellion happens. The world is full of recent and long-ago examples of both.
    America’s history is mysterious in just this way. This book is an attempt to explore the enigma of resistance and acquiescence as those experiences unfolded in the late nineteenth and again in the late twentieth century. We have grown accustomed for some years now to referring to America’s two gilded ages. The first one was baptized by Mark Twain in his novel of that same name and has forever after been used to capture the era’s exhibitionist material excess and political corruption. The second, our own, which began sometime during the Reagan era and lasted though the financial meltdown of 2008, like the original, earned a reputation for extravagant self-​indulgence by the rich and famous and for a similar political system of, by, and for the moneyed. So it has been natural to assume that these two gilded ages, however much they have differed in their particulars, were essentially the same. Clearly there is truth in that claim. However, they were fundamentally dissimilar.
    We have grown accustomed for some years now to referring to America’s two gilded ages. The first one was baptized by Mark Twain in his novel of that same name and has forever after been used to capture the era’s exhibitionist material excess and political corruption. The second, our own, which began sometime during the Reagan era and lasted though the financial meltdown of 2008, like the original, earned a reputation for extravagant self-​indulgence by the rich and famous and for a similar political system of, by, and for the moneyed. So it has been natural to assume that these two gilded ages, however much they have differed in their particulars, were essentially the same. Clearly there is truth in that claim. However, they were fundamentally dissimilar.
    Penthouse helipads, McMansions roomy enough to house a regiment, and private island getaways… Substitute those Fifth Avenue castles, Newport beachfront behemoths, and Boss Tweed’s infamous courthouse of a century before and nothing much had changed.
    Mark Twain’s Gilded Age has always fascinated and continues to fascinate. The American vernacular is full of references to that era: the “Gay Nineties,” “robber barons,” “how the other half lives,” “cross of gold,” “acres of diamonds,” “conspicuous consumption,” “the leisure class,” “the sweatshop,” “other people’s money,” “social Darwinism and the survival of the fittest,” “the nouveau riche,” “the trust.” What a remarkable cluster of metaphors, so redolent with the era’s social tensions they have become permanent deposits in the national memory bank.We think of the last third of the nineteenth century as a time of great accomplishment, especially of stunning economic growth and technological transformation and the amassing of stupendous wealth. This is the age of the steam engine and transcontinental railroads, of the mechanical reaper and the telephone, of cities of more than a million and steel mills larger than any on earth, of America’s full immersion in the Industrial Revolution. A once underdeveloped, infant nation became a power to be reckoned with.
    For people living back then, however much they were aware of and took pride in these marvels, the Gilded Age was also a time of profound social unease and chronic confrontations. Citizens were worried about how the nation seemed to be verging on cataclysmic divisions of wealth and power. The trauma of the Civil War, so recently concluded, was fresh in everyone’s mind. The abiding fear, spoken aloud again and again, was that a second civil war loomed. Bloody encounters on railroads, in coal mines and steel mills, in city streets and out on the Great Plains made this premonition palpable. This time the war to the death would be between the haves and ­have-​nots, a war of class against class. American society was becoming dangerously, ominously unequal, fracturing into what many at the time called “two nations.”
    Until OWS [Occupy Wall Street] came along, all of this would have seemed utterly strange to those living through America’s second Gilded Age. But why? After all, years before the financial meltdown plenty of observers had noted how unequal American society had become. They compared the skewed distribution of income and wealth at the turn of the twenty-first​ century with the original Gilded Age and found it as stark or even starker than at any time in American history. Stories about penthouse helipads, McMansions roomy enough to house a regiment, and private island getaways kept whole magazines and TV shows buzzing. “Crony capitalism,” which Twain had great fun skewering in his novel, was very much still alive and well in the age of Jack Abramoff. Substitute those Fifth Avenue castles, Newport beachfront behemoths, and Boss Tweed’s infamous courthouse of a century before and nothing much had changed.
    New York Council Member Ydanis Rodriguez marches with Occupy Wall Street protestors before an attempted re-occupation of a vacant lot beside Duarte Park in New York. December 2011. (AP Photo/John Minchillo)
    New York Council Member Ydanis Rodriguez marches with Occupy Wall Street protestors before an attempted re-occupation of a vacant lot beside Duarte Park in New York. December 2011. (AP Photo/John Minchillo)
    Or so it might seem. But in fact times had changed profoundly. Gone missing were the insurrections and all those utopian longings for a world put together differently so as to escape the ravages of industrial capitalism. It was this social chemistry of apocalyptic doom mixed with visionary expectation that had lent the first Gilded Age its distinctive frisson. The absence of all that during the second Gilded Age, despite the obvious similarities it shares with the original, is a reminder that the past is indeed, at least in some respects, a foreign country. Why, until the sudden eruption of ­OWS — a​ flare‑up that died down rather quickly­ — was​ the second Gilded Age one of acquiescence rather than resistance?
    If the first Gilded Age was full of sound and fury, the second seemed to take place in a padded cell. Might that striking contrast originate in the fact that the capitalist society of the Gay Nineties was nothing like the capitalism of our own time? Or to put it another way: Did the utter strangeness of capitalism when it was first taking shape in ­America —​  ­beginning decades before the Gay ­Nineties — so​ deeply disturb traditional ways of life that for several generations it seemed intolerable to many of those violently uprooted by its onrush? Did that shattering experience elicit responses, radical yet proportionate to the life‑or‑death threat to earlier, cherished ways of life and customary beliefs?
    And on the contrary, did a society like our own long ago grow accustomed to all the fundamentals of capitalism, not merely as a way of con-ducting economic affairs, but as a way of being in the world? Did we come to treat those fundamentals as part of the natural order of things, beyond real challenge, like the weather? What were the mechanisms at work in our own distinctive political economy, in the quotidian experiences of work and family life, in the interior of our imaginations, that produced a sensibility of irony and even cynical disengagement rather than a morally charged universe of utopian yearnings and dystopian forebodings?
    Gilded ages are, by definition, hiding something; what sparkles like gold is not. But what they’re hiding may differ, fundamentally. Industrial capitalism constituted the understructure of the first Gilded Age. The second rested on finance capitalism. Late-​nineteenth-​century American capitalism gave birth to the “trust” and other forms of corporate consolidation at the expense of smaller businesses. ­Late-twentieth​-​ century­ capitalism, notwithstanding its mania for mergers and acquisitions, is known for its “flexibility,” meaning its penchant for off­-loading​ corporate functions to a world of freelancers, contractors, subcontractors, and numberless petty enterprises. The first Gilded Age, despite its glaring inequities, was accompanied by a gradual rise in the standard of living; the second by a gradual erosion.
    Excerpted from The Age of Acquiescence by Steve Fraser. Copyright (c) 2015 by Steve Fraser. Reprinted with permission of Little, Brown and Company.