Sunday, September 29, 2013

Rep. Debbie Wasserman Schultz on the GOP

Referring to a possible government shutdown -

"These people have come unhinged."

Breaking Bad



The World According to Team WaltBy ROSS DOUTHAT

Published: September 28, 2013 50 Comments



ACROSS five seasons of riveting television, the antihero of AMC’s “Breaking Bad,” Walter Hartwell White, has committed enough crimes to earn several life sentences from any reasonable jury. He has cooked crystal meth in bulk, hooking addicts from his native Albuquerque all the way to Prague. He has personally killed at least seven people and is implicated in the deaths of hundreds more. He has poisoned an innocent child, taken out a contract on his longtime partner, and stood by and watched a young woman choke to death.


But one thing he hasn’t done, as this weekend’s series finale looms, is entirely forfeit the sympathies of his audience. As a cultural phenomenon, this is the most striking aspect of “Breaking Bad” — the persistence, after everything he’s done, of a Team Walt that still wants him to prevail.



In the online realms where hit shows are dissected, critics who pass judgment on Walt’s sins find themselves tangling with a multitude of commenters who don’t think he needs forgiveness. And it isn’t just the anonymous hordes who take his side. “You’d think I’d bear Walt some serious ill will considering he sat there and watched Jane die,” the actress who played his vomit-choked victim wrote for New York magazine last week, “but I’m still rooting for everything to work out for the guy.”



On the surface, this sympathy is not surprising, given the long pop culture tradition of rooting for the bad guy. But you don’t usually hear audiences argue insistently that their favorite villains are actually heroic — that a J. R. Ewing or a Francis Underwood is a misunderstood paragon of virtue. And when viewers do make excuses for fictional criminals, it’s usually because those characters inhabit distinctive, hermetic worlds — the Jersey mafia on “The Sopranos,” West Baltimore on “The Wire” — in which becoming a killer is less a decision than an inheritance, which we can root for them to escape from or rise above.



Walter White, though, begins as a perfectly law-abiding citizen — a high school chemistry teacher and family man, who turns to cooking meth after a terminal cancer diagnosis because it promises to make money for his family. He isn’t the product of a lawless environment who never knew another way. He’s a protagonist who made a conscious decision to embrace what society regards as evil, to step permanently outside our civilization’s moral norms.



This means “Breaking Bad” implicitly challenges audiences to get down to bedrock and actually justify those norms. Why is it so wrong to kill strangers — often dangerous strangers! — so that your own family can survive and prosper? Why is it wrong to exploit people you don’t see or care about for the sake of those inside your circle? Why is Walter White’s empire-building — carried out with boldness, brilliance and guile — not an achievement to be admired?



And the fact that so many viewers do seem to end up admiring him — even to the point of despising Walt’s conflicted wife, Skyler, because she doesn’t appreciate him — is a reminder that the answers to these questions aren’t actually as self-evident as our civilization would like to assume.



The allure for Team Walt is not ultimately the pull of nihilism, or the harmless thrill of rooting for a supervillain. It’s the pull of an alternative moral code, neither liberal nor Judeo-Christian, with an internal logic all its own. As James Bowman wrote in The New Atlantis, embracing Walt doesn’t requiring embracing “individual savagery” and a world without moral rules. It just requires a return to “old rules” — to “the tribal, family-oriented society and the honor culture that actually did precede the Enlightenment’s commitment to universal values.”



Those rules seem cruel by the lights of both cosmopolitanism and Christianity, but they are not irrational or necessarily false. Their Darwinian logic is clear enough, and where the show takes place — in the shadow of cancer, the shadow of death — the kindlier alternatives can seem softheaded, pointless, naïve.



Nor can this tribal morality be refuted in a laboratory. Indeed, by making Walt a chemistry genius, the show offers an implicit rebuke to the persistent modern conceit that a scientific worldview logically implies liberalism, humanism and a widening circle of concern. On “Breaking Bad,” that worldview just makes Walt a better kingpin, and the beautiful equations of chemistry are deployed to addict, poison, decompose.



To be clear, I don’t think the show itself is actually on Walt’s side. I think Team Walt badly misreads the story’s moral arc and vision.



But the pervasiveness of that misreading tells us something significant. It’s comforting to dismiss Walt’s admirers as sickos, idiots, “bad fans.” But they, too, can be moralists — drawn by their sympathy for Walter White into a worldview that still lies percolating, like one of his reactions, just below the surface of every human heart.



Obamacare: A Good Summary

by Atul Gawande


Ours can be an unforgiving country. Paul Sullivan was in his fifties, college-educated, and ran a successful small business in the Houston area. He owned a house and three cars. Then the local economy fell apart. Business dried up. He had savings, but, like more than a million people today in Harris County, Texas, he didn’t have health insurance. “I should have known better,” he says. When an illness put him in the hospital and his doctor found a precancerous lesion that required treatment, the unaffordable medical bills arrived. He had to sell his cars and, eventually, his house. To his shock, he had to move into a homeless shelter, carrying his belongings in a suitcase wherever he went.



This week, the centerpiece of the Affordable Care Act, which provides health-insurance coverage to millions of people like Sullivan, is slated to go into effect. Republican leaders have described the event in apocalyptic terms, as Republican leaders have described proposals to expand health coverage for three-quarters of a century. In 1946, Senator Robert Taft denounced President Harry Truman’s plan for national health insurance as “the most socialistic measure this Congress has ever had before it.” Fifteen years later, Ronald Reagan argued that, if Medicare were to be enacted, “one of these days you and I are going to spend our sunset years telling our children and our children’s children what it once was like in America when men were free.” And now comes Senate Minority Leader Mitch McConnell describing the Affordable Care Act as a “monstrosity,” “a disaster,” and the “single worst piece of legislation passed in the last fifty years.” Lacking the votes to repeal the law, Republican hard-liners want to shut down the federal government unless Democrats agree to halt its implementation.



The law’s actual manifestation, however, is rather anodyne: as of October 1st, healthcare.gov is scheduled to open for business. A Web site where people who don’t have health coverage through an employer or the government can find a range of health plans available to them, it resembles nothing more sinister than an eBay for insurance. Because it’s a marketplace, prices keep falling lower than the Congressional Budget Office predicted, by more than sixteen per cent on average. Federal subsidies trim costs even further, and more people living near the poverty level will qualify for free Medicaid coverage.



How this will unfold, though, depends on where you live. Governors and legislatures in about half the states—from California to New York, Minnesota to Maryland—are working faithfully to implement the law with as few glitches as possible. In the other half—Indiana to Texas, Utah to South Carolina—they are working equally faithfully to obstruct its implementation. Still fundamentally in dispute is whether we as a society have a duty to protect people like Paul Sullivan. Not only do conservatives not think so; they seem to see providing that protection as a threat to America itself.



Obstructionism has taken three forms. The first is a refusal by some states to accept federal funds to expand their Medicaid programs. Under the law, the funds cover a hundred per cent of state costs for three years and no less than ninety per cent thereafter. Every calculation shows substantial savings for state budgets and millions more people covered. Nonetheless, twenty-five states are turning down the assistance. The second is a refusal to operate a state health exchange that would provide individuals with insurance options. In effect, conservatives are choosing to make Washington set up the insurance market, and then complaining about a government takeover. The third form of obstructionism is outright sabotage. Conservative groups are campaigning to persuade young people, in particular, that going without insurance is “better for you”—advice that no responsible parent would ever give to a child. Congress has also tied up funding for the Web site, making delays and snags that much more inevitable.





from the issuecartoon banke-mail this.Some states are going further, passing measures to make it difficult for people to enroll. The health-care-reform act enables local health centers and other organizations to provide “navigators” to help those who have difficulties enrolling, because they are ill, or disabled, or simply overwhelmed by the choices. Medicare has a virtually identical program to help senior citizens sort through their coverage options. No one has had a problem with Medicare navigators. But more than a dozen states have passed measures subjecting health-exchange navigators to strict requirements: licensing exams, heavy licensing fees, insurance bonds. Florida has attempted to ban them from county health departments, where large numbers of uninsured people go for care. Tennessee recently adopted an emergency rule declaring that anyone who could be described as an “enrollment assister” must undergo a criminal background check, fingerprinting, and twelve hours of course work. The hurdles would hamper hospital financial counsellors in the state—and, by some interpretations, ordinary good Samaritans—from simply helping someone get insurance.



This kind of obstructionism has been seen before. After the Supreme Court’s ruling in Brown v. Board of Education, in 1954, Virginia shut down schools in Charlottesville, Norfolk, and Warren County rather than accept black children in white schools. When the courts forced the schools to open, the governor followed a number of other Southern states in instituting hurdles such as “pupil placement” reviews, “freedom of choice” plans that provided nothing of the sort, and incessant legal delays. While in some states meaningful progress occurred rapidly, in others it took many years. We face a similar situation with health-care reform. In some states, Paul Sullivan’s fate will become rare. In others, it will remain a reality for an unconscionable number of people. Of some three thousand counties in the nation, a hundred and fourteen account for half of the uninsured. Sixty-two of those counties are in states that have accepted the key elements of Obamacare, including funding to expand Medicaid. Fifty-two are not.



So far, the health-care-reform law has allowed more than three million people under the age of twenty-six to stay on their parents’ insurance policy. The seventeen million children with preëxisting medical conditions cannot be excluded from insurance eligibility or forced to pay inflated rates. And more than twenty million uninsured will gain protection they didn’t have. It won’t be the thirty-two million hoped for, and it’s becoming clear that the meaning of the plan’s legacy will be fought over not for a few months but for years. Still, state by state, a new norm is coming into being: if you’re a freelancer, or between jobs, or want to start your own business but have a family member with a serious health issue, or if you become injured or ill, you are entitled to basic protection.



Conservatives keep hoping that they can drive the system to collapse. That won’t happen. Enough people, states, and health-care interests are committed to making it work, just as the Massachusetts version has for the past seven years. And people now have a straightforward way to resist the forces of obstruction: sign up for coverage, if they don’t have it, and help others do so as well. ♦



Saturday, September 28, 2013

Government Shutdown

Looks like it's coming.  Who will be hurt?  Which political party will get the blame?  (This should be a no-brainer).  What exactly will happen?  We shall find out.

Richard Reeves - President Kennedy

Amongst the plethora of good books written about John F. Kennedy, this has to be one of the best.  The book was engrossing to me from start to finish.

Jack Kennedy was born into wealth and privilege, the second son Joseph Kennedy, a Boston Irish Catholic who was a self-made millionaire.  He never had to work a day in his life in a real job.

This book is organized around particular days in Kennedy's Presidency.  In this way Reeves covers all of the major events of JFK's thousand days in the White House.

Reading this book reminds me of how dangerous the world was during Kennedy's tenure.  World War III could have started, really, over Berlin and the Cuban Missile Crisis.  It is due to no small measure to JFK's actions that war was averted.  The world owes this man a debt that can never be repaid.

JFK could not stand to be alone.  He compartmentalized his life: different friends for different purposes.  How did he keep up with all of it?  He was a serial philandeerer.  He lived off of a huge trust fund.  Must have been nice.  He cried when the Bay of Pigs invasion failed as he should have.  He was a pragmatic rather than a feeling liberal.  Everything was political, and things like the civil rights movement was a practical problem to be solved rather than a moral issue despite his famout civil rights speech in the summer of 1963.

Friday, September 27, 2013

Family Debt Ceiling

My spouse is threatening not to extend our debt ceiling at the end of the month. I have been filibustering but she is not listening.


Tuesday, September 24, 2013

Is Poker a Game of Skill? Yes



Posted on September 24, 2013 by Sean Carroll


Is poker a game of skill or chance? A quasi-experimental study

Gerhard Meyer, Marc von Meduna, Tim Brosowski, Tobias Hayer



Due to intensive marketing and the rapid growth of online gambling, poker currently enjoys great popularity among large sections of the population. Although poker is legally a game of chance in most countries, some (particularly operators of private poker web sites) argue that it should be regarded as a game of skill or sport because the outcome of the game primarily depends on individual aptitude and skill. The available findings indicate that skill plays a meaningful role; however, serious methodological weaknesses and the absence of reliable information regarding the relative importance of chance and skill considerably limit the validity of extant research. Adopting a quasi-experimental approach, the present study examined the extent to which the influence of poker playing skill was more important than card distribution. Three average players and three experts sat down at a six-player table and played 60 computer-based hands of the poker variant “Texas Hold’em” for money. In each hand, one of the average players and one expert received (a) better-than-average cards (winner’s box), (b) average cards (neutral box) and (c) worse-than-average cards (loser’s box). The standardized manipulation of the card distribution controlled the factor of chance to determine differences in performance between the average and expert groups. Overall, 150 individuals participated in a “fixed-limit” game variant, and 150 individuals participated in a “no-limit” game variant. ANOVA results showed that experts did not outperform average players in terms of final cash balance…



(It’s a long abstract, I didn’t copy the whole thing.) The question “Is poker a game of skill or chance?” is a very important one, not least for legal reasons, as governments decide how to regulate the activity. However, while it’s an important question, it’s not actually an interesting one, since the answer is completely obvious: while chance is obviously an element, poker is a game of skill.



Note that chance is an element in many acknowledged games of skill, including things like baseball and basketball. (You’ve heard of “batting averages,” right?) But nobody worries about whether baseball is a game of skill, because there are obvious skill-based factors involved, like strength and hand-eye coordination. So let’s confine our attention to “decision games,” where all you do is sit down and make decisions about one thing or another. This includes games without a probabilistic component, like chess or go, but here we’re interested in games in which chance definitely enters, like poker or blackjack or Monopoly. Call these “probabilistic decision games.” (Presumably there is some accepted terminology for all these things, but I’m just making these terms up.)



So, when does a probabilistic decision game qualify as a “game of skill”? I suggest it does when the following criteria are met:



1.There are different possible strategies a player could choose.

2.Some strategies do better than others.

3.The ideal “dominant strategy” is not known.

It seems perfectly obvious to me that any game fitting these criteria necessarily involves an element of skill — what’s the best strategy to use? It’s also obvious that poker certainly qualifies, as would Monopoly. Games like blackjack or craps do not, since the best possible strategy (or “least bad,” since these games are definite losers in the long run) is known. Among players using that strategy, there’s no more room for skill (outside card-counting or other forms of cheating.)



Nevertheless, people continue to act like this is an interesting question. In the case of this new study, the methodology is pretty crappy, as dissected here. Most obviously, the sample size is laughably small. Each player played only sixty hands; that’s about two hours at a cardroom table, or maybe fifteen minutes or less at a fast online site. And any poker player knows that the variance in the game is quite large, even for the best players; true skill doesn’t show up until a much longer run than that.



More subtly, but worse, the game that was studied wasn’t really poker. If I’m understanding the paper correctly, the cards weren’t dealt randomly, but with pre-determined better-than-average/average/worse-than-average hands. This makes it easy to compare results from different occurrences of the experiment, but it’s not real poker! Crucially, it seems like the players didn’t know about this fake dealing. But one of the crucial elements of skill in poker is understanding the possible distribution of beginning hands. Another element is getting to know your opponents over time, which this experiment doesn’t seem to have allowed for.



On Black Friday in 2011, government officials swept in and locked the accounts of players (including me) on online sites PokerStars and Full Tilt. Part of the reason was alleged corruption on the part of the owners of the sites, but part was because (under certain interpretations of the law) it’s illegal to play poker online in the US. Hopefully someday we’ll grow up and allow adults to place wagers with other adults in the privacy of their own computers.



Monday, September 23, 2013

Vidal on Hemingway

"What other culture could have produced someone like Hemingway and not seen the joke."

-Gore Vidal

Saturday, September 21, 2013

The Feel of Books

The Feel Of Books



Sep 21 2013 @ 2:17pm

The all-digital Bexar County Bibliotech Library opened in Texas this week. Jenny Davis wonders how reading culture will change as digitization becomes more dominant:



A traditional library … has a quite distinct sensory profile. Scents of Freshly vacuumed carpets mix with slowly disintegrating paper and the hushed sounds buzzing fluorescent bulbs. The lightly dusted, thickly bound books align row after row, adorned with laminated white stickers with small black letters and numbers, guiding readers to textual treasures organized by genre, topic, author, and title. These sensory stimuli may evoke calm, excitement, comfort, all of these things together. Indeed, being in a library has a feel. To fear the loss of this somatic experience, this “feel” is a legitimate concern. With a new kind of library, and a new medium for text, a particular sensory experience will, in time, be lost forever.



The new space, constituted by a new medium, will not, however, be without a “feel” of its own. The glowing screen; the smell of plastic mixed with cheap screen cleaners; the sound of softly clacking keys; the visual effect of a slightly warped screen against the faded grey of an old kindle; the anticipation of an hour glass or repeating circle, accompanied by the excitement of a pop-up: “your document has arrived.”



In the course of a couple generations, if digitized libraries become the new norm, new sensory profiles will reshape the somatic nostalgia of an entire population. This historical moment is at once terribly sad and incredibly sociologically interesting. We are at an intersection of somatic transition. Many will experience great and legitimate loss, unable to pass down some of their most meaningful sensory experiences to their children and grandchildren. Perhaps, at some point, losing the sacred spaces in which they get to revisit these sensory experiences themselves. Meanwhile, the young are in the midst of a great construction, building the sights, sounds, and smells of future whimsy.



(Photo from the Public Library of Cincinnati and Hamilton County)



Dennett to the Scientism Rescue





Let's Start With A Respect For Truth By Daniel C. Dennett [9.10.13]

Topic: ConversationsIntroduction By: John Brockman



Introduction



"The third culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are." (From The Emerging Third Culture", 1991)



Last month, The New Republic published Steven Pinker's article "Science Is Not The Enemy: An impassioned plea to neglected novelists, embattled professors, and tenure-less historian" (August 6, 2013). A link to a 3-minute video attacking the article was inserted in the middle of Pinker's text—"WATCH: Leon Wieseltier's rejoinder: Science doesn't have all the answers". Billed as one of "An irregular video-interview series with New Republic Literary Editor Leon Wieseltier", the video was conveniently ready for posting within minutes of the publication of Pinker's article.



Now, a month later, Wieseltier is back with a 5,650-word attack in the magazine entitled "Crimes Against Humanities: Now science wants to invade the liberal arts. Don't let it happen." (September 3, 2013).



This is not a new debate. In my 1991 essay "The Emerging Third Culture", I wrote:



In the past few years, the playing field of American intellectual life has shifted, and the traditional intellectual has become increasingly marginalized. A 1950s education in Freud, Marx, and modernism is not a sufficient qualification for a thinking person in the 1990s. Indeed, the traditional American intellectuals are, in a sense, increasingly reactionary, and quite often proudly (and perversely) ignorant of many of the truly significant intellectual accomplishments of our time. Their culture, which dismisses science, is often nonempirical. It uses its own jargon and washes its own laundry. It is chiefly characterized by comment on comments, the swelling spiral of commentary eventually reaching the point where the real world gets lost.



In 1959 C.P. Snow published a book titled The Two Cultures. On the one hand, there were the literary intellectuals; on the other, the scientists. He noted with incredulity that during the 1930s the literary intellectuals, while no one was looking, took to referring to themselves as "the intellectuals," as though there were no others. This new definition by the "men of letters" excluded scientists such as the astronomer Edwin Hubble, the mathematician John von Neumann, the cyberneticist Norbert Wiener, and the physicists Albert Einstein, Niels Bohr, and Werner Heisenberg.



How did the literary intellectuals get away with it? ...



In a second edition of "The Two Cultures", published in 1963, Snow added a new essay, "The Two Cultures: A Second Look," in which he optimistically suggested that a new culture, a "third culture," would emerge and close the communications gap between the literary intellectuals and the scientists. In Snow's third culture, the literary intellectuals would be on speaking terms with the scientists. Although I borrow Snow's phrase, it does not describe the third culture he predicted. Literary intellectuals are not communicating with scientists. Scientists are communicating directly with the general public. ...



Given Wieseltier's screed, we can all be thankful that this is happening. His clueless attack is evidence that he doesn't know, and doesn't even know that he doesn't know. It's no accident that Prospect Magazine has scientists (and Edge contributors) Richard Dawkins, Steven Pinker, Daniel Kahneman, and Jared Diamond at, or near, the top of their "World's Greatest Thinkers 2013" poll ("a snapshot of the intellectual trends that dominate our age").




DANIEL C. DENNETT is Austin B. Fletcher Professor of Philosophy, Co-Director, Center for Cognitive Science at Tufts U
--------------------------------------------------------------------------------



DENNETT ON WIESELTIER V. PINKER IN THE NEW REPUBLIC



Leon Wieseltier sees that the humanities are in a deep crisis, but his essay, "Crimes against the Humanities," is not a helpful contribution to its resolution. Name-calling and sarcasm are typically the last refuge of somebody who can't think of anything else to say to fend off a challenge he doesn't understand and can't abide. His response to Steven Pinker's proposed conciliation of science and the humanities is neither polite nor fair, and amounts, in the end, to a blustery attempt to lay down the law:



It is not for science to say whether science belongs in morality and politics and art. Those are philosophical matters, and science is not philosophy, even if philosophy has since its beginnings been receptive to science.



This is true enough, if carefully interpreted, but Wieseltier asserts it without argument, showing that he himself is not even trying to be a philosopher, but rather a Wise Divulger of the Undeniable Verities. He knows—take it from him. So this simple passage actually illustrates the very weakness of the humanities today that has encouraged scientists and other conscientious thinkers to try their own hand at answering the philosophical questions that press in on us, venturing beyond the confines of their disciplines to fill the vacuum left by the humanities.



Postmodernism, the school of "thought" that proclaimed "There are no truths, only interpretations" has largely played itself out in absurdity, but it has left behind a generation of academics in the humanities disabled by their distrust of the very idea of truth and their disrespect for evidence, settling for "conversations" in which nobody is wrong and nothing can be confirmed, only asserted with whatever style you can muster. Wieseltier concedes the damage done to the humanities by postmodernism "and other unfortunate hermeneutical fashions of recent decades" but tries to pin this debacle on the "progressivism" the humanities was tempted to borrow from science. "The humanities do not progress linearly, additively, sequentially, like the sciences," he avers, in the face of centuries of scholarship and criticism in the humanities that have corrected, enlarged, illuminated, and advanced the understanding of all its topics and texts. All that accumulated knowledge used to be regarded as the intellectual treasure we humanities professors were dedicated to transmitting to the next generation, and Pinker is encouraging us to return to that project, armed with some new intellectual tools—both thinking tools (theories and methods and models and the like) and data-manipulating tools (computers, optical character recognition, statistics, data banks). Wieseltier wants no part of this, but his alternative is surprisingly reminiscent of the just discredited fads; perhaps he has not completely purged his mind of the germs of postmodernism. Consider, for instance, this obiter dictum from Wieseltier:



It is the irreducible reality of inwardness, and its autonomy as a category of understanding, over which Pinker, in his delirium of empirical research, rides roughshod. The humanities are the study of the many expressions of that inwardness.



In what sense irreducible? What inwardness, exactly, are we discussing? How has its autonomy as a category been established? In short, who says? Wieseltier says, on behalf on the humanities, which thus declares itself authoritative with all the pomposity of a fake pope. And notice the ambiguity: is the study of those many expressions itself a matter governed by the rules of empirical research, or is it just another set of expressions of inwardness, interpretations of interpretations of interpretations?





--------------------------------------------------------------------------------



In short, who says? Wieseltier says, on behalf on the humanities, which thus declares itself authoritative with all the pomposity of a fake pope





--------------------------------------------------------------------------------



Philosophical matters are those that demand answers that can stand up to all things considered and hence cannot be addressed without suspending the enabling assumptions of any more specific field of science or inquiry. Wieseltier seems to believe that these matters are the exclusive province of philosophers, professionals who have been licensed to hold forth on them because of some advanced training in the humanities that qualifies them to do this important work. That is a common enough illusion, fostered by the administrative structures of academia, and indeed many (paid, professional, tenured) philosophers cling to it, but the plain fact is that every discipline generates philosophical issues as it advances, and they cannot be responsibly addressed by thinkers ignorant of the facts (the findings, the methods, the problems) encountered in those disciplines.



A philosopher in the sub-discipline of aesthetics who held forth on the topic of beauty in music but who couldn't read music or play an instrument, and who was unfamiliar with many of the varieties of music in the world, would not deserve attention. Nor would an ethicist opining on what we ought to do in Syria who was ignorant of the history, culture, politics and geography of Syria. Those who want to be taken seriously when they launch inquiries about such central philosophical topics as morality, free will, consciousness, meaning, causality, time and space had better know quite a lot that we have learned in recent decades about these topics from a variety of sciences. Unfortunately, many in the humanities think that they can continue to address these matters the old-fashioned way, as armchair theorists in complacent ignorance of new developments.





--------------------------------------------------------------------------------



Pomposity can be amusing, but pomposity sitting like an oversized hat on top of fear is hilarious.





--------------------------------------------------------------------------------



Pomposity can be amusing, but pomposity sitting like an oversized hat on top of fear is hilarious. Wieseltier is afraid that the humanities are being overrun by thinkers from outside, who dare to tackle their precious problems—or "problematics" to use the, um, technical term favored by many in the humanities. He is right to be afraid. It is true that there is a crowd of often overconfident scientists impatiently addressing the big questions with scant appreciation of the subtleties unearthed by philosophers and others in the humanities, but the way to deal constructively with this awkward influx is to join forces and educate them, not declare them out of bounds. The best of the "scientizers" (and Pinker is one of them) know more philosophy, and argue more cogently and carefully, than many of the humanities professors who dismiss them and their methods on territorial grounds. You can't defend the humanities by declaring it off limits to amateurs. The best way for the humanities to get back their mojo is to learn from the invaders and re-acquire the respect for truth that they used to share with the sciences.



Humanism vs. Scientism



BOOK BRAWL SEPTEMBER 3, 2013

Share Crimes Against Humanities

Now science wants to invade the liberal arts. Don't let it happen.

BY LEON WIESELTIER

The question of the place of science in knowledge, and in society, and in life, is not a scientific question. Science confers no special authority, it confers no authority at all, for the attempt to answer a nonscientific question. It is not for science to say whether science belongs in morality and politics and art. Those are philosophical matters, and science is not philosophy, even if philosophy has since its beginnings been receptive to science. Nor does science confer any license to extend its categories and its methods beyond its own realms, whose contours are of course a matter of debate. The credibility of physicists and biologists and economists on the subject of the meaning of life—what used to be called the ultimate verities, secularly or religiously constructed—cannot be owed to their work in physics and biology and economics, however distinguished it is. The extrapolation of larger ideas about life from the procedures and the conclusions of various sciences is quite common, but it is not in itself justified; and its justification cannot be made on internally scientific grounds, at least if the intellectual situation is not to be rigged. Science does come with a worldview, but there remains the question of whether it can suffice for the entirety of a human worldview. To have a worldview, Musil once remarked, you must have a view of the world. That is, of the whole of the world. But the reach of the scientific standpoint may not be as considerable or as comprehensive as some of its defenders maintain.



None of these strictures about the limitations of science, about its position in nonscientific or extra-scientific contexts, in any way impugns the integrity or the legitimacy or the necessity or the beauty of science. Science is a regular source of awe and betterment. No humanist in his right mind would believe otherwise. No humanist in his right mind would believe otherwise. Science is plainly owed this much support, this much reverence. This much—but no more. In recent years, however, this much has been too little for certain scientists and certain scientizers, or propagandists for science as a sufficient approach to the natural universe and the human universe. In a world increasingly organized around the dazzling new breakthroughs in science and technology, they feel oddly besieged.



They claim that science is under attack, and from two sides. The first is the fundamentalist strain of Christianity, which does indeed deny the truth of certain proven scientific findings and more generally prefers the subjective gains of personal rapture to the objective gains of scientific method. Against this line of attack, even those who are skeptical about the scientizing enterprise must stand with the scientists, though it is important to point out that the errors of religious fundamentalism must not be mistaken for the errors of religion. Too many of the defenders of science, and the noisy “new atheists,” shabbily believe that they can refute religion by pointing to its more outlandish manifestations. Only a small minority of believers in any of the scriptural religions, for example, have ever taken scripture literally. When they read, most believers, like most nonbelievers, interpret. When the Bible declares that the world was created in seven days, it broaches the question of what a day might mean. When the Bible declares that God has an arm and a nose, it broaches the question of what an arm and a nose might mean. Since the universe is 13.8 billion years old, a day cannot mean 24 hours, at least not for the intellectually serious believer; and if God exists, which is for philosophy to determine, this arm and this nose cannot refer to God, because that would be stupid.



Interpretation is what ensues when a literal meaning conflicts with what is known to be true from other sources of knowledge. As the ancient rabbis taught, accept the truth from whoever utters it. Religious people, or many of them, are not idiots. They have always availed themselves of many sources of knowledge. They know about philosophical argument and figurative language. Medieval and modern religious thinking often relied upon the science of its day. Rationalist currents flourished alongside anti-rationalist currents, and sometimes became the theological norm. What was Jewish and Christian and Muslim theology without Aristotle? When a dissonance was experienced, the dissonance was honestly explored. So science must be defended against nonsense, but not every disagreement with science, or with the scientific worldview, is nonsense. The alternative to obscurantism is not that science be all there is.



The second line of attack to which the scientizers claim to have fallen victim comes from the humanities. This is a little startling, since it is the humanities that are declining in America, not least as a result of the exaggerated glamour of science. But some scientists and some scientizers feel prickly and self-pitying about the humanistic insistence that there is more to the world than science can disclose. It is not enough for them that the humanities recognize and respect the sciences; they need the humanities to submit to the sciences, and be subsumed by them. The idea of the autonomy of the humanities, the notion that thought, action, experience, and art exceed the confines of scientific understanding, fills them with a profound anxiety. It throws their totalizing mentality into crisis. And so they respond with a strange mixture of defensiveness and aggression. As people used to say about the Soviet Union, they expand because they feel encircled.



A few weeks ago this magazine published a small masterpiece of scientizing apologetics by Steven Pinker, called “Science Is Not Your Enemy.” Pinker utters all kinds of sentimental declarations about the humanities, which “are indispensable to a civilized democracy.” Nobody wants to set himself against sensibility, which is anyway a feature of scientific work, too. Pinker ranges over a wide variety of thinkers and disciplines, scientific and humanistic, and he gives the impression of being a tolerant and cultivated man, which no doubt he is. But the diversity of his analysis stays at the surface. His interest in many things is finally an interest in one thing. He is a foxy hedgehog. His essay, a defense of “scientism,” is a long exercise in assimilating humanistic inquiries into scientific ones. By the time Pinker is finished, the humanities are the handmaiden of the sciences, and dependent upon the sciences for their advance and even their survival.



Pinker tiresomely rehearses the familiar triumphalism of science over religion: “the findings of science entail that the belief systems of all the world’s traditional religions and cultures ... are factually mistaken.” So they are, there on the page; but most of the belief systems of all the world’s traditional religions and cultures have evolved in their factual understandings by means of intellectually responsible exegesis that takes the progress of science into account; and most of the belief systems of all the world’s traditional religions and cultures are not primarily traditions of fact but traditions of value; and the relationship of fact to value in those traditions is complicated enough to enable the values often to survive the facts, as they do also in Aeschylus and Plato and Ovid and Dante and Montaigne and Shakespeare. Is the beauty of ancient art nullified by the falsity of the cosmological ideas that inspired it? I would sooner bless the falsity for the beauty. Factual obsolescence is not philosophical or moral or cultural or spiritual obsolescence. Like many sophisticated people, Pinker is quite content with a collapse of sophistication in the discussion of religion.



Yet the purpose of Pinker’s essay is not chiefly to denounce religion. It is to praise scientism. Rejecting the various definitions of scientism—“it is not an imperialistic drive to occupy the humanities,” it is not “reductionism,” it is not “naïve”—Pinker proposes his own characterization of scientism, which he defends as an attempt “to export to the rest of intellectual life” the two ideals that in his view are the hallmarks of science. The first of those ideals is that “the world is intelligible.” The second of those ideals is that “the acquisition of knowledge is hard.” Intelligibility and difficulty, the exclusive teachings of science? This is either ignorant or tendentious. Plato believed in the intelligibility of the world, and so did Dante, and so did Maimonides and Aquinas and Al-Farabi, and so did Poussin and Bach and Goethe and Austen and Tolstoy and Proust. They all share Pinker’s denial of the opacity of the world, of its impermeability to the mind. They all join in his desire to “explain a complex happening in terms of deeper principles.” They all concur with him that “in making sense of our world, there should be few occasions in which we are forced to concede ‘It just is’ or ‘It’s magic’ or ‘Because I said so.’ ” But of course Pinker is not referring to their ideals of intelligibility. The ideal that he has in mind is a very particular one. It is the ideal of scientific intelligibility, which he disguises, by means of an inoffensive general formulation, as the whole of intelligibility itself.



.If Pinker believes that scientific clarity is the only clarity there is, he should make the argument for such a belief. He should also acknowledge its narrowness (though within the realm of science it is very wide), and its straitening effect upon the investigation of human affairs. Instead he simply conflates scientific knowledge with knowledge as such. In his view, anybody who has studied any phenomena that are studied by science has been a scientist. It does not matter that they approached the phenomena with different methods and different vocabularies. If they were interested in the mind, then they were early versions of brain scientists. If they investigated human nature, then they were social psychologists or behavioral economists avant la lettre. Pinker’s essay opens with the absurd, but immensely revealing, contention that Spinoza, Locke, Hume, Rousseau, Kant, and Smith were scientists. It is true that once upon a time a self-respecting intellectual had to be scientifically literate, or even attempt a modest contribution to the study of the natural world. It is also true that Kant, to choose but one of Pinker’s heroes of science, made some astronomical discoveries in his early work; but Kant’s significant contributions to our understanding of mind and morality were plainly philosophical, and philosophy is not, and was certainly not for Kant, a science. Perhaps one can be a scientist without being aware that one is a scientist. What else could these thinkers have been, for Pinker? If they contributed to knowledge, then they must have been scientists, because what other type of knowledge is there? For all its geniality, Pinker’s translation of nonscientific thinking into science is no less strident a constriction than, say, Carnap’s colossally parochial dictum that “there is no question whose answer is in principle unattainable by science.” His ravenous intellectual appetite notwithstanding, Pinker is finally in the same reductionist racket. (The R-word!) He sees many locks but only one key.



The translation of nonscientific discourse into scientific discourse is the central objective of scientism. It is also the source of its intellectual perfunctoriness. Imagine a scientific explanation of a painting—a breakdown of Chardin’s cherries into the pigments that comprise them, and a chemical analysis of how their admixtures produce the subtle and plangent tonalities for which they are celebrated. Such an analysis will explain everything except what most needs explaining: the quality of beauty that is the reason for our contemplation of the painting. Nor can the new “vision science” that Pinker champions give a satisfactory account of aesthetic charisma. The inadequacy of a scientistic explanation does not mean that beauty is therefore a “mystery” or anything similarly occult. It means only that other explanations must be sought, in formal and iconographical and emotional and philosophical terms.



The scientistic reading of literary texts is similarly uninstructive, and often quite risible. I will give two examples. In 1951, Richard von Mises, an Austrian scientist and mathematician who fled the Nazis and found sanctuary at Harvard, published Positivism: A Study in Human Understanding, one of the classics of scientism, in which he argued that “a basic contrast between natural sciences and the humanities, with respect to either method or subject matter, cannot be constructed.” Such a separation, he said, would “require that between the intellectual behavior of a man and his physical organism no direct connection exists.” If we are partially explicable by science, in other words, then we must be totally explicable by science. Von Mises was a devoted reader of Rilke, and assembled an important collection of Rilke materials, which is now in Harvard’s library. In his book he included a discussion of poetry. “What the poet reports ...” he contended, “are experiences about vital interrelations between observable phenomena.” Not only narrative and dramatic verse, but also lyrical or “pure” verse, “expresses only experiences about observable facts.” As his proof-text for this scientistic understanding of poetry, von Mises cites his beloved Rilke: “For the sake of a single verse one must see many cities, men, and things; one must know the animals; one must feel how the birds fly and know the gesture with which the little flowers open up in the morning.” He believed that Rilke, of all writers, was recommending empiricism! Von Mises aridly instructed that “every poem, except in rare extreme cases, contains judgments and implicit propositions and thus becomes subject to logical analysis.” He deserved to be barred from Duino’s door.



In 1997, Jared Diamond published Guns, Germs, and Steel, another scientistic theory of everything. In one of its less charming passages, Diamond proposes “the Anna Karenina principle” for the understanding of the domestication of animals: “domesticable animals are all alike; every undomesticable animal is undomesticable in its own way.” He is mimicking the renowned opening sentence of Tolstoy’s novel: “all happy families are alike; every unhappy family is unhappy in its own way.” The adage is rather overrated, since all happy families are not alike; but here is how Diamond explicates it: “By that sentence, Tolstoy meant that, in order to be happy, a marriage must succeed in many different respects: sexual attraction, agreement about money, child discipline, religion, in-laws, and other vital issues. Failure in any one of those respects can doom a marriage even if it has all the other ingredients needed for happiness.” This is a fine instance of the incomprehension, and the buzzkill, that often attends the extension of the scientistic temperament to literature and art. Of course Tolstoy had no such sociology or self-help in mind. His proposition was a caution against generalizations about the human heart, and a strike against facile illusions of intelligibility, and an affirmation of the incommensurability, the radical particularity, of individual experience. In-laws!



What von Mises and Diamond—and Pinker—deny is that the differences between the various realms of human existence, and between the disciplines that investigate them, are final. For these scientizers, they are not differences in kind; they are differences only in appearance, whereas a deeper explanation, a scientific explanation, will expose the underlying sameness. The underlying sameness is the presumption of scientism. The scientizers do not respect the borders between the realms; they transgress the borders so as to absorb all the realms into a single realm, into their realm. They are not pluralists. With his uniform notion of intelligibility, Pinker rejects the momentous distinction between the study of the natural world and the study of the human world, as it was developed by thinkers from Vico and Dilthey to Isaiah Berlin and Bernard Williams. Here is Dilthey, in 1883: “The impossibility of deriving mental or spiritual facts from those of the mechanical order of nature—an impossibility based on the difference of their sources—does not preclude their inclusion within the system of nature. But there comes a point where the relations among the facts of the world of human spirit show themselves to be incommensurate with the uniformities of natural processes in that the facts of the human world cannot be subordinated to those established by the mechanistic conception of nature. Only then do we witness ... the boundary where knowledge of nature ends and an independent human science, shaped by its own central concerns, begins.” Some of Dilthey’s language is archaic—we no longer think of the natural universe mechanistically, and we would call his “human science” history and the humanities, and we would likely refer to “human spirit” as consciousness—but his cartography of knowledge, and the principles that justify its demarcations, remains valid. The boundary is porous, of course: whatever else we are, we are also animals, and the impact upon us of material causes is indisputable. But we are animals who live in culture; which is to say, the biological or psychological or economic elements of our constitution do not operate in sovereign independence of “the human spirit.” They are inflected and interpreted in meanings and intentions. We do not only receive material causes, we also act upon them. For this reason, we cannot be explained only in terms of our externalities. Not even our externalities can be explained only externally.



It is the irreducible reality of inwardness, and its autonomy as a category of understanding, over which Pinker, in his delirium of empirical research, rides roughshod. The humanities are the study of the many expressions of that inwardness. Pinker’s condescension to the humanities is endless. He proposes for the humanities “a consilience with science,” but the only apparent beneficiary of such an arrangement would be the humanities, since they have nothing much to offer the sciences, which obviously occupy a higher place in the hierarchy of knowledge. Or more precisely, “the humanities would enjoy more of the explanatory depth of the sciences ... [and] the sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanists.” I am not sure I understand the latter compliment. He seems to be saying that scientists think well and humanists write well. “Consilience” is a word that should get humanists’ backs up: the convergence of which it dreams is not so much a convergence of the sciences with the humanities as a convergence of the sciences upon the humanities. Pinker’s program puts me in mind of the definition of scientism that a British philosopher offered years ago: “the belief that science, especially natural science, is much the most valuable part of human learning” and “the view that it is always good for subjects that do not belong to science to be placed on a scientific footing.” It is a more candid and more accurate definition than Pinker’s casuistry about intelligibility and difficulty. Pinker impugns humanists for inventing a straw man called scientism, and then he goes and covers himself in straw.



.Pinker’s condescension to the humanities is nicely illustrated by the second of what he advertises as science’s two virtues—his criterion of difficulty: “the acquisition of knowledge is hard.” He continues: “The world does not go out of its way to reveal its workings, and even if it did, our minds are prone to illusions, fallacies, and superstitions. Most of the traditional causes of belief—faith, revelation, dogma, authority, charisma, conventional wisdom, the invigorating glow of subjective certainty—are generators of error and should be dismissed as sources of knowledge.” Science, by contrast, teaches “skepticism, open debate, formal precision, and empirical tests,” as indeed it does. Pinker seems to be saying that reason is essentially scientific. This is another one of his definitional tricks. Reason is larger than science. Reason is not scientific; science is rational. Moreover, science is not all that is rational. Philosophy and literature and history and critical scholarship also espouse skepticism, open debate, formal precision (though not of the mathematical kind), and—at the higher reaches of humanistic labor—even empirical tests. What is a novel if not the representation of simultaneous non-omniscient perspectives—skepticism in the form of narrative? In literature and the arts, there are ideas, intellectually respectable ideas, about the world, but they are not demonstrated, they are illustrated. They are not argued, they are imagined; and the imagination has rigors of its own. What the imagination imparts in the way of understanding the world should also be called knowledge. Scientists and scientizers are not the only ones working toward truths and trying to get things right.



Pinker’s self-congratulatory suggestion that only science recognizes the complexity and the obscurity of the world—his implication that in the nonscientific disciplines the acquisition of knowledge, if knowledge is even acquired, is easy—is very unimpressive. It betrays a contempt for humanistic exertion, even as he accuses the liberal arts in many universities of “cultivat[ing] a philistine indifference to science that shades into contempt.” The superiority of the sciences to the humanities in Pinker’s account is made clear by his proposed solution to the crisis in the humanities: “an infusion of new ideas,” which turns out to be an infusion of scientific ideas. There is nothing wrong with the humanities that the sciences cannot fix. Pinker is correct to hold the humanities partly complicit in their own decline, referring appositely to “the disaster of postmodernism” and “suffocating political correctness”; but he does not summon the humanities to recover their greatness and their pride. Instead he summons them to a process of scientization. The humanities, he charges, “have failed to define a progressive agenda.” There follows this unforgettable observation: “Several university presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done.” Why can’t the humanities be more like the sciences, and “appeal to deans and donors”?



How lamentable for the humanities, that Pinker and the other big shots in the faculty club no longer find them sexy. Some of us, by contrast, cannot open a page of Sophocles and Tacitus and Augustine and Milton and Gibbon and Keats and Tocqueville and Emerson and Mill and Dickens and Mann and Stevens and Auerbach and Camus and Panofsky and Miłosz without a quickening in our blood. (We are the same benighted people who receive quite calmly the latest bulletins from the frontiers of neuroscience.) There is something callow about Pinker’s insistence that the humanities get with it, that they learn to keep up. He reminds me of C. P. Snow—we have been here before—and his sneering characterization of “literary intellectuals” as “the traditional culture”: “If the scientists have the future in their bones, then the traditional culture responds by wishing the future did not exist.” Snow did not grasp that “the traditional culture,” or modern literature and thought, was in many respects revolutionary, a grand project of skepticism and subversion, and that “the future” owes a great deal, for better and for worse (but the same may be said of the influence of science), to the direction in which writers and artists and philosophers and historians and critics lead the culture. Pinker is similarly blinkered; for him, too, scientists are the unacknowledged legislators of the world.



It is important to point out, therefore, that it was the imperative to keep up, to be “progressive,” which led to “the disaster of postmodernism” and other unfortunate hermeneutical fashions of recent decades. More importantly, the humanities do not advance the way the sciences advance. Once again Pinker has imposed a scientistic framework upon a nonscientistic discussion. The humanities do not progress linearly, additively, sequentially, like the sciences. The sciences were never riven by a querelle des Anciens et des Modernes, because modern scientists have no need to study ancient scientists, at least for the purposes of scientific work. The history of science is a history of errors corrected and discarded. But the vexations of philosophy and the obsessions of literature are not retired in this way. In these fields, the forward-looking cast backward glances. The history of old art and thought fuels the production of young art and thought. Scientists no longer consult Aristotle’s scientific writings, but philosophers still consult Aristotle’s philosophical writings. In this sense, the humanities, unlike the sciences, constitute a tradition, which is (in Gershom Scholem’s words) “a process that creates productivity through receptivity.” The present has the power of life and death over the past. It can choose to erase vast regions of it. Tradition is what the present calls those regions of the past that it retains, that it cherishes and needs. Contrary to the progressivist caricature, tradition is not the domination of the present by the past. It is the domination of the past by the present—the choice that we make to preserve and to love old things because we have discovered in them resources for contemporary sustenance and up-to-the-minute illumination.



It makes no sense to denounce a tradition for demanding “respect for the way things have always been done.” The humanities should make no apologies for making such a demand. It is not the dreary reactionary intercession that Pinker makes it out to be—unless, of course, there really is nothing more to be learned from the way things have always been done. Where is the man who can honestly say that this is so? Scientizers—and presidents and provosts and deans and donors—may regard such a release from custom as a liberation, but for humanists it represents a calamity, a terrible self-inflicted wound on the self and the culture. There are moments when there is nothing more urgent than the defense of what has already been accomplished. A threat to what one values cannot be met by a desire for something else. In his opposition to postmodernist theories of science, for example, and to other misappropriations of the mantle of science, Pinker is correct to be unswayed by the rustle of the new and to speak for conformity to the established understandings. Sometimes wisdom is conventional. The denigration of conventional wisdom is itself a convention.



.In demanding respect for the way things have always been done, one is not demanding an end to new ways of doing things. Tradition is a body of accumulated innovations, some of them evolving smoothly from precedent, some of them more of a rupture with earlier methods and conclusions. The chronicle of the humanities is the chronicle of different techniques for interpreting the humanities. They do not all go together, like the canon itself; and the internecine tensions, which result from the workings of originality in even the most hidebound pursuits, provide many of the thrills of humanistic learning. Pinker concedes that “there can be no replacement for the varieties of close reading, thick description, and deep immersion that erudite scholars can apply to individual works,” but really he is bored by all those old practices and wants the humanities to move on. They need to be saved; they need to be saved by something other than themselves; they need to be saved by science. “A consilience with science offers the humanities countless possibilities for innovation in understanding.” There follows the usual breathless list of contemporary scientific excitements: neuroscience, since “art, culture, and society are products of human brains”; and linguistics, cognitive psychology, behavioral genetics, and evolutionary psychology; and data science, or “big data.”







Pinker justifiably deplores the dogmatic resistance of certain humanists to the encounter of their fields with these sciences. The search for knowledge is catch as catch can; accept the truth from whoever utters it. Yet his examples of particular humanities rescued by particular sciences are rather underwhelming. “Linguistics can illuminate the resources of grammar and discourse that allow authors to manipulate a reader’s imaginary experience.” But those are technical matters, not matters of meaning. How art works is not the most penetrating question that can be asked about it. Many years ago I attended a lecture by Roman Jakobson on patterns of consonant placement in Baudelaire’s “Le Chat,” and it was the least enlightening discussion of a poem I ever heard. “Behavioral genetics can update folk theories of parental influence with discoveries about the effects of genes, peers, and chance, which have profound implications for the interpretation of biography and memoir.” Profound? I think not. Whatever its genetic roots, a man’s experience of his father is his experience of his father, and the representation of that relationship in a biography or a memoir demands empathy and probity more than a hunt for phenotypes. “Evolutionary psychologists can distinguish the obsessions that are universal from those that are exaggerated by a particular culture and can lay out the inherent conflicts and confluences of interest within families, couples, friendships, and rivalries that are the drivers of plot.” How is Madame Bovary more deeply explicated by the suggestion that an interest in adultery by a writer in Croisset in the 1850s was not common to his contemporaries in Africa? Can Pinker really believe that these “drivers of plot” were hidden from readers and scholars until evolutionary psychology came along? Or is it that we did not really know them until our paltry comp-lit intuitions were provided with scientistic foundations? And surely the evolutionary dimensions of Middlemarch and In Search of Lost Time and Herzog are their least significant dimensions—a distraction from the real challenge of such books, which is the exploration of subjectivity and what is lived. What makes “conflicts and confluences” interesting in a work of art is that they are intentions formed by values and desires, not outcomes fixed by chromosomes. The scientific reading of a novel’s plot may thus be both true and marginal. This condemns scientizers who meddle in the humanities to a permanent condition of bafflement.



Pinker concludes his inventory of all the favors that the sciences can do for the humanities with a paean to “an expansive new ‘digital humanities,’ ” in which “the possibilities for theory and discovery are limited only by the imagination and include the origin and spread of ideas, networks of intellectual and artistic influence, the persistence of historical memory, the waxing and waning of themes in literature, and patterns of unofficial censorship and taboo.” Those were all familiar subjects for the pre-digital humanities, though there is no doubt that those subjects are about to enjoy, or endure, a quantitative windfall. The problem, of course, is what to make of the role of quantification. It is certainly the case that the more scholars know, the better for scholarship. But “allatonceness”—McLuhan’s term has been adopted by advocates of the digital humanities—brings its own anxieties. All data points are not equally instructive or equally valuable, intellectually and historically. Judgments will still have to be made; frameworks for evaluation will still have to be readied against the informational deluge. Those judgments and those frameworks will be modified and refined by the data, but they cannot be dictated by the data. Search, search, search, but reflect. If the digital humanities depend heavily on crowd-sourcing, somebody will have to vouch for the reliability of the crowd. Or will the validations also be crowd-sourced? Surely the republic of letters cannot be wiki’ed. Learning is a collective endeavor and requires communities, but I am not prepared to renounce the romance of scholarly solitude or the gains that accrue to erudition from the lone erudit.



The inundation of historical and humanistic scholarship by patterns will also broach the question of the explanatory power of patterns. (The question occurred to me as Jakobson diligently collated all the r’s in Baudelaire’s poem.) As even some partisans of big data have noted, the massive identification of regularities and irregularities can speak to “what” but not to “why”: they cannot recognize causes and reasons, which are essential elements of humanistic research. And in some agitators for the digital humanities I detect a certain mischievousness that does not inspire confidence: they speak exaltedly of “versioning,” for instance, which is “favored over definitive editions and research silos”: “there is space to iterate and test, to create precarious experiments that are speculative, ludic, or even impossible.” This sounds like a lot of fun. I am not sure what it has to do with the expansion of scholarship.



I do not mean to be altogether churlish about the possibilities, or to confine the humanities to ghostly paleographers in the Bodleian reading room. The technological revolution will certainly transform and benefit the humanities, as it has transformed and benefited many disciplines and vocations. But it may also mutilate and damage the humanities, as it has mutilated and damaged many disciplines and vocations. My point is only that shilling for the revolution is not what we need now. The responsibility of the intellectual toward the technologies is no longer (if it ever was) mere enthusiasm. The magnitude of the changes wrought by the new machines calls for the revival of a critical temper. Too much is at stake to make do with that cool vanguard feeling. But Pinker is just another enthusiast, just another cutting-edge man, waxing on like everybody else about how “this is an extraordinary time” because “powerful tools have been developed” and so on. It is more of the general inebriation. We get it, we get it. With his dawn-is-breaking scientistic cheerleading, Pinker shows no trace of the skepticism whose absence he deplores in others. His sunny scientizing blurs distinctions and buries problems. If there was one thing for which the humanities, the old humanities, the wearyingly traditional humanities, could be counted on, it was to introduce us also to the darkness and prepare us also for the worst.



Leon Wieseltier is the literary editor of The New Republic.



Wilson



By KEVIN BAKER

Published: September 19, 2013
No American president was more improbable than Thomas Woodrow Wilson. None better embodied how we like to think of ourselves in the greater world.


WILSON



By A. Scott Berg



Illustrated. 818 pp. G. P. Putnam’s Sons. $40.

.A Princeton University president and political economy professor given to making high-minded speeches and advocating a parliamentary system, Wilson held no public office until he was 54 years old. Recruited to run for governor of New Jersey in 1910 by a Democratic machine boss who thought he would be easily controlled, the prof schooled the pro in practical politics, passing a reform agenda that curbed the power of parties and corporations alike. “After dealing with college politicians,” he gibed, “I find that the men with whom I am dealing with now seem like amateurs.”



Adroitly riding the progressive wave breaking over the country, Wilson took the presidency two years later, only the second Democrat to capture the White House since the Civil War. He possessed a rare instinct for power and how to use it. Once in Washington he put his theories to the test, audaciously choosing to rule more as a prime minister than a traditional chief executive. Within 10 months he had passed a progressive agenda that had been stalled for a generation, slashing tariff rates that protected monopolies, passing the first permanent federal income tax and creating the Federal Reserve system to end the bank panics that continually ravaged the American economy. More reforms — to bolster antitrust laws, discourage child labor and inaugurate the eight-hour day and workers’ compensation — followed.



Handsome and charismatic, Wilson was our first modern president, holding regular news conferences, complaining about having to live in Washington and delighting in popular distractions like baseball games, detective stories, golf and especially the new moving pictures. He adored women and had remarkably modern partnerships with them, sharing every aspect of his work and his ideas with his wife, Ellen, and, after she died, with his second wife, Edith. He also had a longtime — and apparently platonic — female friend.



A. Scott Berg tells the story of Wilson, the man, very well indeed. The author of four previous prizewinning, best-selling biographies, he has a novelist’s eye for the striking detail, and a vivid prose style.



He is on less sure footing when it comes to Wilson, the statesman. Too often, he relies on shoddy sources that distort the historical record. The Black Death recurred frequently, but it did not last for 400 years. Henry Cabot Lodge was not a right-­winger, the Royal Navy did not take “a timorous approach” to German U-boats and Winston Churchill did not believe that “America should have minded its own business and stayed out of the world war.”



Berg gives us little on the vital economic debates of the Progressive Era, and only a perfunctory comparison of Wilson’s “New Freedom” and Teddy Roosevelt’s “New Nationalism.” There is barely a mention of the Pujo committee’s investigations into our financial system, which made many of Wilson’s reforms possible, and no attempt to assess the long-term effects of these reforms.



He does better on issues like women’s rights and especially race. Wilson, a Virginia native steeped in the lore of the “Lost Cause,” stuffed his cabinet full of bigoted Southern mediocrities, who cruelly segregated federal offices, cafeterias and washrooms for the first time. When a black journalist and Wilson supporter, William Monroe Trotter, protested too persistently, the president ordered him out of his office.



Both his temper and his injudicious selection of advisers were indicative of flaws that would come to devour his presidency. Wilson attracted some of the most talented figures in American political history to his administration and his causes — Franklin Roosevelt, Louis Brandeis, Herbert Hoover, Walter Lippmann and Bernard Baruch, among others — but too often he failed to delegate well, routinely writing his own speeches and even typing his own policy papers. Absolute loyalty was valued over candor. Again and again, Wilson broke with his closest associates when he felt they had betrayed him.



Despite these tendencies, he managed much of the war effort brilliantly, delivering a surprisingly effective army of more than two million men to France by the end of 1918. The United States stumbled onto the world stage a full-blown colossus, turning overnight from the world’s largest debtor nation to practically its sole creditor. Arriving in Europe to negotiate the peace, Wilson was greeted with an ecstasy no American president had ever matched, hailed as the savior of mankind.



He was quickly wrestled back to earth by his allies, the French premier Georges Clemenceau and the British prime minister David Lloyd George, and embroiled in endless, debilitating conferences on every detail of reconstructing the world. Wilson’s always fragile constitution began to break down. He suffered repeated cerebral episodes in Paris — probably strokes, perhaps even early onsets of dementia — that drove him into fits of paranoia and incoherence.



Wilson nonetheless carried his main objective back to America, a treaty for a “League of Nations,” intended to prevent future wars. Ratification required support from Republican senators he had needlessly antagonized and cut out of the diplomatic process, and when they demanded changes to the treaty he refused. The Senate, led by the waspish Lodge, responded with a campaign of insult and filibuster. Wilson tried to take his case to the people, embarking on an arduous speaking tour of the West, but there he broke down once and for all. Rushed back to Washington, he suffered a crushing stroke that left him an invalid for the rest of his life.



The government professor now put the Constitution through an acid test. For over a month, Wilson’s contact with the outside world was limited largely to his wife, Edith, and his doctor, and he remained in his bedroom for nearly all of his last year and a half in office. Rumors flew that the president had gone mad, while the country descended into bloody chaos. Corporate America crushed the country’s labor unions, and white mobs attacked black communities. The dark side of Wilson’s war effort had been a series of restrictive laws, censorship decrees and organized vigilantism designed to silence dissent and leaving the country, as Berg states, in “a period of repression as egregious as any in American history.” Now his most abysmal appointment of all, Attorney General A. Mitchell Palmer, used these wartime statutes to raid homes and social clubs throughout the nation — and inject into our political system the hardy plague bacillus of J. Edgar Hoover.



To the end, Wilson refused to compromise even though, as Berg points out, the changes Lodge insisted on were to a large degree cosmetic, and would have preserved the League. Wilson let it die instead, living out the last five years of his life as a shuffling wreck of a man.



Here begins the enduring national legend of Wilson as Christ, the American leader clean of hands and noble at heart, betrayed by perfidious Europe and the “little group of willful men” back home in the Senate. Wilson’s final struggle is indeed a tragedy and Berg plumbs its depths, but once again he elides the broader context.



Nowhere does he address Margaret MacMillan’s arguments in “Paris 1919” that the whole idea of a tragic peace is overstated — that deconstructing the ancient empires leveled by World War I was too complicated a task to have ever gone well, and that there was no conceivable peace the Germans would not have resented.



Yes, we should have joined Wilson’s League. But how much would a deeply isolationist and distracted America have wanted to intervene in the Europe of the 1930s? How much would England and France have allowed us to do so? In short, did Woodrow Wilson’s martyrdom really matter so much in the end . . . or is it more a story we like to tell ourselves?





Kevin Baker’s latest novel is “The Big Crowd.”



Friday, September 20, 2013

Book Banning - 2013

N. Carolina County Bans ‘Invisible Man’


All copies of Ralph Ellison's National Book Award–winning novel Invisible Man will be removed from a North Carolina county's school libraries. The Randolph County Board of Education voted, Monday, to ban the critically acclaimed 1952 book from its reading list. Invisible Man was named one of the top 100 English-language novels of the 20th century by Time magazine in 2010, but according to the parent whose complaint sparked the vote, "This book is filthier, too much for teenagers." The board's chair said he thought the novel was "a hard read," while another board member said he "didn't find any literary value" in it.



The New Civil War - 2013



The Crazy PartyBy PAUL KRUGMAN

Published: September 19, 2013 1005 Comments



Early this year, Bobby Jindal, the governor of Louisiana, made headlines by telling his fellow Republicans that they needed to stop being the “stupid party.” Unfortunately, Mr. Jindal failed to offer any constructive suggestions about how they might do that. And, in the months that followed, he himself proceeded to say and do a number of things that were, shall we say, not especially smart.


Nonetheless, Republicans did follow his advice. In recent months, the G.O.P. seems to have transitioned from being the stupid party to being the crazy party.



I know, I’m being shrill. But as it grows increasingly hard to see how, in the face of Republican hysteria over health reform, we can avoid a government shutdown — and maybe the even more frightening prospect of a debt default — the time for euphemism is past.



It helps, I think, to understand just how unprecedented today’s political climate really is.



Divided government in itself isn’t unusual and is, in fact, more common than not. Since World War II, there have been 35 Congresses, and in only 13 of those cases did the president’s party fully control the legislature.



Nonetheless, the United States government continued to function. Most of the time divided government led to compromise; sometimes to stalemate. Nobody even considered the possibility that a party might try to achieve its agenda, not through the constitutional process, but through blackmail — by threatening to bring the federal government, and maybe the whole economy, to its knees unless its demands were met.



True, there was the government shutdown of 1995. But this was widely recognized after the fact as both an outrage and a mistake. And that confrontation came just after a sweeping Republican victory in the midterm elections, allowing the G.O.P. to make the case that it had a popular mandate to challenge what it imagined to be a crippled, lame-duck president.



Today, by contrast, Republicans are coming off an election in which they failed to retake the presidency despite a weak economy, failed to retake the Senate even though far more Democratic than Republican seats were at risk, and held the House only through a combination of gerrymandering and the vagaries of districting. Democrats actually won the popular ballot for the House by 1.4 million votes. This is not a party that, by any conceivable standard of legitimacy, has the right to make extreme demands on the president.



Yet, at the moment, it seems highly likely that the Republican Party will refuse to fund the government, forcing a shutdown at the beginning of next month, unless President Obama dismantles the health reform that is the signature achievement of his presidency. Republican leaders realize that this is a bad idea, but, until recently, their notion of preaching moderation was to urge party radicals not to hold America hostage over the federal budget so they could wait a few weeks and hold it hostage over the debt ceiling instead. Now they’ve given up even on that delaying tactic. The latest news is that John Boehner, the speaker of the House, has abandoned his efforts to craft a face-saving climbdown on the budget, which means that we’re all set for shutdown, possibly followed by debt crisis.



How did we get here?



Some pundits insist, even now, that this is somehow Mr. Obama’s fault. Why can’t he sit down with Mr. Boehner the way Ronald Reagan used to sit down with Tip O’Neill? But O’Neill didn’t lead a party whose base demanded that he shut down the government unless Reagan revoked his tax cuts, and O’Neill didn’t face a caucus prepared to depose him as speaker at the first hint of compromise.



No, this story is all about the G.O.P. First came the southern strategy, in which the Republican elite cynically exploited racial backlash to promote economic goals, mainly low taxes for rich people and deregulation. Over time, this gradually morphed into what we might call the crazy strategy, in which the elite turned to exploiting the paranoia that has always been a factor in American politics — Hillary killed Vince Foster! Obama was born in Kenya! Death panels! — to promote the same goals.



But now we’re in a third stage, where the elite has lost control of the Frankenstein-like monster it created.



So now we get to witness the hilarious spectacle of Karl Rove in The Wall Street Journal, pleading with Republicans to recognize the reality that Obamacare can’t be defunded. Why hilarious? Because Mr. Rove and his colleagues have spent decades trying to ensure that the Republican base lives in an alternate reality defined by Rush Limbaugh and Fox News. Can we say “hoist with their own petard”?



Of course, the coming confrontations are likely to damage America as a whole, not just the Republican brand. But, you know, this political moment of truth was going to happen sooner or later. We might as well have it now.



Thursday, September 19, 2013

With James Meredith

I was in Tunica this week. Yesterday, I visited the local public library. I noticed that James Meredith was coming today to lecture on improving public education for our children and to promote his book, A Mission From God: A Memoir and Challenge for America. So, I attended the program this afternoon.

I arrived early enough to chat briefly with Mr. Meredith beforehand. Yesterday, while visiting the library, I met the head librarian, and she introduced me. Mr. Meredith is now 80, and his passion is educating children and uplifting people from poverty.

He said during his lecture that he has written over 20 books. He said his degree is in Political Science, History, and French, as the first black graduate of Ole Miss in 1963. He went on to earn a law degree from Columbia. He now lives in Jackson, Mississippi.

His overarching point is that the problem of public education is a breakdown of moral character and religion can rectify that problem.

Tuesday, September 17, 2013

Depressing Brain Power

Tuesday, Sep 17, 2013 07:43 AM CDT

Scientists’ depressing new discovery about the brain

Forget the dream that education, scientific evidence or reason can help people make good decisions

By Marty Kaplan
Yale law school professor Dan Kahan’s new research paper is called “Motivated Numeracy and Enlightened Self-Government,” but for me a better title is the headline on science writer Chris Mooney’s piece about it in Grist: “Science Confirms: Politics Wrecks Your Ability to Do Math.”



Kahan conducted some ingenious experiments about the impact of political passion on people’s ability to think clearly. His conclusion, in Mooney’s words: partisanship “can even undermine our very basic reasoning skills…. [People] who are otherwise very good at math may totally flunk a problem that they would otherwise probably be able to solve, simply because giving the right answer goes against their political beliefs.”



In other words, say goodnight to the dream that education, journalism, scientific evidence, media literacy or reason can provide the tools and information that people need in order to make good decisions. It turns out that in the public realm, a lack of information isn’t the real problem. The hurdle is how our minds work, no matter how smart we think we are. We want to believe we’re rational, but reason turns out to be the ex post facto way we rationalize what our emotions already want to believe.



For years my go-to source for downer studies of how our hard-wiring makes democracy hopeless has been Brendan Nyhan, an assistant professor of government at Dartmouth.



Nyan and his collaborators have been running experiments trying to answer this terrifying question about American voters: Do facts matter?



The answer, basically, is no. When people are misinformed, giving them facts to correct those errors only makes them cling to their beliefs more tenaciously.



Here’s some of what Nyhan found:

People who thought WMDs were found in Iraq believed that misinformation even more strongly when they were shown a news story correcting it.

•People who thought George W. Bush banned all stem cell research kept thinking he did that even after they were shown an article saying that only some federally funded stem cell work was stopped.

•People who said the economy was the most important issue to them, and who disapproved of Obama’s economic record, were shown a graph of nonfarm employment over the prior year – a rising line, adding about a million jobs. They were asked whether the number of people with jobs had gone up, down or stayed about the same. Many, looking straight at the graph, said down.

•But if, before they were shown the graph, they were asked to write a few sentences about an experience that made them feel good about themselves, a significant number of them changed their minds about the economy. If you spend a few minutes affirming your self-worth, you’re more likely to say that the number of jobs increased.

In Kahan’s experiment, some people were asked to interpret a table of numbers about whether a skin cream reduced rashes, and some people were asked to interpret a different table – containing the same numbers – about whether a law banning private citizens from carrying concealed handguns reduced crime. Kahan found that when the numbers in the table conflicted with people’s positions on gun control, they couldn’t do the math right, though they could when the subject was skin cream. The bleakest finding was that the more advanced that people’s math skills were, the more likely it was that their political views, whether liberal or conservative, made them less able to solve the math problem.



I hate what this implies – not only about gun control, but also about other contentious issues, like climate change. I’m not completely ready to give up on the idea that disputes over facts can be resolved by evidence, but you have to admit that things aren’t looking so good for a reason. I keep hoping that one more photo of an iceberg the size of Manhattan calving off of Greenland, one more stretch of record-breaking heat and drought and fires, one more graph of how atmospheric carbon dioxide has risen in the past century, will do the trick. But what these studies of how our minds work suggest is that the political judgments we’ve already made are impervious to facts that contradict us.



Maybe climate change denial isn’t the right term; it implies a psychological disorder. Denial is business-as-usual for our brains. More and better facts don’t turn low-information voters into well-equipped citizens. It just makes them more committed to their misperceptions. In the entire history of the universe, no Fox News viewers ever changed their minds because some new data upended their thinking. When there’s a conflict between partisan beliefs and plain evidence, it’s the beliefs that win. The power of emotion over reason isn’t a bug in our human operating systems, it’s a feature.



Sunday, September 15, 2013

The Birmingham Bombings After 50 Years



50 Years Later: The Sixteenth Street Baptist Church Bombing

by Lottie L. Joiner Sep 15, 2013 9:58 AM EDT

Fifty years ago, four little girls were tragically killed during a bombing at the Sixteenth Street Baptist Church. Birmingham native Carolyn McKinstry remembers that day.

93

inShare.28 It was only a few weeks ago when the nation commemorated the 50th Anniversary of the March on Washington for Jobs and Freedom. It was an opportunity to look at the progress of African Americans over the past five decades and a reminder that the fight for social justice, civil rights and human rights is a constant battle.





Unidentified mourners, who overflowed the church, stand across the street during funeral services for 14-year-old Carol Robertson, Sept. 17, 1963, Birmingham, Ala. The girl was one of four young African Americans killed in a bomb blast the previous Sunday. (AP)



Before addressing the crowd, President Obama joined the King Family—Martin III, Dexter, and Bernice—in ringing a bell that once hung in the Sixteenth Street Baptist Church in Birmingham, Ala. Less than a month after Martin Luther King Jr. made his historic “I Have a Dream” speech, the16th Street Baptist Church was bombed, killing four black girls: Denise McNair, 11; Carol Robertson, 14; Cynthia Wesley, 14, and Addie Mae Collins, 14.



Sixteenth Street Baptist Church was the first and largest black church in Birmingham. Located in the heart of downtown, it was known to host historic figures such as Thurgood Marshall, W.E.B. DuBois, and later Hillary Clinton and a junior senator from Illinois who would later become America’s first black president. During the 1960s, 16th Street was the hub of the city’s civil rights activities. There, civil rights activists strategized, held mass meetings, sponsored rallies, and planned demonstrations in the fight against segregation.



At the height of the Civil Rights Movement, Birmingham was known as Bombingham. By the fall of 1963, there had been more than 80 unsolved bombings in the city including the home of A.D King, Martin Luther King Jr.’s brother.



But September 15, 1963 would go down in history as a day like no other. A bomb, planted in the church’s basement, exploded, killing the four girls and injuring many others. It was “a moment that the world would never forget,” Lonnie Bunch told The Washington Post. Bunch is the director of the Smithsonian’s National Museum of African American History and Culture. Shard glass from the historic church was recently donated to the museum, which is scheduled to open in 2015.



The caller on the other end of the phone said: “Three minutes.”

For the first 14 years after the bombing no one was arrested. Robert Chambliss was eventually convicted of the crime in 1977. And three decades after that horrific explosion, Thomas Blanton Jr. and Bobby Frank Cherry were convicted of murder and sentenced to life in prison. It would be the only bombing solved in Birmingham.



Earlier this year, President Obama signed a bill designating the Congressional Gold Medal to the victims of the bombing. On Sept. 10, the families of the four girls gathered in Washington, D.C., for the bi-partisan ceremony at the U.S. Capitol. The medal will be housed at the Birmingham Civil Rights Institute, located across the street from the Sixteenth Street Baptist Church.



Carolyn McKinstry was the 15-year-old Sunday School secretary of Sixteenth Street Baptist Church when it was bombed 50 years ago and was friends with the four girls who died during the explosion. Terrorism was a way of life for children growing up in segregated Birmingham said McKinstry, who documented that time in her book, While the World Watched: A Birmingham Bombing Survivor Comes of Age during the Civil Rights Movement. She talks about that tragic day and the impact the death of four little girls had on the Civil Rights Movement.



Joiner: Take me back to that day 50 years ago, September 15, 1963.



McKinstry: At the end of Sunday School, I would get up and make a report. Around 10:15 a.m., I got up to collect the reports. I started upstairs. You had to pass the girls bathroom. I paused at the doorway because they [Addie Mae, Carol, Cynthia, Denise] were all standing there, combing their hair, playing and talking. We were all good friends and we were excited about two things that Sunday. It was Youth Sunday and that meant we got to do everything. We were the choir. We were the ushers, the speakers. The second thing was after church we were going to have a gathering with punch and dancing. I knew my report had to be done at a certain time, so I went on up the stairs. When I got to the office, the phone was ringing. The caller on the other end of the phone said: “Three minutes.” Male caller. But he hung up just as quickly as he said that. I stepped out into the sanctuary to get more reports and I only took about 15 steps into the sanctuary and the bomb exploded.



What did you do?



When the bomb exploded it felt like the building shook. Everything came crashing in, the glass and the windows in the church. I fell on the floor because someone said, “Hit the floor.” We were all on the floor for just a couple of seconds. And then I could hear people getting up and running out. I got up and I went outside. I was looking for my two little brothers. One of the first things we noticed was that the church was already surrounded by policemen. People were in panic mode. They were everywhere looking for their family members.



How did you learn that the four young ladies died?



When I went home that Sunday, I remember one or two people calling my mother looking for their children. One of them was Mrs. Robertson, Carol’s mother. Later, somebody else called and said that the girls in the bathroom never made it out. My heart jumped. I knew who they were talking about. I was shocked. I was numb. The bomb exploded on Sunday at 10:22 a.m. On Monday morning at 8 o’clock, I was sitting in my classroom. No one said anything. No one said, “Let’s have a moment of silent prayer.” No one said, “Let’s have a memorial. Let’s talk about it.” Even in my home we didn’t talk about it. My parents never said, are you okay? Do you miss your friends? Are you afraid? I think the reason we didn’t talk about it primarily was because there was nothing we could do about it.



What stands out most about that day?



The first thing that stands out is the pain of that day. How horrible it was and learning that my friends had died. The second thing that stands out is that no one responded. No one did anything. For the first 14 years after the bombing of the church no one was arrested. Nothing happened. The police and FBI acted as though they didn’t have any evidence or enough evidence. But the police would later say they did not feel they could get a conviction in Birmingham. The mood of the community was such that they did not think white people were going to convict one of their own for the death of black children. But the truth was in Birmingham no one thought that black life was important. It didn’t matter that blacks were killed, that little girls were killed in Sunday School.



How did the bombing of Sixteenth Street impact Birmingham?



It gave us a reputation that we didn’t want. There is nowhere in the world that you can go that people don’t know this story. That’s how horrific it was. And how people saw what we had done. When we finally prosecuted someone 14 years later and then 32 years later, I think it was because we received pressure from the rest of the world. You know how people can shame you? You want to make amends. That one image we could never get rid of: Killing babies in church all in the name of segregation. So I think when we began the prosecuting of the last two men, it was an attempt to say we have changed. We are a different nation.



What impact did the bombing have on the Civil Rights Movement and America?



It softened the heart of the oppressors. What Dr. King said to us was that unmerited suffering was always redemptive. He also said that the blood of these girls, might well serve as a redemptive force not only for Birmingham, Ala., but for the rest of the world. We may yet see something very horrible become a force for good. And I think that is what we saw to a large extent. The following year we saw the signing of the civil rights legislation.



How would you describe Birmingham today?



I think that Birmingham is a city that is on its way. We have not arrived yet. We really have made some good strides toward showing that we value all of our citizens. It’s a place that’s openly integrated now in all aspects. In 2000, we had an amendment on the ballot to remove the ban on interracial marriage. So we just did that in 2000. In the meantime, the state still functions under the 1901 Constitution. The governor has put together a commission to re-write the Constitution for Alabama.