Monday, April 29, 2013

R.B. Bernstein --Thomas Jefferson

This is the best brief biography of Jefferson.  I read the book initially when it was published in 2003.  After deciding on doing some Jefferson reading I read this book again.  It is a good reference.

The Phoniness of Austerity

The Story of Our TimeBy PAUL KRUGMAN

Published: April 28, 2013

Those of us who have spent years arguing against premature fiscal austerity have just had a good two weeks. Academic studies that supposedly justified austerity have lost credibility; hard-liners in the European Commission and elsewhere have softened their rhetoric. The tone of the conversation has definitely changed.


My sense, however, is that many people still don’t understand what this is all about. So this seems like a good time to offer a sort of refresher on the nature of our economic woes, and why this remains a very bad time for spending cuts.



Let’s start with what may be the most crucial thing to understand: the economy is not like an individual family.



Families earn what they can, and spend as much as they think prudent; spending and earning opportunities are two different things. In the economy as a whole, however, income and spending are interdependent: my spending is your income, and your spending is my income. If both of us slash spending at the same time, both of our incomes will fall too.



And that’s what happened after the financial crisis of 2008. Many people suddenly cut spending, either because they chose to or because their creditors forced them to; meanwhile, not many people were able or willing to spend more. The result was a plunge in incomes that also caused a plunge in employment, creating the depression that persists to this day.



Why did spending plunge? Mainly because of a burst housing bubble and an overhang of private-sector debt — but if you ask me, people talk too much about what went wrong during the boom years and not enough about what we should be doing now. For no matter how lurid the excesses of the past, there’s no good reason that we should pay for them with year after year of mass unemployment.



So what could we do to reduce unemployment? The answer is, this is a time for above-normal government spending, to sustain the economy until the private sector is willing to spend again. The crucial point is that under current conditions, the government is not, repeat not, in competition with the private sector. Government spending doesn’t divert resources away from private uses; it puts unemployed resources to work. Government borrowing doesn’t crowd out private investment; it mobilizes funds that would otherwise go unused.



Now, just to be clear, this is not a case for more government spending and larger budget deficits under all circumstances — and the claim that people like me always want bigger deficits is just false. For the economy isn’t always like this — in fact, situations like the one we’re in are fairly rare. By all means let’s try to reduce deficits and bring down government indebtedness once normal conditions return and the economy is no longer depressed. But right now we’re still dealing with the aftermath of a once-in-three-generations financial crisis. This is no time for austerity.



O.K., I’ve just given you a story, but why should you believe it? There are, after all, people who insist that the real problem is on the economy’s supply side: that workers lack the skills they need, or that unemployment insurance has destroyed the incentive to work, or that the looming menace of universal health care is preventing hiring, or whatever. How do we know that they’re wrong?



Well, I could go on at length on this topic, but just look at the predictions the two sides in this debate have made. People like me predicted right from the start that large budget deficits would have little effect on interest rates, that large-scale “money printing” by the Fed (not a good description of actual Fed policy, but never mind) wouldn’t be inflationary, that austerity policies would lead to terrible economic downturns. The other side jeered, insisting that interest rates would skyrocket and that austerity would actually lead to economic expansion. Ask bond traders, or the suffering populations of Spain, Portugal and so on, how it actually turned out.



Is the story really that simple, and would it really be that easy to end the scourge of unemployment? Yes — but powerful people don’t want to believe it. Some of them have a visceral sense that suffering is good, that we must pay a price for past sins (even if the sinners then and the sufferers now are very different groups of people). Some of them see the crisis as an opportunity to dismantle the social safety net. And just about everyone in the policy elite takes cues from a wealthy minority that isn’t actually feeling much pain.



What has happened now, however, is that the drive for austerity has lost its intellectual fig leaf, and stands exposed as the expression of prejudice, opportunism and class interest it always was. And maybe, just maybe, that sudden exposure will give us a chance to start doing something about the depression we’re in.



Sunday, April 28, 2013

W the Worst Ever

April 27, 2013, 2:00 pm
by Paul Krugman
The Great Degrader

I’ve been focused on economic policy lately, so I sort of missed the big push to rehabilitate Bush’s image; also, as a premature anti-Bushist who pointed out how terrible a president he was back when everyone else was praising him as a Great Leader, I’m kind of worn out on the subject.



But it does need to be said: he was a terrible president, arguably the worst ever, and not just for the reasons many others are pointing out.



From what I’ve read, most of the pushback against revisionism focuses on just how bad Bush’s policies were, from the disaster in Iraq to the way he destroyed FEMA, from the way he squandered a budget surplus to the way he drove up Medicare’s costs. And all of that is fair.



But I think there was something even bigger, in some ways, than his policy failures: Bush brought an unprecedented level of systematic dishonesty to American political life, and we may never recover.



Think about his two main “achievements”, if you want to call them that: the tax cuts and the Iraq war, both of which continue to cast long shadows over our nation’s destiny. The key thing to remember is that both were sold with lies.



I suppose one could make an argument for the kind of tax cuts Bush rammed through — tax cuts that strongly favored the wealthy and significantly increased inequality. But we shouldn’t forget that Bush never admitted that his tax cuts did, in fact, favor the wealthy. Instead, his administration canceled the practice of making assessments of the distributional effects of tax changes, and in their selling of the cuts offered what amounted to an expert class in how to lie with statistics. Basically, every time the Bushies came out with a report, you knew that it was going to involve some kind of fraud, and the only question was which kind and where.



And no, this wasn’t standard practice before. Politics ain’t beanbag and all that, but the president as con man was a new character in American life.



Even more important, Bush lied us into war. Let’s repeat that: he lied us into war. I know, the apologists will say that “everyone” believed Saddam had WMD, but the truth is that even the category “WMD” was a con game, lumping together chemical weapons with nukes in an illegitimate way. And any appearance of an intelligence consensus before the invasion was manufactured: dissenting voices were suppressed, as anyone who was reading Knight-Ridder (now McClatchy) knew at the time.



Why did the Bush administration want war? There probably wasn’t a single reason, but can we really doubt at this point that it was in part about wagging the dog? And right there you have something that should block Bush from redemption of any kind, ever: he misled us into a war that probably killed hundreds of thousands of people, and he did it in part for political reasons.



There was a time when Americans expected their leaders to be more or less truthful. Nobody expected them to be saints, but we thought we could trust them not to lie about fundamental matters. That time is now behind us — and it was Bush who did

Saturday, April 27, 2013

LIke, I Distrust

I automatically distrust 1) people with MBAs, the more prestigious the pedigree the more I distrust 2) spreadsheets (a corollary of #1) 3) frank and simple people whose stories hold up 4) people who constantly say "the research shows" and 5) people who put "like" into every sentence. LIke, I distrust lots of other people, but this will do for starters


The Impossible Decision

The Impossible Decision

Posted by Joshua Rothman9961
From The News Yorker


Graduate students are always thinking about the pleasures and travails of grad school, and springtime is a period of especially intense reflection. It’s in the spring, often in March and April, that undergraduates receive their acceptance letters. When that happens, they turn to their teachers, many of them graduate students, for advice. They ask the dreaded, complicated, inevitable question: To go, or not to go?



Answering that question is not easy. For graduate students, being consulted about grad school is a little like starring in one of those “Up” documentaries (“28 Up,” ideally; “35 Up,” in some cases). Your students do the work of Michael Apted, the series’s laconic director, asking all sorts of tough, personal questions. They push you to think about the success and failure of your life projects; to decide whether or not you are happy; to guess what the future holds; to consider your life on a decades-long scale. This particular spring, the whole conversation has been enriched by writers from around the Web, who have weighed in on the pros and cons of graduate school, especially in the humanities. In addition to the usual terrifying articles in the advice section of the Chronicle of Higher Education, a pair of pieces in Slate—“Thesis Hatement,” by Rebecca Schuman, and “Thesis Defense” by Katie Roiphe—have sparked many thoughtful responses from bloggers and journalists. It’s as though a virtual symposium has been convened.



I’m a former humanities graduate student myself—I went to grad school in English from 2003 through 2011 before becoming a journalist, and am still working nights on my dissertation—and I’m impressed by the clarity of the opinions these essays express. (Rebecca Schuman: “Don’t do it. Just don’t”; Katie Roiphe: “It gives you a habit of intellectual isolation that is… useful, bracing, that gives you strength and originality.”) I can’t muster up that clarity myself, though. I’m very glad that I went to graduate school—my life would be different, and definitely worse, without it. But when I’m asked to give students advice about what they should do, I’m stumped. Over time, I’ve come to feel that giving good advice about graduate school is impossible. It’s like giving people advice about whether they should have children, or move to New York, or join the Army, or go to seminary.



Maybe I’ve been in school too long; doctoral study has a way of turning your head into a never-ending seminar, and I’m now capable of having complicated, inconclusive thoughts about nearly any subject. But advice helps people when they are making rational decisions, and the decision to go to grad school in English is essentially irrational. In fact, it’s representative of a whole class of decisions that bring you face to face with the basic unknowability and uncertainty of life.



To begin with, the grad-school decision is hard in all sorts of perfectly ordinary ways. One of them is sample bias. If you’re an undergrad, then most of the grad students you know are hopeful about their careers, and all of the professors you know are successful; it’s a biased sample. Read the harrowing collection of letters from current and former grad students published in the Chronicle, and you encounter the same problem: the letters are written by the kinds of people who read the Chronicle, in response to an article about the horrors of grad school. They, too, are writing out of their personal experiences. It’s pretty much impossible to get an impartial opinion.



Last week, one of my college friends, who now manages vast sums at a hedge fund, visited me. He’s the most rational person I know, so I asked him how he would go about deciding whether to go to grad school in a discipline like English or comparative literature. He dealt immediately with the sample bias problem by turning toward statistics. His first step, he said, would be to ignore the stories of individual grad students, both good and bad. Their experiences are too variable and path-dependent, and their stories are too likely to assume an unwarranted weight in our minds. Instead, he said, he would focus on the “base rates”: that is, on the numbers that give you a broad statistical picture of outcomes from graduate school in the humanities. What percentage of graduate students end up with tenure? (About one in four.) How much more unhappy are graduate students than other people? (About fifty-four per cent of graduate students report feeling so depressed they have “a hard time functioning,” as opposed to ten per cent of the general population.) To make a rational decision, he told me, you have to see the big picture, because your experience is likely to be typical, rather than exceptional. “If you take a broader view of the profession,” he told me, “it seems like a terrible idea to go to graduate school.”



Perhaps that’s the rational conclusion, but, if so, it’s beset on all sides by confounding little puzzles; they act like streams that divert and weaken the river of rational thought. Graduate school, for example, is a one-time-only offer. Very few people start doctoral programs later in life. If you pass it up, you pass it up forever. Given that, isn’t walking away actually the rash decision? (This kind of thinking is a subspecies of the habit of mind psychologists call loss aversion: once you have something, it’s very hard to give it up; if you get into grad school, it’s very hard not to go.) And then there’s the fact that graduate school, no matter how bad an idea it might be in the long term, is almost always fulfilling and worthwhile in the short term. As our conversation continued, my friend was struck by this. “How many people get paid to read what they want to read,” he asked, “and study what they want to study?” He paused. ”If I got into a really good program, I would probably go.”



Thinking about grad school this way is confusing, but it’s confusing in a mundane, dependable way; you’re still thinking about pros and cons, about arguments for and against a course of action. Continue to think about grad school, though, and you’ll enter the realm of the simply unknowable. The conflicting reports you’ll hear from different graduate students speak to the difficulty, perhaps even the impossibility, of judging lengthy experiences. What does it mean to say that a decade of your life is good or bad? That it was worthwhile, or a waste of time? Barring some Proustian effort of recollection, a long period of years, with its vast range of experiences and incidents, simply can’t be judged all at once. The best we can do is use what psychologists call “heuristics”: mental shortcuts that help us draw conclusions quickly.



One of the more well-understood heuristics is called the “peak-end rule.” We tend to judge long experiences (vacations, say) by averaging, more or less, the most intense moment and the end. So a grad student’s account of grad school might not be truly representative of what went on; it might merely combine the best (or worst) with how it all turned out. The most wonderful students will be averaged with the grind of the dissertation; that glorious summer spent reading Kant will be balanced against the horrors of the job market. Essentially, peak-end is an algorithm; it grades graduate school in the same way a software program grades an essay. Sure, a judgment is produced, but it’s only meaningful in a vague, approximate way. At the same time, it raises an important conceptual question: What makes an experience worthwhile? Is it the quality of the experience as it’s happening, or as it’s remembered? Could the stress and anxiety of grad school fade, leaving only the learning behind? (One hopes that the opposite won’t happen.) Perhaps one might say of graduate school what Aeneas said of his struggles: “A joy it will be one day, perhaps, to remember even this.” Today’s unhappiness might be forgotten later, or judged enriching in other ways.



This kind of thinking, in turn, makes you wonder about the larger purpose of graduate school in the humanities—about the role it assumes in one’s life. To some degree, going to graduate school is a career decision. But it’s also a life decision. It may be, therefore, that even older graduate students are too young to offer their opinions on graduate school. Ten years is a long time, but it’s still only part of a whole. The value of grad school hinges, to a large extent, on what comes next. The fact that what comes next is, increasingly, unclear—that many graduate students don’t go into academia, but pursue other jobs—might only mean that a greater proportion of the value of graduate school must be revealed with time. Grad school might be best understood as what George Eliot, at the end of “Middlemarch,” calls a “fragment of a life,” and





the fragment of a life, however typical, is not the sample of an even web: promises may not be kept, and an ardent outset may be followed by declension; latent powers may find their long-waited opportunity; a past error may urge a grand retrieval.



You never know how things will turn out. Experiences accrued in one currency can be changed into another. Ambition today can fund tranquility tomorrow; fear today can be a comfort later on. Or the reverse.



The breadth of grad school, in other words—the sheer number of years it encompasses—makes it hard to think about. But, finally, it’s challenging because of its depth, too. Grad school is a life-changing commitment: less like taking a new job and more like moving, for the entirety of your twenties, to a new country. (That’s true, I think, even for undergraduates: grad school is different from college.) Grad school will shape your schedule, your interests, your reading, your values, your friends. Ultimately, it will shape your identity. That makes it difficult to know, in advance, whether you’ll thrive, and difficult to say, afterward, what you would have been like without it.



The philosopher L. A. Paul, who teaches at the University of North Carolina at Chapel Hill, describes these sorts of big life decisions eloquently in a forthcoming paper; she calls them “epistemically transformative” decisions. Sometimes, you can’t know what something is like until you try it. You can’t know what Vegemite tastes like, for example, until you try Vegemite; you can’t know what having children will be like until you have children. You can guess what these things will be like; you can ask people; you can draw up lists of pros and cons; but, at the end of the day, “without having the experience itself” you “cannot even have an approximate idea as to what it is like to have that experience.” That’s because you won’t just be having the experience; the experience will be changing you. On the other side, you will be a different kind of person. Making such a decision, you will always be uninformed.



We don’t, Paul writes, really have a good way to talk about these kinds of life-changing decisions, but we still make them. It’s hard to say how, exactly, we do it. All she can say is that, in order to make them, we have to do something a little crazy; we have to cast aside “the modern upper middle class conception of self-realization [that] involves the notion that one achieves a kind of maximal self-fulfillment through making rational choices about the sort of person one wants to be.” From this point of view, when you contemplate grad school, you’re like Marlow, in “Heart of Darkness,” when he is travelling up-river to find Kurtz. “Watching a coast as it slips by the ship,” Conrad writes,





is like thinking about an enigma. There it is before you—smiling, frowning, inviting, grand, mean, insipid, or savage, and always mute with an air of whispering, ‘Come and find out.’



We make these decisions, I suspect, not because we’re rational, but because we’re curious. We want to know. This seems especially true about graduate school. It’s designed, after all, for curious people—for people who like knowing things. They are exactly the people most likely to be drawn in by that whispered “Come and find out.”

In a narrow sense, of course, there’s nothing about these skeptical thoughts that should stop me from giving advice about graduate school. And when students ask me, I do have things to say. I point them to data, like the chart published in The Atlantic last week, which shows the declining reliance of universities on tenured faculty. And I tell my own story, which is overwhelmingly positive. I may not have finished (yet), and, like any grad student, I had my moments of panic. But I loved graduate school, and I miss it. In particular, I miss the conversations. Talking with my students, I found and expressed my best self. The office hours I spent in conversation with my professors stand out, even years later, as extraordinary experiences. I wish that everyone I know could have them, too.



But, talking to my students, I’m aware that there are too many unknowns. There are too many ways in which a person can be disappointed or fulfilled. It’s too unclear what happiness is. It’s too uncertain how the study of art, literature, and ideas fits into it all. (I’ve never forgotten the moment, in Saul Bellow’s “Herzog,” when Herzog thinks, “Much of my life has been spent in the effort to live by more coherent ideas. I even know which ones”; Herzog knows everything except how to live and do good. And yet what he knows is so extraordinary. As a grad student, I led a fascinating and, obviously, somewhat ironic discussion of that quote.) And, finally, life is too variable, and subject to too many influences. A person’s life, Eliot writes, also at the end of “Middlemarch,” is





the mixed result of young and noble impulses struggling amidst the conditions of an imperfect social state, in which great feelings will often take the aspect of error, and great faith the aspect of illusion. For there is no creature whose inward being is so strong that it is not greatly determined by what lies outside it.



I’ll give advice about grad school if you ask me to, and I’m happy to share my experiences. But these bigger mysteries make the grad-school decision harder. They take a career conundrum and elevate it into an existential quandary. In the end, I feel just as ignorant as my curious, intelligent, inexperienced students. All I really want to say is, good luck.



Friday, April 26, 2013

Michael Sandel

 Lunch with the FT: Michael Sandel

By Edward Luce

The Harvard philosopher and ‘moral rock star’ on Obama, education’s new frontiers and the shortcomings of markets

©Luke WallerA youthful 60, with mildly thinning hair, Michael Sandel is dressed in the garb of the academic: slacks, light blue shirt, drab jacket and no tie. There is little about his slight build and gentle mien to suggest he commands the kind of audiences usually associated with thriller writers or TV anchors. If you had to pick the celebrity from a line-up of scholars, Sandel would get away with it.



I had recently caught a glimpse of the philosopher at the annual literary festival in Jaipur, where he spoke about whether rape should be treated as a special crime after the gruesome murder of a young woman in Delhi last year. Presiding over a sea of colourful saris and tunics, Sandel came across as half-geek, half-guru. For a man who has been dubbed by the US media the world’s first “moral rock star”, it was a modest showing – a mere 5,000 Indians had gathered to hear him. Compared with the 14,000 he had drawn to an open-air sports stadium in South Korea a few weeks before, or the 30m hits he has received for his online lectures in China alone, it was small chapattis. But the lecture, which Sandel staged as a kind of Socratic dialogue with his audience, held everyone spellbound.



Coming from solid middle class background and raised in Minnesota and Los Angeles, Sandel studied at Brandeis University and then got a Rhodes scholarship to Oxford, where he discovered his passion for moral philosophy. He never looked back. He has taught at Harvard for most of his adult life and lives with his wife and two sons in Brookline, Massachusetts, just outside Boston.



The philosopher made his reputation in 1982 with his debut book, Liberalism and the Limits of Justice, a powerful critique of John Rawls’s “veil of ignorance”. Rawls was the giant of postwar US liberal philosophy. But it was only in the 1990s, when Sandel started his “Justice” lecture series at Harvard, that he began to acquire a broader following.
Sandel asks audiences to imagine themselves in acute moral dilemmas – facing an oncoming train, for example, or participating in a market for human organs. Then he uses people’s answers to tease out their hidden contradictions. Neither the type of example, nor his method of reasoning, is strikingly original. Instead, it is Sandel’s packed lecture performances and pioneering use of online educational technology that sets him apart. Even Sandel’s critics are impressed by how well he can command large audiences, sometimes on the other side of the planet which he can see only through a large screen.

It takes me a minute or two to adjust to the unassuming – almost mouse-like – professor sitting opposite me. We have taken a booth at Legal Sea Foods, a fish restaurant in Harvard Square, Cambridge, that is a short walk from Sandel’s faculty. It is part of a middle-market chain of fish restaurants. “I wanted you to experience real New England cooking,” says Sandel.



It is a bitingly cold late winter’s day and both of us are craving something hot. We order straight away: Sandel chooses a tuna burger and a small Caesar salad; I order a mug of lobster bisque and fish and chips. I apologise for the Britishness of my choice. “You can’t go wrong with fish and chips,” Sandel says. In spite of the temperature outside, we both stick to the glasses of heavily iced water before us. “Oh, no, no, I won’t,” says Sandel looking mildly perturbed when I ask if he wants a glass of wine with his burger.



I remind Sandel of his lecture at Jaipur. Does that type of reception still have the capacity to surprise him? “The short answer is that it kind of amazes me there would be such interest in books about philosophy around the world,” he says, before pausing a little awkwardly. He seems reluctant to discuss the reasons behind his popularity. I press on, and ask about the high-tech lecture he gave in Korea. Sandel nods enthusiastically. “It was in a beautiful set-up as the sun was setting, and then complete darkness. The upper stadium lit up, and the giant screen had translations running at the top,” he says, using his arms to indicate the scale of the event.



“I threw out questions and the audience’s images were projected because there were many, many cameras and they could see who was responding, and, if I called on another student on the other side to respond, you could see them responding to each other.”

But what is it that draws so many people in such diverse countries? Sandel thinks about it for a minute. We are already on our starters. Sandel seems uninterested in his salad. I drain my cup of soup. “There is an enormous hunger to engage in big questions that matter,” he says finally. “I find this in all these places I’ve been travelling – from India to China, to Japan and Europe and to Brazil – there is a frustration with the terms of public discourse, with a kind of absence of discussion of questions of justice and ethics and of values. My hunch is that part of what this is tapping into – the books, but also the lectures – is that people don’t find their political parties are really addressing these questions.”



I ask him about his latest book, What Money Can’t Buy: The Moral Limits of Markets (Penguin), in which he argues that the US and other countries are turning from market economies into market societies, as Lionel Jospin, the former French prime minister, once put it. Sandel argues that we live in a time of deepening “market faith” in which fewer and fewer exceptions are permitted to the prevailing culture of transaction. The book has infuriated some economists, whom he sees as practitioners of a “spurious science”.



He has been at loggerheads with the profession for many years. In 1997, he enraged economists when he attacked the Kyoto protocol on global warming as having removed “moral stigma” from bad activity by turning the right to pollute into a tradeable permit. Economists said he misunderstood why markets work. Sandel retorts that they know the price of everything and the value of nothing. To judge by his sellout lecture tours, he has clearly tapped into a larger disquiet about the commodification of life.



Which countries are the least receptive to his concerns about market fundamentalism? “China and the US – no question,” he replies instantly. “In other parts of east Asia, in Europe and in the UK and in India and Brazil, it goes without arguing that there are moral limits to markets and the question is where to locate them. In the US and China, there are strong voices who will challenge the whole idea of there being any limits.”



Sandel’s method is to probe audiences for where those might be. In China, he explains, people tend to draw the line at train-ticket scalping (reselling tickets for profit) during the Lunar New Year, when almost everyone travels home to their ancestral village. In the US, people recoil when confronted with the idea of markets intruding on “family issues”, as Sandel puts it, such as surrogate motherhood, or to the prospect of a free market in votes. “It is a question of locating people’s boundaries,” he says.



We are now on our main courses. I have resumed my industrious eating while Sandel makes only occasional prods at his burger. “Right at the heart of market thinking is the idea that if two consenting adults have a deal, there is no need for others to figure out whether they valued that exchange properly,” he continues. “It’s the non-judgmental appeal of market reasoning that I think helped deepen its hold on public life and made it more than just an economic tool; it has elevated it into an unspoken public philosophy of everything.”



Sandel mentions that, in 2005, he took part in a joint lecture series on “globalisation and its critics” with Lawrence Summers, then Harvard’s president, now back there as a professor having served in between as Barack Obama’s senior economic adviser. Sandel and Summers, who are also neighbours, clearly disagreed on where, or whether, to draw the line on economic reasoning. Their debates were a blowout. There was not a spare seat to be found in the Sanders Theatre. “Look, some of my best friends are imperialist economists,” says Sandel, half-smiling. “But they tend to see everything through their own lens.”



Fair enough, I respond. But isn’t it quixotic to suppose the political debate can be “remoralised” in the way Sandel is hoping? And shouldn’t we be careful what we wish for? There is, I say, a thin line between promoting virtue and practising tyranny. “It’s an open question,” Sandel disarmingly concedes. Then he gives me a quick sketch of “the rise of market reasoning”, from the triumphalism of Ronald Reagan and Margaret Thatcher through Bill Clinton and Tony Blair up to the present day. “What Blair and Clinton did – and I’m using them not to blame them but as emblematic of this tendency – was they moderated but also consolidated the assumption that markets are the primary instrument for achieving the good life,” he says. “So we never really had a debate.”



Still gently toying with his burger, Sandel’s tone takes on a note of regretfulness when I mention Obama, who in the philosopher’s view has promised so much and delivered so little. “During the healthcare debate in 2009 there was a long angry summer. I was listening to Obama on C-Span and I heard him make the case for healthcare reform by saying we have to ‘bend the cost curve in the out years’. I cringed because I thought that if that’s the way he’s trying to sell healthcare, he will never succeed.



“Later in the fall, Obama did recover his footing to some degree and he quoted Ted Kennedy, who did make a moral argument and not a technocratic one for healthcare. So he is capable of speaking a larger language but it’s been strikingly muted during his first term.”



. . .



Our plates having been cleared – mine clean, his half-finished – Sandel orders a skimmed latte with chocolate syrup on the side. I go for a double espresso. I confess that I am pretty hazy about the practical implications of what he is saying. Can he give an example of a specific change he would like to see that would put economics in its place? What he had just said about Obama sounded more like a critique of the White House communications strategy than of the policy itself, I add. Again, Sandel starts by politely agreeing with my premise. He says he is worried that in the US the language of values is monopolised by the Christian right, which focuses on personal issues such as abortion. Others, he says, should recapture the language of values and extend it to the economy.



But what kind of change would that lead to? I ask, hoping for something more concrete. “I think you could say that the weakness of my argument is that I’m arguing against an overarching singular way of thinking about all questions – ‘an economic way of looking at life’, as Gary Becker [the Chicago economist and Nobel Prize winner] described it,” Sandel replies. “I’m arguing against that not by putting my own overarching singular philosophy but by saying that is a mistake and we must value goods case by case. So the answer may be one thing on the environment and the right way of dealing with nature, and a different one with education and on whether we should offer financial incentive to kids to do their homework, for example, and different still if we’re arguing against a free market in kidneys and surrogate pregnancy.”



Still not entirely convinced, I ask Sandel whether he does anything in his own life to make the world less money-minded. He begins a couple of answers but peters out. I suggest that he makes all his lectures free online. “Yes, that’s one thing,” he agrees. After our lunch I see that Sandel is listed on Royce Carlton, a speaker’s agency, as one of its big names (without apparent irony, a posting by the agency last year said Sandel was available to lecture “at a reduced fee in conjunction with his new book, What Money Can’t Buy”).



But it is talking about the free stuff that gets him going. Sandel says he was recently approached by a Silicon Valley tech company, which he did not name, that has developed the technology to support interactive global lectures. He recently did a pilot run with simultaneous live audiences in Cambridge Massachusetts, Rio de Janeiro, New Delhi, Shanghai and Tokyo. Cisco TelePresence charges hundreds of thousands of dollars per session, says Sandel. This new method costs only a couple of thousand. The drop in price could change everything. “We could see them and they could see us. I could call out to a student in Delhi, and ask a student in the fifth row of a theatre in Harvard to reply to someone in São Paulo and someone in Shanghai – and it worked. The technology worked.”



Sandel confesses he would happily go on chatting about education’s new frontiers for as long as I want – but that would be a good while and I am in danger of missing my flight. The bill settled, Sandel kindly accompanies me on a hunt for a taxi outside. As we walk, he tells me of a recent conversation he had with Rahul Gandhi, scion of the Nehru-Gandhi dynasty, and likely future prime minister of India. “Gandhi was really excited about the possibilities of online education,” he says. “He told me: ‘We could never put all these people in universities. The internet is the answer.’ I was struck by how excited he was.”



As I get into the cab, it is Sandel’s enthusiasm that strikes me. The world may be in thrall to Sandel’s “imperialist economists”, I muse en route to the airport. But there is a living to be made in the resistance.

Economics is Not a Morality Play

The 1 Percent’s SolutionBy PAUL KRUGMAN

Published: April 25, 2013
Economic debates rarely end with a T.K.O. But the great policy debate of recent years between Keynesians, who advocate sustaining and, indeed, increasing government spending in a depression, and austerians, who demand immediate spending cuts, comes close — at least in the world of ideas. At this point, the austerian position has imploded; not only have its predictions about the real world failed completely, but the academic research invoked to support that position has turned out to be riddled with errors, omissions and dubious statistics.


Yet two big questions remain. First, how did austerity doctrine become so influential in the first place? Second, will policy change at all now that crucial austerian claims have become fodder for late-night comics?



On the first question: the dominance of austerians in influential circles should disturb anyone who likes to believe that policy is based on, or even strongly influenced by, actual evidence. After all, the two main studies providing the alleged intellectual justification for austerity — Alberto Alesina and Silvia Ardagna on “expansionary austerity” and Carmen Reinhart and Kenneth Rogoff on the dangerous debt “threshold” at 90 percent of G.D.P. — faced withering criticism almost as soon as they came out.



And the studies did not hold up under scrutiny. By late 2010, the International Monetary Fund had reworked Alesina-Ardagna with better data and reversed their findings, while many economists raised fundamental questions about Reinhart-Rogoff long before we knew about the famous Excel error. Meanwhile, real-world events — stagnation in Ireland, the original poster child for austerity, falling interest rates in the United States, which was supposed to be facing an imminent fiscal crisis — quickly made nonsense of austerian predictions.



Yet austerity maintained and even strengthened its grip on elite opinion. Why?



Part of the answer surely lies in the widespread desire to see economics as a morality play, to make it a tale of excess and its consequences. We lived beyond our means, the story goes, and now we’re paying the inevitable price. Economists can explain ad nauseam that this is wrong, that the reason we have mass unemployment isn’t that we spent too much in the past but that we’re spending too little now, and that this problem can and should be solved. No matter; many people have a visceral sense that we sinned and must seek redemption through suffering — and neither economic argument nor the observation that the people now suffering aren’t at all the same people who sinned during the bubble years makes much of a dent.



But it’s not just a matter of emotion versus logic. You can’t understand the influence of austerity doctrine without talking about class and inequality.



What, after all, do people want from economic policy? The answer, it turns out, is that it depends on which people you ask — a point documented in a recent research paper by the political scientists Benjamin Page, Larry Bartels and Jason Seawright. The paper compares the policy preferences of ordinary Americans with those of the very wealthy, and the results are eye-opening.



Thus, the average American is somewhat worried about budget deficits, which is no surprise given the constant barrage of deficit scare stories in the news media, but the wealthy, by a large majority, regard deficits as the most important problem we face. And how should the budget deficit be brought down? The wealthy favor cutting federal spending on health care and Social Security — that is, “entitlements” — while the public at large actually wants to see spending on those programs rise.



You get the idea: The austerity agenda looks a lot like a simple expression of upper-class preferences, wrapped in a facade of academic rigor. What the top 1 percent wants becomes what economic science says we must do.



Does a continuing depression actually serve the interests of the wealthy? That’s doubtful, since a booming economy is generally good for almost everyone. What is true, however, is that the years since we turned to austerity have been dismal for workers but not at all bad for the wealthy, who have benefited from surging profits and stock prices even as long-term unemployment festers. The 1 percent may not actually want a weak economy, but they’re doing well enough to indulge their prejudices.



And this makes one wonder how much difference the intellectual collapse of the austerian position will actually make. To the extent that we have policy of the 1 percent, by the 1 percent, for the 1 percent, won’t we just see new justifications for the same old policies?



I hope not; I’d like to believe that ideas and evidence matter, at least a bit. Otherwise, what am I doing with my life? But I guess we’ll see just how much cynicism is justified.



Thursday, April 25, 2013

George W. Bush Really is Dumb (Really!)

Yes, George W. Bush Was a Terrible President, and No, He Wasn’t Smart                                              By Jonathan Chait   
  
              More than three years still remained in George W. Bush’s presidency when it had already collapsed by the end of 2005. The Bush revisionism industry has thus enjoyed an unusually long period of time in which to plan out its action and predict their man’s comeback as a misunderstood, unduly maligned and — dare they say it? — successful president. The opening of the Bush museum today has opened up a flood of pent-up Bush revisionism.



It is worth noting that Bush did some good things during his presidency. Some of these received due credit at the time (his education reform, his support for treating disease in Africa). Others received vastly disproportionate credit at the time owing to what one might call the soft bigotry of low expectations (his post-9/11 speeches, which amounted to telling a unified, leadership-craving country that Al Qaeda is bad.)



It is also true that Bush’s party unfortunately decided, after his presidency, that he failed primarily by being too moderate, too compassionate, and too bipartisan, and moved even further right since, making Bush look retrospectively sane. At the time, some of us simply took for granted Bush’s choices to avoid anti-Muslim bigotry and not propose enormous cuts to government programs for the sickest and most vulnerable Americans. By the standards of the present-day GOP, these decisions make Bush look fair-minded and even statesmanlike.



But the Bush revisionist project has far more ambitious aims than to merely salvage a few specks of decency from the ruins. It aims for a wholesale restoration, both characterologically and substantively.



Keith Hennessey, a former Bush aide, has written a long, wounded attack on those of us who doubt the intellectual faculties of the 43rd president, under the provocative headline “George W. Bush is smarter than you.” The headline is not an exaggeration. Hennessey really means it:



President Bush is extremely smart by any traditional standard. He’s highly analytical and was incredibly quick to be able to discern the core question he needed to answer. It was occasionally a little embarrassing when he would jump ahead of one of his Cabinet secretaries in a policy discussion and the advisor would struggle to catch up. He would sometimes force us to accelerate through policy presentations because he so quickly grasped what we were presenting.



Hennessey writes this with such conviction that the effect is stunning. Ezra Klein concedes, “I’m inclined to agree, actually. You don’t get to be president without being pretty smart.”



I suppose all this hinges on what we mean by “pretty smart.” How smart do you have to be to become a governor, or to make it onto a presidential ticket? That’s just one step away from becoming president, but I wouldn’t call Sarah Palin “pretty smart,” at least not by the standards that ought to apply to a job like president. If you’re talking about a bunch of people you knew from high school, then sure, maybe you’d say Bush or Palin were pretty smart.



But if we’re defining intelligence as an ability to grasp public policy issues, to synthesize information in a coherent way, I would not call George W. Bush “pretty smart.” All the public evidence available to us shows a man who thinks in crude, simplistic slogans. Bush did suffer a lot of ridicule for his speaking flubs. But I don’t think awkward speaking was the problem. His way of discussing policy bore all the hallmarks of a highly simplistic mind. Here he is trying to explain himself on foreign policy:



And here he is trying to defend his regressive tax program:



If you like and sympathize with Bush’s program, you might find some deeper intelligence there. I see only evidence of a man who not only lacks the ability to think analytically but disdains the very notion of it.



One defense of Bush, offered by less sycophantic figures like David Brooks, is that the man on public display is a far dumber version of the real thing. This is fairly hard to believe, but it is also at odds with a fair amount of indirect evidence.



Former administration figures like John O’Neill and John DiIulio have painted a disturbing picture of Bush as closed-minded and simplistic. (“The incurious President was so opaque on some important issues that top Cabinet officials were left guessing his mind even after face-to-face meetings.”) Richard Perle recalled, after briefing Bush, that “he didn’t know very much.” Bush didn’t seem to grasp his own limitations — insisting that Sweden didn’t have an army, and holding to his position even when told he was thinking of Switzerland:



One congressman -- the Hungarian-born Tom Lantos, a Democrat from California and the only Holocaust survivor in Congress -- mentioned that the Scandinavian countries were viewed more positively. Lantos went on to describe for the president how the Swedish Army might be an ideal candidate to anchor a small peacekeeping force on the West Bank and the Gaza Strip. Sweden has a well-trained force of about 25,000. The president looked at him appraisingly, several people in the room recall.



''I don't know why you're talking about Sweden,'' Bush said. ''They're the neutral one. They don't have an army.''



Lantos paused, a little shocked, and offered a gentlemanly reply: ''Mr. President, you may have thought that I said Switzerland. They're the ones that are historically neutral, without an army.'' Then Lantos mentioned, in a gracious aside, that the Swiss do have a tough national guard to protect the country in the event of invasion.



Bush held to his view. ''No, no, it's Sweden that has no army.''



When Bush first appeared on the political scene, and especially during the apparently successful first couple of years of his presidency, his defenders swatted away questions about his mental acuity by pointing to his success. If he’s so dumb, how has he achieved so much? Well, he didn’t. He oversaw a disastrous administration for precisely the reason his critics always grasped: Bush was an intellectual simpleton, a man who made up his mind in absence of the facts, who swatted away inconvenient realities as annoyances.



So the main question hanging over Bush is his record itself. The most useful defense of the Bush record is probably Jennifer Rubin’s — useful because it is so slavish and so crude it inadvertently exposes all the catastrophic weaknesses in the Bush record that more clever defenders have usually learned to tiptoe around.



Rubin begins by citing rising approval ratings for Bush in his absence from office from which she infers, “Bush, like so many other presidents, can be judged best with the passage of time.” Actually, a basic rule of public opinion is that removing oneself from partisan controversy is a nearly foolproof way to enjoy high approval ratings. That’s why first ladies (at least when they stay out of politics) enjoy high approval ratings. It’s why Jimmy Carter has seen his approval rating soar up into the mid-60s. Unless Rubin wants to defend the proposition that Carter, too, is properly viewed more accurately with the passage of time.



Rubin praises Bush for enacting the “fiscally sober” Medicare prescription drug benefit, unlike the “exorbitant program like Obamacare.” She seems genuinely unaware that Bush financed his benefit entirely through deficit spending, while Obama had to pay for Obamacare by finding spending cuts and higher taxes. She praises Bush’s program as “popular” and Obama’s as “unpopular,” which is true, largely because Obama had to do the unpopular thing of paying for the benefits he created while Bush did not.



The core of Rubin’s defense is that Bush was terrific if you exempt him from any blame for the disasters that occurred during his presidency, and credit him entirely for the non-disaster periods. This sentence is a particular masterpiece: “Unlike Obama’s tenure, there was no successful attack on the homeland after 9/11.” In fact, it is not true — there were small terrorist attacks on the United States, both abroad and at home, after 9/11. But exempting the most disastrous attack on the United States from Bush’s record of avoiding terrorism is a feat of propaganda that, while common, continues to boggle the mind. Emperor Honorius Kept Rome Safe, except that one time it was sacked by the Visigoths.



Likewise, Rubin touts “7 1/2 years of job growth and prosperity.” When you’re evaluating a president who served for eight years, you should be suspicious of phrases like “7 ½ years.” Why pick that time frame? Apparently Rubin is chopping off the recession of 2001, which Bush defenders have always, not unfairly, blamed on conditions that preceded him. She is also chopping off the recession of 2008, which is harder to justify given the previous decision.



So the claim here is that, between the two recessions that began under Bush, we were not in a recession. But the period between the two recessions was a giant housing bubble. And even if we ignore that fact, absolve Bush for the first recession because it came at the beginning of his term, absolve him for the second recession because it came at the end, and absolve him for the bubble that he did nothing to deflate, the fact remains that the job and income growth during that middle period was extraordinarily and historically weak.



If you want to look kindly on Bush’s presidency, you can fairly say that, while he deserves significant blame for ignoring warnings of an Al Qaeda strike and the housing bubble, the disasters of his tenure were not entirely his fault. But what did he do? His economic policies exacerbated income inequality without producing prosperity. His massive increase of the structural budget deficit, which ballooned to over a trillion dollars before President Obama took office, left the United States less fiscally equipped to respond to the economic crisis he also left his predecessor. He initiated a costly war on the basis of both mistaken and deliberately cooked intelligence, and failed to plan for the postwar period. His policies not only ignored the crises of climate change and a costly and cruel health insurance system, but made both much harder to solve.



The failures of Bush’s governing method — the staffing of hacks and cronies, the disdain for evidence — was perfectly reflected in the outcomes. The Bush presidency was a full disaster at home and abroad, and whatever small accomplishments that can be salvaged barely rate any mention in comparison with the failures. The general reckoning of Bush is not too harsh. It is too kind.



Wednesday, April 24, 2013

Jane's Game

.Game Theory: Jane Austen Had It FirstBy JENNIFER SCHUESSLER


Published: April 22, 2013
It’s not every day that someone stumbles upon a major new strategic thinker during family movie night. But that’s what happened to Michael Chwe, an associate professor of political science at the University of California, Los Angeles, when he sat down with his children some eight years ago to watch “Clueless,” the 1995 romantic comedy based on Jane Austen’s “Emma.”

Michael Chwe, the political scientist who wrote “Jane Austen, Game Theorist.”

Mr. Chwe (pronounced CHEH), the author of papers like “Farsighted Coalitional Stability” and “Anonymous Procedures for Condorcet’s Model: Robustness, Nonmonotonicity and Optimality,” had never cracked “Emma” or “Pride and Prejudice.” But on screen, he saw glimmers of a strategic intelligence that would make Henry Kissinger blush.



“This movie was all about manipulation,” Mr. Chwe, a practitioner of the hard-nosed science of game theory, said recently by telephone. “I had always been taught that game theory was a mathematical thing. But when you think about it, people have been thinking about strategic action for a long time.”



Mr. Chwe set to doing his English homework, and now his assignment is in. “Jane Austen, Game Theorist,” just published by Princeton University Press, is more than the larky scholarly equivalent of “Pride and Prejudice and Zombies.” In 230 diagram-heavy pages, Mr. Chwe argues that Austen isn’t merely fodder for game-theoretical analysis, but an unacknowledged founder of the discipline itself: a kind of Empire-waisted version of the mathematician and cold war thinker John von Neumann, ruthlessly breaking down the stratagems of 18th-century social warfare.



Or, as Mr. Chwe puts it in the book, “Anyone interested in human behavior should read Austen because her research program has results.”



Modern game theory is generally dated to 1944, with the publication of von Neumann’s “Theory of Games and Economic Behavior,” which imagined human interactions as a series of moves and countermoves aimed at maximizing “payoff.” Since then the discipline has thrived, often dominating political science, economics and biology departments with densely mathematical analyses of phenomena as diverse as nuclear brinkmanship, the fate of protest movements, stock trading and predator behavior.



But a century and a half earlier, Mr. Chwe argues, Austen was very deliberately trying to lay philosophical groundwork for a new theory of strategic action, sometimes charting territory that today’s theoreticians have themselves failed to reach.



First among her as yet unequaled concepts is “cluelessness,” which in Mr. Chwe’s analysis isn’t just tween-friendly slang but an analytic concept worthy of consideration alongside game-theoretic chestnuts like “zero-sum,” “risk dominance” and “prisoner’s dilemma.”



Most game theory, he noted, treats players as equally “rational” parties sitting across a chessboard. But many situations, Mr. Chwe points out, involve parties with unequal levels of strategic thinking. Sometimes a party may simply lack ability. But sometimes a powerful party faced with a weaker one may not realize it even needs to think strategically.



Take the scene in “Pride and Prejudice” where Lady Catherine de Bourgh demands that Elizabeth Bennet promise not to marry Mr. Darcy. Elizabeth refuses to promise, and Lady Catherine repeats this to Mr. Darcy as an example of her insolence — not realizing that she is helping Elizabeth indirectly signal to Mr. Darcy that she is still interested.



It’s a classic case of cluelessness, which is distinct from garden-variety stupidity, Mr. Chwe argues. “Lady Catherine doesn’t even think that Elizabeth” — her social inferior — “could be manipulating her,” he said. (Ditto for Mr. Darcy: gender differences can also “cause cluelessness,” he noted, though Austen was generally more tolerant of the male variety.)



The phenomenon is hardly limited to Austen’s fictional rural society. In a chapter called “Real-World Cluelessness,” Mr. Chwe argues that the moralistic American reaction to the 2004 killing and mutilation of four private security guards working with the American military in Falluja — L. Paul Bremer III, leader of the American occupation of Iraq, later compared the killers to “human jackals”— obscured a strategic truth: that striking back at the city as a whole would only be counterproductive.



“Calling your enemy an animal might improve your bargaining position or deaden your moral qualms, but at the expense of not being able to think about your enemy strategically,” Mr. Chwe writes.



The darker side of Austen is hardly unknown to literary scholars. “Regulated Hatred,” a classic 1940 paper by the psychologist D. W. Harding, argued that her novels explored containment strategies against the “eruption of fear and hatred into the relationships of everyday social life.”



But Mr. Chwe, who identifies some 50 “strategic manipulations” in Austen (in addition to a chapter on the sophisticated “folk game theory” insights in traditional African tales), is more interested in exploring the softer side of game theory. Game theory, he argues, isn’t just part of “hegemonic cold war discourse,” but what the political scientist James Scott called a subversive “weapon of the weak.”



Such analysis may not go over well with military types, to say nothing of literary scholars, many of whom see books like Mr. Chwe’s or “Graphing Jane Austen,” an anthology of Darwinian literary criticism published last year, as examples of ham-handed scientific imperialism.



“These ostensibly interdisciplinary efforts are sometimes seen as attempts to validate the humanities by attaching them to more empirical disciplines,” said Jonathan Kramnick, a professor of English at Johns Hopkins and the author of the 2011 essay “Against Literary Darwinism,” who has not read Mr. Chwe’s book. “But for some, myself included, literary studies doesn’t need to attach itself to any other discipline.” Even some humanists who admire Mr. Chwe’s work suggest that when it comes to appreciating Austen, social scientists may be the clueless ones. Austen scholars “will not be surprised at all to see the depths of her grasp of strategic thinking and the way she anticipated a 20th-century field of inquiry,” Laura J. Rosenthal, a specialist in 18th-century British literature at the University of Maryland, said via e-mail.



As for Mr. Chwe, he said he was happy if he could spread Janeism among the game-playing wonks. And which Austen character would he want leading America in a nuclear showdown?



Easy, he said with a laugh: “I would want Austen herself.”



A Review of The New Mind of the South (2)

‘The New Mind of the South,’ by Tracy Thompson

By DWIGHT GARNER

Published: April 23, 2013
W. J. Cash, the author of “The Mind of the South” (1941), was a newspaperman, a Mencken devotee and a manic-depressive. When he killed himself, not long after his book was published, he was newly married and had a fresh Guggenheim fellowship in his pocket.

THE NEW MIND OF THE SOUTH



By Tracy Thompson



263 pages. Simon & Schuster. $26.


His book is a complicated American classic, sometimes ahead of its time on racial matters, sometimes troglodytic. But in its contemplation of what Cash called “the savage ideal” in Southern culture, an ingrained white male truculence, it remains a volume be reckoned with.



It is among the disappointments of Tracy Thompson’s new book, “The New Mind of the South,” that it leans upon Cash’s title while not coming to terms with his masterpiece at all. Where his book was cerebral and probing, hers is featherweight and breezy. Reading it is like watching someone on a Vespa pull up alongside an Allied landing craft left over from D-Day.



Ms. Thompson, born in Georgia in 1955, has worked as a reporter for The Atlanta Journal-Constitution and The Washington Post. She’s lived in the Washington area for the past 24 years.



This isn’t a disqualification for writing about the South. Being away from home can sharpen your observations about what’s left behind. Witness Roy Blount Jr., who used to deliver a wicked column for The Oxford American called “Gone Off Up North.”



Ms. Thompson wrote “The New Mind of the South” out of an intuition that much of what she prizes about her home ground is imperiled. “Tradition in the South is like beachfront property in an era of global warming,” she declares. “As much as you love the view, you live with the knowledge that some morning you will wake up and find it gone.”



This is not a new arena of thought. In his 1974 book, “The Americanization of Dixie: The Southernization of America,” for example, the journalist and historian John Egerton issued a long, lonesome moan over how malled-up and hollow-souled the South was becoming.



Yet Ms. Thompson digs into fresh material. Her book is a series of pop-ins. She’s interested in identity, and visits a Hispanic leadership seminar to get a feel for the South’s new waves of immigration. She’s deft on how Hispanics don’t entirely see themselves as Southerners — to them the American South will always be El Norte.



“Hispanics will change the South,” she says, but “the South will change them.” She notes that it’s hard to find a mariachi band these days that can’t play “Orange Blossom Special.” She’s optimistic. She asks, “Can salsa with grits be far behind?” I’ve heard that cheese grits with a bit of salsa can be sublime. But in general, I hope well-meaning Latinos and good old boys can come together long enough to agree that the collective answer to Ms. Thompson’s question is “Eww.”



Ms. Thompson drops in on a Children of the Confederacy convention, to observe genteel racism of a fading sort. She takes us along to a megachurch presided over by the sublimely named Creflo Dollar. She ponders the overlapping meanings of the 1994 Susan Smith case, in which a mother in South Carolina drowned her two children before claiming a black man had kidnapped them.



She’s good on what she calls “Southern Sonar,” which allows Southerners to “ping” one another to assess their origins. She’s brutal about her hometown, Atlanta, which she feels has backslid on racial issues.



After listing some civil rights-era black leaders, she says: “With the exception of John Lewis, I couldn’t imagine any of them making a career in politics in today’s Atlanta. Their brand of concern for the poor and dispossessed would be instantly construed as an attempt to pick the pockets of the middle class; they’d be booed off the stage.”



She argues, compellingly, that to put the ghosts of slavery to rest, the South needs something akin to the Truth and Reconciliation Commission set up in South Africa by Nelson Mandela in the mid-1990s.



There are grace notes. I enjoyed Ms. Thompson’s observation that discovering that one of her ancestors had been a Union supporter during the Civil War was a shock not unlike “learning that some old family keepsake painting you’d had for decades had, in fact, been hanging upside down.”



“The New Mind of the South” does not, alas, make for a sustaining supper. Ms. Thompson has zero feel for the South’s literature or music or drama or food — the only cook she discusses in any real way is Paula Deen — and not much more for its politics. There’s no deep historical awareness.



I wasn’t sure it was possible to write a book about the mind of the South that didn’t mention George Wallace, Bear Bryant or Ronnie Van Zant, but Ms. Thompson has managed it.



When large and complicated humans do walk into view, they seem diminished in her hands. She describes the great South Carolina writer, planter and Anglican clergyman Charles Woodmason, for example, “as a kind of 18th-century Frasier Crane.”



This is a book about the Southern intellect by a writer who had never visited the South’s intellectual ground zero, Oxford, Miss., until she hurriedly drove through it while researching this book.



Ms. Thompson takes her share of whacks at what’s become of America below the Mason-Dixon line — or, as The Onion has it, below the IHOP-Waffle House line.



“The South has been urbanized, suburbanized, strip-malled, and land-formed to a point that at times I hardly recognize it anymore,” she writes. Yet it remains, she declares, “a place that bears the imprint of that deep sense of community and an almost tribal definition of kin.”



Best of all, this heartsick Southerner reminds us, “in New York you never get the fleeting sense that the polite stranger giving you directions might invite you home for dinner. In Atlanta, you do.”



Tuesday, April 23, 2013

A Quiet Day

It seems to have been a quiet day on the national scene.  No bombings.  Not explosions.  A  good day by recent standards.

Monday, April 22, 2013

Poe on Prozac

.Poe on Prozac wouldn't be much fun, would he? "I'll just ignore that heartbeat. It must be from somewhere else. No need to lose sleep over a silly heartbeat." The Raven says, "What the heck. Things could be a lot worse." Drink your wine, my friend. Would you like a cask to take home? You have nothing to fear from me. No need for me to die under mysterious circumstances. Nothing to be upset about. All is well.


Like · · Promote · Share.

L Vicky Bartholomew Griffin, Lori Spears Cartwright and Jody Britt like this..Don Waller So you took the Prozac like your Dr told you?

Yesterday at 1:01pm via mobile · Unlike · 1..Fred Hudson My doctor is senile. I only go to him because he usually forgets to bill me.

23 hours ago · Edited · Like..

Let Us Hope

That is a quiet week on the national scene.  After last Monday's bombing in Boston and the dramatic search and rescue of the alleged bombers Friday night, the Senate's rejection of background checks on gun purchases, and the fertilizer plant explosiion in Texas, we are due for a good week.

Sunday, April 21, 2013

Our Undivided Past?

By ALAN WOLFE

Published: April 19, 2013
In “The Undivided Past,” David Cannadine challenges those who believe that all history is the history of conflict, whether over class, as Marx and Engels proclaimed, or over religion, nationality, race, gender or civilization. The fact is, mankind’s divisions may not be the most important part of the story. As Cannadine succinctly puts it, “humanity is still here.”

THE UNDIVIDED PAST



Humanity Beyond Our Differences



By David Cannadine



340 pp. Alfred A. Knopf. $26.95.

.This is so hopeful a book, and so authoritative in its coverage of history, that a reader wants to believe its thesis. I do, but only in part.



Of all the texts that map the world into hopelessly hostile camps, “The Communist Manifesto” no doubt sold the most copies. Its authors argued that two features of class conflict would make it especially significant: class would trump all other forms of division, and the ­differences between classes would prove so profound that their strife could be resolved only through revolution. Neither of these arguments has held up in the years since the “Manifesto” was published, allowing Cannadine, who teaches history at Prince­ton, to dispense with this form of conflict handily. We now know, he reminds us, that socialists around the world rose to the defense of their countries in the run-up to World War I, thereby demonstrating that claims of nationhood can easily overwhelm those of class. In the years after World War II, by contrast, when it seemed as if the world was bitterly divided between capitalist and Communist ways of life, conflicts within the Communist bloc never disappeared and the cold war itself had more to do with classic geopolitical considerations than with anything resembling class struggle.



Communism may no longer be with us, at least as a doctrine of global revolution, but wars between nations very much are. In trying to show that theories proclaiming their inevitability are also wrong, Cannadine gives himself a more difficult challenge. But he asks: How can antagonism between nations be a feature of Western history when nations themselves are a product of recent times? Up until the 18th century, wars between states were really wars between monarchs. Even by World War I, when nationalist appeals trumped those of class, less than 5 percent of people living in what we now call Italy spoke Italian in everyday situations, while both “Germany” and “Austria-Hungary” resembled multinational empires more than unified nation-states. To be sure, the 19th and 20th centuries saw the emergence of nations with some kind of collective identity, but almost as soon as they appeared, globalization began to fracture identities once again.



Because so many religions make claims to exclusive truth, not all of them can be correct, leading the Manichaeans among us to conclude that those who are on one side cannot be on the other. Yet, Cannadine argues, religions have borrowed extensively from one another throughout history. Once the Ottomans took Constantinople, Orthodox churches found new life. Trade brought people from different faiths together. Jews and Christians found places for themselves in Baghdad while Muslims were tolerated in Sicily. The story of the Crusades is told so often that we tend to forget those individuals, like the 16th-century traveler Leo Africanus, who, in Cannadine’s words, “moved across the supposedly impermeable boundaries of religious identity with remarkable ease and frequency.” No doubt the world has its share of religious conflict — just consider Northern Ireland in the recent past and the Middle East today — but religious identity, like that of class and nation, is not fixed and implacable.



So it goes for race and gender. Houston Stewart Chamberlain, whose ideas influenced the Nazis, insisted that racial differences could never be overcome and so, in his own way, did the left-wing radical W. E. B. Du Bois. Similarly, it is not just male chauvinists who insist on the biological superiority of one sex over another but also “essentialist” feminists like Germaine Greer. All these arguments represent examples of “totalizing,” which Cannadine defines as “describing and defining individuals by their membership in one single group, deemed to be more important and more all-encompassing than any other solidarity.” Not only is totalizing empirically wrong, he insists, but it is also politically obnoxious in its claim that human solidarity is illusory.



Cannadine does not say so, but he may well have written his book in response to Samuel Huntington’s famous argument about the clash of civilizations. Hence his sixth and last category on collective identity. Huntington’s thesis did not originate with him, Cannadine is quick to point out, but can be traced back to Gibbon’s “Decline and Fall of the Roman Empire,” which featured clashes between pagan and Christian as well as Roman and barbarian civilizations. Those who followed in Gibbon’s wake — Oswald Spengler and Arnold Toynbee are the best known — viewed history as a process of rising and falling, but remained vague on which civilizations were winning and which deserved to lose.



What’s more, the very term “civilization” was anything but merely descriptive; to German thinkers it was a step down from Kultur, while for British imperialists it was a step up from tribalism. When we get to Huntington, therefore, it comes as no surprise to learn that for Cannadine the civilizations presumed to be clashing “seem on closer inspection to be little more than arbitrary groupings and idiosyncratic personal constructs.” Cannadine rarely puts his emotions on display, but on this question he does: “Future world leaders who invoke ‘civilization,’ ” he writes, “ought to be more circumspect about doing so than many who have recently and irresponsibly been bandying it around to such baleful effect.”



I can only hope that “The Undivided Past” will have all the impact of Huntington’s work, serving as an important reminder that human beings around the world not only have much in common but also have improved the conditions of their lives over time. Here, though, is where I worry that Cannadine’s scheme is just a bit too neat. Each of his six identities is treated in the same way: never were they unified and none have managed to trump the others. At times Cannadine, much like a Spengler, seems to be writing theory rather than history. He has an uplifting story to tell, but one suspects that he is telling it too schematically.



The uplift, in addition, may prove temporary. It is true that some predictions of identity conflict now seem obsolete; feminist theories stressing how women reason differently from men, for example, have by and large given way to liberal ideas about gender equality. Yet who is to say that other forms of conflict may not make a comeback? Should present trends toward income inequality persist in the West, for example, class struggle may re-­emerge in our future. Religious hostility is still with us, and the potential for wars between nations is ever-present.



Cannadine is clearly correct that Matthew Arnold’s “darkling plain . . . where ignorant armies clash by night,” is receding, and his optimism is both refreshing and necessary. Alas, the past was always divided and the future is very likely to be so as well.



Jackie Robinson - I Never Had It Made

Normally I don't enjoy autobiographies, but I did thoroughly enjoy this one.  Jackie Robinson is one of the iconic figures of our time.  He was the first African-American major league baseball player in 1947 when owner Branch Rickey siigned him to play for the Brooklyn Dodgers. 

Integrating baseball was tough going.  It took a special person to start it off.  JR was that perfect person.  It is impossible to give Robinson and Rickey too much credit.  Other forces were at work to integrate baseball, and Jackie Robinson in 1947 was not the whole story, but nothing can take away from that story either.

The NY Yankees did not have have a black player until Elston Howard in 1955.  The Boston Red Sox were the last team to integrate.  They didn't have their first black player until 1959.  That's hard to believe but it's true.

Jack Roosevelt Robinson (named after TR) was born in Cairo, Georgia.  His father abandoned the family and luckily the mother was able to move the family to Pasedena, California, where her brother lived.  It's a good thing JR grew up in California rather than Georgia.

JR grew up fighting racial slights.  It was in his nature not to take anything off anybody.  He seemed indifferent to school, but attended Pasadena JC and went from there to UCLA where he littered in four sports.  That would be unheard of today.  He probably could have played professional football.

Fortunately he went the baseball route.  Rickey carefully prepared the way and after playing a year in Montreal where he was treated wonderfully well by the more racially liberal Canadians, he came to play for the Dodgers in 1947.  He promised Branch Rickey that for the first year he would take the insults and the abuse and not retaliate.  JR kept his promise.

The rest is not exactly history for Jackie Robinson.  He had a tumultous career.  His aggressive style of baseball raised hackles, not just because of the color of his skin, but because, well, he played the game hard.  He was electric in the field and particularly on the basepaths; his specialty was stealing home.  Is there anything more exciting in baseball than seeing someone steal home?

After Rickey left the Dodgers for the Pirates, JR didn't get along with new owner Walter O'Malley, the ownder who took the Dodgers to L.A. in 1958.  After the '56 season the Dodgers traded JR to the Giants.  What an insult.  But JR retired and went on to a successful business career.  He died in 1972 from complications from diabetes.

He was active in politics and early on was a Republican.  This was back when there were actually liberal Republicans.  It seems that JR never lost his affection for Nelson Rockefeller and that is OK because Rockefeller was a true liberal and would certainly be a Democrat today. I sort of remember Robinson as a Republican but never would have gussed his affection for Nelson Rockefeller.  He makes Rocky sound like a really good fellow.   Over time Jackie learned the truth about Republicans and people like Richard Nixon and Barry Goldwater.  He supported Hubert Humphrey in 1968.

The saddest part of his life was the death of Jackie, Jr. at the age of 24.  The son battled alcohol and drug addiction and was clean for 3 years before passing away from an auto accident.

His wife Rachel is still alive.  His other son and daughter are still alive and doing well as far as I know.

Jackie Robinson was a true icon and hero for our time.

He ends his story by repeating "I Never Had it Made" as if he is compelled to say that he had to earn everything that he achieved in life.  Okay.  I believe you, Jackie.  Maybe in life a good defense is the best offsense.

Saturday, April 20, 2013

About Boston

It's hard for a Southern boy to pay tribute to Boston, but here goes. I grew up hating the Celtics because I was a Cincinnati Royals fan because they had Oscar Robertson, my favorite player. I hated Bill Russell even though he is the best basketball player ever. Always liked the Red Sox though. Still wear my 2004 championship hat. They had Ted Williams. He refused to come back out of the dugout and tip his hat after hitting a homerun in his last at bat at Fenway in 1960.   Good for him! Need I say more? Jogged around Fenway in 1990 but have never been inside. Wandered around Harvard in 1990. Didn't see anybody I knew ha,ha. The streets of Boston are like cow paths. The Shell and the River Charles. Love that dirty water. The population density of liberals side-by-side with conservatives must be the country's greatest. So much history. Old next to new but far more old than new. Boston, not NY or Philly, epitomizes The East. Love to hear them Yankees talk. HAH-VUD. Shall we all sing "Sweet Caroline

Thursday, April 18, 2013

The Texas Explosion

As if the Boston bombing wasn't enough for Americans to digest, there's been a violent and devastating explosion at a fertilizer plant in Texas.  Someone has been sending the poison Ricin thru the mail.  This has been a tough week in our country.

Tuesday, April 16, 2013

Boston

Yesterday's events in Boston---two bombs placed at the finish line of the Mararthon, three people died, over 150 injured---are unbelievable and sad, made all the more real to me since I was in Boston back in February.  Let us hope the perps are caught and brought to justice.

Sunday, April 14, 2013

Commentary on 42

 The Real Story of Baseball's Integration That You Won't See in 42

The new film ignores the broad-based movement that helped make Jackie Robinson's arrival in baseball possible, as well as the first black major-leaguer's own activism.

Peter Dreier

Apr 11 2013, 2:38 PM ET

One of America's most iconic and inspiring stories—Jackie Robinson breaking baseball's color line in 1947—is retold in the film 42, which opens nationally this weekend. Even if you're not a baseball fan, the film will tug at your heart and have you rooting for Robinson to overcome the racist obstacles put in his way. It is an uplifting tale of courage and determination that is hard to resist, even though you know the outcome before the movie begins.



But despite bravura performances by relatively unknown Chadwick Boseman as Robinson and superstar Harrison Ford as Branch Rickey (the Brooklyn Dodgers' general manager who recruited Robinson and orchestrated his transition from the Negro Leagues to the all-white Major Leagues), the film strikes out as history, because it ignores the true story of how baseball's apartheid system was dismantled.

Why Brad Paisley's "Accidental Racist" Is Actually Just Racist The film portrays baseball's integration as the tale of two trailblazers—Robinson, the combative athlete and Rickey, the shrewd strategist—battling baseball's, and society's, bigotry. But the truth is that it was a political victory brought about by a social protest movement. As an activist himself, Robinson would likely have been disappointed by a film that ignored the centrality of the broader civil rights struggle.



That story has been told in two outstanding books, Jules Tygiel's Baseball's Great Experiment (1983) and Chris Lamb's Conspiracy of Silence: Sportswriters and the Long Campaign to Desegregate Baseball (2012). As they recount, Rickey's plan came after more than a decade of effort by black and left-wing journalists and activists to desegregate the national pastime. Beginning in the 1930s, the Negro press, civil rights groups, the Communist Party, progressive white activists, and radical politicians waged a sustained campaign to integrate baseball. It was part of a broader movement to eliminate discrimination in housing, jobs, and other sectors of society. It included protests against segregation within the military, mobilizing for a federal anti-lynching law, marches to open up defense jobs to blacks during World War II, and boycotts against stores that refused to hire African Americans under the banner "don't shop where you can't work." The movement accelerated after the war, when returning black veterans expected that America would open up opportunities for African Americans.



Robinson broke into baseball when America was a deeply segregated nation. In 1946, at least six African Americans were lynched in the South. Restrictive covenants were still legal, barring blacks (and Jews) from buying homes in many neighborhoods—not just in the South. Only a handful of blacks were enrolled in the nation's predominantly white colleges and universities. There were only two blacks in Congress. No big city had a black mayor.



Martin Luther King Jr. once told Dodgers star Don Newcombe, another former Negro Leaguer, "You'll never know what you and Jackie and Roy [Campanella] did to make it possible to do my job."It is difficult today to summon the excitement that greeted Robinson's achievement. The dignity with which Robinson handled his encounters with racism—including verbal and physical abuse on the field and in hotels, restaurants, trains, and elsewhere—drew public attention to the issue, stirred the consciences of many white Americans, and gave black Americans a tremendous boost of pride and self-confidence. Martin Luther King Jr. once told Dodgers star Don Newcombe, another former Negro Leaguer, "You'll never know what you and Jackie and Roy [Campanella] did to make it possible to do my job."

Robinson, who spent his entire major league career (1947 to 1956) with the Dodgers, was voted Rookie of the Year in 1947 and Most Valuable Player in 1949, when he won the National League batting title with a .342 batting average. An outstanding base runner and base stealer, with a .311 lifetime batting average, he led the Dodgers to six pennants and was elected to the Hall of Fame in 1962.


42 is the fourth Hollywood film about Robinson. All of them suffer from what might be called movement myopia. We may prefer our heroes to be rugged individualists, but the reality doesn't conform to the myth embedded in Hollywood's version of the Robinson story.



In The Jackie Robinson Story, released in 1950, Robinson played himself and the fabulous Ruby Dee portrayed his wife Rachel. Produced at the height of the Cold War, five years before the Montgomery bus boycott, the film celebrated Robinson's feat as evidence that America was a land of opportunity where anyone could succeed if he had the talent and will. The movie opens with the narrator saying, "This is a story of a boy and his dream. But more than that, it's a story of an American boy and a dream that is truly American."



In 1990 TNT released a made-for-TV movie, The Court Martial of Jackie Robinson, starring Andre Braugher, which focused on Robinson's battles with racism as a soldier during World War II. In 1944, while assigned to a training camp at Fort Hood in segregated Texas, Robinson, a second lieutenant, refused to move to the back of an army bus when the white driver ordered him to do so, even though buses had been officially desegregated on military bases. He was court martialed for his insubordination, tried, acquitted, transferred to another military base, and honorably discharged four months later. By depicting Robinson as a rebellious figure who chafed at the blatant racism he faced, the film foreshadows the traits he would have to initially suppress once he reached the majors.



HBO's The Soul of the Game, released in 1996, focused on the hopes and then the frustrations of Satchel Paige and Josh Gibson, the two greatest players in the Negro Leagues, whom Branch Rickey passed up to integrate the majors in favor of Robinson, played by Blair Underwood. Rickey had long wanted to hire black players, both for moral reasons and because he believed it would increase ticket sales among the growing number of African Americans moving to the big cities. He knew that if the experiment failed, the cause of baseball integration would be set back for many years. Rickey's scouts identified Robinson—who was playing for the Negro League's Kansas City Monarchs after leaving the army—as a potential barrier-breaker. Rickey could have chosen other Negro League players with greater talent or more name recognition, but he wanted someone who could be, in today's terms, a role model. Robinson was young, articulate and well educated. His mother moved the family from Georgia to Pasadena, California in 1920 when Robinson was 14 months ago. Pasadena was deeply segregated, but Robinson lived among and formed friendships with whites growing up there and while attending Pasadena Junior College and UCLA. He was UCLA's first four-sport athlete (football, basketball, track, and baseball), twice led the Pacific Coast League in scoring in basketball, won the NCAA broad jump championship, and was a football All-American. Rickey knew that Robinson had a hot temper and strong political views, but he calculated that Robinson could handle the emotional pressure while helping the Dodgers on the field. Robinson promised Rickey that, for at least his rookie year, he would not respond to the inevitable verbal barbs and even physical abuse he would face on a daily basis.



In 1997, America celebrated Robinson with a proliferation of conferences, museum exhibits, plays, and books. Major League Baseball retired Robinson's number—42—for all teams. President Bill Clinton appeared with Rachel Robinson at Shea Stadium to venerate her late husband.



But the next Hollywood movie about Robinson didn't arrive until this year's 42, written and directed by Brian Helgeland (screenwriter of L.A. Confidential and Mystic River), under the auspices of Warner Brothers and Legendary Pictures. The real story of baseball's integration has plenty of drama and could have easily been incorporated into the film.



* * *

Starting in the 1930s, reporters for African-American papers (especially Wendell Smith of the Pittsburgh Courier, Fay Young of the Chicago Defender, Joe Bostic of the People's Voice in New York, and Sam Lacy of the Baltimore Afro-American), and Lester Rodney, sports editor of the Communist paper, the Daily Worker, took the lead in pushing baseball's establishment to hire black players. They published open letters to owners, polled white managers and players (some of whom were threatened by the prospect of losing their jobs to blacks, but most of whom said that they had no objections to playing with African Americans), brought black players to unscheduled tryouts at spring training centers, and kept the issue before the public. Several white journalists for mainstream papers joined the chorus for baseball integration.



Progressive unions and civil rights groups picketed outside Yankee Stadium the Polo Grounds, and Ebbets Field in New York City, and Comiskey Park and Wrigley Field in Chicago. They gathered more than a million signatures on petitions, demanding that baseball tear down the color barrier erected by team owners and Commissioner Kennesaw Mountain Landis. In July 1940, the Trade Union Athletic Association held an "End Jim Crow in Baseball" demonstration at the New York World's Fair. The next year, liberal unions sent a delegation to meet with Landis to demand that major league baseball recruit black players. In December 1943, Paul Robeson, the prominent black actor, singer, and activist, addressed baseball's owners at their annual winter meeting in New York, urging them to integrate their teams. Under orders from Landis, they ignored Robeson and didn't ask him a single question.

National Endowment for the Humanities

In 1945, Isadore Muchnick, a progressive member of the Boston City Council, threatened to deny the Red Sox a permit to play on Sundays unless the team considered hiring black players. Working with several black sportswriters, Muchnick persuaded the reluctant Red Sox general manager, Eddie Collins, to give three Negro League players—Robinson, Sam Jethroe, and Marvin Williams—a tryout at Fenway Park in April of that year. The Sox had no intention of signing any of the players, nor did the Pittsburgh Pirates and Chicago White Sox, who orchestrated similar bogus auditions. But the public pressure and media publicity helped raise awareness and furthered the cause.



Other politicians were allies in the crusade. Running for re-election to the New York City Council in 1945, Ben Davis—an African-American former college football star, and a Communist—distributed a leaflet with the photos of two blacks, a dead soldier and a baseball player. "Good enough to die for his country," it said, "but not good enough for organized baseball." That year, the New York State legislature passed the Quinn-Ives Act, which banned discrimination in hiring, and soon formed a committee to investigate discriminatory hiring practices, including one that focused on baseball. In short order, New York City Mayor Fiorello LaGuardia's established a Committee on Baseball to push the Yankees, Giants, and Dodgers to sign black players. Left-wing Congressman Vito Marcantonio, who represented Harlem, called for an investigation of baseball's racist practices.



This protest movement set the stage for Robinson's entrance into the major leagues. In October 1945, Rickey announced that Robinson had signed a contract with the Dodgers. He sent Robinson to the Dodgers' minor-league team in Montreal for the 1946 season, then brought him up to the Brooklyn team on opening day, April 15, 1947.



The Robinson experiment succeeded—on the field and at the box office. Within a few years, the Dodgers had hired other black players—pitchers Don Newcombe and Joe Black, catcher Roy Campanella, infielder Jim Gilliam, and Cuban outfielder Sandy Amoros—who helped turn the 1950s Dodgers into one of the greatest teams in baseball history.



* * *

Viewers of 42 will see no evidence of the movement that made Robinson's—and the Dodgers'—success possible. For example, Andrew Holland plays Pittsburgh Courier reporter Wendell Smith, but he's depicted as Robinson's traveling companion and the ghost-writer for Robinson's newspaper column during his rookie season. The film ignores Smith's key role as an agitator and leader of the long crusade to integrate baseball before Robinson became a household name.



Robinson recognized that the dismantling of baseball's color line was a triumph of both a man and a movement. During and after his playing days, he joined the civil rights crusade, speaking out—in speeches, interviews, and his column—against racial injustice. In 1949, testifying before Congress, he said: "I'm not fooled because I've had a chance open to very few Negro Americans."



Robinson viewed his sports celebrity as a platform from which to challenge American racism. Many sportswriters and most other players—including some of his fellow black players, content simply to be playing in the majors—considered Robinson too angry and vocal about racism in baseball and society.



Robinson viewed his sports celebrity as a platform from which to challenge American racism. Many sportswriters and other players—including some of his fellow black players—considered Robinson too angry and vocal about racism.When Robinson retired from baseball in 1956, no team offered him a position as a coach, manager, or executive. Instead, he became an executive with the Chock Full o' Nuts restaurant chain and an advocate for integrating corporate America. He lent his name and prestige to several business ventures, including a construction company and a black-owned bank in Harlem. He got involved in these business activities primarily to help address the shortage of affordable housing and the persistent redlining (lending discrimination against blacks) by white-owned banks. Both the construction company and the bank later fell on hard times and dimmed Robinson's confidence in black capitalism as a strategy for racial integration.



In 1960, Robinson supported Hubert Humphrey, the liberal Senator and civil rights stalwart from Minnesota, in his campaign for president. When John Kennedy won the Democratic nomination, however, Robinson shocked his liberal fans by endorsing Richard Nixon. Robinson believed that Nixon had a better track record than JFK on civil rights issues, but by the end of the campaign—especially after Nixon refused to make an appearance in Harlem—he regretted his choice.



During the 1960s, Robinson was a constant presence at civil rights rallies and picket lines, and chaired the NAACP's fundraising drive. Angered by the GOP's opposition to civil rights legislation, he supported Humphrey over Nixon in 1968. But he became increasingly frustrated by the pace of change.



"I cannot possibly believe," he wrote in his autobiography, I Never Had It Made, published shortly before he died of a heart attack at age 53 in 1972, "that I have it made while so many black brothers and sisters are hungry, inadequately housed, insufficiently clothed, denied their dignity as they live in slums or barely exist on welfare."



In 1952, five years after Robinson broke baseball's color barrier, only six of major league baseball's 16 teams had a black player. It was not until 1959 that the last holdout, the Boston Red Sox, brought an African American onto its roster. The black players who followed Robinson shattered the stereotype—once widespread among many team owners, sportswriters, and white fans—that there weren't many African Americans "qualified" to play at the major league level. Between 1949 and 1960, black players won 8 out of 12 Rookie of the Year awards, and 9 out of 12 Most Valuable Player awards in the National League, which was much more integrated than the American League. Many former Negro League players, including Willie Mays, Henry Aaron, Don Newcombe, and Ernie Banks, were perennial All-Stars.



But academic studies conducted from the 1960s through the 1990s uncovered persistent discrimination. For example, teams were likely to favor a weak-hitting white player over a weak-hitting black player to be a benchwarmer or a utility man. And even the best black players had fewer and less lucrative commercial endorsements than their white counterparts.



In the 16 years he lived after his retirement in 1956, Robinson pushed baseball to hire blacks as managers and executives and even refused an invitation to participate in the 1969 Old Timers game because he did not yet see "genuine interest in breaking the barriers that deny access to managerial and front office positions." No major league team had a black manager until Frank Robinson was hired by the Cleveland Indians in 1975. The majors' first black general manager—the Atlanta Braves' Bill Lucas—wasn't hired until 1977.



* * *

Last season, players of color represented 38.2 percent of majo- league rosters, according to a report by the Institute for Diversity and Ethics in Sport at the University of Central Florida. Black athletes represented only 8.8 percent of major-league players—a dramatic decline from the peak of 27 percent in 1975, and less than half the 19 percent in 1995. One quarter of last season's African-Americans players were clustered on three teams—the Yankees, Angels, and Dodgers. Their shrinking proportion is due primarily to the growing number of Latino (27.3%) and Asian (1.9%) players, including many foreign-born athletes, now populating major league rosters.



But there are also sociological and economic reasons for the decline of black ballplayers. The semi-pro, sandlot, and industrial teams that once thrived in black communities, serving as feeders to the Negro Leagues and then the major leagues, have disappeared. Basketball and football have replaced baseball as the most popular sports in black communities, where funding for public school baseball teams and neighborhood playgrounds with baseball fields has declined. Major league teams more actively recruit young players from Latin America, who are typically cheaper to hire than black Americans, as Adrian Burgos, in Playing America's Game: Baseball, Latinos, and the Color Line (2007) and Rob Ruck, in Raceball: How the Major Leagues Colonized the Black and Latin Game (2012) document.



Among today's 30 teams, there are only four managers of color—three blacks (the Reds' Dusty Baker, the Astros' Bo Porter, and the Rangers' Ron Washington) and one Latino (the Braves' Fredi Gonzalez). (Two of last season's Latino managers—the Indians' Manny Acta, and Ozzie Guillen of the Marlins—were fired). One Latino (Ruben Amaro Jr. of the Phillies) and one African American (Michael Hill of the Marlins) serve as general managers. (White Sox GM Ken Williams, an African American, was promoted to executive VP during the off-season.) Arturo Moreno, a Latino, has owned the Los Angeles Angels since 2003. Basketball great Earvin "Magic" Johnson, part of the new group that purchased the Los Angeles Dodgers last year, is the first African-American owner of a major league team.



Like baseball, American society—including our workplaces, Congress and other legislative bodies, friendships, and even families—is more integrated than it was in Robinson's day. But there is still an ongoing debate about the magnitude of racial progress, as measured by persistent residential segregation, a significantly higher poverty rate among blacks than whites, and widespread racism within our criminal justice and prison systems.



As Robinson understood, these inequities cannot be solved by individual effort alone. It also requires grassroots activism and protest to attain changes in government policy and business practices. 42, misses an opportunity to recap this important lesson. Robinson's legacy is to remind us of the unfinished agenda of the civil rights revolution and of the important role that movements play in moving the country closer to its ideals.