Wednesday, August 31, 2011

Is the Internet Destroying the Middle Class?

Editor:
Updated: TodayTopic:
Internet Culture Wednesday, Aug 31, 2011 09:32 ET
How the Internet is destroying the middle class
Artist and theorist Jaron Lanier argues that high-tech "innovations" are making us poorer and less ambitious
By Matt Zoller Seitz

Jaron Lanier
In a wide-ranging interview for the online magazine Edge, theorist Jaron Lanier diagnosis many of the ills that ail the Internet-age economy.Apropos of nothing except the subject's brilliance, I strongly urge you to read this truly epic interview with Jaron Lanier at Edge. It's about, well, pretty much everything that affects you day-to-day -- the decline and death of the middle class, the awesome utility of the Internet as a means to spread hate, superstition and lies, the dicey relationship between humans and machines, the reduced expectations of the younger generations.

Lanier also gets into the market dominance of Wal-Mart, Google, Apple and other huge corporate entities, and their role in what he calls "The Local-Global Flip," wherein companies become arrogant and authoritarian global players very quickly, concentrating massive amounts of data in very few hands and creating "a system in which the Internet user becomes the product that is being sold to others."

It's an extraordinary interview, packed with insight and often grimly funny. Lanier is a composer, computer scientist and visual artist, and the author of some wide-ranging and important bits of writing, including You Are Not a Gadge: A Manifesto" and the 2006 Edge essay, ""Digital Maoism: The Hazards of the New Online Collectivism." I've quoted a few choice bits of the the interview, which runs almost 9,000 words. It's worth taking time to read it all.

On the notion that the Internet would make people freer and help create wealth:


Everyone's into Internet things, and yet we have this huge global economic trouble. If you had talked to anyone involved in it 20 years ago, everyone would have said that the ability for people to inexpensively have access to a tremendous global computation and networking facility ought to create wealth. This ought to create wellbeing; this ought to create this incredible expansion in just people living decently, and in personal liberty. And indeed, some of that's happened. Yet if you look at the big picture, it obviously isn't happening enough, if it's happening at all.

On young people's diminished expectations in the Internet era:


I'm astonished at how readily a great many people I know, young people, have accepted a reduced economic prospect and limited freedoms in any substantial sense, and basically traded them for being able to screw around online. There are just a lot of people who feel that being able to get their video or their tweet seen by somebody once in a while gets them enough ego gratification that it's okay with them to still be living with their parents in their 30s.


On the possibility of a "third way" of modeling the online economy that "could grow the middle [class] back."


The thing that I'm thinking about is the [Theodor Holm "Ted"] Nelson approach, the third way where people buy and sell each other information, and can live off of what they do with their hearts and minds as the machines get good enough to do what they would have done with their hands. That thing is the thing that could grow the middle back. Then the crucial element of that is what we can call a "social contract," where people would pay for stuff online from each other if they were also making money from it. When people get nothing from a society, they eventually just riot ...


The most complex and important part of the interview concerns what Lanier calls the "local-global flip." This term refers to what happens when a company -- Wal-Mart, Google and Apple are his three main examples -- conquer certain sectors of the economy quickly and completely, and their dominance over that sector is so complete that it creates a stranglehold over that part of the market, effectively destroys the so-called "Mom-and-Pop" vendors that used to coexist with it, and turns those companies into gateways that control how other people get their goods, services or ideas into the marketplace.

The upsides of this phenomenon are (1) consumers get a massive array of cheap, convienient-to-purchase goods and services and (2) the owners of these companies, some of their executives, and certain associates get very, very rich.

The downsides, however, can be immense, and it can take a long time for the big companies to see them because they're accumulating so much loot in the short-term. Such a company's success, Lanier suggests, can impoverish parts of the population that were doing fine before. And the way that the successful company's system is set up -- with a "My way or the highway" mentality -- can turn vendors and business partners into indentured servants who are terrified to innovate, or even quit their association with the big company, for fear of being financially obliterated.

The result is low-level economic paralysis and depression that persists over years or decades, and that has has far-ranging, often hard-to-see ripple effects, including a localized devastation caused by the mauneverings of these global giants. As Lanier puts it:


The network effects can be so powerful that you cease being a local player. An example of this is Wal-Mart removing so many jobs from their own customers that they start to lose profitability, and suddenly upscale players, like Target, are doing better. Wal-Mart impoverished its own customer base. Google is facing exactly the same issue long-term, although not yet.


We can see this same process at work in other industries, Lanier says. "The finance industry kept on thinking they could eject waste out into the general system, but they became the system ... Insurance companies in America, by trying to only insure people who didn't need insurance, ejected risk into the general system away from themselves, but they became so big that they were no longer local players, and there wasn't some giant vastness to absorb this risk that they'd ejected, and so therefore the system breaks. You see this again and again and again. It's not sustainable."


Rick Perry Embraces New Deal Revisionism

Jonathan Chait
Senior Editor


Rick Perry, in his anti-Obama spiel, drops in some interesting New Deal revisionism:

“What’s dumb is to oversee an economy that has lost that many millions of jobs, to put unemployment numbers that over his four years will stay probably at 9 percent, to downgrade the credit of this good country, to put fiscal policies in place that were a disaster back in the '30s and to try them again in the 2000s,” Perry said on Sean Hannity’s radio show. “That’s what I consider to be the definition of dumb.”
The previously-marginal notion that Franklin Roosevelt worsened the Depression has rapidly hardened into conventional wisdom within the vanguard of the conservative movement. (In 2009, I wrote a review essay about the phenomenon.) But it's remained largely hidden from the political stage. My suspicion is that most Americans consider the New Deal a great success, and Franklin Roosevelt a great president. Indeed, the full political force of the conservative backlash has mobilized the belief that President Obama is taking away good, earned benefits from middle-class Americans and giving them to the undeserving poor.

I could see why this would put Perry in good stead with conservative Republican activists. But does he really want to run in the general election against Obama as Roosevelt?

Hoover Returns

Wasting Away in Hooverville
Jonathan Chait
Jonathan Chait
Senior Editor of The New Republic


Rick Perry Embraces New Deal Revisionism

The Balanced Budget ScamThe Forgotten Man: A New History of the Great Depression
By Amity Shlaes
(HarperCollins, 464 pp., $26.95)

Herbert Hoover
By William E. Leuchtenburg
(Times Books, 208 pp., $22)

Nothing to Fear: FDR's Inner Circle and the Hundred Days that Created Modern America
By Adam Cohen
(Penguin Press, 372 pp., $29.95)

A generation ago, the total dismissal of the New Deal remained a marginal sentiment in American politics. Ronald Reagan boasted of having voted for Franklin Roosevelt. Neoconservatives long maintained that American liberalism had gone wrong only in the 1960s. Now, decades after Democrats grew tired of accusing Republicans of emulating Herbert Hoover, Republicans have begun sounding ... well, exactly like Herbert Hoover. When President Obama recently met with House Republicans, the eighty-two-year-old Roscoe G. Bartlett told him that "I was there" during the New Deal, and, according to one account, "assert[ed] that government intervention did not work then, either." George F. Will, speaking on the Sunday talk show "This Week," declared not long ago, "Before we go into a new New Deal, can we just acknowledge that the first New Deal didn't work?"

When Republicans announce that the New Deal failed--as they now do, over and over again, without any reproach from their own side--they usually say that the case has been proven by the conservative columnist Amity Shlaes in her book The Forgotten Man. Though Shlaes's revisionist history of the New Deal came out a year and a half ago, to wild acclaim on the right, its popularity seems to be peaking now. Fred Barnes of The Weekly Standard recently called Shlaes one of the Republican party's major assets. "Amity Shlaes's book on the failure of the New Deal to revive the economy, The Forgotten Man, was widely read by Republicans in Washington," he reported. "So were her compelling articles on that subject in mainstream newspapers."

This is no exaggeration. The Forgotten Man has been publicly touted by such Republican luminaries as Newt Gingrich, Rudolph Giuliani, Mark Sanford, Jon Kyl, and Mike Pence. Senator John Barrasso was so eager to tout The Forgotten Man that last month he waved around a copy and announced, "in these economic times, a number of members of the Senate are reading a book called The Forgotten Man, about the history of the Great Depression, as we compare and look for solutions, as we look at a stimulus package." Barrasso offered this unsolicited testimonial, apropos of nothing whatsoever, during the confirmation hearing for Energy Secretary Steven Chu. Chu politely ignored the rave, thus giving no sign as to whether he had heard the Good News. Whether or not The Forgotten Man actually persuaded conservatives that the New Deal failed, in the time of their political exile, which is also a time of grave economic crisis, it has become the scripture to which they have flocked.



When they say that the New Deal "didn't work," conservatives almost always mean New Deal fiscal stimulus. (Other policies, such as Social Security or clearing the way for unions, clearly succeeded on their own terms, whatever their ideological merits.) And then, in turn, they confuse New Deal fiscal stimulus with Keynesian economics, which is also not exactly the same thing. So let me step back and briefly explain for the uninitiated what Keynesian economics means. We may not all be Keynesians now, but we would all benefit from knowing what a Keynesian actually is.

Prior to Keynes, the economy was held to be self-correcting. The only cure for a recession was to let wages and prices fall to their natural level. The prevailing attitude, as Paul Krugman writes in his recently re-issued book The Return of Depression Economics, was "a sort of moralistic fatalism." Keynes upended the orthodoxy in a way that was every bit as dramatic as Galileo challenging geocentrism. He insisted that recessions are not a natural process, or the invisible hand's righteous judgment against our sins, but a simple failure of consumer demand.

When people worry about losing their jobs, they sensibly cut back on their spending. But that decision, in turn, reduces demand for goods and services, which results in reduced income or lost jobs for other workers. Keynes called this phenomenon "the paradox of thrift": what makes sense for individuals turns into a disaster for society as a whole. The recession was therefore a failure of collective action that required government action. Government needed to encourage spending by reducing interest rates or, failing that, to inject spending into the economy directly by deliberately running temporary budget deficits.

At the time, orthodox economists deemed this diagnosis heretical and dangerous, but, in the decades that followed, it became a consensus view. Today economists disagree sharply about how to apply Keynes's insights, with many conservative economists questioning the practicality of large-scale government spending to combat recessions; but the essential framework constructed by Keynes--that recessions are caused by a failure of demand, and that at the very least government should not respond to an economic slowdown by paring back its largesse--is no longer in dispute. Even a right-wing Republican economist such as Gregory Mankiw, a former Bush advisor, writes that "if you were going to turn to only one economist to understand the problems facing the economy, there is little doubt that the economist would be John Maynard Keynes."

But everywhere you look, conservative pundits and elected officials have embraced the pre-Keynesian nostrums. Citing The Forgotten Man, they insist that efforts to stimulate the economy are not just insufficient but also counter-productive. Pence has insisted that The Forgotten Man proves "that it was the spending and taxing policies of 1932 and 1936 that exacerbated the situation." Sanford, for his part, offered this fiscal diagnosis: "When times go south you cut spending. That's what families do, that's what businesses do, and I don't think the government should be exempt from that process." That is, of course, a perfect description of the paradox of thrift, only put forward as the solution rather than the problem. Governor Tim Pawlenty of Minnesota insisted that "we can't solve a crisis caused by the reckless issuance of debt by then recklessly issuing even more debt," and called for a balanced-budget amendment to the Constitution, which would of course massively exacerbate the present crisis. It is 1932 again in the Republican Party.




Now here is the extremely strange thing about The Forgotten Man: it does not really argue that the New Deal failed. In fact, Shlaes does not make any actual argument at all, though she does venture some bold claims, which she both fails to substantiate and contradicts elsewhere. Reviewing her book in The New York Times, David Leonhardt noted that Shlaes makes her arguments "mostly by implication." This is putting it kindly. Shlaes introduces the book by asserting her thesis, but she barely even tries to demonstrate it. Instead she chooses to fill nearly four hundred pages with stories that mostly go nowhere. The experience of reading The Forgotten Man is more like talking to an old person who lived through the Depression than it is like reading an actual history of the Depression. Major events get cursory treatment while minor characters, such as an idiosyncratic black preacher or the founder of Alcoholics Anonymous, receive lengthy portraits. Having been prepared for a revisionist argument against the New Deal, I kept wondering if I had picked up the wrong book.

Many of Shlaes's stories do have an ideological point, but the point is usually made in a novelistic way rather than a scholarly one. She tends to depict the New Dealers as vain, confused, or otherwise unsympathetic. She depicts business owners as heroic and noble. It is a kind of revival of the old de haut en bas sort of social history, except this time the tycoons from whose perspective the events are narrated appear as the underappreciated victims, the giants at the bottom of the heap.

Mostly Shlaes employs wild anecdotal selectivity. At one point she calls the pro-labor Wagner Act "coercive," and elsewhere she alludes to the subtle anti-Semitism of a newspaper column criticizing opponents of the National Recovery Administration. Shlaes ignores the vastly greater use of violent coercion on behalf of employers, or the immensely more common use of anti-Semitic tropes against the New Deal. Does Shlaes think that workers were more coercive than capitalists, or that liberals were more anti-Semitic than conservatives? The book does not say, but clearly she wants her readers to come away with this impression.

Shlaes begins every chapter with a date (say, December 1936), an unemployment percentage (15.3) and a Dow Jones Industrial Average. The tick-tick-tick of statistics is meant to show that conditions did not improve throughout the course of Roosevelt's presidency. Yet her statistics are highly selective. As those of us who get our economic information from sources other than the CNBC ticker know, the stock market is not a broad representative of living standards. Meanwhile, as the historian Eric Rauchway has pointed out, her unemployment figures exclude those employed by the Works Progress Administration and other workrelief agencies. Shlaes has explained in an op-ed piece that she did this because "to count a short-term, make-work project as a real job was to mask the anxiety of one who really didn't have regular work with long-term prospects." So, if you worked twelve hours per day in a coal mine hoping not to contract black lung or suffer an injury that would render you useless, you were employed. But if you constructed the Lincoln Tunnel, you had an anxiety-inducing make-work job.

In response to this criticism, Shlaes has retreated to the defense that unemployment was still high anyway. "Even if you add in all the work relief jobs, as some economists do," she has contended, "Roosevelt-era unemployment averages well above 10 percent. That's a level Obama has referred to once or twice--as a nightmare." But Roosevelt inherited unemployment that was over 20 percent! Sure, the level to which it fell was high by absolute standards, but it is certainly pertinent that he cut that level by more than half. By Shlaes's method of reckoning, Thomas Jefferson rates poorly on the scale of territorial acquisition, because on his watch the United States had less than half the square mileage it has today.



Shlaes's actual critique of the New Deal is not easy to pin down. Defining what she believes depends on whether you are reading the book itself or her incessant stream of spin-off journalism. In one article she adopted the classic right-wing line taken up by Andrew Mellon, Hoover's treasury secretary: "Mellon--unlike the Roosevelt administration--understood that American growth would return if you left the economy alone to right itself." This is the conclusion that most excites Shlaes's conservative admirers. And in keeping with this argument, Shlaes, a committed supply-sider, scolds Roosevelt for raising taxes on the rich, which discouraged them from taking risks. She fails to explain how the economy managed to recover after the outbreak of World War II, which saw even higher taxes on the rich, or in the postwar period, when they remained high.

Moreover, the classic right-wing critique fails to explain how the economy recovered at all. In one of his columns touting Shlaes, George Will observed that "the war, not the New Deal, defeated the Depression." Why, though, did the war defeat the Depression? Because it entailed a massive expansion of government spending. The Republicans who have been endlessly making the anti-stimulus case seem not to realize that, if you believe that the war ended the Depression, then you are a Keynesian.

Shlaes also offers up a more subtle, and slippery, version of this argument. In the second iteration, the problem with the New Deal was not that it involved government but that it involved unpredictable government. Businesses failed to hire workers or open factories not because there was a lack of demand for their products, but because constantly changing government policies made them uncertain. "Uncertainty about what to expect from international events and Washington," Shlaes instructs, "made the Dow Jones Industrial Average gyrate."


I agree that there is something to the notion that unpredictable government policies can spook markets. Shlaes cites an industrialist who urged Roosevelt to "make the program clear and then stick to it." But the answer to this perplexity is not that Roosevelt should have abandoned his public works program, Social Security, and regulatory reforms. It is rather that he should adopted them sooner, and hewed to them more consistently. That would have been perfectly clear.

Yet that is not the conclusion that conservatives wish to draw. Nor is it a useful club to wield against contemporary liberals. So at other times Shlaes suggests--again, her writing is so convoluted that it is hard to discern what she means--that eliminating uncertainty means eliminating government activism. In her recent op-ed pieces, she urges Obama to forego Keynesian stimulus and instead cut taxes for corporations and stockholders. Her real argument, then, is that changing the rules in a liberal direction paralyzes businesses with fear, while changing the rules in a conservative direction promotes growth.

Several of Shlaes's admirers have taken up the line that the current slowdown is caused, or at least worsened, by the "uncertainty" of Obama's fiscal stimulus. "A main reason there's all of this 'money on the sidelines' out there among private investors is that Wall Street doesn't know what the government will do next," Jonah Goldberg wrote in National Review. "In short, don't just do something, President Obama--stand there." If this prescription were true, you would suppose it would show up in the business press. After all, there is a thriving media industry devoted to taking the temperature of the markets and discerning what causes them to rise or fall on any given day. Not all these business reporters are left-wing stooges. And yet almost never do they report that uncertainty about Keynesian politics has caused the market to sink. Instead, you regularly come across coverage like this, from The Wall Street Journal on December 8, 2008:

The prospect of new government action to create jobs and
keep auto makers out of the ditch sent stocks higher for a
second consecutive day, amid hopes that a worst-case
scenario for the global economy could be avoided. Gains in
the U.S. were part of a world-wide rally triggered by
President-elect Obama's plan for a stimulus package.



Shlaes's main indictment of the New Deal is contained within the first few pages of her book. It begins with a snapshot of the depressed economy, depicting a desperately poor family, and broadens out to describe a cratering economy, and finally moves on to Washington, where the oblivious Secretary of the Treasury announces his intent to "continue progress toward a balance of the federal budget." The reader is meant to think this is a description of Herbert Hoover's America. But Shlaes concludes the story with an aha! moment: this episode took place in 1937! "The New Deal was almost five years old," Shlaes concludes, after repeating this shocking tale in another op-ed, "but the economy was not back."

If your understanding of the New Deal is limited to the simple notion that Roosevelt spent a lot of money and tamed unemployment, then this story might sound like a persuasive piece of evidence for Shlaes. Yet there is a tip-off within the story that ought to give even the uninformed reader pause: the part where the Treasury Secretary promises to balance the budget. That doesn't sound very New Deal-ish, does it? And indeed it is not. The historical fact is that Roosevelt's administration contained warring factions with often wildly differing ideas. FDR came into office promising to slash the federal budget, but he moved in fits and starts toward a Keynesian policy of fiscal stimulus. After the elections of 1936, though, his more conservative advisors prevailed upon him to roll back the budget. Liberals, including Keynes, protested that this would jeopardize the fragile recovery. And events vindicated them: after impressive growth, the economy plunged back into a recession within a depression.

That Roosevelt see-sawed between Keynesianism and budget-balancing has been conventional wisdom among mainstream historians and economists for decades. The MIT economist E. Carey Brown wrote this in 1956. Keynes made the same point in a pleading letter to Roosevelt in 1937. Economists disagree about the extent to which Roosevelt's fiscal expansion helped. Many give more credit to his abandonment of the gold standard--which Shlaes, naturally, also decries. The fact that he retreated from Keynes in 1937 and that this retarded the recovery, though, bears little dispute.

Adam Cohen's Nothing to Fear, an admiring history of Roosevelt's first hundred days, captures the deep tensions between camps in the Roosevelt administration. Cohen's take is conventional, but it is executed well, tracing the disparate life stories of the main New Dealers in such a way as to make the inevitability of their conflict clear. This was a true team of rivals, some of them winding up as Roosevelt's most unhinged critics. Even before Roosevelt took office, Cohen writes, there was a "fundamental conflict--between spending more to fight the Depression and spending less to balance the budget--that would be a central tension of the Hundred Days." Yet for Shlaes and her admirers, the finding that Roosevelt vacillated between Keynesianism and orthodoxy represents a devastating intellectual blow to the New Deal edifice. And, yes, if you think of the New Deal as a series of unbroken triumphs held together by a clear and consistent ideology, and you have failed to take in any of the scholarship about the period produced over the last half-century, then The Forgotten Man will come as a revelation.


Shlaes writes that her discoveries about the New Deal show that "Roosevelt was unworthy of emulation." But who, exactly, is proposing to emulate everything that Roosevelt did? Much of his program has long been deemed a failure by liberals, including Roosevelt himself. (This includes the National Recovery Administration, which Shlaes dwells on at great length, while breezing past or ignoring altogether vast swaths of the New Deal.) When liberals suggest that Obama follow Roosevelt's model, they do not mean that he should replicate the entire thing. (The way, say, conservatives do when they suggest following Reagan's model.) They mean that he should emulate the Keynesian fiscal policies and other parts of the New Deal that worked. Shlaes has set out to demolish an argument that no serious person has ever made.

At one point in her book, in fact, Shlaes actually concedes that Roosevelt's Keynesian experiment succeeded when he tried it. "The spending was so dramatic that, finally, it functioned as Keynes ... had hoped it would," she writes about 1936, "Within a year unemployment would drop from 22 percent to 14 percent." So Keynesian policy worked, and the main fiscal problem with the New Deal was that Roosevelt made too many concessions to the right. Here we are in agreement. So can conservatives stop carrying around The Forgotten Man like it's Mao's Little Red Book? Can we all go home now?



It should be clear that intellectual coherence is not the purpose of Shlaes's project. The real point is to recreate the political mythology of the period. It does not matter that Shlaes heaps scorn on Roosevelt for doing things that liberals also scorn. Anything that tarnishes his legacy, she seems to think, tarnishes liberalism by association.

The conservative movement has invested enormous effort in crafting a political mythology that gratifies its ideological impulses. The lesson they learned from Ronald Reagan is that ideological purity is not only compatible with political success, but is also the best path to political success. They dutifully applied this interpretation to everything that happened since--George H.W. Bush, then Newt Gingrich, and then George W. Bush all failed because they deviated from the true path--and to all that happened before. Nixon failed because he embraced big government. Kennedy succeeded because he was actually a proto-supply-sider.

From such a perspective, Roosevelt casts a long and threatening shadow over the conservative movement. Here was a case of a wildly unpopular conservative Republican, Herbert Hoover, who gave way to an unabashed liberal Democrat who won four presidential elections. Shlaes goes to great pains to explain away this apparent anomaly. In this instance, she does produce an internally coherent argument. It is, alas, wildly ahistorical.

If the New Deal failed so miserably, one might wonder why voters continued to endorse it. In Shlaes's telling, Roosevelt's first challenger, Alf Landon, lost in 1936 because he "failed to distinguish himself" from Roosevelt. It is certainly true that Landon hailed from the party's moderate wing and shied away from the root-and-branch condemnation of the New Deal favored by, say, Hoover. But as the campaign wore on, Landon's rhetoric grew increasingly harsh. If Roosevelt returned to office, he warned, "business as we know it is to disappear." Voters who opposed the New Deal may not have had a perfect choice, but they did have a clear one. It also takes quite a bit of ideological credulity to believe, as Shlaes apparently does, that Roosevelt's twenty-point victory represented anything other than massive support for his program. Landon himself later remarked that "I don't think that it would have made any difference what kind of a campaign I made as far as stopping this avalanche is concerned."

And Shlaes offers an even odder explanation for Roosevelt's triumph in 1940. Wendell Willkie seized the advantage by attacking the New Deal, she writes, but squandered it with his dovish position on the war. The war, she argues, had become "the single best argument to reelect Roosevelt and give him special powers." After the election, she asserts, the Republicans "concluded, accurately enough, that the outcome would sideline not only their party but their record of accuracy when it came to the economy. They had been right so often in the 1930s and they would not get credit for it. The great error of their isolationism was what stood out."


Shlaes, characteristically, bolsters this highly idiosyncratic reading of history with only bare wisps of data. It is true that the outbreak of war in Europe made Roosevelt, the incumbent, appear safer. But this pro-incumbent upsurge merely cancelled out a powerful current of anti-third term sentiment. Moreover, public opinion opposed entry into the war, and Roosevelt had to fight the suspicion that he was nudging the country into the war by explicitly promising to stay out. Shlaes's portrayal of an electorate seeking activist government abroad and laissez-faire at home gets the history almost perfectly backward. (The Forgotten Man ends with the 1940 race, sparing her readers any further contortions of electoral interpretation.)



The final unanswered question that must nag at the minds of the true believers is how the Depression managed to develop even before Roosevelt assumed office. After all, his bungling caused the economy to stall for years, yet the Depression was already more than three years old before Roosevelt even took office. Shlaes's answer is to implicate Hoover as a New Deal man himself:

Hoover had called for a bank holiday to end the
banking crisis; Roosevelt's first act was to declare a bank
holiday to sort out the banks and build confidence. ...
Hoover had spent on public hospitals and bridges;
Roosevelt created the post of relief administrator for the
old Republican progressive Harry Hopkins. Hoover had
loved public works; Roosevelt created a Public Works
Administration. ... Hoover had known that debt was a
problem and created the Reconstruction Finance
Corporation; Roosevelt put Jones at the head of the RFC
so he might address the debt. ...

Hoover had deplored the shorting of Wall Street's rogues;
Roosevelt set his brain trusters to writing a law that
would create a regulator for Wall Street.

This part of Shlaes's argument has generated enormous enthusiasm on the right. At last the cultural baggage of Roosevelt's predecessor--Hoovervilles, Hoover flags, and the like--has been lifted off the shoulders of conservatism and onto the real culprit, which is liberalism. Senator Kyl proclaimed on the Senate floor last fall that "in the excellent history of the Great Depression by Amity Shlaes, The Forgotten Man, we are reminded that Herbert Hoover was an interventionist, a protectionist, and a strong critic of markets." Recently the House Republican Steve Austria went so far as to declare that Roosevelt actually caused the Depression. "When Roosevelt did this, he put our country into a Great Depression," Austria said. "He tried to borrow and spend, he tried to use the Keynesian approach, and our country ended up in a Great Depression. That's just history."

There is indeed a revisionist scholarship that recasts Hoover as an energetic quasi-progressive rather than a stubborn reactionary. William Leuchtenburg, in his short new biography Herbert Hoover, makes some allowance for the revisionist case, but finally he settles on a more traditional conclusion. Leuchtenburg shows that Hoover's history of activism consistently left him with the belief in the primacy of voluntarism and the private sector, a faith that left him unsuited to handle a catastrophe like the Depression.

Leuchtenburg also provides a handy rebuttal to Shlaes's preposterous conflation of the two presidents. Hoover's National Credit Corporation, he explains, "did next to nothing." Hoover and Roosevelt would be amused to hear that his bank holiday aped Hoover's, given that Hoover denounced the Emergency Banking Act as a "move to gigantic socialism." (Does this ring a bell?) Shlaes's attempt to equate Hoover's disdain for short-sellers and Roosevelt's regulation of the market presumes that there is no important difference between expressing disapproval for something and taking public action against it.

Yes, Hoover created the Reconstruction Finance Corporation. But (I am quoting Leuchtenburg) "at Hoover's behest, RFC officials administered the law so stingily that the tens of thousands of jobs the country had been promised were never created. By mid-October, the RFC had approved only three of the 243 applications it had received for public works projects." Hoover's head of unemployment relief said that "federal aid would be a disservice to the unemployed." Hoover was a staunch ideological conservative who remarked, in 1928, that "even if governmental conduct of business could give us more efficiency instead of less efficiency, the fundamental objection to it would remain unaltered and unabated." This was not, to put it mildly, Roosevelt's philosophy.

Hoover himself would have found the notion that Roosevelt mostly carried on his work offensive. During the campaign of 1932 he warned that, if the New Deal came to fruition, "the grass will grow in the streets of a hundred cities, a thousand towns." This was not mere campaign rhetoric. After Roosevelt won, Hoover desperately sought to persuade him to abandon his platform. He spent the rest of his years denouncing Roosevelt's reforms as dangerous Bolshevism. Leuchtenburg records that Hoover wrote a book about the New Deal so acerbic that his own estate suppressed its publication to avoid further tainting his reputation.

Of course, the transition from one presidency to another always involves some level of continuity. The world never begins completely anew with a presidential inauguration. But the break between Roosevelt and Hoover was certainly sharper than that between any president and his predecessor in American history. After 1932, generations of Democrats continued to paint Republicans as neo-Hooverites. This was mostly a calumny. Though Hoover himself continued to assail the New Deal as calamitous socialism right up to his death in 1964, from 1936 on the party remained in the hands of men who understood that the New Deal had built an enduring base of support and could not be directly assailed.

But now we have come to a time when leading Republicans and conservatives--not just cranks, but the leadership of the party and the movement--once again sound exactly like Herbert Hoover. "Prosperity cannot be restored by raids upon the public Treasury," said President Hoover in 1930. "Our plan is rooted in the philosophy that we cannot borrow and spend our way back to prosperity," said House Minority Leader Boehner in 2009. They have come to this point by preferring theology to history, by wiping Hoover's record from their memories and replacing it with something very close to its opposite. It is Hoover, truly, who is the Forgotten Man.

Tuesday, August 30, 2011

Gene Chizik - All In

It's game week so I've started reading the Coach's book. My plan is read it slowly, sipping it rather than rushing through it. In the prologue he calls Auburn his dream job and says it didn't take long for his 5 & 19 record at Iowa State to come up in his Auburn job interview.

Libertarianism is Inherently Autocratic


Libertarianism Tuesday, Aug 30, 2011 07:01 ET
War Room Why libertarians apologize for autocracy
By Michael Lind

Having denounced liberals as crypto-communists for half a century during the Cold War, the American right now routinely accuses the center-left of being fascist. This libel was given currency in Jonah Goldberg's 2009 book "Liberal Fascism: The Secret History of the American Left, From Mussolini to the Politics of Meaning." From the support of a few progressives a century ago for eugenics, and expressions of admiration by a few 1920s liberals for Mussolini’s ability to make the trains run on time, Goldberg and others on the right have crafted the latest in a series of right-wing conspiracy theories about American history, this one claiming that Woodrow Wilson and Franklin Roosevelt deliberately set the U.S. on the road to an American version of Mussolini’s corporate state.

Given their professed interest in admirers of Mussolini, it is curious that American conservatives and libertarians have not seen fit to discuss the view of fascism held by one of the heroes of modern American libertarianism, the Austrian economist Ludwig von Mises. In his book "Liberalism," published in 1927 after Mussolini had seized power in Italy, Mises wrote:


It cannot be denied that Fascism and similar movements aimed at the establishment of dictatorships are full of the best intentions and that their intervention has for the moment saved European civilization. The merit that Fascism has thereby won for itself will live on eternally in history.








Indeed, in all 17 years of military rule, the total number of dead and missing -- according to the official Retting Commission -- was 2,279. Were there abuses? Were there real victims? Without the slightest doubt. A war on terror tends to be a dirty war.

Still, in the case of Chile, and contrary to news reports, the number of actual victims was small.


Democracy is the current industry standard political system, but unfortunately it is ill-suited for a libertarian state. It has substantial systemic flaws, which are well-covered elsewhere,[2] and it poses major problems specifically for libertarians:

1) Most people are not by nature libertarians. David Nolan reports that surveys show at most 16% of people have libertarian beliefs. Nolan, the man who founded the Libertarian Party back in 1971, now calls for libertarians to give up on the strategy of electing candidates! …

2) Democracy is rigged against libertarians. Candidates bid for electoral victory partly by selling future political favors to raise funds and votes for their campaigns. Libertarians (and other honest candidates) who will not abuse their office can't sell favors, thus have fewer resources to campaign with, and so have a huge intrinsic disadvantage in an election.


In his recommendations for further reading, Friedman included the Austrian economist Hans-Hermann Hoppe’s book "Democracy: The God That Failed," which appeared in 2001, following the fall of the Berlin Wall, during the greatest wave of global democratization in history. In his Cato Unbound manifesto, Friedman called on his fellow libertarians to give up on the whole idea of the democratic nation-state and join his movement in favor of "seasteading," or the creation of new, microscopic sovereign states on repurposed oil derricks, where people who think that "Atlas Shrugged" is really cool can be in the majority for a change.

In a similar spirit, a libertarian economics blogger named Arnold Kling has proposed his own alternative to democracy, which he calls "competitive government":


In this essay, I will suggest that competitive government might be better than democratic government at satisfying the desires of the governed. In democratic government, people take jurisdictions as given, and they elect leaders. In competitive government, people take leaders as given, and they select jurisdictions.


When it comes to American history, libertarians tend retrospectively to side with the Confederacy against the Union. Yes, yes, the South had slavery -- but it also had low tariffs, while Abraham Lincoln's free labor North was protectionist. Surely the tariff was a greater evil than slavery.

The posthumous induction of Jefferson Davis into the libertarian hall of fame was too much for David Boaz, a vice president of Cato. In a 2010 essay in Reason magazine titled "Up From Slavery: There’s No Such Thing as a Golden Age of Lost Liberty," Boaz observed that even whites in the antebellum North "did not actually live in a free society ... Liberalism seeks not just to liberate this or that person, but to create a rule of law exemplifying equal freedom. By that standard, even the plantation owners did not live in a free society, nor even did people in the free states."

Boaz asked his fellow libertarians, "If you had to choose, would you rather live in a country with a department of labor and even an income tax or a Dred Scott decision and a Fugitive Slave Act?" It says something that in 2009 this question stirred up a controversy on the libertarian right.

Libertarians and conservatives, to be sure, can point to many examples of naive liberals in the last century who embarrassed themselves by praising this or that squalid, tyrannical communist regime, from the Soviet Union and communist China to petty police states like North Korea, communist Vietnam and Castro’s Cuba. But the apologists for tyranny on the left were always opposed by anti-communist liberals and anti-communist democratic socialists. Where were the anti-authoritarian libertarians, denouncing libertarian fellow travelers of Pinochet like von Hayek and Milton Friedman?

For that matter, where was the libertarian right during the great struggles for individual liberty in America in the last half-century? The libertarian movement has been conspicuously absent from the campaigns for civil rights for nonwhites, women, gays and lesbians. Most, if not all, libertarians support sexual and reproductive freedom (though Rand Paul has expressed doubts about federal civil rights legislation). But civil libertarian activists are found overwhelmingly on the left. Their right-wing brethren have been concerned with issues more important than civil rights, voting rights, abuses by police and the military, and the subordination of politics to religion -- issues like the campaign to expand human freedom by turning highways over to toll-extracting private corporations and the crusade to funnel money from Social Security to Wall Street brokerage firms.

While progressives betray their principles when they apologize for autocracy, libertarians do not. Today’s libertarians claim to be the heirs of the classical liberals of the 19th century. Without exception the great thinkers of classical liberalism, like Benjamin Constant, Thomas Babington Macaulay and John Stuart Mill, viewed universal suffrage democracy as a threat to property rights and capitalism. Mill favored educational qualifications for voters, like the "literacy tests" used to disfranchise most blacks and many whites in the South before the 1960s. After the Civil War, Lord Acton wrote to Robert E. Lee, commiserating with him on the defeat of the Confederacy.

In a letter to an American in 1857, Macaulay wrote:


Dear Sir: You are surprised to learn that I have not a high opinion of Mr. JEFFERSON, and I am surprised at your surprise. I am certain that I never wrote a line, and that I never, in Parliament, in conversation, or even on the hustings -- a place where it is the fashion to court the populace -- uttered a word indicating an opinion that the supreme authority in a State ought to be intrusted to the majority of citizens told by the head; in other words, to the poorest and most ignorant part of society. I have long been convinced that institutions purely democratic must, sooner or later, destroy liberty, or civilization, or both.


By "purely democratic" Macaulay meant universal suffrage; he opposed democracy even with checks and balances and written constitutions.


It is quite plain that your Government will never be able to restrain a distressed and discontented majority. For with you the majority is the Government, and has the rich, who are always a minority, absolutely at its mercy. The day will come when, in the State of New-York, a multitude of people, none of whom has had more than half a breakfast, or expects to have more than half a dinner, will choose a Legislature. Is it possible to doubt what sort of Legislature will be chosen? On one side is a statesman preaching patience, respect for vested rights, strict observance of public faith. On the other is a demagogue ranting about the tyranny of capitalists and usurers, and asking why anybody should be permitted to drink champagne and to ride in a carriage, while thousands of honest folks are in want of necessaries. Which of the two candidates is likely to be preferred by a working man who hears his children cry for more bread?


Macaulay’s solution was to limit voting rights to those who drink champagne and ride in carriages, on the proto-Reaganite theory that some of their wealth would trickle down to people with hungry, crying children, "none of whom has had more than half a breakfast, or expects to have more than half a dinner."

The history of democratic nation-states since the 19th century proves that Macaulay, and von Mises, and Hayek, as well as lesser lights like Patri Friedman, have been right to argue that democracy is incompatible with libertarianism. Every modern, advanced democracy, including the United States, devotes between a third and half of its GDP to government, in both direct spending on public services like defense and transfer payments. Given the power to vote, most populations will not only vote for some system of government-backed social insurance, but also for all sorts of interventions in individual behavior that libertarians object to, from laws banning nudity in public to laws mandating that people support their children, do not torture or neglect their pets and water their lawns during droughts according to scheduled rationing.

Unfortunately for libertarians who, like Hayek, prefer libertarian dictatorships to welfare-state democracies, even modern authoritarians reject the small-government creed. The most successful authoritarian capitalist regimes, such as today’s China and South Korea and Taiwan before their recent transitions to democracy, have been highly interventionist in economics, promoting economic growth by means of state-controlled banking, state-owned enterprises, government promotion of cartels, suppression of wages and consumption, tariffs and nontariff barriers to imports, toleration of intellectual piracy, massive infrastructure projects to help industry, and subsidies to manufacturers in the form of artificially cheap raw materials, energy and land.

The dread of democracy by libertarians and classical liberals is justified. Libertarianism really is incompatible with democracy. Most libertarians have made it clear which of the two they prefer. The only question that remains to be settled is why anyone should pay attention to libertarians.

Michael Lind is Policy Director of the Economic Growth Program at the New America Foundation and is the author of "The Next American Nation: The New Nationalism and the Fourth American Revolution." More: Michael Lind

Monday, August 29, 2011

Is Perry Dumb?

Yes---dumb as dishwater.


Christopher Hitchens asks: "Does the Texas governor believe his idiotic religious rhetoric, or is he just pandering for votes?" I ask: why not both?

Friday, August 26, 2011

Embracing the Left

Editor: Laura Miller
Updated: TodayTopic:
History Friday, Aug 26, 2011 17:01 ET
Why won't America embrace the left?
In two centuries, the movement's history in America is plagued by failure. An expert explains why
By Mandy Van Deven

What has the left really accomplished over the past two centuries? FDR's New Deal remains one of the great American success stories. In the '60s, leftist politics created a massive countercultural movement -- and sexual and feminist revolutions. The civil rights movement transformed both American society and the American soul. But, if you compare the accomplishments of the American left to those of other parts of the world, like Western Europe, its record is remarkably dismal, with a surprising lack of real political and social impact.

At least, that's the main takeaway from "American Dreamers," a new book by Michael Kazin, professor of history at Georgetown University, which covers nearly 200 years of struggle for civil rights, sexual equality and radical rebellion. His book explores the way the national conversation has been changed by union organizers, gay rights activists and feminists. He also writes about how their techniques have now been adopted by the Tea Party movement. From Michael Moore to "Wall-E," he argues that, although the left has been successful at transforming American culture, when it comes to practical change, it's been woefully unsuccessful.

Salon spoke to Kazin over the phone about the difference between Europe and America, the rise of the professional left -- and why the Lorax is a progressive icon.

In the book, you argue that the left has been very successful at changing American culture -- but not at making real economic or political change. Why?

It's easier to get people to think about things differently than it is to construct institutions that alter the basic building blocks of society. When leftists talk about having a vision of how things might be different, they attract an audience and create a new way of perceiving things. It's a different issue altogether to go up against entrenched structures of wealth and political power. There are few obstacles to talking differently, singing different kinds of songs, or making a different kind of art, but it takes a sustained movement of millions of people to really change the structures, and that is much harder to organize. Also, most Americans accept the basic ground rules of capitalist society. The ideas are that if you work hard you can get ahead and that it's better to be self-employed than employed by the people. They believe that the basics of a capitalist society are just or can be made just with small alterations. Americans want capitalism to work well for everybody, which is somewhat of a contradiction in terms since capitalism is about people competing with each other to get ahead, and everyone's not going to be able to do well at the same time. That's simply not possible.

Continue reading
Why has the left in Europe been so much more successful at making real change?

The left in Europe arises out of a more traditional class structure, and the left parties there were formed on the basis on those class divisions. Most European countries had feudal societies before they transformed into nation-states. When those societies became capitalist, they retained many of the old divisions both in terms of people's consciousness and in terms of the new social structure. Peasants and lords became workers and employers. So, the parties there tended to fall along class lines much more than in the United States, and people growing up on either side of the class boundary fueled the movements on the left. Even though the differences between the labor or socialist parties and the centrist or right-wing parties have diminished over time, the vision of a socialist society is still alive in many European countries. In America, however, socialism and communism were never more than marginal beliefs.

You would think that the left would become more popular during a bad economy, but that doesn't seem to be happening right now. Why?

That idea is based more on what happened in the Great Depression era than anything that has happened since. The left's success in the 1930s was based on a lot of preparation that went back to the Gilded Age and the Progressive Era when corporations were seen as malefactors of great wealth. When the Great Depression hit there was immediate support for ideas that people on the left had been talking about, like that corporations are selfish and exploit their workers or that the wealth should be more evenly spread out. For the past 35 years, conservative notions about Big Government rather than liberal ones about Big Business have been dominant. When the economic crisis hit in the 2008, Americans were already primed to believe the government couldn't do anything right because it hasn't been doing anything right for years. Ironically, the conservatives were proved right when the stimulus didn't do what the Obama administration hoped it would do, and clearly the Tea Party has been able to grow on that policy mistake. The reaction depends on what people think when an economic crisis hits, not what people say to make their case after it has happened.

So what arguments does the left make well?

The ones regarding equality and rights. That's clear when you look at how popular support is for gay marriage now, but Keynesian economics is not so popular. It'd be nice if they were both popular, but to make political change, you need sustained mobilization of social forces in your favor. You need to make good arguments and also put pressure on people in power. For all kinds of reasons, it's been more difficult to do that. The support Americans have for what could be called "moral capitalism" goes very deep. The myth of the self-made man that emerged in the 19th century wasn't entirely a myth. There were people who came to America and did very well for themselves. They had to do things like kill Native Americans and destroy the land in the process, but they made better lives for their families.

Historically, a lot of leftist activism has been based in religion, but these days, few people would make that connection. Why does that get lost in the retelling?

The wide political divide we have now between people who go to church regularly and people who don't tends to break down along liberal and conservative lines. As a result, we tend to forget that evangelical Protestants in the 19th and 20th centuries were attracted to a social gospel that taught them to be their brother's keeper and that Christ called on them to change the world. That belief system was true for the abolitionists, the Populists, the labor movement, for many early socialists, and for black radicals like Frederick Douglass and David Walker. We've lost that history since the 1950s or so because this growing division frames the understanding of religious politics for a lot of people. I think it's a real shame that we allow the arguments about whether there is a God or not to obscure the potential consequences of what people do with their beliefs.

So what influence has the left actually had on American ideals?

The left has promoted a lot of the important changes that have occurred in American society, especially in expanding the meaning of "individual freedoms" to include African-Americans, women and homosexuals. The United States says it is committed to individual freedoms, but in practice those freedoms have been either betrayed or not fully realized. The left in this country has always been the vanguard of calling for complete equal rights and social equality. A lot of the major movements for equal rights that we celebrate -- the black freedom movement, the women's movement, the gay liberation movement -- were all started by people who were considered to be radicals in their time. The memorial for Dr. Martin Luther King Jr. is being unveiled this week in Washington, D.C., and most people don't realize how daring and dangerous it was for him to talk about civil rights and take part in that movement. The March on Washington was actually a protest for jobs and freedom that was heavily financed by and mobilized by the labor movement, even though people remember it as a march for African-Americans' civil rights.

Is that because we revise our own history once it's no longer fashionable to hold a particular point of view?

Once things are accepted, they become sanitized to a certain degree. Some parts of what it means to be radical get accepted and other parts get sheared off. Privately, King called himself a Democratic Socialist and wanted a much more profound redistribution of wealth. Politicians like Dennis Kucinich or Ralph Nader wouldn't dare advocate today for the things King was struggling to change in the 1960s. Most Americans know King as a charismatic leader who wanted the races to be nicer to each other and worked for African-Americans to have legal equality, but that was only part of it. He was after a much more radical dream. Of course, he wouldn't have a monument or a national holiday if he were perceived to be a radical, so there is good and bad in the revision.

Well, if anything, the left knows how to capture the media's attention -- from burning bras in the 1968 Miss America protest to SlutWalks.

There are a lot of examples of leftists doing outlandish things that bring attention to the issues they support. Americans have been attracted to mass spectacles since the evangelical Great Awakening in the mid-18th century. We like yelling and protesting in colorful ways. The United Auto Workers was really established in the 1930s with the sit-down strikes in Flint, Mich., which is when the workers occupied the factories and kept the bosses out. It was a very imaginative event that was organized by members of the Communist Party. The workers weren't just staying inside the warm plants in the middle of the winter in Michigan. They were saying the plants were as much theirs as the employers because without the workers no cars would be made. So, they slept on the upholstery of the cars they'd made until the union was recognized.

These days, a surprising number of Americans actually make their living by working in leftist activism. When did being a leftist become a career?

The professionalization of the left was inevitable in some ways because the work of the 1960s was primarily anchored in colleges and college communities. It's not surprising that people like me became liberals instead of radicals after the revolution didn't happen. When we had to find a way to make a living, it made sense to become professionals. That is essentially what we were going to college to become, even though we took a detour for a while. To some degree, you need professionals to organize. The people who organized the labor movement in the 1930s were often skilled workers, but there were also professionals like lawyers and journalists. The problem, of course, is when the movement is perceived as a movement of the better-educated, wealthy, privileged elite who are simply self-interested. That image is a problem the left, including liberals, continues to have because it has been cut off from a lot of ordinary working people.

How has the Internet changed the left in America?

The Internet makes it easier to mobilize if you already have a group that's organizing around some issue. It's good for meet-ups more than movements. Even the word "movement" has gotten away from the idea of making change. Now it just means people are moving. As wonderful as the Internet is, it doesn't obviate the need for some of the old things that movements need to grow -- like face-to-face organizing. That builds up a sense of trust among people who work together. Some people tend to be wowed by a great new idea or video, as if that is going to be enough. The Internet can quickly educate people about issues, but it's not going to replace the need for a civil society.

What lessons do you think contemporary leftists should learn from their own history?

In order for the left to be successful, it needs to build institutions that involve people who are not intellectuals and professionals, and ones that aren't full of people who only talk to each other. The left should welcome debate because it is healthiest when it argues with itself as well as with other Americans who think differently. When people on the left talk, they have to figure out ways of connecting their ideas to American ideals. Liberty and equality for all are wonderful and utopian standards that most Americans identify with, and this is a good thing for the left because it's what we have been fighting for all along.

Looking back at the whole history of the left in the United States, who are your favorite American leftists?

I have been made fun of recently for saying this, but I think Dr. Seuss has been greatly overlooked as a leftist. He wasn't a propagandist, but many of his best-selling books -- like "Yertle the Turtle," "The Lorax" and "The Butter Battle Book" -- show that he had a leftist political message. Most successful political messages come from people who aren't very closely associated with a particular left-wing group. Also, although the Greenwich Village artists and writers of the early 20th century aren't exactly neglected, they are cast off as some sort of bohemian dilettantes. But Max Eastman, the editor of the magazine the Masses who later became a conservative, was a major voice of industrial labor unions, sexual liberation, birth control and modernism. In a lot of ways, whether they know it or not, the cultural left today has been inspired by the things the Masses was doing a hundred years ago.

Thursday, August 25, 2011

Jo Wharton Heath - Sarah's Alice

Jo Heath is a retired Auburn math professor. This is her first novel.

I enjoyed it immensely because of the story and because of the local geographical references.

Sarah married young to a preacher who abuses her. While having lunch in Sumiton after visiting in Hamilton, she bolts from the restaurant and hitches a ride with a trucker. She ends up taking the new name of Alice to reinvent herself and after returning to Auburn, where she lives with Joseph, her abusive husband, she heads out again after taking all of their money. She ends up in Lafayette, Louisiana, and the story ends after a final confrontation with Joseph.

Dr. Heath mentions the streets of Auburn, like Magnolia, and she makes reference to the "Heart of Alabama" motel, I assume an inside joke reference to the former "Heart of Auburn" motel. It's very funny.

The writing is simple and straight-forward, short and sweet. I like it very much.

Sunday, August 21, 2011

Richard Beeman - Plain, Honest Men: The Making of the Constitution (2)

Richard Beeman – Plain, Honest Men

The leaders of the Revolutionary Era were men of their generation, not men of the 21st century. They were the products of a particular place and moment of the late 18th century. The production of our constitution was highly improbable. P. xi

Americans argued about the interpretation of their constitution from the beginning, giving the lie to those who point to “original intent” as the proper interpreting mechanism of our founding document. P. xiii

The three indispensable men of the convention were Washington, Madison, and Franklin. P. 40

The thing that has always puzzled me most about the Constitutional Convention is how a convention called for the express purpose of amending the Articles of Confederation instead ended up with a completely new charter that we call our Constitution. Beeman’s first explanation is that the first delegates to assemble in Philadelphia before the convention formally got underway were the delegates from Virginia and Pennsylvania. These delegates, led by James Madison and Gouverneur Morris, wanted a stronger national government and with Madison’s preparation beforehand, hatched what came to be called The Virginia Plan, whose recommendations were predicated on a new constitution totally replacing the Articles. And so it was the Virginia Plan which first dominated discussion when the convention opened May 25, 1787. Chapter 3

Twenty-five of the 55 men who attended this convention owned slaves. Most of them had one or more slaves with them at the convention. P. 67

The convention was completely secret. The Constitution would never have happened if the meetings had been public. P. 83

"Many of the delegates were stunned by the revolutionary character of the proposal (The Virginia Plan as presented by Edmund Randolph) so boldly laid before them. They had come to Philadelphia to revise the Articles of Confederation and were now being asked to create an entirely new kind of government. The Articles of Confederation had created 'federal' government, in the common understanding of that term. The resolutions presented by Randolph repeatedly used the world 'national' rather than 'federal' to describe the various brances of the proposed government, and their insistence that the powers of this government were superior to those of the states left no doubt of their intention." P. 91.

There was Shays Rebellion that was on the minds of the nationalist minded delegates but it was the excess of democracy in the states---the perceived irresponsibility of the state legislatures---that was paramount in their minds. Chapter 5

"There it was---the first explicit proposal to scrap the Articles of Confederation and substitute in its place a supreme national government. The strategy of the Virginia Plan's advocates was to get the delegates to accept the basic priciple of a 'supreme national government' before getting bogged down in the details of the plan itself. P. 100

"And thus on the third full day of business the Convention rejected the princple of federalism on which the American republic had been founded and endorsed in its place the notion of a supreme national government. P. 102

We the people or we the states? It seems that our Constitution has some of the both, but the preamble does start with "we the people." Liberals prefer we the people whereas conservatives prefer we the states. P. 105

The delegates had deep misgivings about democracy though they considered themselves republican in that they stood solidly against hereditary monarchy and in favor of some kind representative government. Their democracy had to be mediated. They would debate the proper place of elitists vs. the proper role of the people. P. 123

"As the delegaes took their seats at ten o'clock on the unseasonably cool and cloudy June 1 morning, Madison and his nationalist allies remained in control of the agenda. P. 125

Without George Washington we might have had a parlimentary rather than a presidential system of government. P. 129

The counterattack to the Virginia Plan led by William Patterson could be called The New Jersey Plan, but it ultimately failed. P. 144

The Constitution is remarkably secular. P. 179

. . . The principle division of interests within the country, he (Madison) observed wold never lie between the large and small states, but between the Northern and Southern based who did or did not have slaves. P. 183

George Washington did not miss a single session of the convention. He rightly believed that the fate of the 13 colonies depended on what happened at this meeting in Philadelphia, and that success or failure depended mostly on him. It seems to me that this alone made him a great man. P. 193

Discussions of slavery at the Convention including the three-fifths compromise were completely devoid of moral considerations. P. 213

“. . . while our Founding Fathers were for the most part farseeing men living in an age of Enlightenment, the year in which they carried out their deliberations, 1787, was more closely linked in time to 1692---the year of the Salem witchcraft trials and executions in Massachusetts---than it is to our own era.” P. 227

Fueled by his deep mistrust of misconduct by state legislatures, Madison fought to the bitter end for congressional veto of state laws, but the convention, wisely, decisively rejected congressional veto over state legislatures. P. 229

The convention rejected direct election of the chief executive by a vote of 9 to 1. P. 232

The debates can give ammunition to those who favor a “living constitution” (as I do) as well as those who favor “original intent,” but the facts far more strongly support the former interpretation. P. 269-70

The insertion of the “necessary and proper clause” most likely by James Wilson came to assume great importance in constitutional history. P. 274

Led by Benjamin Franklin, the convention voted down a proposal to impose a property requirement for voting. P. 280

Slavery is the paradox at the nation’s core. Attention was diverted from this paradox by the economic interests of slaveholders and those Americans who benefited from slavery even if they were not slaveholders themselves, and by the belief that Africans were fundamentally inferior human beings, people so culturally and physically different from white people that they could never function responsibly as equal citizens in a free republic. P. 314

The racial consensus of the times---belief in the inherent inferiority of people who were different whether blacks or Indians---made it highly unlikely that the Philadelphia delegates would move decisively to abolish slavery. P. 314

The interest of protecting slavery came mainly from the South Carolina and Georgia delegates. P. 315

There WERE some Americans, like Luther Martin, who were deeply disturbed by the inherent contradictions of America’s commitment to liberty on the one hand, and the protection of slavery on the other hand. P. 320

For sure the Founders who owned slaves and yet opposed slavery theoretically were hypocrites; yet the complicating factor is these men could not imagine a society with free blacks. They could not imagine Africans as equal citizens. So what to do? P. 323

The fugitive slave provision is in Article IV, Section 2, Paragraph 3. P. 350

Saturday, August 20, 2011

Do You Suffer From Decision Fatigue?

BY John Tierney
New York Times
17 August 2011

Three men doing time in Israeli prisons recently appeared before a parole board consisting of a judge, a criminologist and a social worker. The three prisoners had completed at least two-thirds of their sentences, but the parole board granted freedom to only one of them. Guess which one:

Case 1 (heard at 8:50 a.m.): An Arab Israeli serving a 30-month sentence for fraud.

Case 2 (heard at 3:10 p.m.): A Jewish Israeli serving a 16-month sentence for assault.

Case 3 (heard at 4:25 p.m.): An Arab Israeli serving a 30-month sentence for fraud.

There was a pattern to the parole board’s decisions, but it wasn’t related to the men’s ethnic backgrounds, crimes or sentences. It was all about timing, as researchers discovered by analyzing more than 1,100 decisions over the course of a year. Judges, who would hear the prisoners’ appeals and then get advice from the other members of the board, approved parole in about a third of the cases, but the probability of being paroled fluctuated wildly throughout the day. Prisoners who appeared early in the morning received parole about 70 percent of the time, while those who appeared late in the day were paroled less than 10 percent of the time.

The odds favored the prisoner who appeared at 8:50 a.m. — and he did in fact receive parole. But even though the other Arab Israeli prisoner was serving the same sentence for the same crime — fraud — the odds were against him when he appeared (on a different day) at 4:25 in the afternoon. He was denied parole, as was the Jewish Israeli prisoner at 3:10 p.m, whose sentence was shorter than that of the man who was released. They were just asking for parole at the wrong time of day.

There was nothing malicious or even unusual about the judges’ behavior, which was reported earlier this year by Jonathan Levav of Stanford and Shai Danziger of Ben-Gurion University. The judges’ erratic judgment was due to the occupational hazard of being, as George W. Bush once put it, “the decider.” The mental work of ruling on case after case, whatever the individual merits, wore them down. This sort of decision fatigue can make quarterbacks prone to dubious choices late in the game and C.F.O.’s prone to disastrous dalliances late in the evening. It routinely warps the judgment of everyone, executive and nonexecutive, rich and poor — in fact, it can take a special toll on the poor. Yet few people are even aware of it, and researchers are only beginning to understand why it happens and how to counteract it.

Decision fatigue helps explain why ordinarily sensible people get angry at colleagues and families, splurge on clothes, buy junk food at the supermarket and can’t resist the dealer’s offer to rustproof their new car. No matter how rational and high-minded you try to be, you can’t make decision after decision without paying a biological price. It’s different from ordinary physical fatigue — you’re not consciously aware of being tired — but you’re low on mental energy. The more choices you make throughout the day, the harder each one becomes for your brain, and eventually it looks for shortcuts, usually in either of two very different ways. One shortcut is to become reckless: to act impulsively instead of expending the energy to first think through the consequences. (Sure, tweet that photo! What could go wrong?) The other shortcut is the ultimate energy saver: do nothing. Instead of agonizing over decisions, avoid any choice. Ducking a decision often creates bigger problems in the long run, but for the moment, it eases the mental strain. You start to resist any change, any potentially risky move — like releasing a prisoner who might commit a crime. So the fatigued judge on a parole board takes the easy way out, and the prisoner keeps doing time.

Decision fatigue is the newest discovery involving a phenomenon called ego depletion, a term coined by the social psychologist Roy F. Baumeister in homage to a Freudian hypothesis. Freud speculated that the self, or ego, depended on mental activities involving the transfer of energy. He was vague about the details, though, and quite wrong about some of them (like his idea that artists “sublimate” sexual energy into their work, which would imply that adultery should be especially rare at artists’ colonies). Freud’s energy model of the self was generally ignored until the end of the century, when Baumeister began studying mental discipline in a series of experiments, first at Case Western and then at Florida State University.

These experiments demonstrated that there is a finite store of mental energy for exerting self-control. When people fended off the temptation to scarf down M&M’s or freshly baked chocolate-chip cookies, they were then less able to resist other temptations. When they forced themselves to remain stoic during a tearjerker movie, afterward they gave up more quickly on lab tasks requiring self-discipline, like working on a geometry puzzle or squeezing a hand-grip exerciser. Willpower turned out to be more than a folk concept or a metaphor. It really was a form of mental energy that could be exhausted. The experiments confirmed the 19th-century notion of willpower being like a muscle that was fatigued with use, a force that could be conserved by avoiding temptation. To study the process of ego depletion, researchers concentrated initially on acts involving self-control ­— the kind of self-discipline popularly associated with willpower, like resisting a bowl of ice cream. They weren’t concerned with routine decision-making, like choosing between chocolate and vanilla, a mental process that they assumed was quite distinct and much less strenuous. Intuitively, the chocolate-vanilla choice didn’t appear to require willpower.

But then a postdoctoral fellow, Jean Twenge, started working at Baumeister’s laboratory right after planning her wedding. As Twenge studied the results of the lab’s ego-depletion experiments, she remembered how exhausted she felt the evening she and her fiancé went through the ritual of registering for gifts. Did they want plain white china or something with a pattern? Which brand of knives? How many towels? What kind of sheets? Precisely how many threads per square inch?

“By the end, you could have talked me into anything,” Twenge told her new colleagues. The symptoms sounded familiar to them too, and gave them an idea. A nearby department store was holding a going-out-of-business sale, so researchers from the lab went off to fill their car trunks with simple products — not exactly wedding-quality gifts, but sufficiently appealing to interest college students. When they came to the lab, the students were told they would get to keep one item at the end of the experiment, but first they had to make a series of choices. Would they prefer a pen or a candle? A vanilla-scented candle or an almond-scented one? A candle or a T-shirt? A black T-shirt or a red T-shirt? A control group, meanwhile — let’s call them the nondeciders — spent an equally long period contemplating all these same products without having to make any choices. They were asked just to give their opinion of each product and report how often they had used such a product in the last six months.

Afterward, all the participants were given one of the classic tests of self-control: holding your hand in ice water for as long as you can. The impulse is to pull your hand out, so self-discipline is needed to keep the hand underwater. The deciders gave up much faster; they lasted 28 seconds, less than half the 67-second average of the nondeciders. Making all those choices had apparently sapped their willpower, and it wasn’t an isolated effect. It was confirmed in other experiments testing students after they went through exercises like choosing courses from the college catalog.

For a real-world test of their theory, the lab’s researchers went into that great modern arena of decision making: the suburban mall. They interviewed shoppers about their experiences in the stores that day and then asked them to solve some simple arithmetic problems. The researchers politely asked them to do as many as possible but said they could quit at any time. Sure enough, the shoppers who had already made the most decisions in the stores gave up the quickest on the math problems. When you shop till you drop, your willpower drops, too.

Any decision, whether it’s what pants to buy or whether to start a war, can be broken down into what psychologists call the Rubicon model of action phases, in honor of the river that separated Italy from the Roman province of Gaul. When Caesar reached it in 49 B.C., on his way home after conquering the Gauls, he knew that a general returning to Rome was forbidden to take his legions across the river with him, lest it be considered an invasion of Rome. Waiting on the Gaul side of the river, he was in the “predecisional phase” as he contemplated the risks and benefits of starting a civil war. Then he stopped calculating and crossed the Rubicon, reaching the “postdecisional phase,” which Caesar defined much more felicitously: “The die is cast.”

The whole process could deplete anyone’s willpower, but which phase of the decision-making process was most fatiguing? To find out, Kathleen Vohs, a former colleague of Baumeister’s now at the University of Minnesota, performed an experiment using the self-service Web site of Dell Computers. One group in the experiment carefully studied the advantages and disadvantages of various features available for a computer — the type of screen, the size of the hard drive, etc. — without actually making a final decision on which ones to choose. A second group was given a list of predetermined specifications and told to configure a computer by going through the laborious, step-by-step process of locating the specified features among the arrays of options and then clicking on the right ones. The purpose of this was to duplicate everything that happens in the postdecisional phase, when the choice is implemented. The third group had to figure out for themselves which features they wanted on their computers and go through the process of choosing them; they didn’t simply ponder options (like the first group) or implement others’ choices (like the second group). They had to cast the die, and that turned out to be the most fatiguing task of all. When self-control was measured, they were the one who were most depleted, by far.

The experiment showed that crossing the Rubicon is more tiring than anything that happens on either bank — more mentally fatiguing than sitting on the Gaul side contemplating your options or marching on Rome once you’ve crossed. As a result, someone without Caesar’s willpower is liable to stay put. To a fatigued judge, denying parole seems like the easier call not only because it preserves the status quo and eliminates the risk of a parolee going on a crime spree but also because it leaves more options open: the judge retains the option of paroling the prisoner at a future date without sacrificing the option of keeping him securely in prison right now. Part of the resistance against making decisions comes from our fear of giving up options. The word “decide” shares an etymological root with “homicide,” the Latin word “caedere,” meaning “to cut down” or “to kill,” and that loss looms especially large when decision fatigue sets in.

Once you’re mentally depleted, you become reluctant to make trade-offs, which involve a particularly advanced and taxing form of decision making. In the rest of the animal kingdom, there aren’t a lot of protracted negotiations between predators and prey. To compromise is a complex human ability and therefore one of the first to decline when willpower is depleted. You become what researchers call a cognitive miser, hoarding your energy. If you’re shopping, you’re liable to look at only one dimension, like price: just give me the cheapest. Or you indulge yourself by looking at quality: I want the very best (an especially easy strategy if someone else is paying). Decision fatigue leaves you vulnerable to marketers who know how to time their sales, as Jonathan Levav, the Stanford professor, demonstrated in experiments involving tailored suits and new cars.

The idea for these experiments also happened to come in the preparations for a wedding, a ritual that seems to be the decision-fatigue equivalent of Hell Week. At his fiancée’s suggestion, Levav visited a tailor to have a bespoke suit made and began going through the choices of fabric, type of lining and style of buttons, lapels, cuffs and so forth.

“By the time I got through the third pile of fabric swatches, I wanted to kill myself,” Levav recalls. “I couldn’t tell the choices apart anymore. After a while my only response to the tailor became ‘What do you recommend?’ I just couldn’t take it.”

Levav ended up not buying any kind of bespoke suit (the $2,000 price made that decision easy enough), but he put the experience to use in a pair of experiments conducted with Mark Heitmann, then at Christian-Albrechts University in Germany; Andreas Herrmann, at the University of St. Gallen in Switzerland; and Sheena Iyengar, of Columbia. One involved asking M.B.A. students in Switzerland to choose a bespoke suit; the other was conducted at German car dealerships, where customers ordered options for their new sedans. The car buyers — and these were real customers spending their own money — had to choose, for instance, among 4 styles of gearshift knobs, 13 kinds of wheel rims, 25 configurations of the engine and gearbox and a palette of 56 colors for the interior.

As they started picking features, customers would carefully weigh the choices, but as decision fatigue set in, they would start settling for whatever the default option was. And the more tough choices they encountered early in the process — like going through those 56 colors to choose the precise shade of gray or brown — the quicker people became fatigued and settled for the path of least resistance by taking the default option. By manipulating the order of the car buyers’ choices, the researchers found that the customers would end up settling for different kinds of options, and the average difference totaled more than 1,500 euros per car (about $2,000 at the time). Whether the customers paid a little extra for fancy wheel rims or a lot extra for a more powerful engine depended on when the choice was offered and how much willpower was left in the customer.

Similar results were found in the experiment with custom-made suits: once decision fatigue set in, people tended to settle for the recommended option. When they were confronted early on with the toughest decisions — the ones with the most options, like the 100 fabrics for the suit — they became fatigued more quickly and also reported enjoying the shopping experience less.

Shopping can be especially tiring for the poor, who have to struggle continually with trade-offs. Most of us in America won’t spend a lot of time agonizing over whether we can afford to buy soap, but it can be a depleting choice in rural India. Dean Spears, an economist at Princeton, offered people in 20 villages in Rajasthan in northwestern India the chance to buy a couple of bars of brand-name soap for the equivalent of less than 20 cents. It was a steep discount off the regular price, yet even that sum was a strain for the people in the 10 poorest villages. Whether or not they bought the soap, the act of making the decision left them with less willpower, as measured afterward in a test of how long they could squeeze a hand grip. In the slightly more affluent villages, people’s willpower wasn’t affected significantly. Because they had more money, they didn’t have to spend as much effort weighing the merits of the soap versus, say, food or medicine.

Spears and other researchers argue that this sort of decision fatigue is a major — and hitherto ignored — factor in trapping people in poverty. Because their financial situation forces them to make so many trade-offs, they have less willpower to devote to school, work and other activities that might get them into the middle class. It’s hard to know exactly how important this factor is, but there’s no doubt that willpower is a special problem for poor people. Study after study has shown that low self-control correlates with low income as well as with a host of other problems, including poor achievement in school, divorce, crime, alcoholism and poor health. Lapses in self-control have led to the notion of the “undeserving poor” — epitomized by the image of the welfare mom using food stamps to buy junk food — but Spears urges sympathy for someone who makes decisions all day on a tight budget. In one study, he found that when the poor and the rich go shopping, the poor are much more likely to eat during the shopping trip. This might seem like confirmation of their weak character — after all, they could presumably save money and improve their nutrition by eating meals at home instead of buying ready-to-eat snacks like Cinnabons, which contribute to the higher rate of obesity among the poor. But if a trip to the supermarket induces more decision fatigue in the poor than in the rich — because each purchase requires more mental trade-offs — by the time they reach the cash register, they’ll have less willpower left to resist the Mars bars and Skittles. Not for nothing are these items called impulse purchases.

And this isn’t the only reason that sweet snacks are featured prominently at the cash register, just when shoppers are depleted after all their decisions in the aisles. With their willpower reduced, they’re more likely to yield to any kind of temptation, but they’re especially vulnerable to candy and soda and anything else offering a quick hit of sugar. While supermarkets figured this out a long time ago, only recently did researchers discover why.

The discovery was an accident resulting from a failed experiment at Baumeister’s lab. The researchers set out to test something called the Mardi Gras theory — the notion that you could build up willpower by first indulging yourself in pleasure, the way Mardi Gras feasters do just before the rigors of Lent. In place of a Fat Tuesday breakfast, the chefs in the lab at Florida State whipped up lusciously thick milkshakes for a group of subjects who were resting in between two laboratory tasks requiring willpower. Sure enough, the delicious shakes seemed to strengthen willpower by helping people perform better than expected on the next task. So far, so good. But the experiment also included a control group of people who were fed a tasteless concoction of low-fat dairy glop. It provided them with no pleasure, yet it produced similar improvements in self-control. The Mardi Gras theory looked wrong. Besides tragically removing an excuse for romping down the streets of New Orleans, the result was embarrassing for the researchers. Matthew Gailliot, the graduate student who ran the study, stood looking down at his shoes as he told Baumeister about the fiasco.

Baumeister tried to be optimistic. Maybe the study wasn’t a failure. Something had happened, after all. Even the tasteless glop had done the job, but how? If it wasn’t the pleasure, could it be the calories? At first the idea seemed a bit daft. For decades, psychologists had been studying performance on mental tasks without worrying much about the results being affected by dairy-product consumption. They liked to envision the human mind as a computer, focusing on the way it processed information. In their eagerness to chart the human equivalent of the computer’s chips and circuits, most psychologists neglected one mundane but essential part of the machine: the power supply. The brain, like the rest of the body, derived energy from glucose, the simple sugar manufactured from all kinds of foods. To establish cause and effect, researchers at Baumeister’s lab tried refueling the brain in a series of experiments involving lemonade mixed either with sugar or with a diet sweetener. The sugary lemonade provided a burst of glucose, the effects of which could be observed right away in the lab; the sugarless variety tasted quite similar without providing the same burst of glucose. Again and again, the sugar restored willpower, but the artificial sweetener had no effect. The glucose would at least mitigate the ego depletion and sometimes completely reverse it. The restored willpower improved people’s self-control as well as the quality of their decisions: they resisted irrational bias when making choices, and when asked to make financial decisions, they were more likely to choose the better long-term strategy instead of going for a quick payoff. The ego-depletion effect was even demonstrated with dogs in two studies by Holly Miller and Nathan DeWall at the University of Kentucky. After obeying sit and stay commands for 10 minutes, the dogs performed worse on self-control tests and were also more likely to make the dangerous decision to challenge another dog’s turf. But a dose of glucose restored their willpower.

Despite this series of findings, brain researchers still had some reservations about the glucose connection. Skeptics pointed out that the brain’s overall use of energy remains about the same regardless of what a person is doing, which doesn’t square easily with the notion of depleted energy affecting willpower. Among the skeptics was Todd Heatherton, who worked with Baumeister early in his career and eventually wound up at Dartmouth, where he became a pioneer of what is called social neuroscience: the study of links between brain processes and social behavior. He believed in ego depletion, but he didn’t see how this neural process could be caused simply by variations in glucose levels. To observe the process — and to see if it could be reversed by glucose — he and his colleagues recruited 45 female dieters and recorded images of their brains as they reacted to pictures of food. Next the dieters watched a comedy video while forcing themselves to suppress their laughter — a standard if cruel way to drain mental energy and induce ego depletion. Then they were again shown pictures of food, and the new round of brain scans revealed the effects of ego depletion: more activity in the nucleus accumbens, the brain’s reward center, and a corresponding decrease in the amygdala, which ordinarily helps control impulses. The food’s appeal registered more strongly while impulse control weakened — not a good combination for anyone on a diet. But suppose people in this ego-depleted state got a quick dose of glucose? What would a scan of their brains reveal?

The results of the experiment were announced in January, during Heatherton’s speech accepting the leadership of the Society for Personality and Social Psychology, the world’s largest group of social psychologists. In his presidential address at the annual meeting in San Antonio, Heatherton reported that administering glucose completely reversed the brain changes wrought by depletion — a finding, he said, that thoroughly surprised him. Heatherton’s results did much more than provide additional confirmation that glucose is a vital part of willpower; they helped solve the puzzle over how glucose could work without global changes in the brain’s total energy use. Apparently ego depletion causes activity to rise in some parts of the brain and to decline in others. Your brain does not stop working when glucose is low. It stops doing some things and starts doing others. It responds more strongly to immediate rewards and pays less attention to long-term prospects.

The discoveries about glucose help explain why dieting is a uniquely difficult test of self-control — and why even people with phenomenally strong willpower in the rest of their lives can have such a hard time losing weight. They start out the day with virtuous intentions, resisting croissants at breakfast and dessert at lunch, but each act of resistance further lowers their willpower. As their willpower weakens late in the day, they need to replenish it. But to resupply that energy, they need to give the body glucose. They’re trapped in a nutritional catch-22:

1. In order not to eat, a dieter needs willpower.

2. In order to have willpower, a dieter needs to eat.

As the body uses up glucose, it looks for a quick way to replenish the fuel, leading to a craving for sugar. After performing a lab task requiring self-control, people tend to eat more candy but not other kinds of snacks, like salty, fatty potato chips. The mere expectation of having to exert self-control makes people hunger for sweets. A similar effect helps explain why many women yearn for chocolate and other sugary treats just before menstruation: their bodies are seeking a quick replacement as glucose levels fluctuate. A sugar-filled snack or drink will provide a quick improvement in self-control (that’s why it’s convenient to use in experiments), but it’s just a temporary solution. The problem is that what we identify as sugar doesn’t help as much over the course of the day as the steadier supply of glucose we would get from eating proteins and other more nutritious foods.

The benefits of glucose were unmistakable in the study of the Israeli parole board. In midmorning, usually a little before 10:30, the parole board would take a break, and the judges would be served a sandwich and a piece of fruit. The prisoners who appeared just before the break had only about a 20 percent chance of getting parole, but the ones appearing right after had around a 65 percent chance. The odds dropped again as the morning wore on, and prisoners really didn’t want to appear just before lunch: the chance of getting parole at that time was only 10 percent. After lunch it soared up to 60 percent, but only briefly. Remember that Jewish Israeli prisoner who appeared at 3:10 p.m. and was denied parole from his sentence for assault? He had the misfortune of being the sixth case heard after lunch. But another Jewish Israeli prisoner serving the same sentence for the same crime was lucky enough to appear at 1:27 p.m., the first case after lunch, and he was rewarded with parole. It must have seemed to him like a fine example of the justice system at work, but it probably had more to do with the judge’s glucose levels.

It’s simple enough to imagine reforms for the parole board in Israel — like, say, restricting each judge’s shift to half a day, preferably in the morning, interspersed with frequent breaks for food and rest. But it’s not so obvious what to do with the decision fatigue affecting the rest of society. Even if we could all afford to work half-days, we would still end up depleting our willpower all day long, as Baumeister and his colleagues found when they went into the field in Würzburg in central Germany. The psychologists gave preprogrammed BlackBerrys to more than 200 people going about their daily routines for a week. The phones went off at random intervals, prompting the people to report whether they were currently experiencing some sort of desire or had recently felt a desire. The painstaking study, led by Wilhelm Hofmann, then at the University of Würzburg, collected more than 10,000 momentary reports from morning until midnight.

Desire turned out to be the norm, not the exception. Half the people were feeling some desire when their phones went off — to snack, to goof off, to express their true feelings to their bosses — and another quarter said they had felt a desire in the past half-hour. Many of these desires were ones that the men and women were trying to resist, and the more willpower people expended, the more likely they became to yield to the next temptation that came along. When faced with a new desire that produced some I-want-to-but-I-really-shouldn’t sort of inner conflict, they gave in more readily if they had already fended off earlier temptations, particularly if the new temptation came soon after a previously reported one.

The results suggested that people spend between three and four hours a day resisting desire. Put another way, if you tapped four or five people at any random moment of the day, one of them would be using willpower to resist a desire. The most commonly resisted desires in the phone study were the urges to eat and sleep, followed by the urge for leisure, like taking a break from work by doing a puzzle or playing a game instead of writing a memo. Sexual urges were next on the list of most-resisted desires, a little ahead of urges for other kinds of interactions, like checking Facebook. To ward off temptation, people reported using various strategies. The most popular was to look for a distraction or to undertake a new activity, although sometimes they tried suppressing it directly or simply toughing their way through it. Their success was decidedly mixed. They were pretty good at avoiding sleep, sex and the urge to spend money, but not so good at resisting the lure of television or the Web or the general temptation to relax instead of work.

We have no way of knowing how much our ancestors exercised self-control in the days before BlackBerrys and social psychologists, but it seems likely that many of them were under less ego-depleting strain. When there were fewer decisions, there was less decision fatigue. Today we feel overwhelmed because there are so many choices. Your body may have dutifully reported to work on time, but your mind can escape at any instant. A typical computer user looks at more than three dozen Web sites a day and gets fatigued by the continual decision making — whether to keep working on a project, check out TMZ, follow a link to YouTube or buy something on Amazon. You can do enough damage in a 10-minute online shopping spree to wreck your budget for the rest of the year.

The cumulative effect of these temptations and decisions isn’t intuitively obvious. Virtually no one has a gut-level sense of just how tiring it is to decide. Big decisions, small decisions, they all add up. Choosing what to have for breakfast, where to go on vacation, whom to hire, how much to spend — these all deplete willpower, and there’s no telltale symptom of when that willpower is low. It’s not like getting winded or hitting the wall during a marathon. Ego depletion manifests itself not as one feeling but rather as a propensity to experience everything more intensely. When the brain’s regulatory powers weaken, frustrations seem more irritating than usual. Impulses to eat, drink, spend and say stupid things feel more powerful (and alcohol causes self-control to decline further). Like those dogs in the experiment, ego-depleted humans become more likely to get into needless fights over turf. In making decisions, they take illogical shortcuts and tend to favor short-term gains and delayed costs. Like the depleted parole judges, they become inclined to take the safer, easier option even when that option hurts someone else.

“Good decision making is not a trait of the person, in the sense that it’s always there,” Baumeister says. “It’s a state that fluctuates.” His studies show that people with the best self-control are the ones who structure their lives so as to conserve willpower. They don’t schedule endless back-to-back meetings. They avoid temptations like all-you-can-eat buffets, and they establish habits that eliminate the mental effort of making choices. Instead of deciding every morning whether or not to force themselves to exercise, they set up regular appointments to work out with a friend. Instead of counting on willpower to remain robust all day, they conserve it so that it’s available for emergencies and important decisions.

“Even the wisest people won’t make good choices when they’re not rested and their glucose is low,” Baumeister points out. That’s why the truly wise don’t restructure the company at 4 p.m. They don’t make major commitments during the cocktail hour. And if a decision must be made late in the day, they know not to do it on an empty stomach. “The best decision makers,” Baumeister says, “are the ones who know when not to trust themselves.”