Wednesday, December 31, 2008

Assessing the New Deal

It is fashionable these days to evaluate the New Deal with an eye toward what the Obama administration needs to do to jumpstart the economy. Politicos/historians/economists take sides on the New Deal. Economist Paul Krugman defends the New Deal unequivocally. Righties like Amity Schales and Uncle Tom Sowell condemn the New Deal unequivocally. Here is the most balanced and nuanced piece on the New Deal I've ever read by respected historian Alan Brinkley.

* * * * * * * *


The New RepublicNo Deal by Alan BrinkleyLearning from FDR's mistakes.Post Date Wednesday, December 31, 2008

Does the New Deal provide a useful model for fixing our own troubled economy? In many respects, yes. The frenzy of activity and innovation that marked Franklin Roosevelt's initial months in office--a welcome contrast to the seeming paralysis of the discredited Hoover regime--helped first and foremost to lessen the panic that had gripped the nation. And, during the prewar years of his presidency, Roosevelt's actions produced an unprecedented array of tangible achievements as well. He moved quickly and effectively to address a wave of bank failures that threatened to shut down the financial system. He created the Securities and Exchange Commission, which helped make the beleaguered stock market more transparent and thus more trustworthy. He responded to out-of-control unemployment by launching the Civil Works Administration, the Public Works Administration, and the Works Progress Administration, which created jobs for millions of the unemployed. He passed the Social Security Act, which over time provided support to the jobless, the indigent, and the elderly--and the Wagner Act, which eventually raised wages by giving unions the right to bargain collectively with employers. He signed the Fair Labor Standards Act, which created the minimum wage and the 40-hour workweek.

Yet, despite these extraordinary achievements, Roosevelt's initiatives did not, in the end, lift the country out of the Great Depression. At no time in the first eight years of the New Deal did unemployment drop below 15 percent. At no time did economic activity reach levels comparable to those of a decade earlier; and, while there were periods when the economy seemed to be recovering, none of them lasted very long. And so this bold, active, and creative moment in our history proved to be a failure at its central task. Understanding what went wrong could help us avoid making the same mistakes today.

Some of the New Deal's most important initiatives were active obstacles to economic renewal.

The National Recovery Administration (NRA), created in 1933 to help stabilize the volatile economy, was enormously popular for a time, mostly because it created the illusion of forceful action. The NRA sought to help corporations cooperate with one another in keeping production low and prices up, effectively creating cartels. This effort proved almost impossible to administer: No one in the federal government had any experience or expertise in managing an economic project of this magnitude; control quickly moved to the corporations themselves, with no better results. But the NRA was even worse when it worked as it was supposed to, because its goal was exactly the opposite of what the economy needed: Instead of expanding economic activity, the NRA worked to constrict it. At the same time, the Federal Reserve Board--operating under classical economic assumptions--saw the economic wreckage around it and responded by raising interest rates so as to protect the solvency of the Federal Reserve Bank itself. No one today would even consider high interest rates in a slumping economy, but the Fed of the early 1930s had not absorbed new economic ideas that would later become almost universally accepted. (In fairness, this catastrophic mistake was not a product of New Deal policy, but few New Dealers recognized the magnitude of the error for years.)

The more important failure of the New Deal, however, was what it did not do. The only way to break the deadlock that paralyzed the U.S. economy in the 1930s was to enormously expand economic activity--quickly and decisively. Instead, the New Deal wavered and equivocated--spending large sums of money with one hand while reducing spending with the other. One of the first acts Congress passed for Roosevelt in 1933 was the Economy Act, which slashed government spending in ways that reduced economic activity. It cut the salaries (and, in some cases, the jobs) of government employees and dramatically reduced payments to World War I veterans, taking $500 million from the economy in a single stroke. The Social Security system, so valuable over the long term, was in the short term a drag on the economy. It began collecting taxes in 1936 but paid out few benefits until the 1940s. In 1937, deluded by a weak economic recovery, Roosevelt (urged on by his Treasury secretary) set out to balance the budget through severe spending cuts. The result was a sudden and dramatic economic downturn--a recession within the Depression that produced some of the highest levels of unemployment and lowest levels of production of the decade.

In the aftermath of the 1937-1938 setback, Roosevelt launched a new $5 billion spending plan to try to shock the economy back to life. This infusion of funds helped undo the damage that the 1937 budget cuts helped to create, spurring a modest recovery that at least got the economy back to the weak and fragile condition of a year earlier. The idea of spending as an antidote to recession--an idea that had never found much favor in the past even among the most progressive figures in the New Deal--began slowly to achieve legitimacy. American economists were now eagerly reading Keynes and imagining more robust uses of fiscal and monetary powers to stimulate growth. It is possible, though by no means certain, that even without a war the influence of Keynesian ideas might have led New Dealers to embark on a spending program large enough to push the economy to somewhere close to full employment. But, in the end, the Great Depression--an unprecedented crisis that had stubbornly resisted the efforts of two presidential administrations over twelve years to restore prosperity--came to a close only because of the massive spending required by the greatest and most terrible war in human history.

Economic orthodoxy--which gave high priority to balanced budgets and fiscal restraint--remained a powerful force in the 1930s, even as its limitations became increasingly obvious. Similar arguments can still be heard today: While most liberals--and most financiers and economists--agree on the necessity of government doing something dramatic to jump-start the economy, there remain powerful voices, particularly on the right, that oppose such efforts on ideological grounds. Hence Republicans' initial opposition to the stimulus package in September and their more recent threat to block, through filibuster, federal aid to the auto industry. The New Deal was least successful when it was least aggressive--when it let concerns about fiscal prudence override the urgent need to pump enormous sums of money into a moribund economy. There is much for the Obama administration to learn from the many achievements of the New Deal. But there may be even more to learn from its failures

Tuesday, December 30, 2008

Reading 2009

This year I did not read as many books as I had hoped. The several I enjoyed include Coetzee's Life & Times of Michael K. Michael is a somewhat idealized version of myself - detached, the salt of the earth, yet in tune with what is real and true. I also liked McCarthy's The Road, with some of the best prose I've ever read. Roth's Indignation was especially stirring too. McCourt's Teacher Man is marvelously written and a truthful account of being a teacher.

For the new year, I have many things I want to read. I want to finish Ernest J. Gaines' A Lesson Before Dying and Yates' Revolutionary Road. I also want to read:

o John Kennedy Toole, A Confederacy of Dunces
o George Eliot, Middlemarch
o Truman Capote, In Cold Blood
o Herman Melville, Typee
o F. Scott Fitzgerald, Tender is the Night
o Elisha Cooper, ridiculous/hilarious/terrible/cool
o Herculine Barbin
o more Roth

Republican Damage

HERE IS A GOOD SUMMARY OF THE BUSH/REPUBLICAN DAMAGE OF THE
LAST 8 YEARS.


By BOB HERBERT
Published: December 29, 2008
Does anyone know where George W. Bush is?
Skip to next paragraph


You don’t hear much from him anymore. The last image most of us remember is of the president ducking a pair of size 10s that were hurled at him in Baghdad.

We’re still at war in Iraq and Afghanistan. Israel is thrashing the Palestinians in Gaza. And the U.S. economy is about as vibrant as the 0-16 Detroit Lions.

But hardly a peep have we heard from George, the 43rd.

When Mr. Bush officially takes his leave in three weeks (in reality, he checked out long ago), most Americans will be content to sigh good riddance. I disagree. I don’t think he should be allowed to slip quietly out of town. There should be a great hue and cry — a loud, collective angry howl, demonstrations with signs and bullhorns and fiery speeches — over the damage he’s done to this country.

This is the man who gave us the war in Iraq and Guantánamo and torture and rendition; who turned the Clinton economy and the budget surplus into fool’s gold; who dithered while New Orleans drowned; who trampled our civil liberties at home and ruined our reputation abroad; who let Dick Cheney run hog wild and thought Brownie was doing a heckuva job.

The Bush administration specialized in deceit. How else could you get the public (and a feckless Congress) to go along with an invasion of Iraq as an absolutely essential response to the Sept. 11 attacks, when Iraq had had nothing to do with the Sept. 11 attacks?

Exploiting the public’s understandable fears, Mr. Bush made it sound as if Iraq was about to nuke us: “We cannot wait,” he said, “for the final proof — the smoking gun that could come in the form of a mushroom cloud.”

He then set the blaze that has continued to rage for nearly six years, consuming more than 4,000 American lives and hundreds of thousands of Iraqis. (A car bomb over the weekend killed two dozen more Iraqis, many of them religious pilgrims.) The financial cost to the U.S. will eventually reach $3 trillion or more, according to the Nobel laureate economist Joseph Stiglitz.

A year into the war Mr. Bush was cracking jokes about it at the annual dinner of the Radio and Television Correspondents Association. He displayed a series of photos that showed him searching the Oval Office, peering behind curtains and looking under the furniture. A mock caption had Mr. Bush saying: “Those weapons of mass destruction have got to be somewhere.”

And then there’s the Bush economy, another disaster, a trapdoor through which middle-class Americans can plunge toward the bracing experiences normally reserved for the poor and the destitute.

Mr. Bush traveled the country in the early days of his presidency, promoting his tax cut plans as hugely beneficial to small-business people and families of modest means. This was more deceit.

The tax cuts would go overwhelmingly to the very rich.

The president would give the wealthy and the powerful virtually everything they wanted. He would throw sand into the regulatory apparatus and help foster the most extreme income disparities since the years leading up to the Great Depression. Once again he was lighting a fire.

This time the flames would engulf the economy and, as with Iraq, bring catastrophe.

If the U.S. were a product line, it would be seen now as deeply damaged goods, subject to recall.
There seemed to be no end to Mr. Bush’s talent for destruction. He tried to hand the piggy bank known as Social Security over to the marauders of the financial sector, but saner heads prevailed.

In New Orleans, the president failed to intervene swiftly and decisively to aid the tens of thousands of poor people who were very publicly suffering and, in many cases, dying. He then compounded this colossal failure of leadership by traveling to New Orleans and promising, in a dramatic, floodlit appearance, to spare no effort in rebuilding the flood-torn region and the wrecked lives of the victims.

He went further, vowing to confront the issue of poverty in America “with bold action.”

It was all nonsense, of course. He did nothing of the kind.

The catalog of his transgressions against the nation’s interests — sins of commission and omission — would keep Mr. Bush in a confessional for the rest of his life. Don’t hold your breath.
He’s hardly the contrite sort.

He told ABC’s Charlie Gibson: “I don’t spend a lot of time really worrying about short-term history. I guess I don’t worry about long-term history, either, since I’m not going to be around to read it.”

The president chuckled, thinking — as he did when he made his jokes about the missing weapons of mass destruction — that there was something funny going on.

Monday, December 29, 2008

My Favorite Books of 2008

Larry McMurtry's Lonesome Dove tetralogy comes immediately to mind. This series with its great story lines and even more memorable characters was a great reading experience.

The Barbara Leaming book on the growth of JFK is my favorite John F. Kennedy book of all time.

Thurston Clarke's book The Last Campaign brought back poignant memories of the greatest politician of my time---Robert F. Kennedy.

I was thrilled to reread Harper Lee. It was as if I were reading the book for the first time. The depiction of small-town Alabama life in the 30's struck me as not much different than small-town Alabama life in the 60's when I was growing up.

Doris Kearns Goodwin's Team of Rivals is the best Lincoln book I've read. Its portrayal of 19th century life in the US (I'm a 19th century man) with these upwardly mobile people like Lincoln, Chase, and Seward is unmatched.

David S. Reynolds and Waking Giant is a great picture of life in the US during the Jacksonian Era.

2008 was a great reading year.

Sunday, December 28, 2008

The Huffington Post by Adrianna Huffington

Arianna Huffington: Sunday Roundup
This week, John Snow, Bush's former Secretary of the Treasury, claimed that one of the causes of the mortgage meltdown was that, in its zeal to increase homeownership, the Bush administration "forgot" that people had to be "able to afford their houses." Perfectly understandable. And it explains so much: Bush and company didn't mean to illegally torture prisoners at Gitmo - they simply "forgot" torture was illegal. They didn't mean to trample civil rights - they simply "forgot" about the Constitution. They didn't gin up phony intel to hype us into war - they simply "forgot" that WMD had to be real. Turns out their guiding principle has been Steve Martin's classic catch-all defense: "I forgot!" So, will Bush's parting words be, "Well, excuuuuuse me!"?

Friday, December 26, 2008

Republican Disarray

Here is an article which accurately outlines the complete disarray and bewilderment of the Republicans.



Tuesday, December 23, 2008 at 2:16 pm
Dark Days Ahead: Why Republicans Need Xmas Vacation
Posted by michaelscherer Comments (75) Permalink Trackbacks (1) Email This
It's bad. Never mind that for two elections in a row Republicans have lost political independents by wide margins. Never mind that their reputation for competence is approaching Bernie Madoff-like levels, or that the nation's demographic shift, particularly the growth in Latino voters, imperils their electoral future. Never mind that party leaders will return to Washington next year with less actual power than at any point since 1995. The real Republican concern is this: The deteriorating economy now threatens to undermine the political value of the GOP's fundamental identity as the party of private markets and limited government.

The terrifying credit crisis is, as I write, morphing into a "liquidity trap," a term from the bygone era of men's hats and ladies' girdles, or at least Japanese kimonos, when the monetary master's of the universe risk losing control over the financial system, when regular people hoard their money, and when the economy looks to cycle ever downward. It's not yet the Great Depression Redux, but it can be talked about in the same sentence. And in that lies the potential for great calamity for the party of Ronald Reagan and Barry Goldwater. Liquidity traps are fought with government interventions. They are fought successfully with big ones. Republicans now face the real possibility of a generation of American voters who will see government not as the problem, but as the solution.

The last time America faced such a major economic retrenchment, Franklin Delano Roosevelt responded with a massive expansion of government spending and regulation, new programs like Social Security and new protections for unions and workers, which were controversial at the time, but which proved to be popular over the long haul. It took leaders like Goldwater more than two decades to gain some significant popular traction in opposition to Roosevelt's vision. Conservative economic ideas did not really impose themselves on the White House until 1981, more than 40 years after the bulk of the New Deal era had been established.

In the face of this peril, conservatives find themselves without leadership, direction, or even a cogent ideological response to the crisis. Conservative lodestars, like Dick Cheney, are warning of Herbert Hoover times if Republicans don't open up the federal pocketbooks. Even President Bush has admitted that he "abandoned free market principles to save the free market system."

And he did not succeed, clearing the way for much more abandoning to come.

Following widely accepted Keynesian theories, Barack Obama has proposed an economic stimulus next year of perhaps $1 trillion over two years, money that will take time to filter into an ever-worsening economy. Whether or not it succeeds, all the voters who get jobs because of this new spending will know its source: For a time, Obamadollars will pay their mortgage or rent. Obamadollars will feed their children. As such, the Democratic president has the ability to build a vast new political coalition of support, much like the one that FDR built during the 1930s. Ask Republican political strategists to honestly tell you why they hate government spending and they all offer the same answer: It creates Democratic voters.

So where does that leave the Republican Party? Totally confused. Lines like "redistributing wealth" makes less sense when the wealth is running scared and your husband just got laid off. The "ownership society" becomes a joke when homes and stocks, the things we aspire to own, are consistently losing value. If government is the only bank willing to lend, it is not hard to understand that voters will support government lending. To make matters worse, Republicans in the House are not a unified bunch. Through the credit and auto bailout debates, Congressional Republicans behaved less like statesmen than stray cats on a sinking ship. According to his own strategists, John McCain only had a slim chance to win the White House this year, but once the House GOP voted down the first version of the bank bailout, his fate was sealed. Then after sinking their own electoral hopes, Republicans reversed course, clearing the way for a slightly-altered bailout a few days later. That's not leadership you can believe in. As a strictly political matter, it was malpractice, or, more sympathetically, a lack of self-control and vision.

As it stands, Republican thinkers appear to be casting about wildly. With his own Rolodex under strain, House Majority Leader John Boehner has put out a public call for any economist who can give some rationale for opposing the Obama stimulus package. The response is so far less than impressive. At best, conservatives have retrenched to argue that the stimulus should focus more on tax cuts then spending. There is a highly technical debate going on between economists about why spending on public works should provide more stimulation than tax cuts for business and the wealthy. (In the classic textbooks, at least, the spending argument beats the tax cut argument.) It does not help the conservatives that their principal academic reference point to argue for tax cuts is a controversial interpretation of a paper written by Christina Romer, the expert on depression economics who is helping to draft Obama's spend-heavy stimulus plan.
So what will Republicans do? In the short term, the answer is clear. They will retrench to a guerrilla war, a tactical battle much like the one adopted by McCain at the end of the general election. Next year will bring Republicans a great unifying gift in the form of the Employee Free Choice Act, a bill Democrats plan to push over the objections of many small- and middle-sized businesses that would allow the unionization of workplaces without a so-called "secret ballot." Combined with the Blagojevich scandal, which does not appear to do the labor movement any favors, this could be a short-term political winner for Republicans, especially if it passes. They can cast Democrats as the party of big labor in much the same way that Democrats cast Republicans as the party of big oil. In the near term, Republicans also have the ability to recast themselves as reformers, as the loyal opposition to the waste, fraud and abuse that is endemic to government, and certain to pop up in any massive new spending program. But winning those battles will not win them the war.

What Republicans need is a new ideological message for a new economic era. One of the smartest Republican strategists I know suggested to me that the frame should not be about whether government is good or bad, but whether the solutions to our woes are classic Democrat--industrial age, centralized and top-down--or whether they are new Republican--Internet-age, market-based and bottom-up. Republicans, he recommends, should claim the mantle of innovative government, not just small government. The challenge here is that Obama will try to fill that space as well.

All that said, we also know that the Republican dilemma is not permanent. Karl Rove's predictions of a generation of Republican rule seem ridiculous in retrospect. Democrats succumb to comfort or hubris at their own peril. But as the cups of eggnog clink and the yule log burns, Republican households across the country would do well to realize the grim future that now faces them in the short term. Christmas is a time for rest and self-reflection. Republicans have little time to rest, and much to reflect upon. Next year begins a new political era for America.

Wednesday, December 24, 2008

A Krugman Comment on the Crazy Right

December 22, 2008, 8:32 pm — Updated: 8:32 pm -->
Crazy conspiracy theorists
So Rush Limbaugh, Bill O’Reilly, and Karl Rove all claim that the financial crisis was a liberal conspiracy, generated either by evil mastermind Chuck Schumer or by wily journalists.
Why does such stuff flourish? Probably because there is no punishment for it — as long as you’re on the right, and I mean right, side. Let Michael Moore point out, entirely correctly, the close ties between the Saudis and the Bush family, and he’s blasted as a crazy conspiracy theorist. On the other hand, let Donald Luskin suggest, in 2004, that George Soros is planning to engineer a financial crisis to defeat Bush, and he gets to publish front-page articles in the Washington Post Outlook section declaring that there isn’t a recession.

Monday, December 22, 2008

A Review of Revolutionary Road

The New RepublicBlaming the 'Burbs by Adelle Waldman'Revolutionary Road,' Considered The Original Anti-Suburban Novel, Isn't Actually Anti-Suburbs--But Something Far More Devastating Than ThatPost Date Monday, December 22, 2008

The New RepublicBlaming the 'Burbs by Adelle Waldman'Revolutionary Road,' Considered The Original Anti-Suburban Novel, Isn't Actually Anti-Suburbs--But Something Far More Devastating Than ThatPost Date Monday, December 22, 2008

The novel of suburban malaise has been in fashion for as long as people have been commuting from leafy pastures just beyond the city limits. Never mind that the majority of Americans actually live in suburbs (and have therefore voted with their feet in favor of suburbia), American readers are apparently hungry for books that seek to reveal how stultifying that life really is. Rick Moody made his career with The Ice Storm, an account of a Connecticut family's expensively appointed but empty lives. Similarly, Tom Perrotta's Little Children depicts a seemingly pleasant Massachusetts town in which rage and depravity lurk behind flower boxes and picture windows, and the banality of child-rearing naturally gives rise to adultery.

Now, Revolutionary Road, Richard Yates's brilliant 1961 novel, stands poised for a comeback. Often considered the original anti-suburban novel, the book--long a staple on bookstore shelves labeled "our favorites" and "staff picks"--tells the story of an unhappy young Connecticut couple; it has just been reissued in tandem with a Hollywood adaptation, due to hit theaters the day after Christmas. The film is directed by none other than Sam Mendes, the man behind American Beauty, perhaps the apotheosis of suburban exposé.

But if Mendes's new film is to do Revolutionary Road justice, it will transcend the easy anti-suburban categorization. While Yates's depiction of suburban life is nightmarish enough to exceed the worst fears of Jane Jacobs's devotees, Revolutionary Road is far more than a complacent takedown of the 'burbs. It is in fact less an anti-suburban novel than a novel about people who blame their unhappiness on the suburbs.

Once upon a time, Frank and April Wheeler were bohemians in Greenwich Village, but one thing led to another--well, sex led to pregnancy--and Frank, who'd graduated from Columbia on the GI Bill and worked odd jobs while trying to "find" himself, finally took a "real" job at Knox Business Machines, the dullest of dull corporations, quintessential Organization Man territory.

But Frank doesn't see himself as a victim of 1950s-style pressure to conform. Taking the job was an ironic gesture. "The thing I'm most anxious to avoid," he said to a friend, "is any kind of work that can be considered 'interesting' in its own right. I want something that can't possibly touch me." Moreover, Knox was the very same corporation for which Frank's own father had toiled his whole sad, Willy Loman-like career. How rich! What better way to thumb his nose at his father and his outmoded bourgeois values than to breeze right into a higher position than his dad had ever achieved--and then treat the whole thing as a joke?

Indeed, Frank's trouble is that he doesn't do much of anything sincerely. When we first meet him, he is sitting through a disastrous amateur theater performance, in which April stars. Afterwards, he approaches her backstage:
He ... started toward her with the corners of his mouth stretched tight in a look that he hoped would be full of love and humor and compassion; what he planned to do was bend down and kiss her and say "Listen: you were wonderful." But an almost imperceptible recoil of her shoulders told him she didn't want to be touched, which left him uncertain what to with his hands, and that was when it occurred to him that "You were wonderful" might be exactly the wrong thing to say--condescending, or at the very least naïve and sentimental, and much too serious.
What he said instead was, "Well, I guess it wasn't exactly a triumph or anything, was it?" Ouch.

The remark wasn't hurtful the way it might have been if April were another woman or this, another marriage. In fact, Frank was probably not wrong to have suspected that "you were wonderful" would have grated on April's nerves. The problem is ... well, it's complicated.
Frank's love for April is real, the only thing in his life that is wholly authentic. That doesn't, however, mean it's good; it is in fact utterly poisonous both for him and for April. From the early days of their affair, April, in spite of moments of feeling something that may be, could be love--she thought he was smart, she liked the countercultural thrill of living together in a cheap, cigarette smoke-filled West Village apartment--has "held herself poised for immediate flight."

And that really galls Frank. It's not the morality of the times or the unavailability of abortion that caused the newly married couple to have their first child--and take their first tentative steps towards conventional middle-class life--but Frank's frustration with April's aloofness. She intended to induce an abortion, which enraged Frank even though "the idea [of ending the pregnancy], God knew, was more than a little attractive." But April's unwillingness to bear his child seemed to bespeak an intolerable lack of love. He just wanted her not to be so indifferent--to him:
"You do this--you do this and I swear to god I'll--"

"Oh, you'll what? You'll leave me. What's that supposed to be, a threat or a promise?"

Feeling threatened, Frank did what was easy, natural--and despicable--he took up the "moral position," as if that were the true reason for his objection. And it worked. Frank convinced April to have a baby she didn't want, without having ever considered her feelings in any light other than how they reflected on him.

What is so unique here is that Yates isn't seduced by his characters' emotions, no matter how earnestly experienced; he lays bare the roiling pools of vanity and narcissism that underlie them. While Frank is the nicer of the two--by conventional standards, at least--it's no wonder that April feels revulsion at his "precious moral maxims" and his "'love' and ... mealy-mouthed little--." It's hard for her to articulate, but the reader has no trouble understanding what she means. Meanwhile, Frank, fearful of her flaring temper yet resentful of her power over him, catalogs her flaws--the widening hips, how certain facial expressions make her look old--but to no avail. No matter how much he wants to, Frank can't talk himself out of the absolute stranglehold April has on his sense of self--that is, his great and abiding love for her.

Continuation of Revolutionary Road Review

At least, that's the state of Frank and April's marriage at the beginning of the book, when things are going comparatively well.

Then April does something that really terrifies Frank, even more than the threat of her temper.

She takes his denunciations of "these damn little suburban types" and his diatribes about "Conformity, or the Suburbs, or Madison Avenue, or American Society Today" at face value. She decides that they should move to France, where she will get a secretarial job and Frank can find himself. They'll finally be free from the banal routine that just has to be the source of their unhappiness.

It has to be, hasn't it? The picture Yates paints of suburbia is unremittingly bleak. Frank and April's world, with its houses that look "as foolishly misplaced as a great many bright new toys that have been left outside overnight," is bad, but even more depressing are the Wheelers' attempts to make their existence congenial: the amateur theater endeavor, for example, so paltry, so chintzy--and so small in light of all that is wrong. Even less likely to yield real relief are their only friends, the fawning, insipid Campbells, who offer little but meaningless distraction and the ignoble pleasure of being looked up to. It's a hellish life, or it should be. It certainly is for April, anyway. While Frank drones on with stock talk about Conformity and other clichéd generalities, April's acidic observations have Dorothy Parker-like precision. "I know these damn artsy-crafty things," she said of the theater group initially, before she got talked into participating. "There'll be of a woman with blue hair and wooden beads who met Max Reinhardt once ... and seven girls with bad complexions."

April's unhappiness is real, but Yates, unlike a more sentimental author, doesn't applaud her daring--her willingness to buck convention and propose escape. Instead, he exposes the foolishness and the self-delusion behind her Paris plan. In a moment of particularly abject misery, April decides that "it"--the whole dreary shebang of suburban family life--is her fault. "I put the whole burden ...on you," she says to Frank. "It was like saying if you want this baby, it's going to be All Your Responsibility. You're going to have to turn yourself inside out to provide for us. You'll have to give up any idea of being anything in the world but a father." It sounds plausible enough, even if it has little bearing on the messier, more complicated truth. But Yates portrays the glee with which April latched onto her new analysis as part and parcel of the human tendency to self-dramatize: "Her whole day had been a heroic build up for this moment of self-abasement."

Nor does Yates let us forget that April's plan is riddled with holes, something even the affable drunkard who shares a cubicle with Frank can see. (As he says to Frank over lunch, "Assuming there is a true vocation lurking in wait for you, don't you think you'd be as apt to discover it here as there?") It also rests on the assumption that Frank really is cut out for a different kind of life. But whatever it may have been when she met him, Frank's anti-suburban talk has by this time become a mere gesture, a way of making himself feel sophisticated and being once more in April's eyes "the most interesting person [she's] ever met." Frank is not so much lying as he is being insincere. It's as Lionel Trilling observed about Mansfield Park's vivacious Mary Crawford: She is "impersonat[ing] the woman she thinks she ought to be." Likewise, Frank is pretending to be the nonconformist he--and April--want him to be.

The truth is Frank is relatively content in Connecticut. He likes the idea of family life; he likes prattling on to the Campbells; there are aspects of his job that he finds pleasant and even gratifying. Besides he's not sure he has it in him to do much else other than work at Knox. The vision that comes to mind as April waxes about the virtues of her European plan:
[April] coming from a day a the office--wearing a Parisian tailored suit, briskly pulling off her gloves--coming home and finding him hunched in an egg-stained bathrobe, on an unmade bed, picking his nose.

This, not the supposedly "hopeless emptiness" of middle-class American life, is what really terrifies Frank Wheeler.

But April, having convinced herself that her misery is merely a function of geography, is rapturous about France. And Frank gives in. After all, how much more tantalizing is it to join in April's euphoric excitement and her breathy account of him as the stifled genius than it is to feel like the dreary mediocrity whose touch she recoils from.

Rarely in literature has there been a second honeymoon quite so chillingly portrayed as the one the Wheelers embark upon after they decide to move to France--that is, decide that they will both embrace the same flattering fantasy of themselves. As unromantic as Emma Bovary's affairs, it's a sort of mutually masturbatory arrangement that, underneath all the murmured "darlings" and doors held open and nights of passion, is as unstable and as devoid of real tenderness as any described in Game Theory. Because as we know, and Frank only half-suspects, he is more interested in enjoying April's newly reinstated tenderness and admiration than in going to France.

How this all plays out makes for a deeply disquieting account of modern dysfunction. Not that Revolutionary Road is perfect: It poises uneasily on the brink of satire, wanting to its detriment to have it both ways--the psychological sophistication of realism and the mercilessness of satire, granting nothing to its characters that isn't either corrosive or affect. It is, undoubtedly, a brutal book.

Still, it's a great deal bigger and more ambitious than most of the anti-suburban novels it's so often lumped with. In the vein of many a great 20th century novel, Revolutionary Road turns the towering Victorian novels on their heads. Without itself being morally obtuse, it sets up a scenario in which the moral questions that preoccupied those authors are largely beside the point. How much of life, Revolutionary Road reminds us, defies the kind of analysis that parcels out responsibility and blame--and how terrible the realization of that is, because if goodness, or at least its attempt, has so little bearing on happiness, then what can any of us do?
Adelle Waldman is working on a novel called The Love Affairs of Nathaniel P.
© The New Republic 2008

Sunday, December 21, 2008

Paul Krugman - The Return of Depression Economics

When it comes to economics, I take an intellectual pass. Which is to say: Economics does not greatly interest me. I am only interested in the big picture. The nitty-gritty particulars of economics are boring to me. So when it comes to trying to understand the world's current economic tribulations, I look for someone to trust, and for me, that person is the most recent Nobel winner in econ, Paul Krugman, professor of economics at Princeton.

Krugman is proudly progressive, as I am. I check his blog---http://krugman.blogs.nytimes.com
daily.

This book is a revision of the same book Krugman published in the 90's. The original issue dealt with then current economic crises in Asia and Mexico. Krugman warned that the troubles affecting those areas of the world come visit this counry one day. One day has come.

We are in a situation of what Krugman calls depression economics. The conventional answer to a recession is a lowering of interest rates by the Fed. The Fed seeks to keep interest rates low enough to promote economic growth but not so low as to bring on inflation. It's a delicate balancing act.

But now the Fed discount rate is practically zero, and this will not be enough. What to do?

According to Professor Krugman, two things. One is to properly regulate what he calls shadow banking. The problem is not unregulated coventional banks; the problem is shadow banking, which was unleashed in 1999 by a Republican sponsored act of Congress that allowed insurance companies to act like banks. The failure of these type of companies---Lehman Brothers, AGI, etc.--- is what has dried up the credit system. We need to do whatever is necessary to properly loosen up credit, while at the same time, implementing proper regulation of shadow banking.

And we need massive federal spending on infrastructure to get the economy moving again. Fiscal responsibility needs to take a temporary backseat to governmental stimulation of the economy. Not rebates like Bush put in earlier this year, but direct governmental spending.

This is what Krugman says, and I go along with his views.

We should combine direct governmental stimulation with individual responsibility. The days of cheap and artificial prosperity in the US based on credit are over. We will return to economic basics, which means consuming less and living within our means. It's about time.

And it's about time for a New Deal a la FDR for 2009.

Saturday, December 20, 2008

Cheney's Delusions

While Bush seems commonly described as just plain dumb, Cheney seems commonly described as pure evil. The following editorial says just that about our lame duck VP. Who can disagree?

FROM Los Angeles Times
20 December 2008

We probably shouldn't have been surprised at Vice President Dick Cheney's blustering, obstinate insistence on ABC News on Monday that he's been right all along about virtually everything. But that doesn't mean we have to agree.

In the interview, Cheney not only acknowledged that he was involved in approving the harsh interrogation methods used by the CIA on suspected terrorists, but said he still thinks that waterboarding was an appropriate way to extract information. He said -- contradicting even President Bush -- that he believes the notorious American prison at Guantanamo Bay should remain open for the foreseeable future, and he reiterated that the U.S. invasion of Iraq was justified by, believe it or not, Saddam Hussein's weapons programs.

Maybe this was just a desperate, last-minute effort to rescue what appears to be a legacy in deep trouble, but it came across asnothing less than self-delusion. Despite public opinion polls showing that only about a third of the country believes the U.S. should have invaded Iraq, undaunted by the irrefutable fact that America's prestige has plummeted around the world, notwithstanding the outcry by human rights advocates against torture and the fact that there is no meaningful peace in sight in Iraq or Afghanistan, Cheney soldiers on with the same cocky, we know what we're doing and laws be damned attitude that has come to define him. Even though the American electorate rose up last month to sweep out the GOP in an extraordinary rejection of Bush administration policy, he persists in defending eight years of morally and legally disastrous decisions.

Cheney, of course, was a hawkish, self-righteous and, ultimately, malevolent figure in the Bush inner circle from day one. In "Angler," Barton Gellman's excellent analysis of his tenure, he emerges as a man willing to bend virtually any rule, a true believer with "a sense of mission so acute it drove him to seek power without limit." In Jane Mayer's "The Dark Side," he's portrayed as pushing his colleagues into ever more morally questionable situations. "The most dangerous vice president we've had probably in American history" is how Vice President-elect Joe Biden accurately described him during the campaign.

Cheney likes to joke about himself that when he told his wife, Lynne, that he had been nicknamed "Darth Vader," she didn't get angry. Instead, she responded: "It humanizes you."

With that, we agree.

Thursday, December 18, 2008

An Arsonist's Guide to Writers' Homes in New England by Brock Clarke

I read this book at the behest of Jamie.

The story sounded enthralling, but it was disappointingly not so. It is about Sam Pulsifer, who is convicted of burning the Emily Dickinson house, killing a couple. After leaving prison, he finds his relationship with his parents strained. He goes to university, marries, and returns to the Amherst area. Soon, other writers' homes begin to be burned. While the immediate suspect, he investigates these burnings himself, only to be confronted with issues of truth, love, and family.

The book is entirely too verbose. It has too many didactic ruminations about life and too many rhetorical questions. I think the story is intriguing, but consequently I felt I were trekking through sludge to reach the engaging parts.

Richard Yates - Revolutionary Road

Until December 2 I had never heard of this author or this book. This is surprising since this particular novel was a finalist for the National Book Award in 1961. The only reason I became aware of it December 2 is that it was displayed in the bookstore since a movie is coming out based on the book. I happened to pick it up and the prose instantly attracted me.

Yates is an exceptional writer. His phrasings, his images, his thoughts connect with me. I won't recount the story here except to say it's a period piece, taking place in the 50's, and it deals with a couple who struggle to make sense of their lives. The only negative I can mention is that the book is certainly depressing---there is no light at the end of the tunnel---and the conclusion seems obvious long before the end.

I will likely read Yates's other novels and his collected short stories, if I can find it. Yates has connections to Alabama. Freddy says he once lived in Tuscaloosa, and he died at the VA hospital in Birmingham in 1992.

Tuesday, December 16, 2008

The Myth that Republicans are for the Working Class

Republicans have tried to maintain the myth for four decades (starting with Nixon) that they are the champions of the working class against the elitist Democrats. The reality is that Republicans are, and have always been, plutocrats who couldn't care less about the working Americans. The debate over our domestic auto industry amply demonstrates this.



15.12.2008
Bill Kristol Wants His Culture War Back

It’s always a bit disconcerting to find oneself agreeing with a figure as odious as Bill Kristol, but he’s right to call out conservatives for their naked contempt for the American auto industry, especially when compared with their solicitousness for the even more screwed up financial sector. With Kristol, though, you can pretty much assume bad faith (and if you couldn’t, he gives it away by listing, as a feature of the bailout bill he supports, the fact that it would shift “difficult decisions to the Obama administration”). Thus I disagree with Noam that this constitutes some sudden outbreak of reasonableness on his part.

So what’s with his newfound objections to GOP class
warfare? Perhaps it’s because the Republican Party likes to keep its anti-union side a bit more on the down low. After decades in which conservatives pretended to be the champion of average working people, the drama in Detroit is the starkest possible illustration of the GOP’s essentially plutocratic nature. Tom Frank could scarcely have invested a more telling scenario. Here are Republican Senators actively fighting for lower wages and more foreign corporate ownership! Here they are blithely dismissing the need for domestic industry! Here they are grousing about working people getting overly generous retirement benefits and health care! In the face of this reality, the narrative that has sustained the party since Nixon--one in which decadent liberal elites are the real enemies of the hardworking silent majority, with all their sturdy volkish virtues--is exposed in all its naked preposterousness. And that narrative is one that Kristol desperately wants to preserve.

After all, Kristol is the heir to a neoconservative tradition that needed that narrative to look at itself in the mirror. The former socialists who spawned (literally) the current generation of right-wing pundits always liked to feel that they, and not the sneering radical chic intellectuals and activists they reviled, were the champions of the plain people. Kristol tries to keep this idea alive in his column, writing, ridiculously, that most “limousine liberals are embarrassed by their political alliance with the workers who built those limousines.” This slur might have had some validity four decades ago, but it has not a shred today, when Jim Webb is a hero of the Democratic Party. Rich liberals lionize unions, and, like rich conservatives, tend to romanticize the earthy authenticity of the working class. (I wouldn’t be surprised if Tim Robbins has already optioned the story of the workers who sat in at Chicago's Republic Windows & Doors.) Kristol, though, is deeply invested in keeping the categories of the Nixon-era culture war alive. Without them, the pseudo-populist Joe-the-Plumber pandering that constitutes Palinism--Kristol’s preferred cultural mode--is revealed as a con. “Senate Republicans now run the risk of being portrayed as Marie Antoinettes with Southern accents,” he writes. In other words, they risk being portrayed accurately.

Friday, December 12, 2008

The Better Educated Tend to Vote Democratic

The better educated, the more highly skilled, the more likely a person is to vote Democratic according to this article and the research that it's based on. Conversely, the less educated a person is, the more likely a person is to vote Republican. Since education leads to higher income, affluent people are trending Democratic. However, at the very highest affluency levels, the top 2% who are subject to the estate tax, the more likely a person is to vote Republican. The country is becoming more, not less, Balkanized. We are separating into the progressive (blue) and the regressive (red). I am on the Blue side!



How the Rich Are Different From You and MePlaces that went for Obama are richer and smarter than places that went for McCain.By Bill Bishop and Robert CushingUpdated Thursday, Dec. 11, 2008, at 8:04 PM ET

Last month's election was historic and may even have been transformative, as many commentators said. But in one important respect, it changed nothing. The divide between Republicans and Democrats in America continues to grow.

And it isn't just about politics. The division is also between rich and poor, between those with college educations and those without. On average, Republican communities have lower incomes and less education than Democratic communities. And those differences are growing as people migrate.

Just more than 600 counties (of more than 3,100 nationally) voted Republican more heavily in this year's presidential contest than in 2004. The average per capita yearly income in those counties was about $18,800, according to county income tallies issued each year by the Internal Revenue Service. (Income in this article is determined by the amount of adjusted gross taxable income listed on individual tax returns from 2004-07. Per capita income equals gross income divided by the number of personal exemptions.) By contrast, those living in the 500-plus counties that voted more heavily Democratic this year than in 2004 had average personal incomes of $28,000—nearly 50 percent higher than the communities trending Republican. The most Democratic counties (those where Barack Obama won by more than 20 percentage points) had average per capita incomes of $28,207. Those counties where John McCain won by similar margins had average personal incomes of just $21,308.
placeAd2(commercialNode,'midarticleflex',false,'')

Places divided by income are also separated by education. In landslide Democratic counties, 32.7 percent of the adult population had a bachelor's degree or better. In Republican counties where McCain won by 20 points or better, 20.4 percent of adults had finished college or graduate school.

More than 30 years ago, pollster Everett Carll Ladd Jr. wrote about the "inversion of the New Deal Order." Ladd was one of the first to notice that white workers without a college degree were voting Republican in larger numbers and that educated white workers were turning Democratic.

The debate over whether working-class white voters have abandoned the Democratic Party rages on. (See this recent paper on the "shifting and diverging white working class in U.S. presidential elections.") In the meantime, the results from this year's election show that there is certainly a geographic division in America based on class and status. Democrats won in the richest and most educated communities in the country.

As people migrate, these divisions (political, educational, and economic) among American communities are increasing. Again using IRS records, we tracked the average income of people who moved between counties since the 2004 election. Those who trekked across state lines from 2003-07 and settled in counties that grew more Republican this year had average incomes of $18,300. The people who moved into counties that became more Democratic in 2008 averaged $28,100 in yearly income. So those who moved to blue counties had incomes more than 50 percent higher than those migrating to the reddest of counties.
digg_url = 'http://www.slate.com/id/2206512';


And in the "flip" counties, the contrast is even starker. In all of the United States, there were only 44 counties that voted for John Kerry in 2004 but for John McCain in 2008. The average annual per capita income of the people who moved into these counties between the two elections was $16,500. That's 34 percent less than those who migrated into the 331 counties that went for George Bush in '04 but Obama in '08.

People with fewer money-making skills are moving into counties that are voting increasingly Republican. Those with higher incomes (and more education) are moving into counties that are voting more Democratic. The more lopsided the local political victory, the greater the differences in income and education.

This phenomenon held true in cities and rural communities alike. In those urban centers that voted overwhelmingly for John McCain, 23.6 percent of the adult population had at least a bachelor's degree. In urban counties that voted in a landslide for Obama, 33.3 percent had at least a college degree. In rural counties that voted in a landslide for McCain, 15.2 percent of adults had a college degree or better. In rural Obama landslide counties, it was 19.2 percent.

We don't pretend to understand the full meaning of how this country is dividing. We can see, however, that America is becoming more polarized not only politically but also educationally and economically—and that a country Balkanized by skills and by income has more troubles than one that is simply divided by votes.

Thursday, December 11, 2008

Indignation by Philip Roth

This was a delight to read.

Marcus Messner aims to be independent, to defy the social codes of respectable conduct, moral supremacy, ethical propriety, dignity, and order, the very values that define the 1950s. The backdrop to his rebellion is the Korean War, the symbol of honor, courage, and service.

He insists on not conforming, but instead following his emotions, not the restrictive cultural dictums espoused by those surrounding him. Ever fearful, he mistrusts everyone, especially his father, his roommates, Dean Caudwell, and Olivia.

Ironically, it is this bullheadedness that causes his tragic fate. As Roth observes, "one's most banal, incidental, even comical choices achieve the most disproportionate result."

I consider Marcus a precursor to the progressivism of the 1960s and its unrest against social rigidity.

Sunday, December 7, 2008

Jon Meacham - American Lion

This volume by journalist Jon Meacham is likely to be the definitve biography of Andrew Jackson for some time to come. I enjoyed it, although I am more interested in what was happening in the country during the Age of Jackson rather than in the particulars of the life of Jackson.

Jackson was the first man of the people, the first non-elitist, the first so-called self-made man to make it to this country's highest office. He was the first chief executive to directly connect to the American people. He was our first President to use the executive branch to try and better the lives of average Americans. No wonder he is the founder of the Democratic Party!

Jackson built up the office of President as the direct representative of the people, by which he meant working class people rather than the privileged. He found eternally against what he called the "money power" and the "priviliged" in favor of the "the people." Historians will forever debate his particular policies. Some were good; some were not good.

Was he right in fighting Biddle's Bank of the United States? Historians disagree. Was he right in the Eaton affire? (The Eaton affire was a long, convoluted situation which defies simple summary). I am not competent to understand the implications of the Eaton mess. Was he right in his Indian Removal policy? No, he was tragically wrong: his policy of removing Native Americans from the Southeast to Oklahoma resulted in thousands of tragic deaths. Was he right in stressing the President as being the direct representative of the people against entrenced special interests. You bet!

If you wish to understand the person of Andrew Jackson, this is your book. If you wish to understand his times, you must delve deeper, which can be the job of a lifetime.

Wednesday, December 3, 2008

The Kindle

I have followed discussion of the Kindle for the last year. So far I am not interested in purchasing one.



A year later, Amazon's Kindle finds a niche
Story Highlights
Amazon.com's electronic Kindle reader celebrates its first birthday
Device holds about 200 digital books and can reduce bookshelf clutter
Sales have been steady, but the device so far remains mostly a tech novelty
Oprah Winfrey has praised it, but J.K. Rowling vows no e-versions of "Harry Potter"

(CNN) -- It has the curves of a Lamborghini, looks like something an astronaut might take into space and weighs only 10.3 ounces.

Amazon's Kindle e-reader is wireless and can hold about 200 books, plus newspapers and magazines.

Amazon.com's electronic Kindle reader -- a device meant to remove the paper from the page and make reading both more convenient and eco-friendly -- is celebrating its first birthday.

Released in November 2007, the Kindle has sold more than a quarter million units. Its texts account for 10 percent of Amazon's book sales despite the fact that 200,000 titles -- a tiny fraction of the books offered on the site -- are available in digital form.

While exact sales figures are hard to come by, recent estimates have put the Kindle's sales on par with other high-profile mobile devices in their first year. Amazon.com says that the Kindle is currently sold out due to heavy demand.

So what has spurred its success? After all, electronic books have been around, in small numbers, for about a decade. Even Jeff Bezos, Amazon's founder and CEO, has admitted that the book is "elegantly suited to its purpose. It's hard to improve on."

One thing that's helped the Kindle is marketing. Where other readers failed to connect with consumers, the Kindle has excelled. The media-savvy Bezos has hardly been publicity shy, gaining his electronic toy a level of exposure most CEOs couldn't begin to fathom.

"You can't discount the prominence of having Amazon behind this," says Paul Reynolds, technology editor at Consumer Reports. "Jeff Bezos is respected for what he's done with Amazon, and if he feels this is a future product in media, people are willing to trust him."
Second, the gadget has been heralded by Oprah Winfrey, whose influence in the publishing world is immense. It's also been embraced by some prominent writers, including Nobel laureate Toni Morrison and best-selling thriller author James Patterson.

Third, with more and more consumers accustomed to reading text on their cell phones and BlackBerrys, the world finally may be ready for an electronic version of a book.

"I checked it out on Amazon and thought it was an intriguing idea, a great way to have a lot of books that don't take up a lot of space," says Emily Branch of Florida, who was moved to buy a Kindle after seeing the hosts of "The View" chatting about it.

"I figured if I didn't like it I could return it within 30 days," Branch says. "There wasn't a chance of that happening once I got it in my hands though."

One clutter-killing Kindle can hold about 200 books. And while other e-readers such as Sony's Reader must connect through a USB port to upload content, the Kindle is a wireless device, thanks to Whispernet, which is powered by Sprint's high-speed data network.
"I think the Whispernet is what sets the Kindle apart from all the other e-readers on the market," says Leslie Nicoll of Portland, Maine, who co-authored "The Amazon Kindle F.A.Q." book after her tech-loving teenage daughter urged her to get a Kindle.

Like Branch, Nicoll says she likes the Kindle's low-impact effect on her bookshelves. "I don't have to worry about giving it to someone else, reselling it on Amazon or finding a place to store it in my house," she says. "For the enjoyment and convenience, it has given me in the past seven months, I consider that it has paid for itself already."

Readers can visit Amazon's online store and upload a new book right to their Kindle. Subscribers also can have electronic versions of The New York Times and other newspapers and magazines delivered automatically to their Kindles in time for reading with their morning cup of coffee.

"The large and tightly interacting collection of Kindle features, that go far beyond those of any other previous e-Book attempt, will cause the Kindle to be the first e-Book to succeed," wrote one reviewer on an Amazon discussion board.

But not everything in Kindle world is roses and gumdrops. There's a difference between modest early success and making a centuries-old print format obsolete. The Kindle sells for $359, a steep price for the average reader in the current economic climate.

"I'm not going to pay $360 for that. I can get books for free," says Nikki Johnson, a college student in Atlanta, Georgia, speaking for traditionalists who are wary of giving up their bound paper volumes.

"There's nothing like reading a nice paperback," she says. "There's nothing like holding or carrying a book, having that tangible quality and it being more than just a piece of data."
So in an unforgiving economy and in a stubbornly old-fashioned medium, will the Kindle ever expand from a tech novelty to a mainstream accessory? It might be too soon to tell.
Blockbuster writers such as J.K. Rowling, author of the "Harry Potter" series, have said they'll never allow their books to appear on the market in electronic form. Yet future, better versions of e-readers may seduce younger consumers who grew up on PSPs and iPhones.

A next-generation model of the Kindle is due in 2009. Early reports indicate the new device will be thinner and will have fixed some current design bugs, such as poorly placed buttons that cause readers to turn pages accidentally.

"I think it's certainly a ways away from hitting the mainstream ... because of the price and the experience a reader gets from long-form reading," says Reynolds of Consumer Reports. "Whether these ... are successful, stand-alone devices remains to be seen. From what I've seen and heard, I think the technology is here to stay."

Monday, December 1, 2008

Obama the Centrist?

It amuses me to see pundits pointing out that Obama, based on his appointments so far, seems to be a pragmatic Centrist. Is he a Centrist? Based on everything I've learned about him, I would say yes. The amusing thing is those righties during the election who called him a socialist and a radical. What rubbish! Compared to Bush, he might be called liberal. Compared to reality, he is probably a centrist. Compared to Bush, Genghis Khan was a liberal. We shall what happens when takes office, but so far, based on his appointments, Obama will be right in the center (as opposed to right of center), which is what we need after 8 years of right-wing ideology.

Saturday, November 29, 2008

The Well-Tended Bookshelf

Essay
The Well-Tended Bookshelf

By LAURA MILLER
Published: November 28, 2008
In order to have the walls of my diminutive apartment scraped and repainted, I recently had to heap all of my possessions in the center of the room. The biggest obstacle was my library. Despite what I like to think of as a rigorous “one book in, one book out” policy, it had begun to metastasize quietly in corners, with volumes squeezed on top of the taller cabinets and in the horizontal crannies left above the spines of books that had been properly shelved. It was time to cull.

I am not a collector or a pack rat, unlike a colleague of mine who once expressed the fear that he might perish someday under a toppled pile of books and papers, like a woman whose obituary he once read. I was baffled the first time a friend explained to me that the book in my hand was his “reading copy,” while the “collection copy” resided upstairs, in some impenetrable sanctum. Having reviewed hundreds of books over the past 20-some years, I no longer subscribe to the notion that I have a vague journalistic responsibility to keep a copy of every title I have ever written about. I am not sentimental.

Nevertheless, things had gotten out of hand. The renovations forced me to pull every copy off every shelf and ask: Do I really want this? I filled four or five cartons with volumes destined for libraries, used-book stores and the recycling bin, and as I did so, certain criteria emerged.

There are two general schools of thought on which books to keep, as I learned once I began swapping stories with friends and acquaintances. The first views the bookshelf as a self-portrait, a reflection of the owner’s intellect, imagination, taste and accomplishments. “I’ve read ‘The Magic Mountain,’ ” it says, and “I love Alice Munro.” For others, especially those with literary careers, a personal library can be “emotional and totemic,” in the words of the agent Ira Silverberg. Books become stand-ins for friends and clients. Silverberg cherishes the copy of Céline given to him when he was 19 by William Burroughs, while “people I’ve stopped talking to go out immediately. There are people whose books I refuse to live with.”

The other approach views a book collection less as a testimony to the past than as a repository for the future; it’s where you put the books you intend to read. “I like to keep something on my shelf for every mood that might strike,” said Marisa Bowe, a nonprofit consultant and an editor of “Gig: Americans Talk About Their Jobs.” At its most pragmatic, and with the aid of technology, this attitude can be breathtakingly ruthless. Lisa Palac, a freelance writer, and Andrew Rice, a public relations executive, ultimately chose their beloved but snug house in Venice, Calif., over their library. “We’d been lugging these books around for years, and why?” Palac wrote in an e-mail message. Her husband said, “Do we really need to keep that copy of ‘The Scarlet Letter’ from college on hand? I can order up another copy online and have it tomorrow if I need it.” They kept only one carton of books apiece, donating the rest to a fund-raising bazaar for their son’s school.

Older people, curiously enough, seem to favor the less nostalgic approach. When you’re young and still constructing an identity, the physical emblems of your inner life appear more essential, and if you’re single, your bookshelves provide a way of advertising your discernment to potential mates. I’ve met readers who have jettisoned whole categories of titles — theology, say, or poststructuralist theory — that they once considered desperately important. Most of them express no regrets, although Nicholson Baker, who wrote an entire book protesting the “weeding” of books and periodicals from American libraries, still mourns the collection of science fiction paperbacks he discarded in his youth. “I’m not good at it,” Baker wrote in an e-mail message when asked about his own culling. “When I’m doing research, I buy lots of used, out-of-print books, preferably with under­lining and torn covers. I like watching them pile up on the stairs.”

For the most part, I’ve been pragmatic in my purging, and for years reference books were the most likely survivors. I needed them for work, for those occasions when I suddenly had to know at what age Faulkner published “Absalom, Absalom” (39) or the name of the Greek muse of lyric poetry (Euterpe). Now the Internet can tell me all that. Apart from the rare reference that’s worth reading in its own right, like David Thomson’s Biographical Dictionary of Film, these titles have been drifting away as the trust I’m willing to put in Wikipedia gradually equalizes with the faith I’ve invested in, say, Benet’s Reader’s Encyclopedia. (It doesn’t help that reference books tend to be shelf hogs.)

Nevertheless, most of the nonfiction I’ve kept consists of books I’ve already read and know I’m likely to refer to in my own writing. Richard Holmes’s biography of Coleridge has come in handy for more than one project, as has Carol J. Clover’s study of slasher films, “Men, Women, and Chainsaws.” In fiction, on the other hand, apart from a few choice favorites, the list is weighted toward classics I optimistically plan to get around to someday. Like John Irving, I hold one substantial unread Dickens novel (“Barnaby Rudge”) in reserve, for emergencies. This method has its pitfalls. The novelist Jonathan Franzen used to limit the unread books on his shelves to no more than 50 percent of the total. “The weight of those books seemed to represent a standing reproach to me of how little I was reading,” he said in a phone interview. “I want to be surrounded by books I love, although now sometimes I worry that it’s too familiar, what I see when I look around me, that it’s become a sort of narcissistic mirror.”

When it comes to novels, I’m probably too sanguine about what my future can accommodate. “Eventually the truth hits home,” Brian Drolet, a television producer in New York, told me. “As the actuarial tables advance, the number of books you’ve got time to read diminishes.” Dr. Johnson once said of second marriages that they represent the triumph of hope over experience. So, too, do my bookshelves. I have turned out to be less rational about this than I thought, and have made my library into a charm against mortality. As long as I have a few unread books beckoning to me from across the room, I tell myself I can always find a little more time.

Thursday, November 27, 2008

Back to Law & Order in the US

By ROGER COHEN
Published: November 26, 2008
It’s Thanksgiving. I’m thankful for many things right now, despite the stock market, and first among them is the fact that the next U.S. commander in chief is a constitutional law expert and former law professor.

Before I get to why, allow me to add two other reasons for thankfulness. The first is that Barack Obama is a man of sufficient self-confidence to entrust the critical job of secretary of state to his former rival, Hillary Rodham Clinton. She has the strength and focus to produce results.
The second is that he’s a man of sufficient good sense to retain the remarkable Robert Gates as defense secretary.

President Bush had one overriding criterion in choosing his inner circle: loyalty. The result was nobody would pull the plug on stupidity. Obama wants the kind of competence and brainpower that challenge him. The God-gut decision-making of The Decider got us in this mess. Getting out of it will require an Oval Office where smart dissent is prized.

But back to the law, which is what defines the United States, for it is a nation of laws. Or was until Bush, in the aftermath of 9/11, unfurled what the late historian Arthur Schlesinger Jr. called “the most dramatic, sustained and radical challenge to the rule of law in American history.”

There is no need to rehearse here the whole sordid history of the Bush administration’s work on Vice President Dick Cheney’s “dark side:” the “enhanced” interrogation techniques in “black sites” outside the United States justified by invocation of a “new paradigm” that rendered the Geneva Conventions “quaint.”

When governments veer onto the dark side, language always goes murky. Direct speech makes dirty deeds too clear. A new paradigm sounds bland enough. What it meant was trashing habeas corpus.

The facts speak for themselves. This month, almost seven years after detainees began arriving at Guantánamo Bay on Jan. 11, 2002, a verdict was handed down in the first hearing on the government’s evidence for holding so-called unlawful enemy combatants at the U.S. naval base in Cuba.

Yes, this was the first hearing in a habeas corpus case, so long has the legal battle been to get to this point, and so stubborn has the administration been in seeking to keep Guantánamo detainees out of reach of civilian courts.

Judge Richard J. Leon of Federal District Court in Washington ruled that five Algerian men had been unlawfully held at Guantánamo and ordered their release. He said: “Seven years of waiting for our legal system to give them an answer to a question so important is, in my judgment, more than plenty.”

Of the 770 detainees grabbed here and there and flown to Guantánamo, only 23 have ever been charged with a crime. Of the more than 500 so far released, many traumatized by those “enhanced” techniques, not one has received an apology or compensation for their season in hell.
What they got on release was a single piece of paper from the American government. A U.S. official met one of the dozens of Afghans now released from Guantánamo and was so appalled by this document that he forwarded me a copy.

Dated Oct. 7, 2006, it reads as follows:
“An Administrative Review Board has reviewed the information about you that was talked about at the meeting on 02 December 2005 and the deciding official in the United States has made a decision about what will happen to you. You will be sent to the country of Afghanistan. Your departure will occur as soon as possible.”

That’s it, the one and only record on paper of protracted U.S. incarceration: three sentences for four years of a young Afghan’s life, written in language Orwell would have recognized.

We have “the deciding official,” not an officer, general or judge. We have “the information about you,” not allegations, or accusations, let alone charges. We have “a decision about what will happen to you,” not a judgment, ruling or verdict. This is the lexicon of totalitarianism. It is acutely embarrassing to the United States.

That is why I am thankful above all that the next U.S. commander in chief is a constitutional lawyer. Nothing has been more damaging to the United States than the violation of the legal principles at the heart of the American idea.

As well as closing Guantánamo, Obama should set up an independent commission to investigate what happened there, as suggested in a fine recent report, “Guantánamo and its Aftermath,” from the University of California, Berkeley. Only then will “deciding officials” become identifiable human beings who can, if necessary be judged.

Obama should also ensure that former detainees receive an apology and compensation. An
American official showing up, envelope in hand, at some dusty Afghan compound and delivering U.S. contrition and cash to a man whose life has been ravaged by U.S. abuse, will in the long term make the United States safer.

Give thanks on this day for the law. It’s what stands between the shining city on a hill and the dark side.

Wednesday, November 26, 2008

A Good Summary of Bush

Here is a good summary of our lameduck President by Joe Klein. I would only add that Bush is a typical Republican in his arrogance, ignorance, and ineptitude.

* * * * * * *

We have "only one President at a time," Barack Obama said in his debut press conference as President-elect. Normally, that would be a safe assumption — but we're learning not to assume anything as the charcoal-dreary economic winter approaches. By mid-November, with the financial crisis growing worse by the day, it had become obvious that one President was no longer enough (at least not the President we had). So, in the days before Thanksgiving, Obama began to move — if not to take charge outright, then at least to preview what things will be like when he does take over in January. He became a more public presence, taking questions from the press three days in a row. He named his economic team. He promised an enormous stimulus package that would somehow create 2.5 million new jobs, and began to maneuver the new Congress toward having the bill ready for him to sign — in a dramatic ceremony, no doubt — as soon as he assumes office.

That we have slightly more than one President for the moment is mostly a consequence of the extraordinary economic times. Even if George Washington were the incumbent, the markets would want to know what John Adams was planning to do after his Inauguration. And yet this final humiliation seems particularly appropriate for George W. Bush. At the end of a presidency of stupefying ineptitude, he has become the lamest of all possible ducks. (See TIME's best pictures of Barack Obama.)

It is in the nature of mainstream journalism to attempt to be kind to Presidents when they are coming and going but to be fiercely skeptical in between. I've been feeling sorry for Bush lately, a feeling partly induced by recent fictional depictions of the President as an amiable lunkhead in Oliver Stone's W. and in Curtis Sittenfeld's terrific novel American Wife. There was a photo in the New York Times that seemed to sum up his current circumstance: Bush in Peru, dressed in an alpaca poncho, standing alone just after the photo op at the Asia Pacific Economic Cooperation forum, with various Asian leaders departing the stage, none of them making eye contact with him. Bush has that forlorn what-the-hell-happened? expression on his face, the one that has marked his presidency at difficult times. You never want to see the President of the United States looking like that.

So I've been searching for valedictory encomiums. His position on immigration was admirable and courageous; he was right about the Dubai Ports deal and about free trade in general. He spoke well, in the abstract, about the importance of freedom. He is an impeccable classicist when it comes to baseball. And that just about does it for me. I'd add the bracing moment of Bush with the bullhorn in the ruins of the World Trade Center, but that was neutered in my memory by his ridiculous, preening appearance in a flight suit on the deck of the aircraft carrier beneath the "Mission Accomplished" sign. The flight-suit image is one of the two defining moments of the Bush failure. The other is the photo of Bush staring out the window of Air Force One, helplessly viewing the destruction wrought by Hurricane Katrina. This is a presidency that has wobbled between those two poles — overweening arrogance and paralytic incompetence.

The latter has held sway these past few months as the economy has crumbled. It is too early to rate the performance of Bush's economic team, but we have more than enough evidence to say, definitively, that at a moment when there was a vast national need for reassurance, the President himself was a cipher. Yes, he's a lame duck with an Antarctic approval rating — but can you imagine Bill Clinton going so gently into the night? There are substantive gestures available to a President that do not involve the use of force or photo ops. For example, Bush could have boosted the public spirit — and the auto industry — by announcing that he was scrapping the entire federal automotive fleet, including the presidential limousine, and replacing it with hybrids made in Detroit. He could have jump-started — and he still could — the Obama plan by releasing funds for a green-jobs program to insulate public buildings. He could start funding the transit projects already approved by Congress.

In the end, though, it will not be the creative paralysis that defines Bush. It will be his intellectual laziness, at home and abroad. Bush never understood, or cared about, the delicate balance between freedom and regulation that was necessary to make markets work. He never understood, or cared about, the delicate balance between freedom and equity that was necessary to maintain the strong middle class required for both prosperity and democracy. He never considered the complexities of the cultures he was invading. He never understood that faith, unaccompanied by rigorous skepticism, is a recipe for myopia and foolishness. He is less than President now, and that is appropriate. He was never very much of one.

Monday, November 24, 2008

Screen Literacy vs. Print Literacy

From the NY Times


By KEVIN KELLY
Published: November 21, 2008

Everywhere we look, we see screens. The other day I watched clips from a movie as I pumped gas into my car. The other night I saw a movie on the backseat of a plane. We will watch anywhere. Screens playing video pop up in the most unexpected places — like A.T.M. machines and supermarket checkout lines and tiny phones; some movie fans watch entire films in between calls. These ever-present screens have created an audience for very short moving pictures, as brief as three minutes, while cheap digital creation tools have empowered a new generation of filmmakers, who are rapidly filling up those screens. We are headed toward screen ubiquity.

Video Citing: TimeTube, on the Web, gives a genealogy of the most popular videos and their descendants, and charts their popularity in time-line form.

When technology shifts, it bends the culture. Once, long ago, culture revolved around the spoken word. The oral skills of memorization, recitation and rhetoric instilled in societies a reverence for the past, the ambiguous, the ornate and the subjective. Then, about 500 years ago, orality was overthrown by technology. Gutenberg’s invention of metallic movable type elevated writing into a central position in the culture. By the means of cheap and perfect copies, text became the engine of change and the foundation of stability. From printing came journalism, science and the mathematics of libraries and law. The distribution-and-display device that we call printing instilled in society a reverence for precision (of black ink on white paper), an appreciation for linear logic (in a sentence), a passion for objectivity (of printed fact) and an allegiance to authority (via authors), whose truth was as fixed and final as a book. In the West, we became people of the book.

Now invention is again overthrowing the dominant media. A new distribution-and-display technology is nudging the book aside and catapulting images, and especially moving images, to the center of the culture. We are becoming people of the screen. The fluid and fleeting symbols on a screen pull us away from the classical notions of monumental authors and authority. On the screen, the subjective again trumps the objective. The past is a rush of data streams cut and rearranged into a new mashup, while truth is something you assemble yourself on your own screen as you jump from link to link. We are now in the middle of a second Gutenberg shift — from book fluency to screen fluency, from literacy to visuality.

The overthrow of the book would have happened long ago but for the great user asymmetry inherent in all media. It is easier to read a book than to write one; easier to listen to a song than to compose one; easier to attend a play than to produce one. But movies in particular suffer from this user asymmetry. The intensely collaborative work needed to coddle chemically treated film and paste together its strips into movies meant that it was vastly easier to watch a movie than to make one. A Hollywood blockbuster can take a million person-hours to produce and only two hours to consume. But now, cheap and universal tools of creation (megapixel phone cameras, Photoshop, iMovie) are quickly reducing the effort needed to create moving images. To the utter bafflement of the experts who confidently claimed that viewers would never rise from their reclining passivity, tens of millions of people have in recent years spent uncountable hours making movies of their own design. Having a ready and reachable audience of potential millions helps, as does the choice of multiple modes in which to create. Because of new consumer gadgets, community training, peer encouragement and fiendishly clever software, the ease of making video now approaches the ease of writing.

This is not how Hollywood makes films, of course. A blockbuster film is a gigantic creature custom-built by hand. Like a Siberian tiger, it demands our attention — but it is also very rare.

In 2007, 600 feature films were released in the United States, or about 1,200 hours of moving images. As a percentage of the hundreds of millions of hours of moving images produced annually today, 1,200 hours is tiny. It is a rounding error.

We tend to think the tiger represents the animal kingdom, but in truth, a grasshopper is a truer statistical example of an animal. The handcrafted Hollywood film won’t go away, but if we want to see the future of motion pictures, we need to study the swarming food chain below — YouTube, indie films, TV serials and insect-scale lip-sync mashups — and not just the tiny apex of tigers. The bottom is where the action is, and where screen literacy originates.

An emerging set of cheap tools is now making it easy to create digital video. There were more than 10 billion views of video on YouTube in September. The most popular videos were watched as many times as any blockbuster movie. Many are mashups of existing video material. Most vernacular video makers start with the tools of Movie Maker or iMovie, or with Web-based video editing software like Jumpcut. They take soundtracks found online, or recorded in their bedrooms, cut and reorder scenes, enter text and then layer in a new story or novel point of view. Remixing commercials is rampant. A typical creation might artfully combine the audio of a Budweiser “Wassup” commercial with visuals from “The Simpsons” (or the Teletubbies or “Lord of the Rings”). Recutting movie trailers allows unknown auteurs to turn a comedy into a horror flick, or vice versa.

Rewriting video can even become a kind of collective sport. Hundreds of thousands of passionate anime fans around the world (meeting online, of course) remix Japanese animated cartoons. They clip the cartoons into tiny pieces, some only a few frames long, then rearrange them with video editing software and give them new soundtracks and music, often with English dialogue. This probably involves far more work than was required to edit the original cartoon but far less work than editing a clip a decade ago. The new videos, called Anime Music Videos, tell completely new stories. The real achievement in this subculture is to win the Iron Editor challenge. Just as in the TV cookoff contest “Iron Chef,” the Iron Editor must remix videos in real time in front of an audience while competing with other editors to demonstrate superior visual literacy. The best editors can remix video as fast as you might type.

In fact, the habits of the mashup are borrowed from textual literacy. You cut and paste words on a page. You quote verbatim from an expert. You paraphrase a lovely expression. You add a layer of detail found elsewhere. You borrow the structure from one work to use as your own. You move frames around as if they were phrases.

Digital technology gives the professional a new language as well. An image stored on a memory disc instead of celluloid film has a plasticity that allows it to be manipulated as if the picture were words rather than a photo. Hollywood mavericks like George Lucas have embraced digital technology and pioneered a more fluent way of filmmaking. In his “Star Wars” films, Lucas devised a method of moviemaking that has more in common with the way books and paintings are made than with traditional cinematography.

In classic cinematography, a film is planned out in scenes; the scenes are filmed (usually more than once); and from a surfeit of these captured scenes, a movie is assembled. Sometimes a director must go back for “pickup” shots if the final story cannot be told with the available film. With the new screen fluency enabled by digital technology, however, a movie scene is something more flexible: it is like a writer’s paragraph, constantly being revised. Scenes are not captured (as in a photo) but built up incrementally. Layers of visual and audio refinement are added over a crude outline of the motion, the mix constantly in flux, always changeable. George Lucas’s last “Star Wars” movie was layered up in this writerly way. He took the action “Jedis clashing swords — no background” and laid it over a synthetic scene of a bustling marketplace, itself blended from many tiny visual parts. Light sabers and other effects were digitally painted in later, layer by layer. In this way, convincing rain, fire and clouds can be added in additional layers with nearly the same kind of freedom with which Lucas might add “it was a dark and stormy night” while writing the script. Not a single frame of the final movie was left untouched by manipulation.

In the recent live-action feature movie “Speed Racer,” while not a box-office hit, took this style of filmmaking even further. The spectacle of an alternative suburbia was created by borrowing from a database of existing visual items and assembling them into background, midground and foreground. Pink flowers came from one photo source, a bicycle from another archive, a generic house roof from yet another. Computers do the hard work of keeping these pieces, no matter how tiny and partial they are, in correct perspective and alignment, even as they move. The result is a film assembled from a million individual existing images. In most films, these pieces are handmade, but increasingly, as in “Speed Racer,” they can be found elsewhere.

In the great hive-mind of image creation, something similar is already happening with still photographs. Every minute, thousands of photographers are uploading their latest photos on the Web site Flickr. The more than three billion photos posted to the site so far cover any subject you can imagine; I have not yet been able to stump the site with a request. Flickr offers more than 200,000 images of the Golden Gate Bridge alone. Every conceivable angle, lighting condition and point of view of the Golden Gate Bridge has been photographed and posted. If you want to use an image of the bridge in your video or movie, there is really no reason to take a new picture of this bridge. It’s been done. All you need is a really easy way to find it.

Similar advances have taken place with 3D models. On Google SketchUp’s 3D Warehouse, you can find insanely detailed three-dimensional virtual models of most major building structures of the world. Need a street in San Francisco? Here’s a filmable virtual set. With powerful search and specification tools, high-resolution clips of any bridge in the world can be circulated into the common visual dictionary for reuse. Out of these ready-made “words,” a film can be assembled, mashed up from readily available parts. The rich databases of component images form a new grammar for moving images.

After all, this is how authors work. We dip into a finite set of established words, called a dictionary, and reassemble these found words into articles, novels and poems that no one has ever seen before. The joy is recombining them. Indeed it is a rare author who is forced to invent new words. Even the greatest writers do their magic primarily by rearranging formerly used, commonly shared ones. What we do now with words, we’ll soon do with images.

For directors who speak this new cinematographic language, even the most photo-realistic scenes are tweaked, remade and written over frame by frame. Filmmaking is thus liberated from the stranglehold of photography. Gone is the frustrating method of trying to capture reality with one or two takes of expensive film and then creating your fantasy from whatever you get. Here reality, or fantasy, is built up one pixel at a time as an author would build a novel one word at a time. Photography champions the world as it is, whereas this new screen mode, like writing and painting, is engineered to explore the world as it might be.

But merely producing movies with ease is not enough for screen fluency, just as producing books with ease on Gutenberg’s press did not fully unleash text. Literacy also required a long list of innovations and techniques that permit ordinary readers and writers to manipulate text in ways that make it useful. For instance, quotation symbols make it simple to indicate where one has borrowed text from another writer. Once you have a large document, you need a table of contents to find your way through it. That requires page numbers. Somebody invented them (in the 13th century). Longer texts require an alphabetic index, devised by the Greeks and later developed for libraries of books. Footnotes, invented in about the 12th century, allow tangential information to be displayed outside the linear argument of the main text. And bibliographic citations (invented in the mid-1500s) enable scholars and skeptics to systematically consult sources. These days, of course, we have hyperlinks, which connect one piece of text to another, and tags, which categorize a selected word or phrase for later sorting.

All these inventions (and more) permit any literate person to cut and paste ideas, annotate them with her own thoughts, link them to related ideas, search through vast libraries of work, browse subjects quickly, resequence texts, refind material, quote experts and sample bits of beloved artists. These tools, more than just reading, are the foundations of literacy.

If text literacy meant being able to parse and manipulate texts, then the new screen fluency means being able to parse and manipulate moving images with the same ease. But so far, these “reader” tools of visuality have not made their way to the masses. For example, if I wanted to visually compare the recent spate of bank failures with similar events by referring you to the bank run in the classic movie “It’s a Wonderful Life,” there is no easy way to point to that scene with precision. (Which of several sequences did I mean, and which part of them?) I can do what I just did and mention the movie title. But even online I cannot link from this sentence to those “passages” in an online movie. We don’t have the equivalent of a hyperlink for film yet. With true screen fluency, I’d be able to cite specific frames of a film, or specific items in a frame.

Perhaps I am a historian interested in oriental dress, and I want to refer to a fez worn by someone in the movie “Casablanca.” I should be able to refer to the fez itself (and not the head it is on) by linking to its image as it “moves” across many frames, just as I can easily link to a printed reference of the fez in text. Or even better, I’d like to annotate the fez in the film with other film clips of fezzes as references.

With full-blown visuality, I should be able to annotate any object, frame or scene in a motion picture with any other object, frame or motion-picture clip. I should be able to search the visual index of a film, or peruse a visual table of contents, or scan a visual abstract of its full length. But how do you do all these things? How can we browse a film the way we browse a book?

It took several hundred years for the consumer tools of text literacy to crystallize after the invention of printing, but the first visual-literacy tools are already emerging in research labs and on the margins of digital culture. Take, for example, the problem of browsing a feature-length movie. One way to scan a movie would be to super-fast-forward through the two hours in a few minutes. Another way would be to digest it into an abbreviated version in the way a theatrical-movie trailer might. Both these methods can compress the time from hours to minutes. But is there a way to reduce the contents of a movie into imagery that could be grasped quickly, as we might see in a table of contents for a book?

Academic research has produced a few interesting prototypes of video summaries but nothing that works for entire movies. Some popular Web sites with huge selections of movies (like porn sites) have devised a way for users to scan through the content of full movies quickly in a few seconds. When a user clicks the title frame of a movie, the window skips from one key frame to the next, making a rapid slide show, like a flip book of the movie. The abbreviated slide show visually summarizes a few-hour film in a few seconds. Expert software can be used to identify the key frames in a film in order to maximize the effectiveness of the summary.

The holy grail of visuality is to search the library of all movies the way Google can search the Web. Everyone is waiting for a tool that would allow them to type key terms, say “bicycle + dog,” which would retrieve scenes in any film featuring a dog and a bicycle. In an instant you could locate the moment in “The Wizard of Oz” when the witchy Miss Gulch rides off with Toto. Google can instantly pinpoint desirable documents out of billions on the Web because computers can read text, but computers are only starting to learn how to read images.