Thursday, July 31, 2014

David McCullough - Truman

I have switched to rereading this biography and will finish the John Wooden biography after this.  Will have my first post on this book on Monday.

Like David Copperfield

Like David Copperfield I used to wonder growing up if I would be the hero of my own story. Now that I know the answer is NO I can relax. The pressure is off.

Monday, July 28, 2014

Jack and Jill

Jack and Jill went up the hill
To fetch a pail of water.
Jack fell down and broke his crown,
And Jill came tumbling after.

I just got a text from Jill saying that Jack didn't fall: she pushed him down. Her tumbling was all for show. That naughty Jill! She says he had it coming because he wouldn't buy her a popcicle. Shouldn't a guy always be willing to buy his girl a popcicle?
I guess it's time to take sides. Whose side are you on?

Sunday, July 27, 2014

The Decline of Harper Lee

BY Boris Kachka
Slate
21 July 2014

For Monroeville, Alabama, population 6,400 and shrinking, the summer of 2010 was momentous. Over a long July weekend, locals reenacted historical vignettes, held a silent auction, cooked a southern feast, and led tours of local landmarks. There was a documentary screening, two lawn parties, and a marathon reading of the novel whose 50th anniversary was the grand occasion. To Kill a Mockingbird, which needs no introduction—because it is the introduction, for most American children, to civil rights, literature, and the justice system—had sold nearly a million copies for each year in print. There were at least 50 other celebrations nationwide, but the epicenter was Monroeville, a place whose only real industry (the lingerie plant having recently shuttered) was Mockingbird-related tourism. It was not only the model for the novel’s fictional Maycomb but the home of its author, Harper Lee. She lived less than a mile from the festival, but she never came.

If our country had a formalized process for anointing literary saints, Harper Lee might be first in line, and one of the miracles held up as proof would be her choice to live out her final years in the small town that became the blueprint for our collective ideal of the Small Town. But at 88, the author finds her life and legacy in disarray, a sad state of litigious chaos brought on by ill health and, in no small part, the very community she always believed, for all its flaws, would ultimately protect her. Maycomb was a town where love and neighborly decency could overcome prejudice. To the woman who immortalized it and retreated to it for stability and safety, Monroeville is something very different: suffocating, predatory, and treacherous.

For much of her life, Nelle Harper Lee (known to friends as Nelle) spent more time in the comforting anonymity of New York than in the Monroeville redbrick ranch house her family had occupied since 1952. Then, in 2007, a stroke left her wheelchair-bound, forgetful, and largely deaf and blind—forced to sell her Upper East Side apartment and move into a Monroeville assisted-living facility. It was a loss but also a homecoming: For decades she’d relied on another local living legend, Alice Lee—her older sister, part-time housemate, and lawyer—to maintain her uneasy armistice with her hometown and her fame.

Alice, who retired two years ago at the age of 100, had inherited her partnership in the family firm from their father, A.C. Lee, the model for Mockingbird’s righteous lawyer, Atticus Finch. (Nelle calls her “Atticus in a skirt.”) The same family practice whose modest virtues are inculcated, via Mockingbird, to generation after generation of schoolchildren was charged with protecting the legacy of its author—a job that one of the best-selling novelists of all time wanted nothing to do with. Yet as both women passed into very old age, what should have been a peaceful and prosperous decline became a surprisingly turbulent decade, robbing Nelle of not just her health but old friends, her dearly held privacy, the town’s good will, and, for a time, the copyright to the book she sometimes wishes she hadn’t written.

It wasn’t just infirmity that kept Nelle from basking in those 2010 celebrations; it was disillusion. Allergic to both attention and commerce, she’d always found the Mockingbird industrial complex tacky and intrusive, but had managed to carve out a separate existence in its shadow. Now too many “well wishers” were stopping by her new apartment—including her literary agent, whom she eventually barred from the facility. (He’d already had her sign over her copyright.) Just a month before the anniversary, a family friend entered her room with a Daily Mail reporter in tow. The journalist flew back to London with an unflattering photo and a cruel 2,000-word profile to match. Monroeville had finally confirmed her fear that there really was nowhere to hide. She’d once explained to Oprah Winfrey, over lunch in a private suite at the Four Seasons, why she’d never appear on her show: Everyone compares her to Scout, the sweetly pugnacious tomboy who narrates Mockingbird. But as she told Oprah, “I’m really Boo”—Boo Radley, the young recluse in the creepy house who winds up saving the day.

By the time of Mockingbird’s golden anniversary, Nelle’s agent was denying in court that he represented her. The courthouse gift shop, “The Bird’s Nest,” was selling To Kill a Mockingbird onesies and car decals. A former next-door neighbor, Marja Mills, was working on a memoir called The Mockingbird Next Door—which came out this week, lifting the veil of Nelle’s privacy amid a confounding volley of statements between lawyers, sisters, and friends over whether and when she approved of the project. It was left to Alice’s successor in the family firm, Tonja Carter, to sort things out. Carter restricted Lee’s visitors and instituted lawsuits against not just the literary agent but also the courthouse museum. She nearly sued Marja Mills, too, and released a letter last week reaffirming Nelle’s objections—objections that her own sister, Alice, had claimed Carter had ginned up on her behalf. “It’s a terrible thing to happen toward the end of a person’s life,” says Thomas Lane Butts, a preacher who was among Lee’s best friends but hasn’t seen her in a year. Whatever Nelle’s intentions, Carter has upended the town’s delicate status quo, making as many enemies as headlines. Nelle never did like making headlines, even for the right reasons, but she did once love Monroeville.

* * *

In 1964, in one of her last interviews, Lee laid out her mission as a writer. “This is a small-town middle-class southern life as opposed to the Gothic,” she said. “I believe there is something universal in this little world, something decent to be said for it, and something to lament in its passing.” She concluded, joking, “All I want to be is the Jane Austen of South Alabama.”

Mockingbird plays on Southern Gothic, only to demystify it and mythologize the ordinary instead. Amasa Coleman Lee may have been, as his daughter said, “one of the most beloved men in this part of the state,” but he wasn’t Atticus Finch; he was a tax lawyer. He left his childhood farm in Florida, married a prominent village daughter (Frances Finch of Finchburg), and moved to Monroeville in order to manage the finances of the law firm of Barnett, Bugg & Lee, as it shortly became, a partnership of businessmen-attorneys who owned half the town. A.C. did try one criminal case, at age 29, defending two black men on a murder charge. He lost and they were hanged, pieces of their scalps mailed to the son of the victim.

Though Atticus defends a black man wrongly accused—and ultimately convicted—of rape, nothing quite so brutal happens in Mockingbird. And by making Atticus a widower, Lee also omitted a much more personal experience: her mother’s instability. According to Mills, Frances suffered a nervous breakdown after her daughter Louise failed to thrive. (The Lees had five children in five-year increments.) Dr. William Harper came to the rescue of both mother and baby, and Harper became their next child’s middle name. Truman Capote, Nelle’s best childhood friend, later described her upbringing as “Southern grotesque.” He claimed Frances had tried to drown Nelle in the bathtub. Lee denied it vehemently, and for all her rebelliousness—Butts once said she had “hell and pepper in her”—she never said a word against her family, in fiction or otherwise. In her work and life, madness is banished in the light of reason and authority.

A.C. passed his august authority on to Alice. During the Depression, she had to leave college but was quickly brought under her father’s wing and into his law firm. Nelle tried to follow the same path—attending the same girls’ college as Alice and then transferring to the University of Alabama, where she loved writing but hated her sorority and law classes. After a summer at Oxford University, she dropped out. She wanted to make a go, like her friend Truman, of living and writing in New York. A.C., who’d been paying for school, said she’d have to make it on her own.

In New York, Lee found a tight-knit replacement family. Capote introduced her to Broadway lyricist Michael Martin Brown and his wife, Joy. They hooked her up with an agent, Maurice Crain, and on Christmas, 1956, they gave her the gift her father wouldn’t: enough money to do nothing but write for a year. She remembered it later as “a full, fair chance for a new life.” Within five months, she had a draft of Atticus out on submission, and was already partway into a second novel when a Lippincott editor took it on.

Most of Mockingbird’s characters have real-life antecedents, and Scout’s delicate friend Dill is clearly Capote. He was Nelle’s first writing partner and her social fixer in New York, and Lee helped him research his true-crime classic, In Cold Blood. But Capote eventually spurned her. Rebutting his vicious gossip seems to have been one of Lee’s motivations for talking to Marja Mills. “They fled from the truth like Dracula from the cross,” Lee told Mills, meaning him and his aunt, whose memoir Lee claimed to have thrown into a bonfire. “Truman was a psychopath, honey.” Capote drifted away in a miasma of drugs and self-hatred—a cautionary tale of frustrated fame. His former best friend tacked fiercely in the opposite direction.

A.C. Lee was shocked by his daughter’s success. “It’s very rare indeed when a thing like this happens to a country girl going to New York,” he told a reporter. “She will have to do a good job next time.” He died in 1962, after meeting Gregory Peck but before seeing him play Atticus in Alan Pakula’s film. Nelle spent the next couple of years trying to write, but couldn’t shake the fear that there was, as her father had worried, nowhere to go but down. At one press conference to promote the movie, Lee’s humor was edged with tension. “Will success spoil Harper Lee?” asked a reporter. “She’s too old,” Lee said. “How do you feel about your second novel?” asked another. “I’m scared,” she said.

In his unauthorized 2006 biography, Mockingbird, Charles J. Shields quotes Lee telling a friend, “I wouldn’t go into downtown Manhattan for the world.” Mills once made Lee a gift of E.B. White’s Here Is New York. Nelle “wept at the first sentence.” It reads, “On any person who desires such queer prizes, New York will bestow the gift of loneliness and the gift of privacy." White later pictures “a young girl arriving from a small town in Mississippi to escape the indignity of being observed by her neighbors.” After Mockingbird won acclaim and a Pulitzer, Lee felt observed by everyone—the whole world a small town. At least when she stayed in Monroeville, she had Alice.

By 1970, when her beloved agent died, there was no one else left—not Capote, not her parents. “The close circle she was relying on fell away over the course of a decade, and her tight Monroeville clique was practically all that remained,” says Charles Shields, who wrote the 2006 unauthorized biography, Mockingbird. “I think the Lees have kind of an old-fashioned notion,” he adds. “Keep your friends close to your breast with hoops of iron and rely on them. And the novel, being one of the most popular of the 20th century, makes tremendous demands that go well beyond their abilities.”

* * *

Maybe it wasn’t just Nelle’s insecurity that held her back from becoming “the Jane Austen of South Alabama,” but also the dismaying decline of the “small-town middle-class” idyll she’d staked her career on documenting. She had, after all, written a historical novel. To Kill a Mockingbird was filmed not in Monroeville but on an L.A. lot. There were—still are—remnants of Depression-era Monroeville, not least the old Federal-style courthouse. But even as the film came out, a drab new courthouse was being built next door. Downtown’s only movie theater burned down not long after Mockingbird had its first run, and was never rebuilt. In 1997, the city was dubbed “The Literary Capital of Alabama,” prompting Lee, who wasn’t consulted on the nickname, to remark, “The literary capital of Alabama doesn’t read.”

Harper Lee’s assisted-living apartment is on Highway Bypass 21, just a couple of blocks from the town’s real commercial center, a series of malls. There’s a place called Radley’s Fountain Grill down that way, and an old stone wall that once separated Lee’s childhood home from Capote’s—both long gone, replaced by a takeout shack called Mel’s Dairy Dream. Lee prefers the more generic places by the lingerie factory outlet (a remnant of the old Vanity Fair plant). Before her stroke, she could be found at Hardee’s, or better yet at McDonald’s, gulping down coffee during long chats with friends. (There were higher-end expeditions to the local golf club and to casinos on the Gulf coast.) When she watched an advance screening of the biopic Capote at a neighbor’s house—the Lees had no television—she opted for Burger King.

In 1961, Lee told Life that, unlike Thomas Wolfe, “I can go home again.” That’s debatable, as is the question of why Harper Lee chose to spend so much of her life in a town whose only claim to fame was her fame—a fame she claimed to despise. The Mockingbird Next Door dwells on rural trips out of town, fishing and duck watching and off-the-record country drives. (Romantic inquiries were “not up for discussion.”) Lee seemed to prefer the countryside to her hometown. “I was surprised that she was living here, to tell you the truth,” says Butts, who was often on those drives. “It’s like being in a fishbowl.”

Marja Mills’s astonishing access to Lee was the product of luck, both good and bad. Sent to Monroeville by the Chicago Tribune to find out what Harper Lee thought of Mockingbird being chosen for “One Book, One Chicago,” she expected to strike out. But, after a polite introductory letter, Alice not only answered the ranch house door but also secured her an audience with Nelle. On Mills’s second visit to town, Butts gave her his rationale for the sisters’ openness: “When she and Alice go, people are going to start ‘remembering’ things as they didn’t happen, or outright making things up, and they won’t be here to set the record straight. So keep taking notes, girl.” Mills suffers from lupus, and she had a flare-up just before leaving Monroeville again. Nelle claimed to be her mother-in-law so she could stay with her in the local hospital. Mills became an honorary member of “the old in a nation geared toward the young.”

In 2004, sapped by her illness, Mills decided to leave her job and try to write a book. She wound up moving in next door to the Lees, securing a $450 rental with the sisters’ help. Over endless coffees and drives, Nelle opened up enough to give a solid sense of herself: unconfident in her looks and therefore unconcerned; witty and garrulous within the strict limits she sets for talk; conservative by northern standards; cranky and principled; moody but predictable.

Mills makes it clear in the book that she intended at first to write a broader Alabama history. Monroeville was confused, years later, by the news of a memoir. “I think that lady kind of pulled wool over their eyes,” George Jones, the 91-year-old town historian and gossip, told me. Mills says only very few friends knew just how much time she and the sisters spent together. The Lees, she says, “managed to have a parallel existence” within Monroeville—a smaller bubble within the bubble of a hard-to-reach county seat, apart from tourists and nosy locals alike.

One of Nelle’s friends, retired Auburn history professor Wayne Flynt, is skeptical of Nelle’s participation—but not Alice’s. “Alice wanted the family story told and Alice has an agenda, and I think Marja Mills fits that agenda quite well,” he says. “Nelle is afraid that telling the family story will be telling her story, and I can’t believe she cooperated.” He adds that, around that time, he tried to persuade Nelle to record a sealed oral history, and she flatly refused. Last Monday’s letter, signed by Lee, seems to confirm his impressions: “I was hurt, angry and saddened, but not surprised,” upon learning of Mills’s “true mission: another book about Harper Lee.” She concludes, “rest assured, as long as I am alive any book purporting to be with my cooperation is a falsehood.” Butts says she may well feel that way now, but didn’t at the time. “There was no break,” he says—contrary to the letter’s claim—“until somebody talked to her, said she should oppose the book.” He says he witnessed Nelle insisting on putting personal things on the record.

Mills’ portrait is gentle almost to a fault, but her mission was to humanize Lee, not to lionize her. Butts warned Mills she might get angry late-night phone calls from Nelle: “She accuses people, chews them out. The alcohol fuels it.” Mills repeats speculation that drinking contributed to Lee’s abandoning a true-crime book in the '80s. Overall, Lee comes off both plain and complicated. She can be paranoid, but often for good reason. In Monroeville, Mills writes, “information about Nelle was currency. It could be spent, traded, or saved for the right moment.” On Nelle’s earliest meeting with Mills, in a sweltering room at the Best Western, one of the first things she told the reporter was, “This is not the Monroeville in which I grew up. I don’t like it one bit.” Mills writes of Lee looking over a ravine. “Nelle suggested that perhaps she could toss all her belongings in there and burn them, preferably shortly before she died, so she wouldn’t have to worry about her personal things falling into the wrong hands. She was only half kidding.”

The case of Samuel Pinkus would make any writer paranoid. Pinkus had briefly run McIntosh & Otis on behalf of his ailing father-in-law—and Nelle’s longtime agent—Eugene Winick, but then suddenly left and took with him the estate-heavy firm’s most lucrative living authors, Mary Higgins Clark and Harper Lee. (No one knows exactly how he persuaded Lee to leave.) “It was an absolute betrayal,” Winick told me last year, “not only as an employee, but also as a family member.” The Winicks sought relief in mediation. Over the years, Pinkus set up a succession of corporations that, M&O’s and Lee’s lawyers claimed, were designed to avoid those debts. In the process of shifting around millions in royalties, Pinkus managed to take over Harper Lee’s copyright.

Lee’s 2013 complaint against Pinkus begins by describing her close ties to the agency: “Both Harper Lee and her sister trusted and relied on M&O virtually all the time since the publication of her famous novel.” That account elides a lot of drift. After Maurice Crain died, Lee was passed along to his wife, but by the time Pinkus was brought into the company, it was Alice whom Nelle counted on most of all. When Nelle heard the courthouse-museum was putting out a book called Calpurnia’s Cookbook, using the name of Mockingbird’s maid, Alice sent the letter that took it off the shelves. M&O never even heard of it.

While working on his biography, Charles Shields called M&O and couldn’t get any real answers about their prized client. Maybe they were just being protective, but Shields found a willing correspondent in Alice Lee. They had a few written exchanges about Lee family history, and things seemed to be opening up—until, one day in 2006, he received “an imperious letter” from Pinkus, by then her exclusive agent, warning him off any further contact with the Lee sisters.

* * *

In June of 2007, Lee had a lunch appointment with friends in New York. When she didn’t show up, they went to her apartment, and found her lying on the floor. She’d been there for more than a day. Even before the debilitating stroke, she’d had hearing problems and macular degeneration—been forced to accessorize her khakis and sneakers with glasses fitted with side panels. Now she went through months of rehab, gave up her New York apartment, and moved straight from the hospital into assisted living.

Around this time, she signed an assignment of copyright to Sam Pinkus—an act she later forgot. Her lawyer during this period was still officially her sister Alice, 94. Eventually, Tonja Carter began pressing Pinkus to give up his copyright. (She had, however, notarized a reaffirmation of Pinkus’s copyright—something she’s never explained.) Finally, in 2012, Nelle got her copyright back, but according to the lawsuit, Pinkus continued to instruct publishers foreign and domestic to pay royalties into one of several companies. It wasn’t until a New York litigation firm filed suit—a move that put the elusive Harper Lee all over the news—that Nelle was finally able to free herself of Pinkus. The case was settled last September.<

In 2011, while Carter and Pinkus haggled, Penguin Press acquired The Mockingbird Next Door after a heated auction. The day after it was announced, Carter released a statement from Harper Lee: “Contrary to recent news reports, I have not willingly participated in any book written or to be written by Marja Mills.” Penguin Press responded by producing a statement, signed by Alice Lee, agreeing to participate. Few people paid attention when, a month later, the AP reported that Alice Lee now claimed that Carter’s statement was made without the sisters’ consent. That story concluded, “A woman who answered the phone at Barnett, Bugg declined comment and hung up on a reporter seeking comment.”

Carter, who reportedly has power of attorney over Lee, replied to one email—“I can correspond by email when and if I become available”—but never answered my questions. It isn’t clear exactly what spurred Lee—or Carter—to file for a trademark to Lee’s name and the title of her book early last year. The Monroe County Heritage Museum fought the trademark, and Lee’s lawyers responded last October by suing them. Like the Pinkus suit, this complaint alleged that the defendant was taking advantage of Lee’s ill health—in this case, by ramping up gift shop operations and naming their website tokillamockingbird.com. (Both the shop and the website are more than 15 years old.)

The complaint begins immediately with a dig at Monroeville: “Although the story was set in the 1930’s, her realistic and highly critical portrayal of Maycomb’s residents shone a harsh light on the attitudes of communities that were the focal point of the civil rights movement in the 1960s … The town’s desire to capitalize upon the fame of To Kill a Mockingbird is unmistakable: Monroeville’s town logo features an image of a mockingbird and the cupola of the Old County Courthouse.”

Seeking unspecified damages, the suit listed all the Mockingbird-branded items in the gift shop, including clothes for adults and children, tote bags, towels, “glass ware, plastic/acrylic tumbler glasses, seat cushions, car decals, coasters,” and a dozen other tchotchkes. It estimated 2011 museum revenue at more than $500,000, without mentioning that expenses were almost as high—the difference being just a bit more than the roughly $30,000 the gift shop earns annually. Nor did the suit mention that the museum is a nonprofit, or that Tonja Carter and her husband, a distant cousin of Truman Capote, own a tourist-filled restaurant across from the courthouse.

Museum attorney Matthew Goforth released a statement in October firing back: “It is sad that Harper Lee's greedy handlers have seen fit to attack the non-profit museum in her hometown that has been honoring her legacy.” Whatever the merit of Goforth’s argument, it brought to mind something Lee told Mills: “Greed is the coldest of deadly sins, don’t you think?”

“I was shocked,” says Stephanie Rogers, executive director of the museum. “I tried to talk to the family and say, ‘let’s stop this.’” After that 50th-anniversary commemoration in 2010, she’d sent Nelle leftover cake (shaped like Mockingbird’s iconic knot-holed tree), and Nelle had written back thanking her “friends.” After last month’s settlement, the website URL has been changed, but all the Mockingbird knickknacks are still for sale. Once the trademark goes through, they’ll be licensed through Lee. The Mockingbird Next Door will be sold there, too.

Friends were hurt by both the lawsuit and notes from Carter informing them they could no longer visit Nelle. One of them, Sam Therrell, owns Radley’s Fountain Grill and recently resigned as a member of the museum’s board. “I don’t think Miss Nelle or Alice had anything to do with it,” he says. “It’s her agent and her local lawyer,” Tonja Carter. “I don’t know what kind of relationship they entered into, how she ever became of counsel, and I don’t give a rat’s ass, to tell you the truth. It was stupid to let it happen, I can tell you that.”

Other friends do emphasize her lifelong ambivalence over Monroeville. “She never has liked the museum,” says Butts. “But a lot of her attitudes about things changed after the stroke. She becomes excitable in all sorts of ways.” It’s perfectly plausible for Lee to be against the book, against the town—even against her own sister—without being fully accountable. “Nelle Harper’s at this stage in her life,” says Butts, “at which she’s readily influenced about anybody who’s around her.” He doesn’t fault Alice for failing to safeguard Lee’s rights; he faults Nelle for never relying on anyone else. “She lived as if Alice would never die.”

Wayne Flynt, the Auburn professor, trusts Carter and believes she’s just honoring Nelle’s sense of being fed up. “Monroeville is like most small towns in the South,” says Flynt, whose work focused on Alabama and poverty. “It’s wonderful because of its tremendous sense of curiosity and community, but it’s also nosy and intrusive. The world she wrote about is the world she now inhabits, with all the good stuff and the bad stuff.”

In responding to Lee’s new letter last week, Penguin Press released a handwritten letter Alice Lee wrote to Marja Mills in 2011. It read, in part: “When I questioned Tonja”—her onetime protégé, inheritor of A.C. Lee’s firm—“I learned that without my knowledge she had typed out the statement, carried it to [Nelle’s apartment], and had Nelle Harper sign it … Poor Nelle Harper can’t see and can’t hear and will sign anything put before her by anyone in whom she has confidence. Now she has no memory of the incident … I am humiliated, embarrassed, and upset about the suggestion of lack of integrity at my office.”

The letter signed by Nelle last week points out that “my sister would have been 100 years old” when she wrote those words. Butts insists Alice was “bright as a penny”—at least back then. Around the time of that letter, Alice stopped visiting the office regularly. She had a fall, then contracted pneumonia and began to decline. She moved out of the Lee’s redbrick ranch house and into a different assisted-living facility. Whatever Wayne Flynt’s suspicions about Marja Mills, he agrees with Nelle’s latest biographer on one point: Silence has not served Nelle Harper Lee. “In the absence of her being willing to talk, the only versions we’ll ever have are other people’s versions.”

The Matter with Libertarianism

The Fair Society

Human nature and the pursuit of a more just political system

What’s the Matter with Libertarianism?

Its models of human nature and society are terminally deficient.
Who can object to the libertarian principles of individual freedom, personal responsibility, and the right to hold property - at least in the abstract?  The problem is that the real world is never "abstract."  All philosophies must ultimately confront reality, and the more radical versions of libertarianism (there are many, from extreme anarchism to limited government "minarchism") rely on terminally deficient models of human nature and society.  Let's (very briefly) take a look at the problem. The libertarian model of individual psychology is grounded in the utilitarian, neo-classical economics model of "Homo economicus" (economic man).  Our motivations can be reduced to the single-minded pursuit of our (mostly material) self-interests. Accordingly, mainstream economists seem to consider it their mission in life to help us do so more "efficiently." The Nobel economist Amartya Sen many years ago scathingly characterized this simplistic model as "rational fools who are decked out in their one, all-purpose preference function."

The selfish actor model of human nature was tacitly endorsed with the rise of "Neo-Darwinism" in evolutionary biology during the 1970s, as epitomized in biologist Richard Dawkins' famous book The Selfish Gene.  As Dawkins summed it up, "We are survival machines - robot vehicles blindly programmed to preserve the selfish molecules known as genes....I shall argue that a predominant quality to be expected in a successful gene is ruthless selfishness....we are born selfish."
A line from libertarian philosopher Robert Nozick's path-breaking book, Anarchy, State and Utopia, says it all: "Individuals have rights, and there are things no person or group [or state] may do to them without violating their rights." (When asked to specify what those rights are, libertarians often cite philosopher John Locke's mantra "life, liberty, and property.")  Not to worry, though.  Through the "magic" of Adam Smith's "invisible hand," the efficient pursuit of our self interests in "free markets" will ensure the greatest good for the greatest number.
One problem with this (utopian) model is we now have overwhelming evidence that the individualistic, acquisitive, selfish-gene model of human nature is seriously deficient; it is simplistic, one-sided and in reality resembles the pathological extremes among the personality traits that we find in our society.  The evidence about human evolution indicates that our species evolved in small, close-knit social groups in which cooperation and sharing overrode our individual, competitive self-interests for the sake of the common good. (This scenario is reviewed in my books The Fair Society and Holistic Darwinism.)  We evolved as intensely interdependent social animals, and our sense of empathy toward others, our sensitivity to reciprocity, our desire for inclusion and our loyalty to the groups we bond with, the intrinsic satisfaction we derive from cooperative activities, and our concern for having the respect and approval of others all evolved in humankind to temper and constrain our individualistic, selfish impulses (as Darwin himself pointed out in The Descent of Man).
So we are not, after all, like bumper cars in a carnival, where we all range freely, and, if we cause "harm" by crashing into others, we simply say "excuse me" and move on.  Rather, we are (most of us) embedded in an exceedingly complex network of social relationships, many of which are vital to our well-being.  Every day we confront issues relating to the needs and wants of others and must continually make accommodations.  And in addressing these conflicting interests, the operative norm is - or should be - fairness, a balancing of the interests and needs of other parties, other "stakeholders."
Indeed, libertarians generally have no model of society as an interdependent group with a common purpose and common interests.  For instance, the canonized conservative economist Friedrich Hayek posited a stark choice between two alternative models - either a "free market" of atomized individuals rationally pursuing their self-interests in transactional relationships or an authoritarian, coercive "state" that seeks control over us. In Hayek's words, "socialism means slavery."
The libertarian novelist Ayn Rand went even further.  As she saw it, there is a perpetual class war going on between the "creators" and "producers," on the one hand, and the great mass of "parasites" and "moochers" who use government to steal whatever they can from the deserving few.  One of Rand's heroes, the defiant architect Howard Roark in her novel The Fountainhead, tells us:  "All that proceeds from man's independent ego is good.  All that which proceeds from man's dependence upon men is evil...The egotist in the absolute sense is not the man who sacrifices for others....Man's first duty is to himself...His moral law is to do what he wishes, provided his wish does not depend primarily upon other men....The only good which men can do to one another and the only statement of their proper relationship is - hands off!"
The benign free market model of society is equally deficient.  Many libertarians seem to be myopic about the prevalence of self-interested "organizations" in the marketplace, from the many millions of mom-and-pop businesses with only a few employees to mega-corporations with hundreds of thousands of workers (whose freedom they may severely restrict). These "corporate interests" sometimes oppose the common interest and perpetrate malfeasance. (Do we need to rehearse the recent examples of Enron, Capital Management, Countrywide, Goldman Sachs, BP, Massey Energy and other disasters?)  So-called free markets are routinely distorted by the wealthy and powerful, and the libertarians' crusade for lower taxes, less regulation and less government plays into their hands.  Perhaps unwittingly, anti-government libertarians would have us trade democratic self-government for an oligarchy.
A more serious concern is that the libertarian fixation with individual freedom distracts us from the underlying biological purpose of a society.  The basic, continuing, inescapable problem for humankind, as for all other living organisms, is biological survival and reproduction.  Whether we are conscious of it or not, most of us spend a large majority of our time and energy engaged in activities that are directly or indirectly related to meeting no fewer than 14 domains of "basic needs" - biological imperatives ranging from such commonplace things as food, clothing and shelter to physical and mental health and the reproduction and nurturance of the next generation.
In a very real sense, therefore, every organized economy and society represents a "collective survival enterprise" - an immensely complex "combination of labor" (a term I prefer to the traditional "division of labor") upon which all of our lives literally depend.  And our first collective obligation is to ensure that all of our basic needs are met.  If there is a "right to life," as our Declaration of Independence and pro-life conservatives aver, it does not end at birth; it extends throughout our lives, and it imposes on all of us a responsibility for ensuring the "no-fault" needs of others, when they cannot for various reasons provide for themselves.
So why is libertarianism unfair?  It rejects any responsibility for our mutual right to life, where we are all created approximately equal.  It would put freedom and property rights ahead of our basic needs, rather than the other way around.  It is also oblivious to the claims for reciprocity, an obligation to contribute a fair share to support the collective survival enterprise in return for the benefits that each of us receives.  And it is weak on the subject of equity (or social merit) as a criterion for respecting property rights.  It presumes a priori that property holdings are deserved, rather than making merit a precondition.  Imposing a test of merit would put strict limits on property rights.  Finally, it is anti-democratic in that it rejects the power of the majority to restrain our freedom and limit our property rights in the common interest, or for the general welfare.      
The conservative Washington Post op-ed writer Michael Gerson observed in a recent column that "Conservatives hold a strong preference for individual freedom. But they traditionally have recognized a limited role for government in smoothing the rough edges of a free society. This concern for the general welfare helps minimize the potential for revolutionary change, while honoring a shared moral commitment to the vulnerable."  This kind of "traditional" Burkean (and Platonic) conservativism is radically opposed to libertarianism, which Gerson calls a teenage fantasy that we need to outgrow.  Freedom, as the social commentator Charles Morgan put it many years ago, is the space created by the surrounding walls.  It's time to begin paying attention to the walls. 

Saturday, July 26, 2014

Reading List

I am reading a new biography of Coach John Wooden.  The three greatest basketball coaches of all-time are Wooden, Red Auerbach, and Phil Jackson.  After this I will reread McCullough's bio of Truman, the best political biography ever.

Wooden came from a small town in Indiana.  He grew up with no indoor plumbing.  He was most influenced by his father.  He grew up in the Hoosier State at a time when basketball was taking over high school sports in the state.  He was also a baseball player and could have given professional baseball a try but he chose basketball.  He was a great All-American at Purdue and starting coaching at a small school in Indiana while also being an English teacher.  He loved poetry and aphorisms all his life.

His is quite a life story.

The Dangerous Tea Party

Today’s Right-Wingers Surpass Wingers of the Past
by Norm Ornstein
July 26, 2014


A terrific essay published this week in the National Journal looks at historical “tugs of war” within both the Republican and Democratic parties, then explains why the current conservative movement’s efforts to yank Republicans right are unprecedented. The author, Norm Ornstein — a journalist, resident political science scholar at the American Enterprise Institute and a former guest on Moyers & Company — writes that today’s tea party is meeting with more success than other radical movements of the last hundred years, with frightening consequences for America.



After wending his way through notable political schisms of the 20th century, Ornstein finds many of the roots of today’s conservative movement took hold in the 1990s:



Clinton’s election in 1992 moved the Democrats firmly to the center on previously divisive issues like welfare and crime. But it also provided the impetus for the forces that have led to the current Republican problem. These forces were built in part around insurgent Newt Gingrich’s plans to overturn the Democratic 38-year hegemony in Congress, and in part around a ruthlessly pragmatic decision by GOP leaders and political strategists to hamper the popular Clinton by delegitimizing him and using the post-Watergate flowering of independent counsels to push for multiple crippling investigations of wrongdoing (to be sure, he gave them a little help along the way). No one was more adroit at using ethics investigations to demonize opponents than Newt. In 1994, Gingrich recruited a passel of more radical candidates for Congress, who ran on a path to overturn most of the welfare state and who themselves demonized Congress and Washington. At a time of rising populist anger—and some disillusionment on the left with Clinton—the approach worked like a charm, giving the GOP its first majority in the House in 40 years, and changing the face of Congress for decades to come.



Newt’s strategy and tactics were abetted and amplified by the new force of political talk radio, which had been activated by the disastrous federal pay raise in 1989-90, and of tribal cable television news. As Sean Theriault details in his book The Gingrich Senators, many of Newt’s progeny moved on to the Senate and began to change it from an old club into a new forum for tribal warfare. Move on through right-wing frustration with George W. Bush’s combination of compassionate conservatism and unfunded social policy (and wars) and then the election of Barack Obama, and the ingredients for a rise of radicalism and a more explosive intra-party struggle were set. They were expanded again with the eager efforts in 2010 of the U.S. Chamber of Commerce and the Young Guns (Eric Cantor, Kevin McCarthy, and Paul Ryan) to exploit the deep populist right-wing anger at the financial collapse and the bailouts of 2008 and 2009 by inciting the tea-party movement. But their expectation that they could then co-opt these insurgents backfired badly.



A lot of history to get to the point. What began as a ruthlessly pragmatic, take-no-prisoners parliamentary style opposition to Obama was linked to constant efforts to delegitimize his presidency, first by saying he was not born in the U.S., then by calling him a tyrant trying to turn the country into a Socialist or Communist paradise. These efforts were not condemned vigorously by party leaders in and out of office, but were instead deflected or encouraged, helping to create a monster: a large, vigorous radical movement that now has large numbers of adherents and true believers in office and in state party leadership. This movement has contempt for establishment Republican leaders and the money to go along with its beliefs. Local and national talk radio, blogs, and other social media take their messages and reinforce them for more and more Americans who get their information from these sources. One result is that even today, a Rasmussen survey shows that 23 percent of Americans still believe Obama is not an American, while an additional 17 percent are not sure. Forty percent of Americans! This is no longer a fringe view.



As for the radicals in elected office or in control of party organs, consider a small sampling of comments:



“Sex that doesn’t produce people is deviate.” —Montana state Rep. Dave Hagstrom.



“It is not our job to see that anyone gets an education.” —Oklahoma state Rep. Mike Reynolds.



“I hear you loud and clear, Barack Obama. You don’t represent the country that I grew up with. And your values is not going to save us. We’re going to take this country back for the Lord. We’re going to try to take this country back for conservatism. And we’re not going to allow minorities to run roughshod over what you people believe in!” —Arkansas state Sen. Jason Rapert, at a tea-party rally.



Friday, July 25, 2014

Political Typology Quiz

http://www.people-press.org/quiz/political-typology/

Technology Will Ruin Education

The plot to destroy education: Why technology could ruin American classrooms — by trying to fix them

A Silicon Valley scheme to "disrupt" America's education system would hurt the people who need it the most

The plot to destroy education: Why technology could ruin American classrooms — by trying to fix them (Credit: Warner Bros. Entertainment Inc./Pgiam via iStock/Salon)
How does Silicon Valley feel about college? Here’s a taste: Seven words in a tweet provoked by a conversation about education started by Silicon Valley venture capitalist Marc Andreeseen.
Arrogance? Check. Supreme confidence? Check. Oblivious to the value actually provided by a college education? Check.
The $400 billion a year that Americans pay for education after high school is being wasted on an archaic brick-and-mortar irrelevance. We can do better! 
But how? The question becomes more pertinent every day — and it’s one that Silicon Valley would dearly like to answer.
The robots are coming for our jobs, relentlessly working their way up the value chain. Anything that can be automated will be automated. The obvious — and perhaps the only — answer to this threat is a vastly improved educational system. We’ve got to leverage our human intelligence to stay ahead of robotic A.I.! And right now, everyone agrees, the system is not meeting the challenge. The cost of a traditional four-year college education has far outpaced inflation. Student loan debt is a national tragedy. Actually achieving a college degree still bequeaths better job prospects than the alternative, but for many students, the cost-benefit ratio is completely out of whack.
No problem, says the tech industry. Like a snake eating its own tail, Silicon Valley has the perfect solution for the social inequities caused by technologically induced “disruption.” More disruption!
Universities are a hopelessly obsolete way to go about getting an education when we’ve got the Internet, the argument goes. Just as Airbnb is disemboweling the hotel industry and Uber is annihilating the taxi industry, companies such as Coursera and Udacity will leverage technology and access to venture capital in order to crush the incumbent education industry, supposedly offering high-quality educational opportunities for a fraction of the cost of a four-year college.


There is an elegant logic to this argument. We’ll use the Internet to stay ahead of the Internet. Awesome tools are at our disposal. In MOOCs — “Massive Open Online Courses” — hundreds of thousands of students will imbibe the wisdom of Ivy League “superprofessors” via pre-recorded lectures piped down to your smartphone. No need even for overworked graduate student teaching assistants. Intelligent software will take care of the grading. (That’s right — we’ll use robots to meet the robot threat!) The market, in other words, will provide the solution to the problem that the market has caused. It’s a wonderful libertarian dream.
But there’s a flaw in the logic. Early returns on MOOCs have confirmed what just about any teacher could have told you before Silicon Valley started believing it could “fix” education. Real human interaction and engagement are hugely important to delivering a quality education. Most crucially, hands-on interaction with teachers is vital for the students who are in most desperate need for an education — those with the least financial resources and the most challenging backgrounds.
Of course, it costs money to provide greater human interaction. You need bodies — ideally, bodies with some mastery of the subject material. But when you raise costs, you destroy the primary attraction of Silicon Valley’s “disruptive” model. The big tech success stories are all about avoiding the costs faced by the incumbents. Airbnb owns no hotels. Uber owns no taxis. The selling point of Coursera and Udacity is that they need own no universities.
But education is different than running a hotel. There’s a reason why governments have historically considered providing education a public good. When you start throwing bodies into the fray to teach people who can’t afford a traditional private education you end up disastrously chipping away at the profits that the venture capitalists backing Coursera and Udacity demand.
And that’s a tail that the snake can’t swallow.
* * *
The New York Times famously dubbed 2012 “The Year of the MOOC.” Coursera and Udacity (both started by Stanford professors) and an MIT-Harvard collaboration called EdX exploded into the popular imagination. But the hype ebbed almost as quickly as it had flowed. In 2013, after a disastrous pilot experiment in which Udacity and San Jose State collaborated to deliver three courses, MOOCs were promptly declared dead — with the harshest schadenfreude coming from academics who saw the rush to MOOCs as an educational travesty.
At the end of 2013, the New York Times had changed its tune: “After Setbacks, Online Courses are Rethought.”
But MOOC supporters have never wavered. In May, Clayton Christensen, the high priest of “disruption” theory, scoffed at the unbelievers: ”[T]heir potential to disrupt — on price, technology, even pedagogy — in a long-stagnant industry,” wrote Christensen, ” is only just beginning to be seen.”
At the end of June, the Economist followed suit with a package of stories touting the inevitable “creative destruction” threatened by MOOCs: “[A] revolution has begun thanks to three forces: rising costs, changing demand and disruptive technology. The result will be the reinvention of the university …” It’s 2012 all over again!
Sure, there have been speed bumps along the way. But as Christensen explained, the same is true for any would-be disruptive start-up. Failures are bound to happen. What makes Silicon Valley so special is its ability to learn from mistakes, tweak its biz model and try something new. It’s called “iteration.”
There is, of course, great merit to the iterative process. And it would be foolish to claim that new technology won’t have an impact on the educational process. If there’s one thing that the Internet and smartphones are insanely good at, it is providing access to information. A teenager with a phone in Uganda has opportunities for learning that most of the world never had through the entire course of human history. That’s great.
But there’s a crucial difference between “access to information” and “education” that explains why the university isn’t about to become obsolete, and why we can’t depend — as Marc Andreessen tells us — on the magic elixir of innovation plus the free market to solve our education quandary.
Nothing better illustrates this point than a closer look at the Udacity-San Jose State collaboration.
* * *
When Gov. Jerry Brown announced the collaboration between Udacity, founded by the Stanford computer science Sebastian Thrun and San Jose State, a publicly funded university in the heart of Silicon Valley, in January 2013, the match seemed perfect. Where else would you want to test out the future of education? The plan was to focus on three courses: elementary statistics, remedial math and college algebra. The target student demographic was notoriously ill-served by the university system: “Students were drawn from a lower-income high school and the underperforming ranks of SJSU’s student body,” reported Fast Company.
The results of the pilot, conducted in the spring of 2013, were a disaster, reported Fast Company:
Among those pupils who took remedial math during the pilot program, just 25 percent passed. And when the online class was compared with the in-person variety, the numbers were even more discouraging. A student taking college algebra in person was 52 percent more likely to pass than one taking a Udacity class, making the $150 price tag–roughly one-third the normal in-state tuition–seem like something less than a bargain.
A second attempt during the summer achieved better results, but with a much less disadvantaged student body; and, even more crucially, with considerably greater resources put into human interaction and oversight. For example, San Jose State reported that the summer courses were improved by “checking in with students more often.”
But the prime takeaway was stark. Inside Higher Education reported that a research report conducted by San Jose State on the experiment concluded that “it may be difficult for the university to deliver online education in this format to the students who need it most.”
In an iterative world, San Jose State and Udacity would have learned from their mistakes. The next version of their collaboration would have incorporated the increased human resources necessary to make it work, to be sure that students didn’t fall through the cracks. But the lesson that Udacity learned from the collaboration turned out be something different: There isn’t going to be much profit to be made attempting to apply the principles of MOOCs to students from a disadvantaged background.
Thrun set off a firestorm of commentary when he told Fast Company’s Max Chafkin this:
“These were students from difficult neighborhoods, without good access to computers, and with all kinds of challenges in their lives,” he says. “It’s a group for which this medium is not a good fit….”
“I’d aspired to give people a profound education–to teach them something substantial… But the data was at odds with this idea.”
Henceforth, Udacity would “pivot” to focusing on vocational training funded by direct corporate support.
Thrun later claimed that his comments were misinterpreted by Fast Company. And in his May Op-Ed Christensen argued that Udacity’s pivot was a boon!
Udacity, for its part, should be applauded for not burning through all of its money in pursuit of the wrong strategy. The company realized — and publicly acknowledged — that its future lay on a different path than it had originally anticipated. Indeed, Udacity’s pivot may have even prevented a MOOC bubble from bursting.
Educating the disadvantaged via MOOCs is the wrong strategy? That’s not a pivot — it’s an abject surrender.
The Economist, meanwhile, brushed off the San Jose State episode by noting that “online learning has its pitfalls.” But the Economist also published a revealing observation: “In some ways MOOCs will reinforce inequality … among students (the talented will be much more comfortable than the weaker outside the structured university environment) …”
But isn’t that exactly the the problem? No one can deny that the access to information facilitated by the Internet is a fantastic thing for talented students — and particularly so for those with secure economic backgrounds and fast Internet connections. But such people are most likely to succeed in a world full of smart robots anyway. The challenge posed by technological transformation and disruption is that the jobs that are being automated away first are the ones that are most suited to the less talented or advantaged. In other words, the population that MOOCs are least suited to serving is the population that technology is putting in the most vulnerable position.
Innovation and the free market aren’t going to fix this problem, for the very simple reason that there is no money in it. There’s no profit to be mined in educating people who not only can’t pay for an education, but also require greater human resources to be educated.
This is why we have public education in the first place.
“College is a public good,” says Jonathan Rees, a professor at Colorado State University who has been critical of MOOCs. “It’s what industrialized democratic society should be providing for students.”

Thursday, July 24, 2014

Khaled Hosseini - A Thousand Splendid Suns

It's quite a novel, and I enjoyed it very much.  Hosseini is an amazing story teller and he certainly knows Afghan history.

The sweep of the story against 30 years of Afghan history is riveting.  I wish I knew more about that war torn country.  Why has this country and these people been invaded so many times?  How much can these people take?   Is it geography or what?  The picture we get of of the Taliban and fundamentalist Islam is horrible.  How tragic if true.

I kept wondering how it would end.  I thought Rasheed would die and the ladies would be left to fend for themselves.  I didn't dream they would kill him although I was cheering imagining the shovel cracking the brute's head.  I didn't dream one would sacrifice her life for the other. How stunning.

Great book.  One of the best I've ever read.


Tuesday, July 22, 2014

Carotid Talk

My cardiologist is a likeable enough fellow. He was a history major at Millsaps in Jackson, Mississippi. I have never met a fellow history major who wasn't a likeable person. He isn't the brightest candle in the room, but I've had him for 18 years so I guess I'll continue running the gauntlet with him. He wanted to run a test today on my carotid arteries but my insurance won't pay for it. The only way the insurance will pay for it is for me to have a stroke first. I don't think it's worth that.


Sunday, July 20, 2014

A Thousand Splendid Suns by Khaled Hosseini

This book is as riveting as The Kite Runner.  It spans over forty years, charting the lives of Mariam and Laila.  Mariam is born an illegitimate child, living in poverty in Herat.  Laila, born a generation later, is more affluent, until tragedy strikes her family.  Eventually, their lives intersect.

Hosseini describes this book as a mother-daughter story, whereas The Kite Runner is a father-son story.  Both are about family, redemption, guilt, and national pride.  This book has more Afghan history than its predecessor.  We get a sense of the suffering that Afghanistan has experienced at the hands of the Soviets and Taliban - the bombings, the inequality towards women, the poverty, the fear, and so on.  In the Afterword, Hosseini says that during this time, as many as 8 million Afghan refugees had fled the country, usually landing in Pakistan or Iran.  With this story, we better understand the country more so than we ever could from the news alone.

I will have to eventually read And the Mountains Echoed.

James Garner

James Garner was perhaps my favorite actor.  I was an avid watcher of "The Rockford Files" from 1974 to 1980.  This was a stylish private detective show with likeable characters including Jim Rockford.  I would rewatch the reruns in a heartbeat.

Friday, July 18, 2014

Khaled Hosseni

I have started A Thousand Splendid Suns.  What a great novel!

Do I Really Want to be A Better Online Reader?

Maria Konnikova

July 16, 2014

Being a Better Online Reader

Soon after Maryanne Wolf published “Proust and the Squid,” a history of the science and the development of the reading brain from antiquity to the twenty-first century, she began to receive letters from readers. Hundreds of them. While the backgrounds of the writers varied, a theme began to emerge: the more reading moved online, the less students seemed to understand. There were the architects who wrote to her about students who relied so heavily on ready digital information that they were unprepared to address basic problems onsite. There were the neurosurgeons who worried about the “cut-and-paste chart mentality” that their students exhibited, missing crucial details because they failed to delve deeply enough into any one case. And there were, of course, the English teachers who lamented that no one wanted to read Henry James anymore. As the letters continued to pour in, Wolf experienced a growing realization: in the seven years it had taken her to research and write her account, reading had changed profoundly—and the ramifications could be felt far beyond English departments and libraries. She called the rude awakening her “Rip van Winkle moment,” and decided that it was important enough to warrant another book. What was going on with these students and professionals? Was the digital format to blame for their superficial approaches, or was something else at work?
Certainly, as we turn to online reading, the physiology of the reading process itself shifts; we don’t read the same way online as we do on paper. Anne Mangen, a professor at the National Centre for Reading Education and Research at the University of Stavanger, in Norway, points out that reading is always an interaction between a person and a technology, be it a computer or an e-reader or even a bound book. Reading “involves factors not usually acknowledged,” she told me. “The ergonomics, the haptics of the device itself. The tangibility of paper versus the intangibility of something digital.” The contrast of pixels, the layout of the words, the concept of scrolling versus turning a page, the physicality of a book versus the ephemerality of a screen, the ability to hyperlink and move from source to source within seconds online—all these variables translate into a different reading experience.
The screen, for one, seems to encourage more skimming behavior: when we scroll, we tend to read more quickly (and less deeply) than when we move sequentially from page to page. Online, the tendency is compounded as a way of coping with an overload of information. There are so many possible sources, so many pages, so many alternatives to any article or book or document that we read more quickly to compensate. When Ziming Liu, a professor at San Jose State University whose research centers on digital reading and the use of e-books, conducted a review of studies that compared print and digital reading experiences, supplementing their conclusions with his own research, he found that several things had changed. On screen, people tended to browse and scan, to look for keywords, and to read in a less linear, more selective fashion. On the page, they tended to concentrate more on following the text. Skimming, Liu concluded, had become the new reading: the more we read online, the more likely we were to move quickly, without stopping to ponder any one thought.
The online world, too, tends to exhaust our resources more quickly than the page. We become tired from the constant need to filter out hyperlinks and possible distractions. And our eyes themselves may grow fatigued from the constantly shifting screens, layouts, colors, and contrasts, an effect that holds for e-readers as well as computers. Mary Dyson, a psychologist at the University of Reading who studies how we perceive and interact with typography and design online and in print, has found that the layout of a text can have a significant effect on the reading experience. We read more quickly when lines are longer, but only to a point. When lines are too long, it becomes taxing to move your eyes from the end of one to the start of the next. We read more efficiently when text is arranged in a single column rather than multiple columns or sections. The font, color, and size of text can all act in tandem to make our reading experience easier or more difficult. And while these variables surely exist on paper just as they do on-screen, the range of formats and layouts online is far greater than it is in print. Online, you can find yourself transitioning to entirely new layouts from moment to moment, and, each time you do so, your eyes and your reading approach need to adjust. Each adjustment, in turn, takes mental and physical energy.
The shift from print to digital reading may lead to more than changes in speed and physical processing. It may come at a cost to understanding, analyzing, and evaluating a text. Much of Mangen’s research focusses on how the format of reading material may affect not just eye movement or reading strategy but broader processing abilities. One of her main hypotheses is that the physical presence of a book—its heft, its feel, the weight and order of its pages—may have more than a purely emotional or nostalgic significance. People prefer physical books, not out of old-fashioned attachment but because the nature of the object itself has deeper repercussions for reading and comprehension. “Anecdotally, I’ve heard some say it’s like they haven’t read anything properly if they’ve read it on a Kindle. The reading has left more of an ephemeral experience,” she told me. Her hunch is that the physicality of a printed page may matter for those reading experiences when you need a firmer grounding in the material. The text you read on a Kindle or computer simply doesn’t have the same tangibility.
In new research that she and her colleagues will present for the first time at the upcoming conference of the International Society for the Empirical Study of Literature and Media, in Torino, Italy, Mangen is finding that that may indeed be the case. She, along with her frequent collaborator Jean-Luc Velay, Pascal Robinet, and Gerard Olivier, had students read a short story—Elizabeth George’s “Lusting for Jenny, Inverted” (their version, a French translation, was called “Jenny, Mon Amour”)—in one of two formats: a pocket paperback or a Kindle e-book. When Mangen tested the readers’ comprehension, she found that the medium mattered a lot. When readers were asked to place a series of events from the story in chronological order—a simple plot-reconstruction task, not requiring any deep analysis or critical thinking—those who had read the story in print fared significantly better, making fewer mistakes and recreating an over-all more accurate version of the story. The words looked identical—Kindle e-ink is designed to mimic the printed page—but their physical materiality mattered for basic comprehension.
Wolf’s concerns go far beyond simple comprehension. She fears that as we turn to digital formats, we may see a negative effect on the process that she calls deep reading. Deep reading isn’t how we approach looking for news or information, or trying to get the gist of something. It’s the “sophisticated comprehension processes,” as Wolf calls it, that those young architects and doctors were missing. “Reading is a bridge to thought,” she says. “And it’s that process that I think is the real endangered aspect of reading. In the young, what happens to the formation of the complete reading circuitry? Will it be short-circuited and have less time to develop the deep-reading processes? And in already developed readers like you and me, will those processes atrophy?”
Of course, as Wolf is quick to point out, there’s still no longitudinal data about digital reading. As she put it, “We’re in a place of apprehension rather than comprehension.” And it’s quite possible that the apprehension is misplaced: perhaps digital reading isn’t worse so much as different than print reading. Julie Coiro, who studies digital reading comprehension in elementary- and middle-school students at the University of Rhode Island, has found that good reading in print doesn’t necessarily translate to good reading on-screen. The students do not only differ in their abilities and preferences; they also need different sorts of training to excel at each medium. The online world, she argues, may require students to exercise much greater self-control than a physical book. “In reading on paper, you may have to monitor yourself once, to actually pick up the book,” she says. “On the Internet, that monitoring and self-regulation cycle happens again and again. And if you’re the kind of person who’s naturally good at self-monitoring, you don’t have a problem. But if you’re a reader who hasn’t been trained to pay attention, each time you click a link, you’re constructing your own text. And when you’re asked comprehension questions, it’s like you picked up the wrong book.”
Maybe the decline of deep reading isn’t due to reading skill atrophy but to the need to develop a very different sort of skill, that of teaching yourself to focus your attention. (Interestingly, Cairo found that gamers were often better online readers: they were more comfortable in the medium and better able to stay on task.) In a study comparing digital and print comprehension of a short nonfiction text, Rakefet Ackerman and Morris Goldsmith found that students fared equally well on a post-reading multiple-choice test when they were given a fixed amount of time to read, but that their digital performance plummeted when they had to regulate their time themselves. The digital deficit, they suggest, isn’t a result of the medium as such but rather of a failure of self-knowledge and self-control: we don’t realize that digital comprehension may take just as much time as reading a book.
Last year, Patricia Greenfield, a psychologist at the University of California, Los Angeles, and her colleagues found that multitasking while reading on a computer or a tablet slowed readers down, but their comprehension remained unaffected. What did suffer was the quality of a subsequent report that they wrote to synthesize their reading: if they read the original texts on paper or a computer with no Internet access, their end product was superior to that of their Internet-enabled counterparts. If the online readers took notes on paper, however, the negative effects of Internet access were significantly reduced. It wasn’t the screen that disrupted the fuller synthesis of deep reading; it was the allure of multitasking on the Internet and a failure to properly mitigate its impact.
Indeed, some data suggest that, in certain environments and on certain types of tasks, we can read equally well in any format. As far back as 1988, the University College of Swansea psychologists David Oborne and Doreen Holton compared text comprehension for reading on different screens and paper formats (dark characters on a light background, or light characters on a dark background), and found no differences in speed and comprehension between the four conditions. Their subjects, of course, didn’t have the Internet to distract them. In 2011, Annette Taylor, a psychologist at the University of San Diego, similarly found that students performed equally well on a twenty-question multiple-choice comprehension test whether they had read a chapter on-screen or on paper. Given a second test one week later, the two groups’ performances were still indistinguishable. And it’s not just reading. Last year, Sigal Eden and Yoram Eshet-Alkalai found no difference in accuracy between students who edited a six-hundred-word paper on the screen and those who worked on paper. Those who edited on-screen did so faster, but their performance didn’t suffer.
We need to be aware of the effects of deeper digital immersion, Wolf says, but we should be equally cautious when we draw causal arrows or place blame without adequate longitudinal research. “I’m both the Cassandra and the advocate of digital reading,” she says. Maybe her letter writers’ students weren’t victims of digitization so much as victims of insufficient training—and insufficient care—in the tools of managing a shifting landscape of reading and thinking. Deep-reading skills, Wolf points out, may not be emphasized in schools that conform to the Common Core, for instance, and need to meet certain test-taking reading targets that emphasize gist at the expense of depth. “Physical, tangible books give children a lot of time,” she says. “And the digital milieu speeds everything up. So we need to do things much more slowly and gradually than we are.” Not only should digital reading be introduced more slowly into the curriculum; it also should be integrated with the more immersive reading skills that deeper comprehension requires.
Wolf is optimistic that we can learn to navigate online reading just as deeply as we once did print—if we go about it with the necessary thoughtfulness. In a new study, the introduction of an interactive annotation component helped improve comprehension and reading strategy use in a group of fifth graders. It turns out that they could read deeply. They just had to be taught how. Wolf is now working on digital apps to train students in the tools of deep reading, to use the digital world to teach the sorts of skills we tend to associate with quiet contemplation and physical volumes. “The same plasticity that allows us to form a reading circuit to begin with, and short-circuit the development of deep reading if we allow it, also allows us to learn how to duplicate deep reading in a new environment,” she says. “We cannot go backwards. As children move more toward an immersion in digital media, we have to figure out ways to read deeply there.”
Wolf has decided that, despite all of her training in deep reading, she, too, needs some outside help. To finish her book, she has ensconced herself in a small village in France with shaky mobile reception and shakier Internet. Faced with the endless distraction of the digital world, she has chosen to tune out just a bit of it. She’s not going backward; she’s merely adapting.

Wednesday, July 16, 2014

The All-Star Game

I didn't notice until just now that the AL beat the NL in the all-star game last night 5 to 3.  From a nostalgia point of view, it's really sad for this used to be a big deal.  No more.

Part of it is that baseball is not as TV popular as football and basketball.  Part of it oversaturation on TV.  Part of it is that interleague play has destroyed part of the uniqueness of baseball's all-star game. In any event, the baseball all-star game is not the attraction that is used to be.

I can talk baseball from the 50's and 60's.  That's it.  Still, baseball is the greatest game of all

Tuesday, July 15, 2014

Sgt. Bowe Bergdahl Recaptured By Taliban After Wandering Off Texas Base

The Onion
14 July 2014

WASHINGTON—Just weeks after Sgt. Bowe Bergdahl’s release from captivity in Afghanistan, U.S. defense officials announced that the 28-year-old had been recaptured by Taliban forces Monday shortly after wandering off base in Texas. “It is with regret that we inform you that at approximately 1200 hours today, Sgt. Bergdahl left his post, was seized by insurgents outside San Antonio, and taken into Taliban custody,” said Pentagon spokesman John Herndon, explaining that, on his first day back on active duty, Bergdahl slipped out of Fort Sam Houston with only a backpack and a notebook, ventured for five miles on foot, and was shortly thereafter abducted by a group of militant jihadists, a sequence of events that was largely corroborated by Bergdahl himself in a Taliban propaganda video released this afternoon. “Based on emails he sent this morning, it appears that Sgt. Bergdahl may have grown disillusioned with his return to service and voluntarily ventured outside the base. We have reclassified him as ‘missing/captured,’ and the U.S. Army will do everything in its power to secure his release and repatriate him, once again, to the United States.” At press time, a spokesman for the Obama administration announced that the president was currently in negotiations to hand over five high-value Taliban prisoners in exchange for Bergdahl.

Sunday, July 13, 2014

The Data of Hate

BY Seth Stephens-Davidowitz
New York Times
12 July 2014
 
VIKINGMAIDEN88 is 26 years old. She enjoys reading history and writing poetry. Her signature quote is from Shakespeare. She was impressed when the dialect quiz in The New York Times correctly identified where she was from: Tacoma and Spokane, Wash. “Completely spot on,” she wrote, followed by a smiling green emoji.
 
I gleaned all this from her profile and posts on Stormfront.org, America’s most popular online hate site.
 
I recently analyzed tens of thousands of the site’s profiles, in which registered members can enter their location, birth date, interests and other information. Call it Big Hatred meets Big Data.
 
Stormfront was founded in 1995 by Don Black, a former Ku Klux Klan leader. Its most popular “social groups” are “Union of National Socialists” and “Fans and Supporters of Adolf Hitler.” Over the past year, according to Quantcast, roughly 200,000 to 400,000 Americans visited the site every month. A recent Southern Poverty Law Center report linked nearly 100 murders in the past five years to registered Stormfront members.
 
The white nationalist posters on Stormfront have issues with many different groups. They often write about crimes committed by African-Americans against whites; they complain about an “invasion” of Mexicans; and they love to mock gays and feminists. But their main problem appears to be with Jewish people, who are often described as super-powerful and clever — the driving force, generally speaking, behind the societal changes they do not like. They sometimes call the Holocaust the “Holohoax.”
 
Stormfront members tend to be young, at least according to self-reported birth dates. The most common age at which people join the site is 19. And four times more 19-year-olds sign up than 40-year-olds. Internet and social network users lean young, but not nearly that young.
 
Profiles do not have a field for gender. But I looked at all the posts and complete profiles of a random sample of American users, and it turns out that you can work out the gender of most of the membership: I estimate that about 30 percent of Stormfront members are female.
The states with the most members per capita are Montana, Alaska and Idaho. These states tend to be overwhelmingly white. Does this mean that growing up with little diversity fosters hate?
 
Probably not. Since those states have a higher proportion of non-Jewish white people, they have more potential members for a group that attacks Jews and nonwhites. The percentage of Stormfront’s target audience that joins is actually higher in areas with more minorities. This is particularly true when you look at Stormfront’s members who are 18 and younger and therefore do not themselves choose where they live.
 
Among this age group, California, a state with one of the largest minority populations, has a membership rate 25 percent higher than the national average.
 
One of the most popular social groups on the site is “In Support of Anti-Semitism.” The percentage of members who join this group is positively correlated with a state’s Jewish population. New York, the state with the highest Jewish population, has above-average per capita membership in this group.
 
In 2001, Dna88 joined Stormfront, describing himself as a “good looking, racially aware” 30-year-old Internet developer living in “Jew York City.” In the next four months, he wrote more than 200 posts, like “Jewish Crimes Against Humanity” and “Jewish Blood Money,” and directed people to a website, jewwatch.com, which claims to be a “scholarly library” on “Zionist criminality.”
 
Stormfront members complain about minorities’ speaking different languages and committing crimes. But what I found most interesting were the complaints about competition in the dating market.
 
A man calling himself William Lyon Mackenzie King, after a former prime minister of Canada who once suggested that “Canada should remain a white man’s country,” wrote in 2003 that he struggled to “contain” his “rage” after seeing a white woman “carrying around her half black ugly mongrel niglet.” In her profile, Whitepride26, a 41-year-old student in Los Angeles, says, “I dislike blacks, Latinos, and sometimes Asians, especially when men find them more attractive” than “a white female.”
 
POLITICAL developments certainly play a role. The day that saw the biggest single increase in membership in Stormfront’s history, by far, was Nov. 5, 2008, the day after Barack Obama was elected president.
 
The top reported interest of Stormfront members is “reading.” Most notably, Stormfront users are news and political junkies. One interesting data point here is the popularity of The New York Times among Stormfront users. According to the economists Matthew Gentzkow and Jesse M. Shapiro, when you compare Stormfront users to people who go to the Yahoo News site, it turns out that the Stormfront crowd is twice as likely to visit nytimes.com.
 
Perhaps it was my own naïveté, but I would have imagined white nationalists’ inhabiting a different universe from that of my friends and me. Instead, they have long threads praising “Breaking Bad” and discussing the comparative merits of online dating sites, like Plenty of Fish and OkCupid.
 
There was also no relationship between monthly membership registration and a state’s unemployment rate. States disproportionately affected by the Great Recession saw no comparative increase in Google searches for Stormfront.
 
Some of this research adds to recent literature in the field that is frankly shocking and should change the way we think about hate.
 
In the 1930s, Arthur F. Raper reported a correlation between bad economic conditions and lynchings of blacks. This led many scholars to the intuitive conclusion that people turn to hate because their lives are going poorly.
 
But evidence is increasingly casting doubt on this idea. In 2001, the political scientists Donald P. Green, Laurence H. McFalls and Jennifer K. Smith used more data and found that there was actually no relationship between lynchings and economic hardship. Lynchings actually fell during the Great Depression.
 
The economist Alan B. Krueger has shown that terrorists are not disproportionately poor. And the economists Roland G. Fryer Jr. and Steven D. Levitt found that Ku Klux Klan members were actually better educated than the typical American.
 
Return to VikingMaiden88. When you read her 189 posts since joining the site, she often seems like a perfectly nice and intelligent young woman.
 
But she also has a lot of hatred. She praises a store for having “100% white employees.” She says the media is promoting a “Jewish agenda.” And she says she finds Asians “repulsive physically, socially, religiously, etc.”
 
Why do some people feel this way? And what is to be done about it? I have pored over data of an unprecedented breadth and depth, thanks to our new digital era. And I can honestly offer the following answer: I have no idea.