Saturday, July 31, 2010

A Political Extrapolation of the Movie "Inception"

Media Criticism Friday, Jul 30, 2010 20:17 ET
The deception of real-world "Inception"
What the new science-fiction movie tells us about the willful ignorance that dominates America's political culture
By David Sirota

For all of its "Matrix"-like convolutions and "Alice in Wonderland" allusions, the new film "Inception" adds something significant to the ancient ruminations about reality's authenticity -- something profoundly relevant to this epoch of confusion. In the movie's tale of corporate espionage, we are asked to ponder this moment's most disturbing epistemological questions: Namely, how are ideas deposited in people's minds, and how incurable are those ideas when they are wrong?

Many old sci-fi stories, like politics and advertising of the past, subscribed to the "Clockwork Orange" theory that says blatantly propagandistic repetition is the best way to pound concepts into the human brain. But as "Inception's" main character, Cobb, posits, the "most resilient parasite" of all is an idea that individuals are subtly led to think they discovered on their own.

This argument's real-world application was previously outlined by Cal State Fullerton's Nancy Snow, who wrote in 2004 that today's most pervasive and effective propaganda is the kind that is "least noticeable" and consequently "convinces people they are not being manipulated." The flip side is also true: When an idea is obviously propaganda, it loses credibility. Indeed, in the same way the subconscious of "Inception's" characters eviscerates known invaders, we are reflexively hostile to ideas when we know they come from agenda-wielding intruders.

These laws of cognition, of course, are brilliantly exploited by a 24-7 information culture that has succeeded in making "your mind the scene of the crime," as "Inception's" trailer warns. Because we are now so completely immersed in various multimedia dreamscapes, many of the prefabricated -- and often inaccurate -- ideas in those phantasmagorias can seem wholly self-realized and, hence, totally logical.

The conservative media dreamland, for instance, ensconces its audience in an impregnable bubble -- you eat breakfast with the Wall Street Journal's editorial page, you drive to the office with right-wing radio, you flit between Breitbart and Drudge at work, you come home to Fox News. The ideas bouncing around in this world -- say, ideas about the Obama administration allegedly favoring blacks -- don't seem like propaganda to those inside the bubble. With heavily edited videos of screaming pastors and prejudiced-sounding USDA officials, these ideas are cloaked in the veneer of unchallenged fact, leaving the audience to assume its bigoted conclusions are completely self-directed and incontrovertible.

Same thing for those living in the closed loop of the "traditional" media. Replace conservative news outlets with the New York Times, NPR, Washingtonpost.com and network newscasts, and it's just another dreamscape promulgating certain synthetic ideas (for instance, militarism and market fundamentalism), excluding other ideas (say, antiwar opinions and critiques of the free market) and bringing audiences to seemingly self-conceived and rational judgments -- judgments that are tragically misguided.

Taken together, our society has achieved the goal of "Inception's" idea-implanting protagonists -- only without all the technological subterfuge. And just as they arose with Cobb's wife, problems are emerging in our democracy as the dreams sow demonstrable fallacies.

As writer Joe Keohane noted in a recent Boston Globe report about new scientific findings, contravening facts no longer "have the power to change our minds" when we are wrong.

"When misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds," he wrote. "In fact, they often became even more strongly set in their beliefs."

What is the circuit breaker in this delusive cycle? It's hard to know if one exists, just as it is difficult to know whether Cobb's totem ever stops spinning. For so many, meticulously constructed fantasies seem like indisputable reality. And because those fantasies' artificial inception is now so deftly obscured, we can no longer wake up, even if facts tell us we're in a dream -- and even when the dream becomes a nightmare.

Thursday, July 29, 2010

The eReader Wars

In Price War, New Kindle Sells for $139

Amazon
The new Amazon Kindle Wi-Fi, above, will sell for $139 but connect to the Web only by Wi-Fi. A new model to replace the Kindle 2 will sell for $189 and connect to the Internet through a cellphone network.

By CLAIRE CAIN MILLER
Published: July 29 - KindleAmazon is hoping to convince even casual readers that they need a digital reading device. By firing another shot in an e-reader price war leading up to the year-end holiday shopping season, the e-commerce giant turned consumer electronics manufacturer is also signaling it intends to do battle with Apple and its iPad as well as the other makers of e-readers like Sony and Barnes & Noble.

Unlike previous Kindles, the $139 “Kindle Wi-Fi” will connect to the Internet using only Wi-Fi instead of a cellphone network as other Kindles do. Amazon is also introducing a model to replace the Kindle 2, which it will sell for the same price as that model, $189. Both new Kindles are smaller and lighter, with higher contrast screens and crisper text.

“The hardware business for us has been so successful that we’re going to continue,” Jeffrey P. Bezos, Amazon’s chief executive, said in an interview at the company’s headquarters. “I predict there will be a 10th-generation and a 20th-generation Kindle. We’re well-situated to be experts in purpose-built reading devices.”

When Amazon introduced the Kindle in 2007, Mr. Bezos described it as a must-have for frequent travelers and people who read “two, three, four books at the same time.” Now, Amazon hopes that at $10 less than the least expensive reading devices from Barnes & Noble and Sony, the new Kindle has broken the psychological price barrier for even occasional readers or a family wanting multiple Kindles.

“At $139, if you’re going to read by the pool, some people might spend more than that on a swimsuit and sunglasses,” Mr. Bezos said.

Some analysts are predicting that e-readers could become this year’s hot holiday gift. James L. McQuivey, a principal analyst specializing in consumer electronics at Forrester Research, said a price war could for the first time reduce at least the price of one e-reader to under $100, often the tipping point for impulse gadget purchases.

Amazon has slashed the price of the Kindle at a speed that is unusual, even for electronic gadgets. By last year, the price of the device was to $259, down from its starting price of $399 in late 2007. In June, hours after Barnes & Noble dropped the price of its Nook e-reader to $199, Amazon dropped the price of the Kindle to $189. The Kindle DX, which has a larger, 9.7-inch screen, is $379.

With Amazon’s latest announcement, it is again waging a price war. Barnes & Noble offers a Wi-Fi version of the Nook for $149 and Sony offers the Reader Pocket Edition, which does not have Wi-Fi, for $150.

Of course, price is just one factor people consider before making a purchase. The quality of the product, adequate inventory and appealing marketing are just as important, said Eric T. Anderson, a professor of marketing at Northwestern University’s Kellogg School of Management.

But as the e-reader marketplace has grown crowded “there are lots of substitutes out there so the only way they can create demand is by lowering the price,” he said.

Still, the iPad’s $499-and-up price tag has not stifled demand for that device. Though the iPad does much more than display books, customers often choose between the two, and are willing to pay much more for the iPad because it is an Apple product, said Dale D. Achabal, executive director of the Retail Management Institute at Santa Clara University. “The price point Apple can go to is quite a bit higher than the price point other firms have to go to that don’t have the same ease of use, design and functionality,” he said.

Apple says it has sold 3.3 million iPads since introducing it in April. Amazon does not release Kindle sales figures, but says that sales tripled in the month after its last price cut.

Two of the most compelling aspects of the iPad — a color display and touch screen — are elements that some customers have been yearning for on the Kindle. Keep waiting, Mr. Bezos said.

“There will never be a Kindle with a touch screen that inhibits reading. It has to be done in a different way. It can’t be a me-too touch screen,” he said. Earlier this year, Amazon bought Touchco, a start-up specializing in touch-screen technology, but current touch-screen technology adds reflections and glare and makes it hard to shift one’s hands while reading for long periods of time, he said. Color is also “not ready for prime time,” Mr. Bezos said.

The new Kindles, which will ship Aug. 27, have the same six-inch reading area as earlier Kindles but weigh about 15 percent less and are 21 percent smaller. The Kindles have twice the storage, up to 3,500 books.

Wednesday, July 28, 2010

Stieg Larsson (3)

I'm reading the second volume in the triology---"The Girl Who Played with Fire." So far I like it.

Monday, July 26, 2010

Great Democratic Summary of the Current Political Situation

by Howard Dean


For some time now, various "reporters" and on-air personalities on the Fox News Network have failed to report the full story or relevant facts, instead indulging in race baiting in order to exploit people's fears and crank up the fringe of their audience. This was exemplified by Glenn Beck's nightly assault on Van Jones earlier this year. Recently, Fox has cranked up stories about the Department of Justice's decision not to prosecute a voter intimidation case against a Black Panther group and even worse, calls for Atty. General Holder's resignation. And now, the Sherrod Debacle.

Turns out Van Jones' name was added to a website without his permission, a fact the group finally admitted some time after he resigned. And maybe he said some things about the Republican Party that he shouldn't have -- but that has nothing to do with the fact that he is a brilliant environmental organizer. It also turns out that it was the Bush Administration who decided not to prosecute the case against the black panthers because as Bush's Assistant Attorney General Perez testified, "the facts did not constitute a prosecutable violation of the criminal statues, and under the Obama Administration Justice Department a judgment was won in a civil case.

And by now we all know how the Sherrod story went down. Despite his claims to the contrary on Fox News Sunday, Chris Wallace didn't have his facts quite right. As a media matters study showed, Fox News did in fact spend a lot of air-time on July 19th and 20th cranking up the false story. Not to mention that foxnews.com bragged that shortly after they posted a "report" about the video Mrs. Sherrod resigned.

None of this is new. I don't believe all or even most of the Republican party voters are racist, but going at least as far back as Lee Atwater, the Willie Horton ads, and the attacks on John McCain in the South Carolina primaries in both 2000 and 2008, the immigration debate in 2006, there is a persistent willingness in the Republican party to use race baiting for electoral advantage. The fact is, this is racist behavior.

Now if the Tea Party, which is not a professional group of politicians have the decency to repudiate the racist fringe in their group, why can't the Republicans? Obviously they think this approach works on the margins, but even if this stuff works, it sure doesn't produce good leaders or a civil society, and it certainly doesn't produce a stronger America, it produces an even more polarized and angry America. It's that willingness to put party ahead of country that has the Republicans in such low regard.

There are lessons to be learned here. Tom Vilsack stated the first one best: don't make decisions without all the facts. To that I would add: consider the source. If it is a group of individuals or a corporation that has chronically ignored the facts and engaged in race baiting in the past, they are likely to do it again. A report by Fox News, Breitbart or Matt Drudge, ought to have -- as it does in most people's minds -- little credibility.

The second lesson is harder. Stand up for what you believe in. I admire Nancy Pelosi because she is tough, gets things done, and doesn't take crap from the right wing or any one else. After the year and a half this country has just been through, it is pretty obvious that the right-wing has no intention of cooperating with anyone, and that they will do anything to regain power, just as they were willing to do anything to hold on to it. The only reasonable approach is to stand up to them as you would any group of bullies. Call them out for what they do- or don't do as the case may be. If the Tea Party can call out some of their own members, surely we can call out a group of people who have put their party ahead of their country.

I have often said the biggest problem with the Democrats is that we are not tough enough. Now is the time to be tough. The fact is that the stimulus package has reduced unemployment from where it would have otherwise been in this Bush-induced recession (based on policies most of the Republicans now in Congress voted for). The fact is, as 60 members of the House and the CBO showed last week, the Public Option, or Medicare Buy-in, as it should more correctly be called, would have reduced the deficit over ten years by an additional $68 million dollars. The fact is that President Obama -- despite Republicans killing the climate change bill -- has done more in 18 months to change America's approach to the environment and green jobs than any president in memory.

The fact is that if we are going to tackle the deficit, it makes no sense to cut taxes for people with plenty of money while we tell people who depend on Social Security and Medicare that they have to do with less, or to play games with unemployment insurance for those who need it most.

The fact is that the Democrats won the election in 2008. The Republicans refuse to do anything for the country except say "no". That means we have to work hard and do what we believe is right. And we have to stop apologizing for it. We have to stand up for what we believe in and stop trying to make deals with people who cannot be trusted to make deals for the good of our country. It's not too late to win in 2010. Conviction politics works. Just ask the right wing!

Sunday, July 25, 2010

Inception

I'd call it a good summer movie, but certainly not a classic film. I did actually enjoy the special effects, and I am not a special effects man.

There are two central ideas that are dressed up in the concept of invading the dreams of other people. Those ideas are reconciling with the past, coming to terms with issues from the past, and the effects that we can have on other people.

I realize watching this movie that I didn't have to exactly figure out everything that was going on. Just realize the idea of dream levels, relax, and go with the flow, and everything would come together in the end. And that's what happened.

So now I've seen my one summer movie of 2010.

Saturday, July 24, 2010

Another Review of "The Shallows"

Sunday, May 9, 2010 19:01 ET
Yes, the Internet is rotting your brain
And Nicholas Carr's "The Shallows" has the evidence to prove it
By Laura Miller

Two years ago, Nicholas Carr, a technology writer, published an essay titled "Is Google Making Us Stupid?" in the Atlantic Monthly magazine. Despite being saddled with a grabby but not very accurate headline (the defendant was the Internet itself, not just its most popular search engine), the piece proved to be one of those rare texts that condense and articulate a fog of seemingly idiosyncratic worries into an urgently discussed issue in contemporary life.

It turned out that a whole lot of people were just then realizing that, like Carr, they had lost their ability to fully concentrate on long, thoughtful written works. "I get fidgety, lose the thread, begin looking for something else to do," Carr wrote. "I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle." At first assuming that his fractured mental state was the result of "middle-age mind rot," Carr eventually concluded that his heavy Internet usage was to blame. His article about this realization instantly rose to the top of the "most-read" list on the Atlantic's website and stayed there for months.

"The Shallows: What the Internet Is Doing to Our Brains" is Carr's new, book-length version of the Atlantic piece. It expands on the points he made in 2008, but it addresses some of the responses he got, as well. In addition to the usual moronic japes ("This article is too long!" -- can anyone really be witless enough to believe that joke is clever?), commenters, bloggers and pundits asked if Carr wasn't confusing the medium with how people choose to use it. Still others dared to argue that the value of what Carr calls "literary reading" has been inflated.

While "The Shallows" does contain significant chunks of the Atlantic essay, this isn't one of those all-too-familiar annoyances, the book that should have remained an article. In the brief period between the writing of the original piece and the publication of "The Shallows," neuroscientists have performed and reviewed important studies on the effects of multitasking, hyperlinks, multimedia and other information-age innovations on human brain function, all of which add empirical heft to Carr's arguments.

The results are not cheering, and the two chapters in which Carr details them are, to my mind, the book's payload. This evidence -- that even the microseconds of decision-making attention demanded by hyperlinks saps cognitive power from the reading process, that multiple sensory inputs severely degrade memory retention, that overloading the limited capacity of our short-term memory hampers our ability to lay down long-term memories -- is enough to make you want to run right out and buy Internet-blocking software.

Above all, Carr points to the past 20-some years of neurological research indicating that the human brain is, in the words of one scientific pioneer, "massively plastic" -- that is, much like our muscles, it can be substantially changed and developed by what we do with it. In a study that is quickly becoming as popular a touchstone as the Milgram experiment, the brains of London cab drivers were discovered to be much larger in the posterior hippocampus (the part of the brain devoted to spatial representations of one's surroundings) than was the case with a control group. These masses of neurons are the physiological manifestation of "the Knowledge," the cabbies' legendary mastery of the city's geography. The drivers' anterior hippocampus, which manages certain memory tasks, is correspondingly smaller. There's only so much space inside a skull, after all.

References to the cabbie study don't often mention this evidence that cognitive development may be a zero-sum game. The more of your brain you allocate to browsing, skimming, surfing and the incessant, low-grade decision-making characteristic of using the Web, the more puny and flaccid become the sectors devoted to "deep" thought. Furthermore, as Carr recently explained in a talk at the Los Angeles Times Festival of Books, distractibility is part of our genetic inheritance, a survival trait in the wild: "It's hard for us to pay attention," he said. "It goes against the native orientation of our minds."

Concentrated, linear thought doesn't come naturally to us, and the Web, with its countless spinning, dancing, blinking, multicolored and goodie-filled margins, tempts us away from it. (E-mail, that constant influx of the social acknowledgment craved by our monkey brains, may pose an even more potent diversion.) "It's possible to think deeply while surfing the Net," Carr writes, "but that's not the type of thinking the technology encourages or rewards." Instead, it tends to transform us into "lab rats constantly pressing levers to get tiny pellets of social or intellectual nourishment."

A good portion of "The Shallows" is devoted to persuading readers of the truth in Marshall McLuhan's famous pronouncement, "The medium is the message." This includes potted histories of such mind-altering "intellectual technologies" as the map, the clock and the printed book. To anyone moderately versed in this history, it may feel unnecessary, but the response to Carr's original article suggests that many people remain perilously sanguine about our ability to control the technology in our lives.

"The Shallows" certainly isn't the first examination of this subject, but it's more lucid, concise and pertinent than similar works by Winifred Gallagher and Sven Birkerts. Carr presents far more scientific material than those writers do, and avoids both the misty Spenglerian melancholia of Birkerts and Gallagher's muddled efforts to inject Buddhist spirituality into the debate.

What the book doesn't do, unfortunately, is offer a sufficient rejoinder to Carr's most puckish critics, people like Clay Shirky, who responded to one Web addict's complaint that he "can't read 'War and Peace' anymore" by proclaiming Tolstoy's epic novel to be "too long and not so interesting." While Shirky was no doubt playing the provocateur, he speaks for a very real anti-authoritarian cultural impulse to dismiss the judgments of experts, of history, even of a majority of other readers when they clash with the (often half-baked) evaluations of the individual. Shirky effectively asserted that, as far as Tolstoy is concerned, the emperor has no clothes -- at least not by the standards of today's multitasking digital natives. And why shouldn't their opinions be just as valid as anyone else's?

Carr sensibly replies that anyone who lacks the time or the cognitive "facility" to read a long novel like "War and Peace" will naturally find it too long and not so interesting. But in that case, how would we persuade such a person that it's worth learning how? For someone like Carr, the value of the intimate, intellectually nourishing practice of "literary reading" (and by extension, literary thinking) may be self-evident. Yet he's able to quote apparently intelligent and well-educated sources (including a Rhodes scholar who claims to never read books) who simply don't agree.

While "The Shallows" does present a good case for the richness of organic, biological memory over the crude information storage of digital media, I would have appreciated a more concerted effort to show the advantages of linear thinking over the scattered, skittering, browsing mind-set fostered by the Internet. What will we lose if (when?) this mode of thought passes into obscurity?

Carr and I (and perhaps you) may know that reading "War and Peace" can be a far more profound experience than navigating through a galaxy of up-to-date blog postings, but to someone who can't or won't believe this, what else can we point to as a consequence of the withering of such a skill? What will we lose socially, politically, civilly, scientifically, psychologically, if a majority decides that the intellectual "shallows" are the proper habitat for the 21st-century mind? This needs to be spelled out because as Carr's critics have demonstrated, fewer and fewer people take it for granted. But with that caveat, "The Shallows" remains an essential, accessible dispatch about how we think now.

Thursday, July 22, 2010

Nicholas Carr - The Shallows

As we enjoy the Web and all it offers---a wealth of information and easy access to that information and the vehicle to express ourselves---are we sacrificing our ability to read deeply (the Web fosters quick and shallow reading) and think deeply? Carr says yes.

Carr cites research study after research study showing that the Web is rewiring our brains (our brains show plasticity even into adulthood), rerouting our neural pathways, lessening our ability to think and concentrate.

The Web greatly enhances our ability to scan and skim, but at the cost of reducing our attention span so that some of us find it more difficult to concentrate long enough in order to read linear print---a book.

Some heavy internet users have stopped reading books. They don't have the patience and the ability concentrate necessary to read books anymore. Books focus our attention. The computer screen distracts our concentated attention as we read bits of information, jumping around fron one source to another, our attention moving from one thing to another.

So says Nicholas Carr.

Wednesday, July 21, 2010

Why The Next Big Pop-Culture Wave After Cupcakes Might Be Libraries

by Linda Holmes (from NPR online)

I realize we're picking the bones from the Old Spice campaign at this point, but when I saw that the Brigham Young University parody of the Old Spice ads had gotten more than 1.2 million views (Old Spicy himself — that's what I'm calling him — did a video for libraries), it got me thinking.

Specifically, it got me thinking about the very enjoyable Librarians Do Gaga video that everyone sent my way after the debut of the NPR Does Gaga video.

And about the fact that a local news story skeptically questioning whether libraries are "necessary" set off a response from Vanity Fair, and a later counterpunch by Chicago's Public Library Commissioner won her support from such diverse, non-library-specific outlets as The A.V. Club and Metafilter, and from as far away as The Guardian.

Call it a hunch, but it seems to me that the thing is in the air that happens right before something — families with a million kids, cupcakes, wedding coordinators — suddenly becomes the thing everyone wants to do happy-fuzzy pop-culture stories about. Why?

Libraries get in fights. Everybody likes a scrapper, and between the funding battles they're often found fighting and the body-checking involved in their periodic struggles over sharing information, there's a certain ... pleasantly plucky quality to the current perception of libraries and librarians. Yes, it plays a little ironically against the hyper-stereotypical buttoned-up notion of what a librarian is, but the sense that they're okay with getting mad in public — like Chicago's Public Library Commissioner did — gives library people a spark they might not otherwise have.


Librarians know stuff. You know how the words "geek" and "nerd" have gone from actual insults to words used to lovingly describe enthusiasts? Well, if we haven't gotten past venerating people who don't know anything, we've certainly reduced, I'd argue, the degree to which we stigmatize people for knowing a lot. This alone might not make libraries cool, but it takes away from the sense that they're actively not cool. More specifically, they live in the world of information, and are employed in part to organize and make accessible large quantities of data. If your computer had feet and a spiffy personality, you see.

Libraries are green and local. This is where there's a lot of potential appeal for the same people who like organic produce and reusable grocery bags. You can pretty easily position a library as environmentally friendly (your accumulation of books and magazines you are not reading is fewer trees for the rest of us, you know), not to mention economical (obvious) and part of your local culture. This is the part of the potential appeal that's anti-chain-store, anti-sprawl, anti-anonymity, and so forth.

Libraries will give you things for free. Hi, have you noticed how much hardcover books cost? Not a Netflix person? They will hand you things for free. That's not an especially hard concept to sell.

"Open to the public" means "some days, you really have to wonder about people." This is where you get the spark of an idea for TLC or somebody to do some goofball show called The Stacks, which follows a small local library through funding problems, trying to get book clubs started, whatever. When your building is open to the public, that means open ... to ... the ... public. And you know what's a little unpredictable? The public. This is where you might get your drama. (When I was in college, the information desk used to post the best questions it received, one of which was "How long do you cook spaghetti?" I suspect many libraries have similar stories.)

There seems to be a preposterous level of goodwill. Quite honestly, I feel like you can go on YouTube and act like a complete goof (in the best way), and if it's for libraries, people have that same rush of warmth that they used to get about people who had sextuplets, before ... well, you know. Before.

I don't know whether it's going to come in the form of a more successful movie franchise about librarians than that TV thing Noah Wyle does, or a basic-cable drama about a crime-fighting librarian (kinda like the one in the comic Rex Libris), or that reality show I was speculating about, but mark my words, once you've got Old Spicy on your side and you can sell a couple of YouTube parodies in a couple of months, you're standing on the edge of your pop-culture moment. Librarians: prepare.

Paper Cuts with Nicholas Carr

--------------------------------------------------------------------------------


--------------------------------------------------------------------------------

June 4, 2010, 7:00 am
Stray Questions for: Nicholas Carr
By THE NEW YORK TIMES

Joanie Simon

Nicholas Carr’s new book, “The Shallows: What the Internet Is Doing to Our Brains,” is reviewed in this Sunday’s book review.

What will this Web Q. and A. do to readers’ brains?

Not much, unless I say something remarkably memorable. What changes our brains is, on the one hand, repetition and, on the other hand, neglect. That’s why I believe the Net is having such far-reaching intellectual consequences. When we’re online, we tend to perform the same physical and mental actions over and over again, at a high rate of speed and in a state of perpetual distractedness. The more we go through those motions, the more we train ourselves to be skimmers and scanners and surfers. But the Net provides no opportunity or encouragement for more placid, attentive thought. What we’re losing, through neglect, is our capacity for contemplation, introspection, reflection — all those ways of thinking that require attentiveness and deep concentration.

What are you working on now?

My last three books have been about computers and the Internet, and at this point I think I’ve said all I have to say on those subjects. With one exception: I’d like to write an essay on the hyperlink. It’s such a small, simple thing, but it’s had a vast and incredibly complicated effect on our intellectual lives, and on our culture, over the last few years. I’d love to unpick the link.

I’m spending most of my time, though, casting about for an idea for my next book, so far without much success. My dream is to disappear for ten years and then reappear, in sandals and a beard, with a strange and wondrous thousand-page manuscript written in longhand. Something tells me that’s not going to happen.

What role does the Internet play in your writing life?

It plays a very beneficial role in helping me to do research efficiently, to find, very quickly and with a minimum of effort, relevant books, articles, and facts. At the same time, it plays a very damaging role in constantly disrupting my train of thought and leading me down endless rabbit holes. Robert Frost had a lover’s quarrel with the world. I’m having a lover’s quarrel with the Net.

What have you been reading or recommending lately?

I’m currently making my third attempt to read David Foster Wallace’s “Infinite Jest” all the way through, and this time I plan to succeed. I quote, in “The Shallows,” some advice that Wallace gave to college students a couple of years before he died. “Learning how to think,” he said, “means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience.” Those words strike me as being worthy of contemplation.

A Review of "The Shallows"

Our Cluttered Minds
By JONAH LEHRER
Published: May 27, 2010

THE SHALLOWS

What the Internet Is Doing to Our Brains

By Nicholas Carr

276 pp. W. W. Norton & Company. $26.95

Socrates started what may have been the first technology scare. In the “Phaedrus,” he lamented the invention of books, which “create forgetfulness” in the soul. Instead of remembering for themselves, Socrates warned, new readers were blindly trusting in “external written characters.” The library was ruining the mind.

Needless to say, the printing press only made things worse. In the 17th century, Robert Burton complained, in “The Anatomy of Melancholy,” of the “vast chaos and confusion of books” that make the eyes and fingers ache. By 1890, the problem was the speed of transmission: one eminent physician blamed “the pelting of telegrams” for triggering an outbreak of mental illness. And then came radio and television, which poisoned the mind with passive pleasure. Children, it was said, had stopped reading books. Socrates would be pleased.

In “The Shallows: What the Internet Is Doing to Our Brains,” the technology writer Nicholas Carr extends this anxiety to the 21st century. The book begins with a melodramatic flourish, as Carr recounts the pleas of the supercomputer HAL in “2001: A Space Odyssey.” The machine is being dismantled, its wires unplugged: “My mind is going,” HAL says. “I can feel it.”

For Carr, the analogy is obvious: The modern mind is like the fictional computer. “I can feel it too,” he writes. “Over the last few years, I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory.” While HAL was silenced by its human users, Carr argues that we are sabotaging ourselves, trading away the seriousness of sustained attention for the frantic superficiality of the Internet. As Carr first observed in his much discussed 2008 article in The Atlantic, “Is Google Making Us Stupid?,” the mere existence of the online world has made it much harder (at least for him) to engage with difficult texts and complex ideas. “Once I was a scuba diver in a sea of words,” Carr writes, with typical eloquence. “Now I zip along the surface like a guy on a Jet Ski.”

This is a measured manifesto. Even as Carr bemoans his vanishing attention span, he’s careful to note the usefulness of the Internet, which provides us with access to a near infinitude of information. We might be consigned to the intellectual shallows, but these shallows are as wide as a vast ocean.

Nevertheless, Carr insists that the negative side effects of the Internet outweigh its efficiencies. Consider, for instance, the search engine, which Carr believes has fragmented our knowledge. “We don’t see the forest when we search the Web,” he writes. “We don’t even see the trees. We see twigs and leaves.” One of Carr’s most convincing pieces of evidence comes from a 2008 study that reviewed 34 million academic articles published between 1945 and 2005. While the digitization of journals made it far easier to find this information, it also coincided with a narrowing of citations, with scholars citing fewer previous articles and focusing more heavily on recent publications. Why is it that in a world in which everything is available we all end up reading the same thing?

But wait: it gets worse. Carr’s most serious charge against the Internet has nothing to do with Google and its endless sprawl of hyperlinks. Instead, he’s horrified by the way computers are destroying our powers of concentration. As the blogger Cory Doctorow, a co-editor of the wildly popular Web site Boing Boing, has observed, the typical electronic screen is an “ecosystem of interruption technologies,” encouraging us to peek at our e-mail in-box, glance at Twitter and waste away the day on eBay. And so we lurch from site to site, if only because we constantly crave the fleeting pleasure of new information. But this isn’t really the fault of the Internet. The online world has merely exposed the feebleness of human attention, which is so weak that even the most minor temptations are all but impossible to resist.

Carr extends these anecdotal observations by linking them to the plasticity of the brain, which is constantly being shaped by experience. While plasticity is generally seen as a positive feature — it keeps the cortex supple — Carr is interested in its dark side. He argues that our mental malleability has turned us into servants of technology, our circuits reprogrammed by our gadgets.

It is here that he starts to run into problems. There is little doubt that the Internet is changing our brain. Everything changes our brain. What Carr neglects to mention, however, is that the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind. For instance, a comprehensive 2009 review of studies published on the cognitive effects of video games found that gaming led to significant improvements in performance on various cognitive tasks, from visual perception to sustained attention. This surprising result led the scientists to propose that even simple computer games like Tetris can lead to “marked increases in the speed of information processing.” One particularly influential study, published in Nature in 2003, demonstrated that after just 10 days of playing Medal of Honor, a violent first-person shooter game, subjects showed dramatic increases in ­visual attention and memory.

Carr’s argument also breaks down when it comes to idle Web surfing. A 2009 study by neuroscientists at the University of California, Los Angeles, found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex, at least when compared with reading a “book-like text.” Interestingly, this brain area underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet. Google, in other words, isn’t making us stupid — it’s exercising the very mental muscles that make us smarter.

This doesn’t mean that the rise of the Internet won’t lead to loss of important mental talents; every technology comes with trade-offs. Look, for instance, at literacy itself: when children learn to decode letters, they usurp large chunks of the visual cortex previously devoted to object recognition. The end result is that literate humans are less able to “read” the details of the natural world.

While Carr tries to ground his argument in the details of modern neuroscience, his most powerful points have nothing do with our plastic cortex. Instead, “The Shallows” is most successful when Carr sticks to cultural criticism, as he documents the losses that accompany the arrival of new technologies. The rise of the written text led to the decline of oral poetry; the invention of movable type wiped out the market for illuminated manuscripts; the television show obliterated the radio play (if hardly radio itself). Similarly, numerous surveys suggest that the Internet has diminished our interest in reading books. Carr quotes Wallace Stevens’s poem “The House Was Quiet and the World Was Calm,” in which stillness allows the reader to “become a book.” The incessant noise of the Internet, Carr concludes, has turned the difficult text into an obsolete relic.

Or maybe even these worries are mistaken; it can be hard to predict the future through the haze of nostalgia. In 1916, T. S. Eliot wrote to a friend about his recent experiments with composing poetry on the typewriter. The machine “makes for lucidity,” he said, “but I am not sure that it encourages subtlety.” A few years later, Eliot presented Ezra Pound with a first draft of “The Waste Land.” Some of it had been composed on the typewriter.

Nicholas Carr - The Shallows: What the Internet is Doing to Our Brains

Nicholas Carr, author of the famous article in The Atlantic called "Is Google Making Us Stupid?", is back with this book that explores how the internet is messing with our minds. If Carr is right, the internet, for all of its exciting attributes, has a serious downside. I will be doing a number of posts on this fascinating book.

Tuesday, July 20, 2010

The Words of Aldous Huxley

"To be well informed, one must read quickly a great number of merely instructive books. To be cultivated, one must read slowly and with a lingering appreciation the comparatively few books that have been written by men who lived, thought, and felt with style."

- Aldous Huxley

Sunday, July 18, 2010

"Inception" ?

I've been waiting in vain all summer for a good movie to see with buttered popcorn. "Inception" is receiving strong reviews. I'll see it next weekend.

Tuesday, July 13, 2010

Hemingway Shoes Anyone?

Ernest Hemingway's Son Approves New Line Of Shoes
| 07/10/10 06:08 PM |



BOZEMAN, Mont. — The 82-year-old son of writer Ernest Hemingway says his famous father would approve of a new line of shoes named after the Pulitzer and Nobel prize-winning author that are divided into angler, literary and sportsman collections.

Bozeman resident Patrick Hemingway tried on a pair of loafers Friday at Schnee's Boots and Shoes and says the best part is he can wear them without socks, and that his father also hated socks.

Thomas Raymond & Co. is launching the Hemingway line of men's footwear and plans to distribute them to 12 different retailers around the U.S. this fall.

The Hemingway shoes are made in El Salvador using bison and calf hide and cost from $150 to $235.

Patrick Hemingway says a lot of celebrity endorsements are phony but not this one.

Sunday, July 11, 2010

Janet Evanovich - Finger Lickin' Fifteen

I picked up this mass market book on an impulse from Publix, read it, found it laugh-out-loud funny, breezy, but finally boring. The author is a mass market machine having turned out 16 books in this series with the heroine Stephannie Plum, who chases bail bond jumpers. It's a good summer reading experience, but I think I'll stop with this one.

Friday, July 9, 2010

The Words of William Faulkner

"He has never been known to use a word that might send a reader to the dictionary."

- William Faulkner, commenting about Ernest Hemingway

Books vs. the Internet

If there is a war between the internet and books, I am on the side of books even though my mind is in both worlds.


The Medium Is the Medium
By DAVID BROOKS
Published: July 8, 2010
NY Times columnist

Recently, book publishers got some good news. Researchers gave 852 disadvantaged students 12 books (of their own choosing) to take home at the end of the school year. They did this for three successive years.

Then the researchers, led by Richard Allington of the University of Tennessee, looked at those students’ test scores. They found that the students who brought the books home had significantly higher reading scores than other students. These students were less affected by the “summer slide” — the decline that especially afflicts lower-income students during the vacation months. In fact, just having those 12 books seemed to have as much positive effect as attending summer school.

This study, along with many others, illustrates the tremendous power of books. We already knew, from research in 27 countries, that kids who grow up in a home with 500 books stay in school longer and do better. This new study suggests that introducing books into homes that may not have them also produces significant educational gains.

Recently, Internet mavens got some bad news. Jacob Vigdor and Helen Ladd of Duke’s Sanford School of Public Policy examined computer use among a half-million 5th through 8th graders in North Carolina. They found that the spread of home computers and high-speed Internet access was associated with significant declines in math and reading scores.

This study, following up on others, finds that broadband access is not necessarily good for kids and may be harmful to their academic performance. And this study used data from 2000 to 2005 before Twitter and Facebook took off.

These two studies feed into the debate that is now surrounding Nicholas Carr’s book, “The Shallows.” Carr argues that the Internet is leading to a short-attention-span culture. He cites a pile of research showing that the multidistraction, hyperlink world degrades people’s abilities to engage in deep thought or serious contemplation.

Carr’s argument has been challenged. His critics point to evidence that suggests that playing computer games and performing Internet searches actually improves a person’s ability to process information and focus attention. The Internet, they say, is a boon to schooling, not a threat.

But there was one interesting observation made by a philanthropist who gives books to disadvantaged kids. It’s not the physical presence of the books that produces the biggest impact, she suggested. It’s the change in the way the students see themselves as they build a home library. They see themselves as readers, as members of a different group.

The Internet-versus-books debate is conducted on the supposition that the medium is the message. But sometimes the medium is just the medium. What matters is the way people think about themselves while engaged in the two activities. A person who becomes a citizen of the literary world enters a hierarchical universe. There are classic works of literature at the top and beach reading at the bottom.

A person enters this world as a novice, and slowly studies the works of great writers and scholars. Readers immerse themselves in deep, alternative worlds and hope to gain some lasting wisdom. Respect is paid to the writers who transmit that wisdom.

A citizen of the Internet has a very different experience. The Internet smashes hierarchy and is not marked by deference. Maybe it would be different if it had been invented in Victorian England, but Internet culture is set in contemporary America. Internet culture is egalitarian. The young are more accomplished than the old. The new media is supposedly savvier than the old media. The dominant activity is free-wheeling, disrespectful, antiauthority disputation.

These different cultures foster different types of learning. The great essayist Joseph Epstein once distinguished between being well informed, being hip and being cultivated. The Internet helps you become well informed — knowledgeable about current events, the latest controversies and important trends. The Internet also helps you become hip — to learn about what’s going on, as Epstein writes, “in those lively waters outside the boring mainstream.”

But the literary world is still better at helping you become cultivated, mastering significant things of lasting import. To learn these sorts of things, you have to defer to greater minds than your own. You have to take the time to immerse yourself in a great writer’s world. You have to respect the authority of the teacher.

Right now, the literary world is better at encouraging this kind of identity. The Internet culture may produce better conversationalists, but the literary culture still produces better students.

It’s better at distinguishing the important from the unimportant, and making the important more prestigious.

Perhaps that will change. Already, more “old-fashioned” outposts are opening up across the Web. It could be that the real debate will not be books versus the Internet but how to build an Internet counterculture that will better attract people to serious learning.

Monday, July 5, 2010

The Perils of Progress

The Perils of Progress

What about all the beautiful things that new technologies will take away from us?

by Rochelle Gurstein

The Great American Brain Rot (Or, Why We Need the Humanities) We had just heard a lecture by an exquisitely sensitive, painfully alert poet friend of ours about how we live today. She ranged widely and brilliantly and did not shy away from hazarding, ever so gently, a few doubts about what the Internet was doing to the feel of our daily life. These days, even a few well-considered, measured reservations about digital gadgetry apparently cannot be tolerated, and our poet friend was informed by forward-looking members of the audience that she was fearful of change, nostalgic, in short, reactionary with all its nasty political connotations. How, I whispered to my husband, is being pro- or anti-technology a political stance? It says nothing, I thought to myself, about where one stands on justice, equality, or freedom, except in the rather debased form of "access" to information. I was abruptly brought back to the lecture hall, however, when I heard an ardent champion of blogs speak the word "dinosaur" with equal parts conviction and contempt. How tiresome, I again whispered to my husband. How many times have I seen skeptics of progress (myself included) turned into dinosaurs? Was there really no way of responding to one's opponents except to doom them to extinction?

The other morning, I felt the same unfair stacking of the deck against my sensibility when I read an op-ed piece in The New York Times by Steven Pinker, a popularizer of evolutionary psychology, where he defended Twitter, e-mail, PowerPoint, and Google from the charge that they are "making us stupid." (Whether they are or not, typing that sentence, I couldn't help thinking that their silly names and broken punctuation have a decidedly stupid adolescent feel about them.) It is a tried-and-true strategy of boosters of progress, even if they don't know they are following in a well-established tradition, to offer a catalogue of what now appears to be irrational fears about earlier versions of whatever they are promoting, the better to discredit present-day naysayers. Pinker, true to type, opens his piece: "New forms of media have always caused moral panics. The printing press, newspapers, paperbacks, and television were all once denounced as threats to their consumers' brainpower and moral fiber."

Just as these, in Pinker's estimation, proved to be false alarms, so, too, he confidently predicts, will be the case with the current moral panic over new electronic technologies. When I read his list of "reality checks" that are supposed to mollify critics—for example, "the decades of television, transistor radios, and rock videos were also decades in which I.Q. scores rose continually"—I can't say that I felt reassured. Instead, I was struck, as I often am at such moments, by the thought that if intelligent, sensitive people have long and consistently been alarmed by a particular class of thing, instead of automatically assuming our superiority to them, we might better assume they were aware of something to which we have since become oblivious and that it is worth our while to attend carefully to their warnings.

Pinker's description of earlier fears about the dangers of newspapers, paperbacks, and television as "threats to their consumers' brainpower and moral fiber," like his belief that a rise in "I.Q. scores" somehow offset or discredited earlier anxieties about what television, transistor radios, and rock videos were doing to people's sensibility and consciousness sounded tone-deaf to me.


The Great American Brain Rot (Or, Why We Need the Humanities) When it came to new mass-circulation newspapers of the last quarter of the nineteenth century, leading critics like E.L. Godkin, editor of The Nation, thought its worst offense was "that its pervading spirit is one of vulgarity, indecency, and reckless sensationalism; that it steadily violates the canons alike of good taste and sound morals; that it cultivates false standards of life and demoralizes its readers." In the 1940s and '50s, critics on the Left perceived similar destructive forces in new forms of mass-produced entertainment. Dwight Macdonald spoke for writers associated with The Partisan Review and his own magazine Politics when he warned that "the deadening and warping effect of long exposure to movies, pulp magazines, and radio can hardly be overestimated."

Pinker closes his piece by praising Twitter and e-books and online encyclopedias for "helping us manage, search and retrieve our collective intellectual output at different scales." ("Manage, search and retrieve"—when, I asked myself, had thinking taken on the character of an army reconnaissance mission?) "Far from making us stupid," Pinker triumphantly concludes, "these technologies are the only things that will keep us smart." In disputes about the consequences of innovation, those on the side of progress habitually see only gains. They have no awareness that there are also losses—equally as real as the gains (even if the gain is as paltry as "keeping us smart")—and that no form of bookkeeping can ever reconcile the two. I had recently been reading some stirring essays in defense of the humanities and reading Pinker made me think of the kinds of losses that worried the great literary scholar Harry Levin back in 1954, the supposed golden age of the humanities: "This is the heyday of reprints and anthologies, not to speak of digests and abridgments. ... It may be that a commendable zeal for widespread literacy has somehow ended by spreading it too thin, with a resulting cultural inflation." I was interested to find that Levin was also troubled by mass culture. He pointed to the ever-increasing popularity of picture magazines like Life, of television and the phonograph, and worried aloud that "we are moving so quickly into the audio-visual epoch that the reading habit itself is seriously jeopardized."

In her lecture, our poet-friend expressed similar reservations about the fate of what remains of the reading habit in our digital era. She was also alert to another kind of loss, more elusive, having to do with a sense that the world we have on our computer screens lacks physical, tangible materiality and that it is changing the feel of our lives in unpredictable ways. Her observation has been much on my mind as I read in newspapers and magazines almost daily about the end of the book as we know it, how convenient it will be to download and read "content" on inert, blank screens with names like Amazon Kindle or Apple iPad. I recoil at this consumerist approach to books as immaterial content to be consumed. For me, books, like paintings, are tangible manifestations of a mind, of a person—

Camerado, this is no book,
Who touches this touches a man,
(Is it night? are we here together alone?)
It is I you hold and who holds you,
I spring from the pages into your arms—decease calls me
forth. (Whitman, Leaves of Grass)

Their physical being matters to us who pour our very being into writing and reading, we want the fruits of our labor to exist between hard or even soft covers in our own time and after us (and accept that the pages containing our being will turn brown and become brittle), it means something to us to see and speak of a book as a weighty tome or a slender volume, we like to be able to locate a passage we've already read spatially on a page, we are interested, even as we are dismayed, to discover that we are the first person in 61 years, eight months, and three days (according to the "due date" slip) to check a book out of the library, it pleases us to think of Whitman's leaves of grass as pages of a book, we are in awe of the perfection of the ending of Gabriel Garcia Marquez's One Hundred Years of Solitude where the reader's act of reading coincides with Aureliano's act of deciphering the pages of a mysterious set of parchments, and that as Aureliano comes to the end of his book—"he began to decipher the instant that he was living, deciphering it as he lived it, prophesying himself in the act of deciphering the last page of the parchments, as if he were looking into a speaking mirror"—so the reader comes to the last page and end of Marquez's novel.

All this to be lost for the sake of consumers who like the ease and efficiency of immaterial electronic "books"... But that is not all. The concrete, material presence of books on our bookshelves transports us back to the time and place where we first read them, we sometimes are pleased and other times shudder when we think of what a book meant to us then, what it has come to mean to us now, we are sometimes comforted to see the continuity of ourselves when we read our earlier marginalia, sometimes disconcerted by its now-alien quality, and occasionally we have dreams about books, like the one I had after my mentor died. When I was in graduate school, he used to lend me his books, their margins overflowing with neat, handwritten questions, objections, notes to himself (I can still picture the fine purple line quality of his felt-tip pen), teaching me how to read in conversation with the author, that is, when I paid attention to the author and not, as I was inclined to do, to the always more interesting thoughts of my mentor. When he died, I dreamt that he had left me a book that he had annotated especially for me and how grateful I was to have it ("who touches this touches a man") and how sorry I was to wake up.

Saturday, July 3, 2010

The Memory Theatre

My library is where I go to find my books. It is also the place where I go to remember for the memories of my life are tied to my books.

In Defense of the Memory Theater

By Nathan Schneider What concerns me about the literary apocalypse that everybody now expects—the at least partial elimination of paper books in favor of digital alternatives—is not chiefly the books themselves, but the bookshelf. My fear is for the eclectic, personal collections that we bookish people assemble over the course of our lives, as well as for their grander, public step-siblings. I fear for our memory theaters.

Thursday, July 1, 2010

The Best Presidents

FDR Rated Best President In Survey Of 238 Scholars
| 07/ 1/10 09:54 AM |


Read More: Best Presidents, Fdr, Fdr Best President Survey, FDR Rated Best President, Fdr Rated Best President In Survey Of 238 Scholars, FDR Siena College Survey, Franklin Delano Roosevelt, Siena College Best Presidents Survey, Siena College President Survey, Siena College Presidential Survey, Siena College

LOUDONVILLE, N.Y. — Franklin Delano Roosevelt is being ranked the top president in U.S. history by 238 scholars surveyed by Siena College.

Roosevelt has topped each of the five presidential scholar surveys conducted by the Albany, N.Y.-area college since 1982. Theodore Roosevelt came in at No. 2 in the survey released Thursday, followed by Abraham Lincoln, George Washington and Thomas Jefferson.

Scholars ranked the 43 presidents on attributes such as integrity, intelligence, leadership and communication, as well as their accomplishments.

Lincoln's beleaguered successor, Andrew Johnson, was rated the worst president.

Multitasking: One of the perils of technology

I will be reading Nicholas Carr's new book.





Nicholas Carr On Colbert: Multitasking Makes Us Worse At Multitasking

Nicholas Carr, author of the new book "The Shallows: What the Internet Is Doing to our Brains," talked to Stephen Colbert last night about how the Internet is making us more superficial. He said that we have begun to multitask more and more, and have a harder time focusing and thinking deeply because of it. In fact, Carr said, the more we multitask, the worse we get at multitasking itself, in addition to other cognitive tasks. Colbert offered up the website Reddit as an example, admitting his love for the site: "I could burn my entire life on that site," he said.

Carr said that what we need is a return to attentive thinking, that introspection and reflection have been the sources of most of the advances and knowledge that we have today, and that they should not be lost.