Sunday, May 31, 2015

Carol Berkin - A Brilliant Solution

I wasn't sure about reading this concise account of the Constitutional Convention because I know nothing of the reputation of the author, a historian at Baruch College, and she does not include footnotes.  Every scholarly work should have footnotes!

But Gordon Wood endorses it on the back of the book as well as Molly Ivins so I gave it a go.  It's okay, but I didn't learn anything new.

If nothing else, the author gives the reader a strong feeling for the compromises that went into the final product.  Anyone like Judge Scalia who goes by "originalism" is clearly mistaken.  Our Constitution wasn't written on stone tablets.  It was the best that these 55 men could come up with and nothing more.  I think they would be appalled at Judge Scalia and his "dead" document.

The book reinforces in my mind that the delegates dealt with the establishment of the new executive branch at the end of the conclave.  I am reminded that they were most concerned with the legislative branch, which was common for the 18th Century.

The author doesn't deal much with slavery, and this is a weakness in her presentation.  This is the subject of most interest to me.  Madison said the biggest differences were between the slave and the non-slave states, not the big states vs. the small states, and the author does not mention this.

The book is worth reading, but I will not be referring back to it.

Saturday, May 30, 2015

The Dylan Library (6)

"Things Have Changed" is such a great song.

The Democratic Nomination

I see that former Maryland Governor O'Malley has announced his candidacy.  I know next to nothing about him.  We do need an alternative to Hillary in case she falters, and I hope she doesn't.

History at Risk


Google is not the answer: How the digital age imperils history

From floppy disks to thumb drives, we get better at storing things -- while trapping history in obsolete formats

Google is not the answer: How the digital age imperils history (Credit: Spiderstock via iStock)
Our species created about 5 billion gigabytes of information from the dawn of time until 2003.  Before long, we will create that much information many times per day, according to IBM.  The problem: No one is doing enough to select and preserve the bits that really matter.
One of the great paradoxes of the digital age is that we are producing vastly more information than ever before, but we are not very good at preserving knowledge in digital form for the long haul.  There’s a difference between creating big server farms to store the information somewhere for near-term retrieval (industry is very good at that) and in fact choosing and preserving the data that matters, and being able to render it useful, at some time in the future (something that, scarily, we are not nearly as good at). We are radically underinvesting in the processes and technologies that will allow us to preserve our cultural, literary and scientific records.
Consider the experience of pulling out an old shoebox from under a bed and discovering a series of floppy disks there from the 1980s. Perhaps you smile, thinking of what might be on them; perhaps you shiver. How would you find out? Most of us have not preserved a vintage Macintosh SE to be able to play them back. Data formats have changed multiple times since then. From 8-inch to 5-and-a-quarter inch to 3-and-a-half-inch floppy disks to compact disks to thumb drives, we are continuously making progress in how we store our media — and trapping information in lost formats in the process. Best that you put the box back under the bed and not worry too much about it.
Obsolescence of this kind may, in fact, be a blessing. It’s important that much of the information we create is ephemeral. Otherwise, the world will become far too cluttered. Our behaviors would shift, torqued by the constant surveillance to which we increasingly subject ourselves. We will have an even harder time finding the knowledge that’s important in the vast ocean of the unimportant – much less making sense of it all.
It’s fine when it’s your old term papers that are locked away in an obsolete format.  And many blogs, tweets, photos and status updates don’t need to be kept for the long run. It’s not so fine, though, when the lost knowledge has historical significance.
The problem is not that it’s impossible to transfer information from one format to another; with enough effort and cost, most data can be transferred to formats that can be read today. A cloud-based world, to which we are headed, is likely to be simpler to manage than a world of shoe-boxes, floppy disks and thumb drives.

Wednesday, May 27, 2015

Charles Leerhsen - Ty Cobb: A Terrible Beauty (2)

In March of 1994 we watched the filming of a scene from the Ron Shelton movie about Ty Cobb at Rickwood Field with Tommy Lee Jones playing the title role.  The script was based on a magazine article by a man named Al Stump.  In this major revisionist biography of Ty Cobb, the author says that whole thing, article and movie, is pure fiction.  Mr. Stump had a habit of disguising fiction as fact.

Ty Cobb was not the meanest and most hated man in baseball.  He had his bad qualities as we all do, but he was not the monster he's been made out to be over the decades.  Sharpening his spikes and trying  to hurt opposing players with them?  Not so according to this author. 

He got into his share of fights on the diamond, but so did lots of players during Cobb's era.  He had a hair-trigger temper, but so what?  If opposing players didn't like him, it was probably because he was such a fierce competitor.

 Before I go on, here is something I want to say.  I grew up collecting baseball cards and memorizing statistics, for baseball of all the sports is a game of statistics.  Two that always stuck in mind  are that Cobb's lifetime average was .367 and he had 4.191 hits.  Somewhere in the last few years some statistical do-gooder did some research and took some hits and average away from Cobb.  This book shows an average of .366 and 4189 hits.  I refuse to accept this!  I'm sticking with the original numbers.

In this book I learned that he tried to tutor Joe DiMaggio in his early years.  He tried to help Mantle.  He criticized Ted Williams for always trying to pull the ball and not hit to left field.  Boo!

He was a shrewd business man as he bought original GM and Coca-Cola stock.  Ty Cobb was certainly a multimillionaire.

The book makes me see that Cobb was a "push" hitter.  He tried to hit balls over the infield but did not "swing for the fences."  This was dead ball, small ball baseball. that changed when the ball was made livelier and a player named Babe Ruth came along.  By the 20's Cobb's style of baseball was in decline.  He would hold the bat spreading his hands and then adjusting.  He would be laughed at today.

Billy Martin and Ty Cobb would have been a great combo.  They both played and taught the same brand of baseball.  Cobb would drive other teams crazy when on the base paths.  He would pressure the other team into making mistakes.  He stole 892 bases in his career, but according to this author, was not so exceedingly fast---he did it as much by studying pitchers and getting a good jump.  Ty Cobb was the ultimate competitor in the dead ball era of small ball.

The reader appreciates that Cobb was a keen student of the game, that he was so good at spotting and exploiting the weaknesses of his opponents, and that he could be a trickster.  I do not particularly like the latter.  One technique was to put dirt on a catcher's foot to distract him.  That is bush in my opinion.  Do pressure and agitate the opponent like Billy Martin, but not juvenile tricks to take the place of out performing your opponent in the field.

Will this book rehabilitate Cobb's image?  Only to those who read the book.

There are other Ty Cobb books out there, but I think I'm done with this one.  The subject is exhaustible.

Tuesday, May 26, 2015

Print Lives

Why digital natives prefer reading in print. Yes, you read that right.

February 22
Frank Schembari loves books — printed books. He loves how they smell. He loves scribbling in the margins, underlining interesting sentences, folding a page corner to mark his place.
Schembari is not a retiree who sips tea at Politics and Prose or some other bookstore. He is 20, a junior at American University, and paging through a thick history of Israel between classes, he is evidence of a peculiar irony of the Internet age: Digital natives prefer reading in print.
“I like the feeling of it,” Schembari said, reading under natural light in a campus atrium, his smartphone next to him. “I like holding it. It’s not going off. It’s not making sounds.”
ADVERTISING
 
Textbook makers, bookstore owners and college student surveys all say millennials still strongly prefer print for pleasure and learning, a bias that surprises reading experts given the same group’s proclivity to consume most other content digitally. A University of Washington pilot study of digital textbooks found that a quarter of students still bought print versions of e-textbooks that they were given for free.
“These are people who aren’t supposed to remember what it’s like to even smell books,” said Naomi S. Baron, an American University linguist who studies digital communication. “It’s quite astounding.”
Earlier this month, Baron published “Words Onscreen: The Fate of Reading in a Digital World,” a book (hardcover and electronic) that examines university students’ preferences for print and explains the science of why dead-tree versions are often superior to digital. Readers tend to skim on screens, distraction is inevitable and comprehension suffers.
In years of surveys, Baron asked students what they liked least about reading in print. Her favorite response: “It takes me longer because I read more carefully.”
The preference for print over digital can be found at independent bookstores such as the Curious Iguana in downtown Frederick, Md., where owner Marlene England said millennials regularly tell her they prefer print because it’s “easier to follow stories.” Pew studies show the highest print readership rates are among those ages 18 to 29, and the same age group is still using public libraries in large numbers.
It can be seen in the struggle of college textbook makers to shift their businesses to more profitable e-versions. Don Kilburn, North American president for Pearson, the largest publisher in the world and the dominant player in education, said the move to digital “doesn’t look like a revolution right now. It looks like an evolution, and it’s lumpy at best.”
And it can be seen most prominently on college campuses, where students still lug backpacks stuffed with books, even as they increasingly take notes (or check Facebook) on laptops during class. At American, Cooper Nordquist, a junior studying political science, is even willing to schlep around Alexis de Tocqueville’s 900-plus-page “Democracy in America.”
“I can’t imagine reading Tocqueville or understanding him electronically,” Nordquist said in between classes while checking his e-mail. “That would just be awful.”
Without having read Baron’s book, he offered reasons for his print preference that squared with her findings.
The most important one to him is “building a physical map in my mind of where things are.” Researchers say readers remember the location of information simply by page and text layout — that, say, the key piece of dialogue was on that page early in the book with that one long paragraph and a smudge on the corner. Researchers think this plays a key role in comprehension.
But that is more difficult on screens, primarily because the time we devote to reading online is usually spent scanning and skimming, with few places (or little time) for mental markers. Baron cites research showing readers spend a little more than one minute on Web pages, and only 16 percent of people read word-by-word. That behavior can bleed into reading patterns when trying to tackle even lengthier texts on-screen.
“I don’t absorb as much,” one student told Baron. Another said, “It’s harder to keep your place online.”
Another significant problem, especially for college students, is distraction. The lives of millennials are increasingly lived on screens. In her surveys, Baron writes that she found “jaw-dropping” results to the question of whether students were more likely to multitask in hard copy (1 percent) vs. reading on-screen (90 percent).
Earlier this month, while speaking to sophomores about digital behavior, Baron brought up the problem of paying close attention while studying on-screen.
“You just get so distracted,” one student said. “It’s like if I finish a paragraph, I’ll go on Tumblr, and then three hours later you’re still not done with reading.”
There are quirky, possibly lazy reasons many college students prefer print, too: They like renting textbooks that are already highlighted and have notes in the margins.
While Nordquist called this a crapshoot, Wallis Neff, a sophomore studying journalism, said she was delighted to get a psychology textbook last year that had been “run through the mill a few times.”
“It had a bunch of notes and things, explaining what this versus that was,” she said. “It was very useful.”
When do students say they prefer digital?
For science and math classes, whose electronic textbooks often include access to online portals that help walk them through study problems and monitor their learning. Textbook makers are pushing these “digital learning environments” to make screen learning more attractive.
They prefer them for classes in which locating information quickly is key — there is no control-F in a printed book to quickly find key words.
And they prefer them for cost — particularly when the price is free. The Book Industry Study Group recently found that about a quarter of 1,600 students polled either downloaded or knew someone who downloaded pirated textbooks. Students, it turns out, are not as noble in their reading habits when they need beer money. They become knowledge thieves.
But stealing texts probably is more a reflection on the spiraling cost of higher education — and the price of textbooks, up 82 percent from 2002 to 2012 — than some secret desire of students to read digitally. If price weren’t a factor, Baron’s research shows that students overwhelmingly prefer print. Other studies show similar results.
The problem, Baron writes, is that there has been “pedagogical reboot” where faculty and textbook makers are increasingly pushing their students to digital to help defray costs “with little thought for educational consequences.”
“We need to think more carefully about students’ mounting rejection of long-form reading,” Baron writes.
And that thinking shouldn’t be limited to millennials, Baron said. Around the country, school systems are buying millions of tablets and laptops for classroom use, promising easier textbook updates, lower costs, less back strain from heavy book bags, and more interactivity. But the potential downsides aren’t being considered, she said.
“What’s happening in American education today?” she said. “That’s what I’m concerned about. What’s happening to the American mind?”
When Baron started researching her book on reading, some of her colleagues responded with pity.
“Did I fail to understand that technology marches on?” she writes. “That cars supplanted horses and buggies? That printing replaced handwritten manuscripts, computers replaced typewriters and digital screens were replacing books? Hadn’t I read the statistics on how many eReaders and tablets were being sold? Didn’t I see all those people reading eBooks on their mobile devices? Was I simply unable to adapt?”
But after learning what millennials truly think about print, Baron concluded, “I was roundly vindicated.”

From Michael O'Brien


The Minds of the South

Disunion
Disunion follows the Civil War as it unfolded.
Early 1861 found the 23-year-old Henry Adams in Washington, working as the private secretary to his father, Charles Francis Adams, a representative from Massachusetts. Adams was a keen observer even at that early age, and he focused much of his attention on the Southern political delegations going through the throes of secession. To Adams, the Southerners were little more than madmen. In a Jan. 8 letter to one of his brothers, he wrote, “I do not want to fight them … They are mad, mere maniacs, and I want to lock them up till they become sane; not kill them. I want to educate, humanize, and refine them, not send fire and sword among them.”
Such stereotypes, though, ran counter to the way most Southerners saw themselves. To them, they were among the best and the brightest of their time: they read and wrote political philosophy, they studied statistics, they took an interest in sociology, they travelled and kept up with the latest European trends and they were fastidious Biblical and classical scholars. Above all, they had regularly produced philosophically adept politicians, not only in the earliest generation (Thomas Jefferson, James Madison) and in the middle generation (John C. Calhoun), but in the generation that opted for secession. To them the decision to secede was not rash, but rational, the result of reasoned discussion. On Feb. 18, 1861, the South Carolinian diarist Mary Chesnut wrote, “This southern Confederacy must be supported now by calm determination — & cool brains,” and she did not doubt that the decision to leave the Union evidenced a coolness of calm judgment.
She was not wrong. Historical evidence abounds that Southerners were not stupid or close–minded, let alone mad, but rather as capable of well-informed analysis as anyone else (though also as capable of making the wrong analysis). For good or ill, theirs was a culture which believed that ideas mattered and had consequences. In this spirit, the case for secession had been reasoned out over generations, in books, periodicals, pamphlets, sermons and speeches. It was not something invented, in a moment of panic after Lincoln’s election, but a machine made by many hands, which needed only to be started up when the moment seemed right. If secession was a mistake, as events were to prove, it was an intellectual blunder, not because it was incoherent as an assertion of political principle, but because it fatally mistook Northern resolve and failed.
An 1861 Currier & Ives cartoon depicting secession as a race off of a cliff. “We go it blind!” yells the figure depicting Alabama.Library of CongressAn 1861 Currier & Ives cartoon depicting secession as a race off of a cliff. “We go it blind!” yells the figure depicting Alabama.
Antebellum Southerners had often disagreed with one another, and secession was no exception. It is probable that, at least before the summer of 1861, more Southerners opposed secession than agreed with it. Despite this habit of dissent, however, a few ideas had focused debate on what it meant to be a Southerner. Slavery was seen as fundamental to social order, though opinion was divided about why. Almost everyone agreed that the Bible and Christianity sanctioned the institution, some thought its contribution to sustaining racial hierarchy was indispensable, and no one doubted that the Southern economy would be wrecked by the elimination of forced labor.
Like other Americans, Southerners were interested in ideas of progress, thought themselves modern and understood how deeply enmeshed they were in modern capitalism. Most were earnest free traders. But, more than most Americans, Southerners had a sharp sense that progress did not come easily, that there were usually hard choices to be made about how its benefits could best be shared. It would be nice, they said, if everyone could painlessly benefit from progress. But many believed that tradeoffs were a “necessity,” a term that cropped up frequently in debates: among other “necessities,” they believed, was that in order for the majority whites of European descent to prosper, many — including Africans, Indians and Mexicans — had to lose out. A few Southerners felt guilty about this, many were smug and not a few respectfully blamed God for arranging life this way. The point is, they thought and argued about it, at length and in depth. Contrary to Henry Adams’s impression, ideas mattered in the Old South.
One idea in particular had gained purchase by the end of the antebellum era. Southerners were less interested in individualism than they had been in the 18th century, and more interested in the proposition that community was desirable and that a government which did not rest on shared social habits must fail. The “South” was one such community, but states even more so. Whereas a state in 1776 was mostly a polity, by 1860 it was thought to be a social world, with its own literature, habits, character and cultural institutions.
This emphasis on local values was not always in conflict with Unionism; Calhoun, for example, had been a dogged Unionist all his life, as well as a devoted South Carolinian. For all that, the cultural institutions of the states grew more elaborate, while those of the nation remained thinner. By 1860 there was a South Carolina Historical Society, but not yet an American Historical Association. There was a University of Virginia, but not a national university. So, when tensions within the Union grew intolerable, Southerners had alternatives — their states, their South, their version of American — ready to hand.
Indeed, the options seemed all the more attractive because they seemed so conformable to the political traditions of 1776. As Jefferson Davis put it in his inaugural address as provisional president of the Confederacy, “Our present position … illustrates the American idea that government rests upon the consent of the governed, and that it is the right of the people to alter or abolish a government whenever it becomes destructive of the ends for which it was established.”
Related
Civil War Timeline
Fort Sumter
An unfolding history of the Civil War with photos and articles from the Times archive and ongoing commentary from Disunion contributors.
This contrasted with prevailing Unionist ideas, articulated by Abraham Lincoln, which held that first there had been an American people and a Union, and then this Union had sanctioned the political subdivisions that were the states. But Davis believed this was backward: first there had been the individual colonies, which had broken with Britain and had established themselves as sovereign states, and then these states had agreed to create the United States as a workmanlike compact. The states that had made the compact were entitled to unmake it, if they followed the appropriate democratic procedures of consulting the popular will.
This was not just a difference of convenience, but a philosophical division about the nature of the state and time. Unionists were inclined to the view that, with the creation of the American republic, history had stopped and that the Union was literally perpetual, at least as long as the whole American people found it satisfactory. Davis thought that history could move on, that the political geography of God’s purposes for Americans could be rearranged without serious damage to providence.
As William Henry Trescot, one of the earliest proponents of secession, had put it in 1850, “We believe that the interests of the southern country demand a separate and independent government … The Union has redeemed a continent to the Christian world … It has developed a population with whom liberty is identical with law, and in training thirty-three States to manhood, has fitted them for the responsibility of independent national life … It has achieved its destiny. Let us achieve ours.”
Today Americans necessarily embrace the Unionist point of view of 1861, so that talk like Trescot and Davis’s gets branded as extremist and incoherent. Yet it was not always so: many American political thinkers before 1860 dissented from Lincoln’s version of history and thought Davis’s compact theory to be, at a minimum, a cogent proposition.
This much is clear: far from being a case of the crazy South splitting from the wise North, in 1861 the balance of rationality and irrationality was poised. Both sides had a delicate mix of wisdom and folly, clarity and confusion, altruism and self–interest. Neither had a marked advantage when it came to rationality or madness. And neither had the least idea what war would mean, as is often the way with the best and the brightest.

Monday, May 25, 2015

Charles Leerhsen - Ty Cobb: A Terrible Beauty

Many people still call Ty Cobb the greatest baseball player of all time.  My opinion is that such statements are foolish.  All you can of any athlete is that he or she is greatest of his or her time.  In the so-called dead ball pre-Babe Ruth era you can authoritatively say that Cobb was the best.  I am enjoying this new biography of the Georgia Peach.

Cobb was the best hitter of his time, but his split-handed slap hitting wouldn't work today.

I suspect his wild base running wouldn't work today either.

My conclusion so far is that Ty Cobb was a baseball player of his time.

The History of Memorial Day

Forgetting Why We Remember

Photo
Credit Owen Freeman
MOST Americans know that Memorial Day is about honoring the nation’s war dead. It is also a holiday devoted to department store sales, half-marathons, picnics, baseball and auto racing. But where did it begin, who created it, and why?
At the end of the Civil War, Americans faced a formidable challenge: how to memorialize 625,000 dead soldiers, Northern and Southern. As Walt Whitman mused, it was “the dead, the dead, the dead — our dead — or South or North, ours all” that preoccupied the country. After all, if the same number of Americans per capita had died in Vietnam as died in the Civil War, four million names would be on the Vietnam Veterans Memorial, instead of 58,000.
Officially, in the North, Memorial Day emerged in 1868 when the Grand Army of the Republic, the Union veterans’ organization, called on communities to conduct grave-decorating ceremonies. On May 30, funereal events attracted thousands of people at hundreds of cemeteries in countless towns, cities and mere crossroads. By the 1870s, one could not live in an American town, North or South, and be unaware of the spring ritual.
But the practice of decorating graves — which gave rise to an alternative name, Decoration Day — didn’t start with the 1868 events, nor was it an exclusively Northern practice. In 1866 the Ladies’ Memorial Association of Columbus, Ga., chose April 26, the anniversary of Gen. Joseph Johnston’s final surrender to Gen. William T. Sherman, to commemorate fallen Confederate soldiers. Later, both May 10, the anniversary of Gen. Stonewall Jackson’s death, and June 3, the birthday of Jefferson Davis, were designated Confederate Memorial Day in different states.
Memorial Days were initially occasions of sacred bereavement, and from the war’s end to the early 20th century they helped forge national reconciliation around soldierly sacrifice, regardless of cause. In North and South, orators and participants frequently called Memorial Day an “American All Saints Day,” likening it to the European Catholic tradition of whole towns marching to churchyards to honor dead loved ones.
But the ritual quickly became the tool of partisan memory as well, at least through the violent Reconstruction years. In the South, Memorial Day was a means of confronting the Confederacy’s defeat but without repudiating its cause. Some Southern orators stressed Christian notions of noble sacrifice. Others, however, used the ritual for Confederate vindication and renewed assertions of white supremacy. Blacks had a place in this Confederate narrative, but only as time-warped loyal slaves who were supposed to remain frozen in the past.
The Lost Cause tradition thrived in Confederate Memorial Day rhetoric; the Southern dead were honored as the true “patriots,” defenders of their homeland, sovereign rights, a natural racial order and a “cause” that had been overwhelmed by “numbers and resources” but never defeated on battlefields.
Yankee Memorial Day orations often righteously claimed the high ground of blood sacrifice to save the Union and destroy slavery. It was not uncommon for a speaker to honor the fallen of both sides, but still lay the war guilt on the “rebel dead.” Many a lonely widow or mother at these observances painfully endured expressions of joyous death on the altars of national survival.
Some events even stressed the Union dead as the source of a new egalitarian America, and a civic rather than a racial or ethnic definition of citizenship. In Wilmington, Del., in 1869, Memorial Day included a procession of Methodists, Baptists, Unitarians and Catholics; white Grand Army of the Republic posts in parade with a black post; and the “Mount Vernon Cornet Band (colored)” keeping step with the “Irish Nationalists with the harp and the sunburst flag of Erin.”
But for the earliest and most remarkable Memorial Day, we must return to where the war began. By the spring of 1865, after a long siege and prolonged bombardment, the beautiful port city of Charleston, S.C., lay in ruin and occupied by Union troops. Among the first soldiers to enter and march up Meeting Street singing liberation songs was the 21st United States Colored Infantry; their commander accepted the city’s official surrender.
Whites had largely abandoned the city, but thousands of blacks, mostly former slaves, had remained, and they conducted a series of commemorations to declare their sense of the meaning of the war.
The largest of these events, forgotten until I had some extraordinary luck in an archive at Harvard, took place on May 1, 1865. During the final year of the war, the Confederates had converted the city’s Washington Race Course and Jockey Club into an outdoor prison. Union captives were kept in horrible conditions in the interior of the track; at least 257 died of disease and were hastily buried in a mass grave behind the grandstand.
After the Confederate evacuation of Charleston black workmen went to the site, reburied the Union dead properly, and built a high fence around the cemetery. They whitewashed the fence and built an archway over an entrance on which they inscribed the words, “Martyrs of the Race Course.”
The symbolic power of this Low Country planter aristocracy’s bastion was not lost on the freedpeople, who then, in cooperation with white missionaries and teachers, staged a parade of 10,000 on the track. A New York Tribune correspondent witnessed the event, describing “a procession of friends and mourners as South Carolina and the United States never saw before.”
The procession was led by 3,000 black schoolchildren carrying armloads of roses and singing the Union marching song “John Brown’s Body.” Several hundred black women followed with baskets of flowers, wreaths and crosses. Then came black men marching in cadence, followed by contingents of Union infantrymen. Within the cemetery enclosure a black children’s choir sang “We’ll Rally Around the Flag,” the “Star-Spangled Banner” and spirituals before a series of black ministers read from the Bible.
After the dedication the crowd dispersed into the infield and did what many of us do on Memorial Day: enjoyed picnics, listened to speeches and watched soldiers drill. Among the full brigade of Union infantrymen participating were the famous 54th Massachusetts and the 34th and 104th United States Colored Troops, who performed a special double-columned march around the gravesite.
The war was over, and Memorial Day had been founded by African-Americans in a ritual of remembrance and consecration. The war, they had boldly announced, had been about the triumph of their emancipation over a slaveholders’ republic. They were themselves the true patriots.
Despite the size and some newspaper coverage of the event, its memory was suppressed by white Charlestonians in favor of their own version of the day. From 1876 on, after white Democrats took back control of South Carolina politics and the Lost Cause defined public memory and race relations, the day’s racecourse origin vanished.
Indeed, 51 years later, the president of the Ladies’ Memorial Association of Charleston received an inquiry from a United Daughters of the Confederacy official in New Orleans asking if it was true that blacks had engaged in such a burial rite in 1865; the story had apparently migrated westward in community memory. Mrs. S. C. Beckwith, leader of the association, responded tersely, “I regret that I was unable to gather any official information in answer to this.”
Beckwith may or may not have known about the 1865 event; her own “official” story had become quite different and had no place for the former slaves’ march on their masters’ racecourse. In the struggle over memory and meaning in any society, some stories just get lost while others attain mainstream recognition.
AS we mark the Civil War’s sesquicentennial, we might reflect on Frederick Douglass’s words in an 1878 Memorial Day speech in New York City, in which he unwittingly gave voice to the forgotten Charleston marchers.
He said the war was not a struggle of mere “sectional character,” but a “war of ideas, a battle of principles.” It was “a war between the old and the new, slavery and freedom, barbarism and civilization ... and in dead earnest for something beyond the battlefield.” With or against Douglass, we still debate the “something” that the Civil War dead represent.
The old racetrack is gone, but an oval roadway survives on the site in Hampton Park, named for Wade Hampton, former Confederate general and the governor of South Carolina after the end of Reconstruction. The old gravesite of the Martyrs of the Race Course is gone too; they were reinterred in the 1880s at a national cemetery in Beaufort, S.C.
But the event is no longer forgotten. Last year I had the great honor of helping a coalition of Charlestonians, including the mayor, Joseph P. Riley, dedicate a marker to this first Memorial Day by a reflecting pool in Hampton Park.
By their labor, their words, their songs and their solemn parade on their former owners’ racecourse, black Charlestonians created for themselves, and for us, the Independence Day of a Second American Revolution.

Sunday, May 24, 2015

Dylan Endures

Say what you will-----Dylan lives. Dylan endures. Out here on Highway 61 let us endure with him. He doesn't have to explain himself. There is nothing more he needs to say. One year short of 75, Dylan lives.

Friday, May 22, 2015

Joseph J. Ellis - The Quartet

Joseph Ellis is my favorite early American historian.  He writes clearly and persuasively for the layman.

His thesis here is that the ratifying of the Constitution was the real founding of the United States.  The Declaration was an affirmation of 13 sovereign states who became bound in a confederation with what was called The Articles of Confederation.  That confederation was not the country founded by the Constitution.

The drive to adopt the new Constitution was principally the work of James Madison, John Jay, Alexander Hamilton, and, of course, George Washington.  These four made it happen.

Ellis starts with the bold proclamation that Lincoln was wrong in his famous Gettysburg Address.  Four score and seven years ago our fathers did NOT bring forth a new nation.  A proclamation of independence did not create the United States.  The states were not united until the Constitution went into effect in 1788. Americans tend to see a straight line from 1776 to the Constitution.  In fact it's a blurry line that could have but didn't go in different direction.

All four leading Federalists preferred a stronger statement of federal sovereignty.

There is a long chapter on John Jay.  He is the one founding father that I've never had a feel for.  I still don't after reading this book.

The power and influence of George Washington is amazing.  There never has and never will be a President of his stature.

Lincoln was wrong in his Gettysburg Address saying that the country began on 7/4/76.  P. XI

"In 1863 Lincoln has some compelling reasons for bending the arc of American history in a national direction, since he was then waging a civil war on behalf of a union that he claimed predated the existence of the states.  P. XII

The transition from the DOI to the Constitution cannot be described as natural and inevitable.  P. X111

Ellis thinks the motivation present in Philadelphia was more political than Beardian economic.  I say there is still something about the Progressive position.

By 1787 the confederation was on the verge of dissolution.  P. XVII

The framers of the Constitution subordinated the moral to the political in putting slavery into the framework of the document permitting the continuance and expansion of the peculiar institution.   Did they do the right thing?   They certainly could not have imagined a biracial society.  P. XIX

It is clear that the political framework drawn up by the Articles of Confederation was not designed to function as a national government.  P. 7

For Washington the West replaced the War as the common bond of the people.  P. 27

Ellis says that Hamilton predicted that if the states did not disappear there would be a civil war over slavery.  How chilling.  P. 166

The Federalists had an almost mystical feeling that ratification of the new Constitution was foreordained.  P. 167

New York voted for ratification 30 to 27 after 10 states had already voted affirmatively.  P. 188

There seemed to be a providential feel to the founders as the Constitution went into effect.  P. 191

Jefferson was SO concerned about restricting the powers of the national government whereas Madison initially was not so concerned.

Neither Madison nor Jefferson could foresee that the Supreme Court would be come the ultimate arbiter of the Constitution and the Bill of Rights.  P. 204

Madison correctly believed that more abuses would occur at the state rather than the state level.  P. 210

Madison's motivations in drafting the BOR were political, not philosophical.  He had no sense that he was writing sacred script.  He was acting not as a political philosopher but a political strategist.  He repeatedly said he didn't think the Constitution needed a bill of rights.  The BOR for Madison was the completion of the ratification process.   P. 212-213

The thing that most intrigues me about Madison is how he went from being an ardent nationalist to a states-righter quickly in 1791.  Ellis says this is a historical mystery---this has "baffled historians."
P. 215

Legitimate government must rest on a popular foundation, but popular majorities cannot be trusted to act responsibly, a paradox that served the country well.  P. 218

Jefferson went to his grave believing that federal authority over domestic policy was a betrayal of the American Revolution rather than a rescue.  Jefferson was wrong.  P. 219

"Jefferson spoke for all the most prominent members of the revolutionary generation in urging posterity not to regard their political prescriptions as sacred script.  It is richly ironic that one of the few original intentions they all shared was opposition to any judicial doctrine of 'original intent.'  To be sure, they all wished to be remembered, but they did not want to be embalmed."  P. 220


Thursday, May 21, 2015

The Oregon Trail Generation: Life Before and After Mainstream Tech

BY Anna Garvey
Social Media Week
21 April 2015

We’re an enigma, those of us born at the tail end of the 70s and the start of the 80s. Some of the “generational” experts lazily glob us on to Generation X, and others just shove us over to the Millennials they love to hate – no one really gets us or knows where we belong.

We’ve been called Generation Catalano, Xennials, and The Lucky Ones, but no name has really stuck for this strange micro-generation that has both a healthy portion of Gen X grunge cynicism, and a dash of the unbridled optimism of Millennials.

A big part of what makes us the square peg in the round hole of named generations is our strange relationship with technology and the internet.  We came of age just as the very essence of communication was experiencing a seismic shift, and it’s given us a unique perspective that’s half analog old school and half digital new school.


You Have Died of Dysentery


If you can distinctly recall the excitement of walking into your weekly computer lab session and seeing a room full of Apple 2Es displaying the start screen of Oregon Trail, you’re a member of this nameless generation, my friend.

We were the first group of kids who grew up with household computers, but still novel enough to elicit confusion and wonder.  Gen X individuals were already fully-formed teens or young adults when computers became mainstream, and Millennials can’t even remember a time before computers.

But, when we first placed our sticky little fingers on a primitive Mac, we were elementary school kids whose brains were curious sponges.  We learned how to use these impressive machines at a time when average middle class families were just starting to be able to afford to buy their own massive desktops.

This made us the first children to grow up figuring it out, as opposed to having an innate understanding of new technology the way Millennials did, or feeling slightly alienated from it the way Gen X did.


An AOL Adolescence


Did you come home from middle school and head straight to AOL, praying all the time that you’d hear those magic words, “You’ve Got Mail” after waiting for the painfully slow dial-up internet to connect?  If so, then yes, you are a member of the Oregon Trail Generation.  And you are definitely part of this generation if you hopped in and out of sketchy chat rooms asking others their A/S/L (age/sex/location for the uninitiated).

Precisely at the time that you were becoming obsessed with celebrities, music and the opposite sex, you magically had access to “the internet,” a thing that few normal people even partially grasped the power of at the time.

We were the first group of high school kids to do research for papers both online and in an old-fashioned card catalogue, which many millennials have never even heard of by the way (I know because I asked my 21-year-old intern and he started stuttering about library cards).

Because we had one foot in the traditional ways of yore and one foot in the digital information age, we appreciate both in a way that other generations don’t.  We can quickly turn curmudgeonly in the face of teens who’ve never written a letter, but we’re glued to our smartphones just like they are.

Those born in the late 70s and early 80s were the last group to have a childhood devoid of all the technology that makes childhood and adolescence today pretty much the worst thing imaginable.  We were the last gasp of a time before sexting, Facebook shaming, and constant communication.

We used pay-phones; we showed up at each other’s houses without warning; we often spoke to our friends’ parents before we got to speak to them; and we had to wait at least an hour to see any photos we’d taken.  But for the group of kids just a little younger than us, the whole world changed, and that’s not an exaggeration.  In fact, it’s possible that you had a completely different childhood experience than a sibling just 5 years your junior, which is pretty mind-blowing.


Napster U


Thanks to the evil genius of Sean Parker, most of us were in college in the heyday of Napster and spent many a night using the university’s communal Ethernet to pillage our friends’ music libraries at breakneck speeds.  With mouths agape at having downloaded the entire OAR album in under five seconds, we built our music libraries faster than any other dorm-dwelling generation in history.

We were the first to experience the beauty of sharing and downloading mass amounts of music faster than you can say, “Third Eye Blind,” which made the adoption of MP3 players and music streaming apps perfectly natural.  Yet, we still distinctly remember buying cassette singles, joining those scam-tastic CD clubs and recording songs onto tapes from the radio.  The very nature of buying and listening to music changed completely within the first 20 years of our lives.


A Youth Untouched by Social Media


The importance of going through some of life’s toughest years without the toxic intrusion of social media really can’t be overstated.  Myspace was born in 2003 and Facebook became available to all college students in 2004.  So if you were born in 1981-1982, for example, you were literally the last graduating class to finish college without social media being part of the experience.

When we get together with our fellow Oregon Trail Generation friends, we frequently discuss how insanely glad we are that we escaped the middle school, high school and college years before social media took over and made an already challenging life stage exponentially more hellish.

We all talked crazy amounts of shit about each other, took pictures of ourselves and our friends doing shockingly inappropriate things and spread rumors like it was our jobs, but we just never had to worry about any of it ending up in a place where everyone and their moms (literally) could see it a hot second after it happened.

But unlike our older Gen X siblings, we were still young and dumb enough to get really into MySpace and Facebook in its first few years, so we understand what it feels like to overshare on social media and stalk a new crush’s page.

Time after time, we late 70s and early 80s babies were on the cusp of changes that essentially transformed modern life and, for better or worse, it’s shaped who we are and how we relate to the world.

Anna Garvey is the Director of Content and Social Media for WebRev Marketing & Design, a boutique firm in Chicago. In past lives, she’s also been an ex-pat in Italy and a 6th grade teacher on the Southside of Chicago. When she’s not scouring the internet for social media and blog fodder, she enjoys Netflix binges, soulful music and New Orleans culture.

Wednesday, May 20, 2015

Hillary's Favorite Historian

Meet Hillary's Historian: Professor Sean Wilentz, Partisan Jacksonian Democrat

Posted: Updated:

SEAN WILENTZ
WASHINGTON -- As a presidential candidate, Hillary Clinton has a tight circle of advisers who counsel her on economic policy, foreign affairs and politics in general. In Sean Wilentz, she also has something of a house historian.
Wilentz, a Princeton professor, was an outspoken supporter of Clinton during her previous presidential bid, and has remained close to her since, according to Clinton insiders. He has been helping Clinton understand where and how her potential administration, and that of her husband Bill Clinton, fit into the arc of progressive history over the last half-century or more, according to people who know both him and the candidate.
Wilentz, Princeton's George Henry Davis 1886 professor of American history, was a guest of honor at a Ready for Hillary event in the Hamptons, one Clinton source said, and remains in close touch with Clinton.
The role of Wilentz is noteworthy because of the political perspective he brings as an expert on President Andrew Jackson, considered the founder of the Democratic Party. Jackson was a relentless partisan and a populist, who attacked the aristocracy on behalf of the working class. (The white working class, that is; Jackson was also viciously racist and genocidal in his treatment of Native Americans.)
Wilentz, reached Tuesday in Germany, where he is teaching a course and working on a new book, declined to discuss his conversations with Clinton or her campaign. He said the best way to understand his current thinking on politics is to read his essay, "The Mirage," published in 2011 in The New Republic.
The lesson that Wilentz has drawn from his study of 19th century politics, as well as the Reagan era, which he dates to the mid-'70s, is that partisanship is a necessary element of the political process, and that those preaching non-partisan or post-partisan politics are naive at best, and more likely guided by an agenda to benefit those already in power. Wilentz notes that the pre-Civil War South was an example of post-partisan politics in action, as the South lived under one-party, Democratic rule. The Confederacy itself, he argues, was the most robust attempt at government without parties in the last 200 years of American history. It was also, not coincidentally, a wildly unequal society, with a few families controlling nearly all the wealth, the rest of the white population subsisting on little, and millions of black people enslaved.
Wilentz was highly critical of President Barack Obama during his first campaign, and continuing into his presidency, for his message of post-partisanship and his relentless efforts to strike grand bargains with the GOP. Former Rep. Barney Frank (D-Mass.), before Obama was even sworn in, had articulated the same critique in a more playful way. "I think he overestimates his ability to take people -- particularly our colleagues on the right -- and sort of charm them into being nice. I know he talks about being post-partisan. But I’ve worked frankly with Newt Gingrich, Tom DeLay, and the current Republican leadership," Frank said. "When he talks about being post-partisan, having seen these people and knowing what they would do in that situation, I suffer from post-partisan depression."
Instead, argues Wilentz, a clear-eyed partisan battle must be waged if a party wants to implement its agenda. If Clinton were to follow his counsel, the result would be a more combative administration than House and Senate Republicans have dealt with under Obama.
Our greatest presidents have been the fiercest partisans, Wilentz argued in a 2011 Stanford Lecture Series, titling his talk, "The True and Tragical History of Post Partisanship." His list of great partisan presidents: Thomas Jefferson, Andrew Jackson, Abraham Lincoln, Ulysses S. Grant, Theodore Roosevelt (before his 1912 run on a third-party ticket), Woodrow Wilson, Franklin Roosevelt and Bill Clinton. (Wilentz is also the author of a best-selling biography of Bob Dylan and knew the musician as a young child growing up in New York.)
Over the course of his career, however, Wilentz has been a better analyst of partisan political history than as a partisan operator himself. His 2007 and 2008 polemics against Obama on behalf of Clinton, mostly published in the New Republic, fell flat or backfired. And a more recent attack on Edward Snowden, Glenn Greenwald and Julian Assange was the subject of ridicule. (Which Wilentz told HuffPost he accepted "with pride," standing by his argument.)
In January 2013, shortly after Hillary Clinton suffered a concussion at home, Sidney Blumenthal, the journalist and former aide to Bill Clinton, wrote an email to her expressing concern. The message was later leaked after his email account was hacked. Sean Wilentz, Blumenthal assured Clinton, was thinking of her. Blumenthal added that he'd been doing some Clintonland matchmaking. "I've hooked up Sean, who flew to New Orleans for a few days, with James [Carville], who's giving him a tour of the music scene tomorrow, Thursday, and bringing him to the field of the Battle of NO. James is on the 200th anniversary commission and Sean, of course, is the Andrew Jackson expert," Blumenthal wrote.
wilentz
During the 2008 Democratic primary, Clinton often positioned herself as the experienced politician who knew how to deal with Republican opponents, an approach that the advice of Wilentz would only reinforce.
The influence of Wilentz can also been seen in some of Clinton's more recent rhetoric. "Democracy can come undone. It's not something that's necessarily going to last forever once it's been established," Wilentz wrote in his book, The Rise of American Democracy. If people lose faith that politics is on the level, democracy itself can erode, creating what Wilentz often refers to as "a crisis of democracy." His new book, he said, is about the fate of democracy, a study of the arc of democratic politics and inequality.
"We've got to do a better job of getting our economy growing again and producing results and renewing the American dream so that Americans feel that they have a stake in the future and that the economy and the political system is not stacked against them, because that will erode the trust that is at the basis of our democracy," Clinton said in Aspen last June in remarks that HuffPost interpreted as influenced by Elizabeth Warren, but may in fact be more Wilentzian.

Monday, May 18, 2015

Billy Martin (4)

Alfred Manuel "Billy" Martin died on December 25, 1989.  I remember because we were in Modesto that Christmas where the O'Rileys always passed around the morning paper and I remember reading about the car accident that killed Billy the next morning.  I was saddened by the news.

Not that I followed Billy's career that closely, but I was always somewhat aware of his presence in the baseball world mainly because he was several times the manager of the New York Yankees when a big time jerk named George Steinbrenner owned the team and fired managers almost annually.

Billy grew up in West Berkeley, California.  This is the poor side of Berkeley.  From the start baseball was his passion.  He told everyone as a kid that he would one day play for the Yankees and by golly he did.  He was Mr. New York during his time in the big town, buddies with The Mick, Yogi, and Whitey Ford.  He and Mantle had many a good time together.  Billy Martin knew everybody of note in New York, and everybody knew him.

Billy Martin was a good second baseman, but he gained his rep as a manager.  He had 9 gigs as a manager, very successful, yet he was fired 9 times because he always had off-the-field issues.  Steinbrenner fired him 5 times.  The book makes you think he might have been back managing the Yankees in 1989 had he lived.

Billy Martin was a true baseball genius, a student of the game like no other.  Most of his managerial competitors said he was the best.  His vision saw the whole field.  His modus operandi was to put pressure on the other team, to rattle, to force them into mistakes.  Billy Ball it was called.  Stealing bases.  Double steals.  Even triple steals.  Stealing home.  He loved stealing home.  Yelling at opposing players.  Taking advantage of their weaknesses and tendencies.  In the language of poker, he looked for "tells" against his opponents.  Working umpires.  His confrontations with umps are legendary.  Anything to score runs.

He was married4 times and divorced three times.  He had two including Billy, Jr.  We have to say that he was not a good husband or father.

The author says that Martin liked to talk about the Civil War and in particular General Lee's tactical choices at Gettysburg.  I wonder which side he was on.  The author says that Billy could entertain by talking like Donald Duck.  How hilarious to think about!

Baseball writers constantly refer to the 50's as baseball's golden age.  Why is this?

If Billy Martin were Auburn's coach, we'd win the conference title every year.  With a team like we have this year, he would extract every run he could out of this bunch.

He was a big time alcoholic, which was his undoing, leading to constant bar room fights which got him into constant trouble and led to his continuous firings.  He couldn't hold a managerial job not because he couldn't manage but because of off-the- field issues.   The worst situation was the day his pickup ran off the road in Fenton, New York, in the Binghamton area where he was living with his last wife Jill.  Another man was in the truck with him when the truck ran off the road and into a ditch on that Christmas Day.  Most likely both were sot drunk.  Who was driving the vehicle?  It was a subject of litigation.  Most likely the other man was driving.  Both were not wearing seat belts.  If Billy were wearing a seat belt he likely would have survived the crash.

The man had a passion.  That passion was baseball.  It is thrilling to read about a man with such a passion.  Billy Martin was a man of awesome talent and awesome weaknesses.  He had an identity.  He was always a New York Yankee even from childhood.  Once a Marine, always a Marine.  Once a Yankee, always a Yankee.


It Was a Crime

Surprise! It turns out that there’s something to be said for having the brother of a failed president make his own run for the White House. Thanks to Jeb Bush, we may finally have the frank discussion of the Iraq invasion we should have had a decade ago.
But many influential people — not just Mr. Bush — would prefer that we not have that discussion. There’s a palpable sense right now of the political and media elite trying to draw a line under the subject. Yes, the narrative goes, we now know that invading Iraq was a terrible mistake, and it’s about time that everyone admits it. Now let’s move on.
Well, let’s not — because that’s a false narrative, and everyone who was involved in the debate over the war knows that it’s false. The Iraq war wasn’t an innocent mistake, a venture undertaken on the basis of intelligence that turned out to be wrong. America invaded Iraq because the Bush administration wanted a war. The public justifications for the invasion were nothing but pretexts, and falsified pretexts at that. We were, in a fundamental sense, lied into war.
The fraudulence of the case for war was actually obvious even at the time: the ever-shifting arguments for an unchanging goal were a dead giveaway. So were the word games — the talk about W.M.D that conflated chemical weapons (which many people did think Saddam had) with nukes, the constant insinuations that Iraq was somehow behind 9/11.
And at this point we have plenty of evidence to confirm everything the war’s opponents were saying. We now know, for example, that on 9/11 itself — literally before the dust had settled — Donald Rumsfeld, the secretary of defense, was already plotting war against a regime that had nothing to do with the terrorist attack. “Judge whether good enough [to] hit S.H. [Saddam Hussein] ...sweep it all up things related and not”; so read notes taken by Mr. Rumsfeld’s aide.
This was, in short, a war the White House wanted, and all of the supposed mistakes that, as Jeb puts it, “were made” by someone unnamed actually flowed from this underlying desire. Did the intelligence agencies wrongly conclude that Iraq had chemical weapons and a nuclear program? That’s because they were under intense pressure to justify the war. Did prewar assessments vastly understate the difficulty and cost of occupation? That’s because the war party didn’t want to hear anything that might raise doubts about the rush to invade. Indeed, the Army’s chief of staff was effectively fired for questioning claims that the occupation phase would be cheap and easy.
Why did they want a war? That’s a harder question to answer. Some of the warmongers believed that deploying shock and awe in Iraq would enhance American power and influence around the world. Some saw Iraq as a sort of pilot project, preparation for a series of regime changes. And it’s hard to avoid the suspicion that there was a strong element of wagging the dog, of using military triumph to strengthen the Republican brand at home.
Now, you can understand why many political and media figures would prefer not to talk about any of this. Some of them, I suppose, may have been duped: may have fallen for the obvious lies, which doesn’t say much about their judgment. More, I suspect, were complicit: they realized that the official case for war was a pretext, but had their own reasons for wanting a war, or, alternatively, allowed themselves to be intimidated into going along. For there was a definite climate of fear among politicians and pundits in 2002 and 2003, one in which criticizing the push for war looked very much like a career killer.
On top of these personal motives, our news media in general have a hard time coping with policy dishonesty. Reporters are reluctant to call politicians on their lies, even when these involve mundane issues like budget numbers, for fear of seeming partisan. In fact, the bigger the lie, the clearer it is that major political figures are engaged in outright fraud, the more hesitant the reporting. And it doesn’t get much bigger — indeed, more or less criminal — than lying America into war.
But truth matters, and not just because those who refuse to learn from history are doomed in some general sense to repeat it. The campaign of lies that took us into Iraq was recent enough that it’s still important to hold the guilty individuals accountable. Never mind Jeb Bush’s verbal stumbles. Think, instead, about his foreign-policy team, led by people who were directly involved in concocting a false case for war.
So let’s get the Iraq story right. Yes, from a national point of view the invasion was a mistake. But (with apologies to Talleyrand) it was worse than a mistake, it was a crime.

Saturday, May 16, 2015

Woody Allen Muses About Emma Stone, Death, and His “Catastrophic Mistake”

BY Julie Miller
Vanity Fair
15 May 2015

Warning: there are mild spoilers about the film plot ahead.

Woody Allen may have had a difficult time understanding the questions from international reporters at Friday’s Cannes press conference for Irrational Man, but there was one query he jumped to answer.

In the spirit of Allen’s latest picture, which involves a murder plot, a foreign journalist wondered why the filmmaker killed off so many of his comedy characters (or had characters considering the act) over the years. And, while on that subject, had Allen himself ever “considered murdering someone?”

“Even as you speak,” Allen quipped, to laughter from adoring press. Flanked by his Irrational Man actresses Emma Stone and Parker Posey—star Joaquin Phoenix was M.I.A. from the Mediterranean—Allen was making a rare public appearance begrudgingly, or at least so says Page Six. Last decade, the famously press-shy actor made a point of flying out for the Cannes Film Festival whenever he had a project because, as he told The Guardian, “The French people have been so supportive of my films for so many years. . . . I thought I ought to make a gesture of reciprocity.” Of course, Allen never said that he would be demonstratively happy about promoting his films in the South of France.

In an interview immediately before the press conference began that was streamed live throughout the Palais des Festivals, Allen answered questions as succinctly as possible. When asked why he cast Stone again, after their first collaboration in Magic in the Moonlight, Allen said, “She was at the door. . . . And she looked hungry, so we gave her a part.” Asked about the making of this film, he offered the unfortunate reporter these words, “It’s just the idea of a story and you tell the story as the narrative unfolds.”

Several minutes later, after being tossed into a room full of journalists, Allen was able to crank up the charm a few notches.

Asked about working with Stone—essentially the same question he had been asked moments before—the filmmaker delivered a monologue about the joys of the Internet sweetheart.

“I saw her in a movie by accident. I was walking on the treadmill and looking for something to distract me and then I saw Emma. And she looked beautiful. And then she was very funny in the movie and I thought to myself, This is someone who would be very interesting to work with. And I worked with her in the first movie. She was great, absolutely great. Lived up to all of the hype. . . . So I had no problem thinking of her for Irrational Man. . . she amazed me.”

(Seeming to notice that it would be rude to leave Posey out of the praise-fest, as she was the only other person on the panel, Allen also volunteered his reasoning for why he wanted to work with the fabulous, if under-appreciated, indie actress. “I had always wanted to work with Parker Posey because I liked the name Parker Posey. I had seen her in all of these offbeat movies. She was always great, and I always thought, Will I ever get to be on set and say, ‘Where is Parker Posey?’”)

Stone was asked whether, as the latest Woody Allen muse, there were plans to collaborate again. The actress revealed, “There are no plans as it stands to do a third [film].” Looking in Allen’s direction, Stone added, “But wouldn’t that be nice?”

A reporter wondered whether Allen had been in touch with another of his brightest stars, Cate Blanchett, since her Oscar win for his 2013 drama, Blue Jasmine. The filmmaker surprised reporters by saying that he had actually not been in touch with Blanchett since way before the Academy Award.

“I have not seen or spoken to Cate since that movie [wrapped],” Allen explained. “It's all very professional . . . people go their separate ways after a film. You come in and you shoot. The last day everyone is very teary because you are not going to see the people anymore. But then you go off, and get on with your life. So I have not seen or spoken to Cate since that picture was over.”

The actors aren’t the only ones Allen disconnects with the second a project ends—he also leaves each of his films in the rear-view mirror. When the moderator brought up Allen’s 1989 release, Crimes and Misdemeanors, the filmmaker asked, quite genuinely, “Crimes and Misdemeanors—yeah, what happened in it? I can’t remember.”

“When I make a film, I never ever look at it again once I put it out because, if you look at it again, you can always see what you did wrong and how you could improve it.” Allen explained that if humanly possible, he would absolutely remake every single one of his films. “Charlie Chaplin had the luxury of shooting a whole film all at one time. He could look at it and study it and then shoot it again and again if he wanted to. Films were not that costly at that time . . . I would very happily take every movie that I have and improve it if I could get the cast back and the circumstances exact, roll back time, get the money.”

Recently, it was announced that the filmmaker will create a television series for Amazon, a project that Allen really regrets signing onto. “It was a catastrophic mistake for me,” he grumbled about the endeavor. “I'm struggling with it at home. . . . I never should have gotten into it. I thought it would be easy. You know, you do a movie . . . it’s a big, long thing. But to do six half hours, I thought it would be a cinch. . . . It is very, very hard. And I just hope I don’t disappoint Amazon. . . . I don't watch any of those television series, really, so I don’t know what I am doing. . . . I expect this to be a cosmic embarrassment when it comes out.”

If there was one subject that Allen seemed even more comfortable discussing than his own professional shortcomings, it was his looming death, and how filmmaking has distracted him from it.

“Making films, you know there is no positive answer to the grim reality of life. No matter how much the philosophers or the priests or the psychiatrists talk to you, the bottom line after all the talk is that life has its own agenda and it runs right over you while you are prattling. . .And the only way out of it, as an artist, is to try and come up with something where we can explain to people why life is worth living. And is a positive thing. And does have some meaning. Now, you can’t really do that without conning [moviegoers]. You can't be honest because in the end, it has no meaning.

“You are living in a random universe,” he continued. “You are living a meaningless life. And everything you create or you do is going to vanish with the sun burning out and the universe will be gone and it’s over. My conclusion is that the only possible way you can beat [this conclusion] even a little bit is through distraction. . . .Making movies is a wonderful distraction.”

While no journalist dared to ask Allen about any of his personal distractions—such as his purported controversies over the years, and decades’ worth of unsettling allegations that resurfaced last year—the filmmaker did, at one point, go on a curious tangent about morality.

When asked whether any of his Irrational Man characters were truly moral, Allen suggests that Stone’s character—who finds herself excited by the possibility of a criminal behavior—would evolve morally as she ages.

“Her perspectives will change from what they were at the end of the movie,” he told press. “When she is in her 40s or 50s or 70s, the perspective will change and she won’t be as hard on herself on certain issues. And she will be much harder on herself on other issues.”

Stone appeared surprised by this information, and a reporter asked whether this was more character description than Allen had provided Stone during the entire course of filming. “Yeah, it's interesting!” To Allen, she turned and pleaded, “Keep going! What about in her 80s?”

Allen, who enters that very decade in December, deadpanned, “In her 80s, she packs it in.” After journalists laughed, he added, “Her 80s is just all face work.”

When the press conference wrapped, about a dozen reporters immediately dropped what was left of their professional guard and flooded the podium with glossy photos and other film memorabilia for the director to autograph. At the Cannes Film Festival, for better or worse, the focus apparently really is the filmmaking.