Friday, November 29, 2013

The Disadvantages of an Elite Education



The Disadvantages of an Elite Education


By William Deresiewicz


It didn’t dawn on me that there might be a few holes in my education until I was about 35. I’d just bought a house, the pipes needed fixing, and the plumber was standing in my kitchen. There he was, a short, beefy guy with a goatee and a Red Sox cap and a thick Boston accent, and I suddenly learned that I didn’t have the slightest idea what to say to someone like him. So alien was his experience to me, so unguessable his values, so mysterious his very language, that I couldn’t succeed in engaging him in a few minutes of small talk before he got down to work. Fourteen years of higher education and a handful of Ivy League degrees, and there I was, stiff and stupid, struck dumb by my own dumbness. “Ivy retardation,” a friend of mine calls this. I could carry on conversations with people from other countries, in other languages, but I couldn’t talk to the man who was standing in my own house.



It’s not surprising that it took me so long to discover the extent of my miseducation, because the last thing an elite education will teach you is its own inadequacy. As two dozen years at Yale and Columbia have shown me, elite colleges relentlessly encourage their students to flatter themselves for being there, and for what being there can do for them. The advantages of an elite education are indeed undeniable. You learn to think, at least in certain ways, and you make the contacts needed to launch yourself into a life rich in all of society’s most cherished rewards. To consider that while some opportunities are being created, others are being cancelled and that while some abilities are being developed, others are being crippled is, within this context, not only outrageous, but inconceivable.



I’m not talking about curricula or the culture wars, the closing or opening of the American mind, political correctness, canon formation, or what have you. I’m talking about the whole system in which these skirmishes play out. Not just the Ivy League and its peer institutions, but also the mechanisms that get you there in the first place: the private and affluent public “feeder” schools, the ever-growing parastructure of tutors and test-prep courses and enrichment programs, the whole admissions frenzy and everything that leads up to and away from it. The message, as always, is the medium. Before, after, and around the elite college classroom, a constellation of values is ceaselessly inculcated. As globalization sharpens economic insecurity, we are increasingly committing ourselves—as students, as parents, as a society—to a vast apparatus of educational advantage. With so many resources devoted to the business of elite academics and so many people scrambling for the limited space at the top of the ladder, it is worth asking what exactly it is you get in the end—what it is we all get, because the elite students of today, as their institutions never tire of reminding them, are the leaders of tomorrow.



The first disadvantage of an elite education, as I learned in my kitchen that day, is that it makes you incapable of talking to people who aren’t like you. Elite schools pride themselves on their diversity, but that diversity is almost entirely a matter of ethnicity and race. With respect to class, these schools are largely—indeed increasingly—homogeneous. Visit any elite campus in our great nation and you can thrill to the heartwarming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals. At the same time, because these schools tend to cultivate liberal attitudes, they leave their students in the paradoxical position of wanting to advocate on behalf of the working class while being unable to hold a simple conversation with anyone in it. Witness the last two Democratic presidential nominees, Al Gore and John Kerry: one each from Harvard and Yale, both earnest, decent, intelligent men, both utterly incapable of communicating with the larger electorate.



But it isn’t just a matter of class. My education taught me to believe that people who didn’t go to an Ivy League or equivalent school weren’t worth talking to, regardless of their class. I was given the unmistakable message that such people were beneath me. We were “the best and the brightest,” as these places love to say, and everyone else was, well, something else: less good, less bright. I learned to give that little nod of understanding, that slightly sympathetic “Oh,” when people told me they went to a less prestigious college. (If I’d gone to Harvard, I would have learned to say “in Boston” when I was asked where I went to school—the Cambridge version of noblesse oblige.) I never learned that there are smart people who don’t go to elite colleges, often precisely for reasons of class. I never learned that there are smart people who don’t go to college at all.



I also never learned that there are smart people who aren’t “smart.” The existence of multiple forms of intelligence has become a commonplace, but however much elite universities like to sprinkle their incoming classes with a few actors or violinists, they select for and develop one form of intelligence: the analytic. While this is broadly true of all universities, elite schools, precisely because their students (and faculty, and administrators) possess this one form of intelligence to such a high degree, are more apt to ignore the value of others. One naturally prizes what one most possesses and what most makes for one’s advantages. But social intelligence and emotional intelligence and creative ability, to name just three other forms, are not distributed preferentially among the educational elite. The “best” are the brightest only in one narrow sense. One needs to wander away from the educational elite to begin to discover this.



What about people who aren’t bright in any sense? I have a friend who went to an Ivy League college after graduating from a typically mediocre public high school. One of the values of going to such a school, she once said, is that it teaches you to relate to stupid people. Some people are smart in the elite-college way, some are smart in other ways, and some aren’t smart at all. It should be embarrassing not to know how to talk to any of them, if only because talking to people is the only real way of knowing them. Elite institutions are supposed to provide a humanistic education, but the first principle of humanism is Terence’s: “nothing human is alien to me.” The first disadvantage of an elite education is how very much of the human it alienates you from.



The second disadvantage, implicit in what I’ve been saying, is that an elite education inculcates a false sense of self-worth. Getting to an elite college, being at an elite college, and going on from an elite college—all involve numerical rankings: SAT, GPA, GRE. You learn to think of yourself in terms of those numbers. They come to signify not only your fate, but your identity; not only your identity, but your value. It’s been said that what those tests really measure is your ability to take tests, but even if they measure something real, it is only a small slice of the real. The problem begins when students are encouraged to forget this truth, when academic excellence becomes excellence in some absolute sense, when “better at X” becomes simply “better.”



There is nothing wrong with taking pride in one’s intellect or knowledge. There is something wrong with the smugness and self-congratulation that elite schools connive at from the moment the fat envelopes come in the mail. From orientation to graduation, the message is implicit in every tone of voice and tilt of the head, every old-school tradition, every article in the student paper, every speech from the dean. The message is: You have arrived. Welcome to the club. And the corollary is equally clear: You deserve everything your presence here is going to enable you to get. When people say that students at elite schools have a strong sense of entitlement, they mean that those students think they deserve more than other people because their SAT scores are higher.



At Yale, and no doubt at other places, the message is reinforced in embarrassingly literal terms. The physical form of the university—its quads and residential colleges, with their Gothic stone façades and wrought-iron portals—is constituted by the locked gate set into the encircling wall. Everyone carries around an ID card that determines which gates they can enter. The gate, in other words, is a kind of governing metaphor—because the social form of the university, as is true of every elite school, is constituted the same way. Elite colleges are walled domains guarded by locked gates, with admission granted only to the elect. The aptitude with which students absorb this lesson is demonstrated by the avidity with which they erect still more gates within those gates, special realms of ever-greater exclusivity—at Yale, the famous secret societies, or as they should probably be called, the open-secret societies, since true secrecy would defeat their purpose. There’s no point in excluding people unless they know they’ve been excluded.







One of the great errors of an elite education, then, is that it teaches you to think that measures of intelligence and academic achievement are measures of value in some moral or metaphysical sense. But they’re not. Graduates of elite schools are not more valuable than stupid people, or talentless people, or even lazy people. Their pain does not hurt more. Their souls do not weigh more. If I were religious, I would say, God does not love them more. The political implications should be clear. As John Ruskin told an older elite, grabbing what you can get isn’t any less wicked when you grab it with the power of your brains than with the power of your fists. “Work must always be,” Ruskin says, “and captains of work must always be….[But] there is a wide difference between being captains…of work, and taking the profits of it.”



The political implications don’t stop there. An elite education not only ushers you into the upper classes; it trains you for the life you will lead once you get there. I didn’t understand this until I began comparing my experience, and even more, my students’ experience, with the experience of a friend of mine who went to Cleveland State. There are due dates and attendance requirements at places like Yale, but no one takes them very seriously. Extensions are available for the asking; threats to deduct credit for missed classes are rarely, if ever, carried out. In other words, students at places like Yale get an endless string of second chances. Not so at places like Cleveland State. My friend once got a D in a class in which she’d been running an A because she was coming off a waitressing shift and had to hand in her term paper an hour late.



That may be an extreme example, but it is unthinkable at an elite school. Just as unthinkably, she had no one to appeal to. Students at places like Cleveland State, unlike those at places like Yale, don’t have a platoon of advisers and tutors and deans to write out excuses for late work, give them extra help when they need it, pick them up when they fall down. They get their education wholesale, from an indifferent bureaucracy; it’s not handed to them in individually wrapped packages by smiling clerks. There are few, if any, opportunities for the kind of contacts I saw my students get routinely—classes with visiting power brokers, dinners with foreign dignitaries. There are also few, if any, of the kind of special funds that, at places like Yale, are available in profusion: travel stipends, research fellowships, performance grants. Each year, my department at Yale awards dozens of cash prizes for everything from freshman essays to senior projects. This year, those awards came to more than $90,000—in just one department.



Students at places like Cleveland State also don’t get A-’s just for doing the work. There’s been a lot of handwringing lately over grade inflation, and it is a scandal, but the most scandalous thing about it is how uneven it’s been. Forty years ago, the average GPA at both public and private universities was about 2.6, still close to the traditional B-/C+ curve. Since then, it’s gone up everywhere, but not by anything like the same amount. The average gpa at public universities is now about 3.0, a B; at private universities it’s about 3.3, just short of a B+. And at most Ivy League schools, it’s closer to 3.4. But there are always students who don’t do the work, or who are taking a class far outside their field (for fun or to fulfill a requirement), or who aren’t up to standard to begin with (athletes, legacies). At a school like Yale, students who come to class and work hard expect nothing less than an A-. And most of the time, they get it.



In short, the way students are treated in college trains them for the social position they will occupy once they get out. At schools like Cleveland State, they’re being trained for positions somewhere in the middle of the class system, in the depths of one bureaucracy or another. They’re being conditioned for lives with few second chances, no extensions, little support, narrow opportunity—lives of subordination, supervision, and control, lives of deadlines, not guidelines. At places like Yale, of course, it’s the reverse. The elite like to think of themselves as belonging to a meritocracy, but that’s true only up to a point. Getting through the gate is very difficult, but once you’re in, there’s almost nothing you can do to get kicked out. Not the most abject academic failure, not the most heinous act of plagiarism, not even threatening a fellow student with bodily harm—I’ve heard of all three—will get you expelled. The feeling is that, by gosh, it just wouldn’t be fair—in other words, the self-protectiveness of the old-boy network, even if it now includes girls. Elite schools nurture excellence, but they also nurture what a former Yale graduate student I know calls “entitled mediocrity.” A is the mark of excellence; A- is the mark of entitled mediocrity. It’s another one of those metaphors, not so much a grade as a promise. It means, don’t worry, we’ll take care of you. You may not be all that good, but you’re good enough.



Here, too, college reflects the way things work in the adult world (unless it’s the other way around). For the elite, there’s always another extension—a bailout, a pardon, a stint in rehab—always plenty of contacts and special stipends—the country club, the conference, the year-end bonus, the dividend. If Al Gore and John Kerry represent one of the characteristic products of an elite education, George W. Bush represents another. It’s no coincidence that our current president, the apotheosis of entitled mediocrity, went to Yale. Entitled mediocrity is indeed the operating principle of his administration, but as Enron and WorldCom and the other scandals of the dot-com meltdown demonstrated, it’s also the operating principle of corporate America. The fat salaries paid to underperforming CEOs are an adult version of the A-. Anyone who remembers the injured sanctimony with which Kenneth Lay greeted the notion that he should be held accountable for his actions will understand the mentality in question—the belief that once you’re in the club, you’ve got a God-given right to stay in the club. But you don’t need to remember Ken Lay, because the whole dynamic played out again last year in the case of Scooter Libby, another Yale man.



If one of the disadvantages of an elite education is the temptation it offers to mediocrity, another is the temptation it offers to security. When parents explain why they work so hard to give their children the best possible education, they invariably say it is because of the opportunities it opens up. But what of the opportunities it shuts down? An elite education gives you the chance to be rich—which is, after all, what we’re talking about—but it takes away the chance not to be. Yet the opportunity not to be rich is one of the greatest opportunities with which young Americans have been blessed. We live in a society that is itself so wealthy that it can afford to provide a decent living to whole classes of people who in other countries exist (or in earlier times existed) on the brink of poverty or, at least, of indignity. You can live comfortably in the United States as a schoolteacher, or a community organizer, or a civil rights lawyer, or an artist—that is, by any reasonable definition of comfort. You have to live in an ordinary house instead of an apartment in Manhattan or a mansion in L.A.; you have to drive a Honda instead of a BMW or a Hummer; you have to vacation in Florida instead of Barbados or Paris, but what are such losses when set against the opportunity to do work you believe in, work you’re suited for, work you love, every day of your life?



Yet it is precisely that opportunity that an elite education takes away. How can I be a schoolteacher—wouldn’t that be a waste of my expensive education? Wouldn’t I be squandering the opportunities my parents worked so hard to provide? What will my friends think? How will I face my classmates at our 20th reunion, when they’re all rich lawyers or important people in New York? And the question that lies behind all these: Isn’t it beneath me? So a whole universe of possibility closes, and you miss your true calling.



This is not to say that students from elite colleges never pursue a riskier or less lucrative course after graduation, but even when they do, they tend to give up more quickly than others. (Let’s not even talk about the possibility of kids from privileged backgrounds not going to college at all, or delaying matriculation for several years, because however appropriate such choices might sometimes be, our rigid educational mentality places them outside the universe of possibility—the reason so many kids go sleepwalking off to college with no idea what they’re doing there.) This doesn’t seem to make sense, especially since students from elite schools tend to graduate with less debt and are more likely to be able to float by on family money for a while. I wasn’t aware of the phenomenon myself until I heard about it from a couple of graduate students in my department, one from Yale, one from Harvard. They were talking about trying to write poetry, how friends of theirs from college called it quits within a year or two while people they know from less prestigious schools are still at it. Why should this be? Because students from elite schools expect success, and expect it now. They have, by definition, never experienced anything else, and their sense of self has been built around their ability to succeed. The idea of not being successful terrifies them, disorients them, defeats them. They’ve been driven their whole lives by a fear of failure—often, in the first instance, by their parents’ fear of failure. The first time I blew a test, I walked out of the room feeling like I no longer knew who I was. The second time, it was easier; I had started to learn that failure isn’t the end of the world.



But if you’re afraid to fail, you’re afraid to take risks, which begins to explain the final and most damning disadvantage of an elite education: that it is profoundly anti-intellectual. This will seem counterintuitive. Aren’t kids at elite schools the smartest ones around, at least in the narrow academic sense? Don’t they work harder than anyone else—indeed, harder than any previous generation? They are. They do. But being an intellectual is not the same as being smart. Being an intellectual means more than doing your homework.



If so few kids come to college understanding this, it is no wonder. They are products of a system that rarely asked them to think about something bigger than the next assignment. The system forgot to teach them, along the way to the prestige admissions and the lucrative jobs, that the most important achievements can’t be measured by a letter or a number or a name. It forgot that the true purpose of education is to make minds, not careers.



Being an intellectual means, first of all, being passionate about ideas—and not just for the duration of a semester, for the sake of pleasing the teacher, or for getting a good grade. A friend who teaches at the University of Connecticut once complained to me that his students don’t think for themselves. Well, I said, Yale students think for themselves, but only because they know we want them to. I’ve had many wonderful students at Yale and Columbia, bright, thoughtful, creative kids whom it’s been a pleasure to talk with and learn from. But most of them have seemed content to color within the lines that their education had marked out for them. Only a small minority have seen their education as part of a larger intellectual journey, have approached the work of the mind with a pilgrim soul. These few have tended to feel like freaks, not least because they get so little support from the university itself. Places like Yale, as one of them put it to me, are not conducive to searchers.







Places like Yale are simply not set up to help students ask the big questions. I don’t think there ever was a golden age of intellectualism in the American university, but in the 19th century students might at least have had a chance to hear such questions raised in chapel or in the literary societies and debating clubs that flourished on campus. Throughout much of the 20th century, with the growth of the humanistic ideal in American colleges, students might have encountered the big questions in the classrooms of professors possessed of a strong sense of pedagogic mission. Teachers like that still exist in this country, but the increasingly dire exigencies of academic professionalization have made them all but extinct at elite universities. Professors at top research institutions are valued exclusively for the quality of their scholarly work; time spent on teaching is time lost. If students want a conversion experience, they’re better off at a liberal arts college.



When elite universities boast that they teach their students how to think, they mean that they teach them the analytic and rhetorical skills necessary for success in law or medicine or science or business. But a humanistic education is supposed to mean something more than that, as universities still dimly feel. So when students get to college, they hear a couple of speeches telling them to ask the big questions, and when they graduate, they hear a couple more speeches telling them to ask the big questions. And in between, they spend four years taking courses that train them to ask the little questions—specialized courses, taught by specialized professors, aimed at specialized students. Although the notion of breadth is implicit in the very idea of a liberal arts education, the admissions process increasingly selects for kids who have already begun to think of themselves in specialized terms—the junior journalist, the budding astronomer, the language prodigy. We are slouching, even at elite schools, toward a glorified form of vocational training.



Indeed, that seems to be exactly what those schools want. There’s a reason elite schools speak of training leaders, not thinkers—holders of power, not its critics. An independent mind is independent of all allegiances, and elite schools, which get a large percentage of their budget from alumni giving, are strongly invested in fostering institutional loyalty. As another friend, a third-generation Yalie, says, the purpose of Yale College is to manufacture Yale alumni. Of course, for the system to work, those alumni need money. At Yale, the long-term drift of students away from majors in the humanities and basic sciences toward more practical ones like computer science and economics has been abetted by administrative indifference. The college career office has little to say to students not interested in law, medicine, or business, and elite universities are not going to do anything to discourage the large percentage of their graduates who take their degrees to Wall Street. In fact, they’re showing them the way. The liberal arts university is becoming the corporate university, its center of gravity shifting to technical fields where scholarly expertise can be parlayed into lucrative business opportunities.



Finishing Up JFK

The month of JFK comes to an end.  I learned a lot reading about our 35th President.

First of all, I am not interested in assassination speculation.  It's all a waste of time.  Until hard evidence surfaces, a mysterious man named Lee Harvey Oswald shot and killed President Kennedy from the 6th floor of the then Texas Schoolbook Depository.  He acted alone though influenced by his pro-Cuba anti-Americanism of the times.  Case closed.

JFK saved the world by his handling of the Cuban Missile Crisis.  He learned to distrust the military and CIA during the Bay of Pigs disaster.  You have to think this played a part in how he handled the Cuban crisis.  He successfully jawboned the steel industry into lowering its prices.  He sucessfully handled Berlin crisis.  The wall went up, but better a wall than a war.  He was a late-comer to the civil rights movement.  JFK was something of a pragmatic liberal, controlled by events rather than he controlling events.  He mishandled the Oxford crisis.  If he had done right things, there might not have been a riot and two people killed.  Conservatives try to claim him, but they are wrong.  He was not a heart-felt, passionate liberal, but he was liberal nontheless.  He came into office toally focused on foreign affairs, lessening tensions with the Soviet Union, and without a domestic agenda.  I did not realize the extent of this until I began reading.

As time goes on, his stock will improve with professional historians, and deservedly so.  The sad thing is that he was so hated in the South because of race.  The country was not united behind his idealism; only part of the country was united.  This is one of the great misnomers of his administration.

The Camelot stuff is fantasy.  Reality is rarely fantasy.

JFK was a rich kid who never had to work a real job a day in his life.  My reading makes clear how his domineering daddy bankrolled everything Kennedy ever did. 

He was witty; he was articulate; he was inspiring.  Too bad I was too young and I grew up in the South and therefore missed the Kennedy mystique.

Tuesday, November 26, 2013

Larry Sabato - The Kennedy Half-Century

I think I've finished my JFK reading for the Fall as we come to the end of November remembering 11/22/63.  This is a humdrum book with a summary of JFK's life and the events leading up to and aftterwards of the assassination.  Nothing new here.  There is nothing anywhere regarding the assassination.  I don't get into the conspiracy nonsense.  Let it go I've learned.

Monday, November 25, 2013

The Difference

Democrats work to help people who need help.  That other party, they work to help people who don't need help.  Right there is the difference betwen the two parties and that's all there is to it.

-Harry Truman

Saturday, November 23, 2013

The Same People Who Hated Kennedy Hate Obama

facebooktweetpost22 Days of JFK


Hating KennedyBy John AvlonNovember 23rd 20132:10 pm

followMore Stories by John AvlonA look back at the voices who excoriated John F. Kennedy during his presidency gives perspective to the vitriol directed at Barack Obama todayThis week, America was fixated on the 50th anniversary of John F. Kennedy's assassination. Grief nourishes myth and a new CNN poll registers JFK as our most admired ex-president from the past half century.

His brief 1,000 days in the Oval Office loom large in American memory because of his abrupt loss; a psychic wound that shaped a generation, symbolizing a collective loss of innocence.

Perhaps inevitably, we buy into the idea that President Kennedy was as beloved in life as he has been in death. Of course, this was not the case.

There are always cranks and conspiracy theorists who nourish themselves on the bile that comes from hating the president of the United States. Some are just obsessive-compulsive hyper-partisans, some nurse groupthink grievances while others can be fairly classified as prejudiced or simply unhinged.

But a brief look back at the chorus of voices who hated President Kennedy offers some perspective on our own occasionally overheated political passions and on the vitriol directed at President Barack Obama.

During the 1960 campaign, Kennedy confronted the persistent strain of anti-Catholic bigotry that surfaced since the days of the Know-Nothing Party in the mid-19th century, which promised to purify American politics of Catholic or other immigrant influence deemed to be alien.

In Texas, the Baptist convention passed a resolution "cautioning members against voting for a Roman Catholic candidate" – a measure echoed across a handful of other states – buoyed by the argument that a Catholic president would put loyalty to the Pope ahead of loyalty to the United States. Just weeks after his election, a virulently anti-Catholic retired postal worker tried to assassinate Kennedy in Florida

After the botched Bay of Pigs invasion, Kennedy became a curse word among many Cuban exiles who blamed the president for abandoning their brothers on the beaches to Fidel Castro. Even a half-century later, the community's anger continues.

"Kennedy betrayed the Cuban people," Vicente Blanco told the Orlando Sun Sentinel this month while another Bay of Pigs veteran named Carl Sudano remembered that after the assassination in Dallas, "I shed no tears."

Kennedy's initially tentative embrace of civil rights caused him to be hated by some in the South. When James Meredith integrated the University of Mississippi, he was escorted by 300 federal troops, while more than 2,000 students protested, chanting "Two, four, one, three, we hate Kennedy."

In Georgia, a movie theatre showing the film PT109 decorated its marquee with this message: "See how the Japs almost got Kennedy."

But it's worth remembering that Kennedy was not always beloved by the Left. He was never fully embraced by the liberal wing of the Democratic Party and as a result of his hard line against Khrushchev during the Cuban Missile Crisis, British Nobel Prize winner and pacifist Bertrand Russell (who once described Kennedy as "much more wicked than Hitler") sent a telegram to Kennedy stating "Your actions desperate ... no conceivable justification. We will not have mass murder end this madness."

On the flip-side of the aisle, the far-Right wing John Birch Society mouthpiece, American Opinion, accused Kennedy of "shameless intimidation, bribery, and blackmail" which compelled "weaklings in Congress to approve treasonable acts designed to disarm us and make us the helpless prey of the affiliated criminals and savages of the United Nations".

President Kennedy also confronted the forerunners of the modern "patriot group" militia movement, warning that "armed bands of civilian guerrillas that are more likely to supply local vigilantes than national vigilance".

And days before Kennedy's assassination, thousands of fliers were distributed in downtown Dallas, featuring a mugshot photo of Kennedy over the words "Wanted for Treason."

Among the charges:

- "Betraying the Constitution (which he swore to uphold): He is turning sovereignty of the US over to the communist controlled United Nations: He is betraying our friends and befriending our enemies."

- "He has given support and encouragement to the Communist-inspired racial riots."

- "He has illegally invaded a sovereign State with federal troops."

- "He has consistently appointed Anti-Christians to Federal office: Upholds the Supreme Court in its Anti-Christian rulings. Aliens and known Communists abound in federal offices."

Against the backdrop of history, it is sobering to learn that the US Secret Service, which protects presidents, investigated 34 threats on President Kennedy's life from the state of Texas alone.

Why recount these long-gone grievances from the fringes of the early 1960s? Because we hear the echoes of some of the unhinged ideas that distort our own debates – the idea that President Obama is un-American or anti-American, of questionable faith and loyalties, that he is hell-bent on betraying the constitution and surrendering sovereignty to the U.N.

The persistence of this paranoid style in American politics says more about its articulators than the political leaders they project upon. They will not look any better in the eyes of history.

Eric Foner - Lincoln Scholar

Last Thursday night I had the pleasure of hearing highly esteemed historian Eric Foner of Columbia University speak at UAB on the legacy of Abraham Lincoln.  It was a highly rewarding experience for me.

The event was at UAB's Alumni Hall, a great new facility on the campus.  The place was packed with faculty, students, and the public.

Here are the notes I took of his talk.

Abraham Lincoln is the most iconic figure in American history.

You can find many Lincolns.  Everyone tries to claim Lincoln for their own.

There have been many Lincoln movies so not just the recent Speilberg one.  Foner is dismissive of Speilberg calling him a "megalomaniac" for postponing the release of his Lincoln movie until after the 2012 election.  In general Foner is dismissive of movies.

The end of slavery was a process and not a sudden thing like the passage of the 13 Amendment.

From the beginning of the war slaves ran away to Union lines.

The key turning point was the Emancipation Proclamation.  If there is one key point that Foner makes it is that he stresses the importance of the EP.

There is no Lincoln document where he authoritatively summarizes his views on slavery.

If there is another Foner theme in addition to his emphasis on the importance of the EP it is his belief in Lincoln's capacity for growth.  The liberal Foner sees Lincoln moving toward a liberal view on slavery and race.  This vew comes thru clearly in his Lincoln book.

Lincoln was not an abolitionist but he was influenced by the abolitionists and moved in that direction during the war.

Lincoln was a consumate politician in a time when politicians were admired unlike today.

Lincoln rose to prominence thru his oratory condeming slavery via the Declaration of Independence.

The baseline of L's opposition to slavery was economic: slavery was a theft of labor.

Each person has the right to enjoy the fruits of one's own labor.

Lincoln spoke of natural rights embodied in the Declaration of Independence not political rights: the right to the fruits of one's labor as a natural right but not political and social equality.

Lincoln, Clay, and others could speak of ending slavery which included colonization and thereby avoid the issue of what to do with freed former slaves in this country.

Lincoln did not degrade blacks like Jefferson.

Foner takes a shot at Doris Kearns Goodwin.  She does not mention colonization in her book.  The subject doesn't fit the story she wishes to tell.

Before the war it seemed that colonization was necessary to deal with eliminating slavery.

Lincoln's plan to eliminate slavery: 1)gradual 2) compensated 3) colonization.  But the war changed everything.

Lincoln was not an abolitionist.  He was not a "secret abolitionist."

The natural rights of the DOI but not political rights.

Colonization was a way to end slavery without dealing with its consequences in this country.

The most important Lincoln act was the Emancipation Proclimation.

Military emancipation was common in this time.

The South heard "ultimate extinction."

The South clearly thought slavery was threatened which triggered secession.

Foner does not see how slavery could have peacefully ended.

The border states rejected gradual, compensated emancipation.

Circumstances forced Lincoln to issue the Emancipation Proclamation.  This was the larget EP in history.  Lincoln acted under his war powers.  If the Union wins, the slaves freed by he EP would be free.

There was no moral sentiment in the EP.

The EP ended gradual and compensated emancipation.  According to Foner, it also ended colonization.

Lincoln's greatness was his capacity change.  In the last 2 years of his life he started thinking about a biracial society.

In his last public speech he asserted that some blacks should have the right to vote.  He was pushing the envelope forward.

The second inaugural was a humble speech.  It was only 8 minutes long.  All knew the war was about slavery.  The entire nation was at fault for the sin of slavery.

We are still in this country confronting the legacy of slavery.
The speaker invites questions after the lecture.  I get to ask the last question.

"If you could go back in time to April of 1865 and could ask Lincoln one question, what would that question be?"

Foner chuckles and I don't know if he is just amused by the question or if he thinks it a dumb question.  He responds, "I suppose I would ask what his plans were for Reconstruction."

Remembering JFK

How We Should Remember John F. Kennedy

To honor his memory, it's important to place him in his era of bold optimism and hard realism—and alongside other great leaders of that era.

Newton N. Minow

Nov 21 2013, 8:00 AM ET 6



President John F. Kennedy once described himself as an “idealist without illusions.” Now in my 88th year as one of the dwindling number of surviving members of Kennedy’s administration, I often think of him. As the nation marks the 50th anniversary of his death, we should remember not only the man himself, great as he was and could have been. More important is the exceptional era he embodied and championed, an era of both bold optimism and hard realism.



Kennedy described and also prescribed the time perfectly in his inaugural address when he spoke of “a new generation of Americans—born in this century, tempered by war, disciplined by a hard and bitter peace, proud of our ancient heritage, and unwilling to witness or permit the slow undoing of those human rights to which this nation has always been committed.”

Kennedy was not alone in his conviction, and as we reflect on his tragic death, we might think as well of four other transformational men and women who died in a six-year period between 1962 and 1968. Eleanor Roosevelt and Pope John XXIII died in 1962 and 1963, respectively, after long and productive lives. Six years later, Martin Luther King Jr. and Robert Kennedy, like Jack Kennedy, died as young men at the hands of assassins. All five died having transformed the way we think about ourselves as citizens and as people. All changed my life profoundly, as they did the world at large, and I had the good fortune to know three of them.



None were perfect—they were all in some way flawed—but they were leaders of a kind we have not seen again, idealists without illusions. They could be skeptics, but never cynics, and they had steady faith in the future. They believed deeply in human rights and the process and the judgments of democracy. That all five died within such a short span, leaving the legacies they did, is what I think about on this anniversary.



Eleanor Roosevelt, godmother of the women’s struggle for equal rights, forever changed the role of the first lady, leaving a record of and invitation to public service that her successors would follow. During FDR’s presidency, she gave visibility to human-rights issues at home and abroad, emphasizing the rights of women and children, the poor and African-Americans when few others did. Like her husband, she was skilled at using mass media to promote her causes, particularly in her popular syndicated column, “My Day.”



After FDR’s death in 1945, Roosevelt told reporters she was leaving the public eye, but she never did. She served as chair of the United Nations’ Human Right Commission, and in that role forged consensus among disagreeing delegates to produce the 1948 Universal Declaration of Human Rights.





Eleanor Roosevelt with Nell, Mary, and Martha

Minow (Courtesy Newton Minow)I experienced first-hand Roosevelt’s powers of persuasion in her quest for equal rights. I was serving in the Kennedy Administration as chairman of the Federal Communications Commission when she called me about the Reverend Robert L. T. Smith, a black candidate for Congress in 1962 in Jackson, Mississippi. Smith was being denied the opportunity to buy television time on local station WLBT solely because of his race. We were able to help, and Smith became the first black candidate for Congress ever to buy commercials or appear on TV in Mississippi. Myrlie Evers, whose husband Medgar Evers was murdered in Jackson a year later, said that seeing Smith on television “was like the lifting of a giant curtain. He was saying things that had never before been said by a Negro to whites in Mississippi.”



WLBT eventually lost its FCC license for its failure to serve the public interest. Roosevelt visited my FCC office, and my wife brought our three daughters, Nell, Martha, and Mary, to meet this great woman, who coaxed our three-year-old, Mary, to smile. Roosevelt died not long after her visit. When she died, Adlai Stevenson said, “She would rather light a candle than curse the darkness, and her glow has warmed the world.”



Pope John XXIII, elected to the papacy in 1958 at the age of 77, was expected to be a caretaker pope, but during his five-year term he transformed the Catholic Church for the modern age. He held passionate views on human dignity and equality. The Second Vatican Council, which he convened in 1962, changed the face of Catholicism to the world, to promote ecumenism and kindness.



As a young nuncio in France, the Pope had been instrumental in saving refugees, most of them Jews, from the Nazis. As pope, John XXIII put to rest the idea that the Jews were responsible for the death of Christ, apologizing for the Church’s “many centuries of blindness.” Later this year, John XXIII is on course to be canonized along with John Paul II. His work and his leadership also affected me personally. At an international conference in 1961, the president of Notre Dame, Father Theodore Hesburgh, and I served together as members of the American delegation. After I left government service, Father Ted (now 96, and coincidentally born the same week as President Kennedy), invited me to become the first Jewish trustee of Notre Dame. I still serve today as a life trustee, after almost 50 years.



Martin Luther King was a newly arrived 26-year-old pastor of a Montgomery, Alabama, Baptist church in 1955 when Rosa Parks refused to give up her seat on a city bus. The 382-day bus boycott that followed propelled him to national attention. He became a founder of the Southern Christian Leadership Council, went on to champion the 1960 lunch counter sit-ins that swept the South, to confront the infamous Bull Connor, and in 1963 to lead the March on Washington, at the time the largest protest ever assembled there. A year later King won the Nobel Peace Prize and saw President Kennedy’s successor, Lyndon Johnson, sign into law the Civil Rights Act. Four years after that he would be shot dead in Memphis.



When King came home from prison, he told his wife that despite being a Southern Baptist with a history of prejudice against Catholics, he would vote for a Catholic in 1960.King made no claim to being perfect. Some years after his death, I was seated next to his widow at a small dinner in Chicago. I told Coretta Scott King that I was with President Kennedy at O’Hare Airport in the 1960 campaign when he called her to offer help while her husband was in jail. She smiled and told me that when her husband came home from prison, they sat down in the kitchen to talk and he told her that as a Southern Baptist with a long history of prejudice against Catholics, this would be the first time in his life that he would vote for a Catholic for president. Sadly, the civil-rights movement in our own country has never again had a leader so visionary, strong, and effective.



I got to know Robert Kennedy before I met his brother. I met Bob in 1956 when we worked together as members of Stevenson’s presidential campaign staff. Bob and I were the same age and had children with similar ages. We quickly became good friends, and sometimes were roommates on campaign trips. As a young lawyer on the Senate Permanent Subcommittee on Investigations, Bob resigned his job in protest of Senator Joseph McCarthy’s investigative tactics. In 1966, he famously told a group of South African students that “each time a man stands up for an ideal, or acts to improve the lot of others, or strikes out against injustice, he sends forth a tiny ripple of hope, and … those ripples build a current that can sweep down the mightiest walls of oppression and resistance.”



When Kennedy ran for president in 1968, he ran to end the Vietnam War, which he had originally supported but came to believe was morally wrong. Sirhan Sirhan’s bullet killed that promise on June 5 of that year, and with Robert Kennedy’s life went the optimism and the promise of the era.



I will always remember the 1956 Stevenson campaign when we traveled to Springfield, Illinois. Bob asked if we could play hooky and visit Abraham Lincoln’s home and get back in time to board the campaign plane. The two of us walked to Lincoln’s home, talking about our children on the way. Bob said that when he was young there were three great influences on a child—the home, school, the church—but that he now saw a fourth great influence: television. This was the beginning of a conversation that four years later led to President Kennedy appointing me as chairman of the FCC, and to my lifelong work to promote quality children’s television and public-interest media.



Americans today remember John F. Kennedy as one of our most beloved presidents, along with Abraham Lincoln. Like King, he had been a mostly unfocused young man with moments of brilliance—his undergraduate thesis, Why England Slept, was published and sold 80,000 copies. He was a hero of World War II and of the generation that returned from the war determined to work for peace. Jack Kennedy first won a seat in Congress in 1946, a year in which Republicans won both houses, which gave him instant stature within the Democratic Party.



Six years later he moved to the Senate, and eight years after that he became the youngest man elected to the presidency and the first born in the 20th century. Kennedy was also the first president of the television age, and an early master of the medium. He told me more than once that he would not have been elected but for his four televised debates with then-Vice President Richard Nixon in 1960.



I went to Washington in 1961 to work in his administration, believing then as I do now that his election signaled a historic moment, a call to duty. I still believe that had he lived our country would have avoided some of the worst political and social turmoil of the 1960s, and that it would never have suffered the prolonged tragedy of Vietnam. Historian Arthur Schlesinger Jr. once wrote of Kennedy’s death that it was “as if Lincoln had been killed six months after Gettysburg or Franklin Roosevelt at the end of 1935 or Truman before the Marshall Plan.”



Historians most often remember Kennedy for his foreign-policy achievements, especially his handling of the Cuban missile crisis, but also the creation of the Peace Corps and the 1963 Limited Nuclear Test Ban treaty with Great Britain and the Soviet Union. During the Cuban crisis he gave me a challenging assignment: Voice of America’s broadcast signals to Cuba had been blocked by the Russians, and my job was to find a different way to get his speech on the crisis heard in Cuba. We succeeded, and the president later invited the American broadcasters who helped make it happen, together with me, to the White House to thank them for their unprecedented service to the country.



But Kennedy’s unfinished legacy was civil rights here at home. In November 1962, he sent federal marshals to Oxford, Mississippi, to confront a governor and an institution, the University of Mississippi, dead set against the matriculation of James Meredith. After the 1963 March on Washington, in one of the last acts of his presidency, Kennedy sent the bill to Congress that became the 1964 Civil Rights Act, one of the most important pieces of legislation in the 20th century.





The Minow family visits John Kennedy in the Oval Office on May 29, 1963. Newton Minow is second from right. (Courtesy of Newton Minow)

When we were leaving Washington to return home, Kennedy invited my family to the Oval Office to say goodbye. My wife and our three daughters visited the president on May 29, 1963—his last birthday. A picture of us with him is one of our most cherished memories. A few short months later he was killed. Like all Americans, we were shattered and still are as we relive that awful day in our hearts.



When President Kennedy was shot that November day in 1963, our daughter Martha, then 8 years old, wrote a poem to Jackie Kennedy:



Slowly but surely a willow branch fell down,

While rain spread ‘round the town.

A sad day it was for me

For the willow is my favorite tree….



But if I take the branch right in,

The roots might then begin.

By spring I will know whether it will grow.



My little tree will grow again

I guess that’s the same way with men.



I have always shared Martha’s optimism. And I believe idealism without illusions will grow again.



__



I thank Associate Dean Craig LaMay of the Medill School at Northwestern University for his research and help on this essay.

Alan Brinkley on the Legacy of JFK

The Legacy of John F. Kennedy

Historians tend to rate JFK as a good president, not a great one. But Americans consistently give him the highest approval rating of any president since Franklin D. Roosevelt. Why?

Alan Brinkley

Sep 18 2013, 8:24 PM ET 10
Among the many monuments to John F. Kennedy, perhaps the most striking is the Sixth Floor Museum in Dallas, in the building that was once the Texas School Book Depository. Every year, nearly 350,000 people visit the place where Lee Harvey Oswald waited on November 22, 1963, to shoot at the president’s motorcade. The museum itself is an oddity because of its physical connection to the event it illuminates; the most memorable—and eeriest—moment of a visit to the sixth floor is when you turn a corner and face the window through which Oswald fired his rifle as Kennedy’s open car snaked through Dealey Plaza’s broad spaces below. The windows are cluttered once again with cardboard boxes, just as they had been on that sunny afternoon when Oswald hid there.



Visitors from all over the world have signed their names in the memory books, and many have written tributes: “Our greatest President.” “Oh how we miss him!” “The greatest man since Jesus Christ.” At least as many visitors write about the possible conspiracies that led to JFK’s assassination. The contradictory realities of Kennedy’s life don’t match his global reputation. But in the eyes of the world, this reticent man became a charismatic leader who, in his life and in his death, served as a symbol of purpose and hope.



President Kennedy spent less than three years in the White House. His first year was a disaster, as he himself acknowledged. The Bay of Pigs invasion of Communist Cuba was only the first in a series of failed efforts to undo Fidel Castro’s regime. His 1961 summit meeting in Vienna with the Soviet leader Nikita Khrushchev was a humiliating experience. Most of his legislative proposals died on Capitol Hill.

Yet he was also responsible for some extraordinary accomplishments. The most important, and most famous, was his adept management of the Cuban missile crisis in 1962, widely considered the most perilous moment since World War II. Most of his military advisers—and they were not alone—believed the United States should bomb the missile pads that the Soviet Union was stationing in Cuba. Kennedy, aware of the danger of escalating the crisis, instead ordered a blockade of Soviet ships. In the end, a peaceful agreement was reached. Afterward, both Kennedy and Khrushchev began to soften the relationship between Washington and Moscow.



Kennedy, during his short presidency, proposed many important steps forward. In an address at American University in 1963, he spoke kindly of the Soviet Union, thereby easing the Cold War. The following day, after almost two years of mostly avoiding the issue of civil rights, he delivered a speech of exceptional elegance, and launched a drive for a civil-rights bill that he hoped would end racial segregation. He also proposed a voting-rights bill and federal programs to provide health care to the elderly and the poor. Few of these proposals became law in his lifetime—a great disappointment to Kennedy, who was never very successful with Congress. But most of these bills became law after his death—in part because of his successor’s political skill, but also because they seemed like a monument to a martyred president.



Kennedy was the youngest man ever elected to the presidency, succeeding the man who, at the time, was the oldest. He symbolized—as he well realized—a new generation and its coming-of-age. He was the first president born in the 20th century, the first young veteran of World War II to reach the White House. John Hersey’s powerful account of Kennedy’s wartime bravery, published in The New Yorker in 1944, helped him launch his political career.



In shaping his legend, Kennedy’s personal charm helped. A witty and articulate speaker, he seemed built for the age of television. To watch him on film today is to be struck by the power of his presence and the wit and elegance of his oratory. His celebrated inaugural address was filled with phrases that seemed designed to be carved in stone, as many of them have been. Borrowing a motto from his prep-school days, putting your country in place of Choate, he exhorted Americans: “Ask not what your country can do for you—ask what you can do for your country.”



Another contributor to the Kennedy legend, something deeper than his personal attractiveness, is the image of what many came to call grace. He not only had grace, in the sense of performing and acting gracefully; he was also a man who seemed to receive grace. He was handsome and looked athletic. He was wealthy. He had a captivating wife and children, a photogenic family. A friend of his, the journalist Ben Bradlee, wrote a 1964 book about Kennedy called That Special Grace.



The Kennedys lit up the White House with writers, artists, and intellectuals: the famous cellist Pablo Casals, the poet Robert Frost, the French intellectual André Malraux. Kennedy had graduated from Harvard, and stocked his administration with the school’s professors. He sprinkled his public remarks with quotations from poets and philosophers.



The Kennedy family helped create his career and, later, his legacy. He could never have reached the presidency without his father’s help. Joseph Kennedy, one of the wealthiest and most ruthless men in America, had counted on his first son, Joe Jr., to enter politics. When Joe died in the war, his father’s ambitions turned to the next-oldest son. He paid for all of John’s—Jack’s—campaigns and used his millions to bring in supporters. He prevailed on his friend Arthur Krock, of The New York Times, to help Jack publish his first book, Why England Slept. Years later, when Kennedy wrote Profiles in Courage with the help of his aide Theodore Sorensen, Krock lobbied successfully for the book to win a Pulitzer Prize.



The Kennedy legacy has a darker side as well. Prior to his presidency, many of JFK’s political colleagues considered him merely a playboy whose wealthy father had bankrolled his campaigns. Many critics saw recklessness, impatience, impetuosity. Nigel Hamilton, the author of JFK: Reckless Youth, a generally admiring study of Kennedy’s early years, summed up after nearly 800 pages:



He had the brains, the courage, a shy charisma, good looks, idealism, money … Yet, as always, there was something missing—a certain depth or seriousness of purpose … Once the voters or the women were won, there was a certain vacuousness on Jack’s part, a failure to turn conquest into anything very meaningful or profound.

I. F. Stone, the distinguished liberal writer, observed in 1973: “By now he is simply an optical illusion.”



Kennedy’s image of youth and vitality is, to some degree, a myth. He spent much of his life in hospitals, battling a variety of ills. His ability to serve as president was itself a profile in courage.



Much has been written about Kennedy’s covert private life. Like his father, he was obsessed with the ritual of sexual conquest—before and during his marriage, before and during his presidency. While he was alive, the many women, the Secret Service agents, and the others who knew of his philandering kept it a secret. Still, now that the stories of his sexual activities are widely known, they have done little to tarnish his reputation.



Half a century after his presidency, the endurance of Kennedy’s appeal is not simply the result of a crafted image and personal charm. It also reflects the historical moment in which he emerged. In the early 1960s, much of the American public was willing, even eager, to believe that he was the man who would “get the country moving again,” at a time when much of the country was ready to move. Action and dynamism were central to Kennedy’s appeal. During his 1960 presidential campaign, he kept sniping at the Republicans for eight years of stagnation: “I have premised my campaign for the presidency on the single assumption that the American people are uneasy at the present drift in our national course … and that they have the will and the strength to start the United States moving again.” As the historian Arthur M. Schlesinger Jr., Kennedy’s friend and adviser, later wrote, “The capital city, somnolent in the Eisenhower years, had suddenly come alive … [with] the release of energy which occurs when men with ideas have a chance to put them into practice.”



“He had the brains, the courage, a shy charisma, good looks, idealism, money … Yet, as always, there was something missing.”Kennedy helped give urgency to the idea of pursuing a national purpose—a great American mission. In the 15 years since World War II, ideological momentum had been slowly building in the United States, fueled by anxieties about the rivalry with the Soviet Union and by optimism about the dynamic performance of the American economy.



When Kennedy won the presidency, the desire for change was still tentative, as his agonizingly thin margin over Richard Nixon suggests. But it was growing, and Kennedy seized the moment to provide a mission—or at least he grasped the need for one—even though it was not entirely clear what the mission was. Early in his tenure, a Defense Department official wrote a policy paper that expressed a curious mix of urgent purpose and vague goals:



The United States needs a Grand Objective … We behave as if our real objective is to sit by our pools contemplating the spare tires around our middles … The key consideration is not that the Grand Objective be exactly right, it is that we have one and that we start moving toward it.

This reflected John Kennedy’s worldview, one of commitment, action, movement. Those who knew him realized, however, that he was more cautious than his speeches suggested.



John F. Kennedy was a good president but not a great one, most scholars concur. A poll of historians in 1982 ranked him 13th out of the 36 presidents included in the survey. Thirteen such polls from 1982 to 2011 put him, on average, 12th. Richard Neustadt, the prominent presidential scholar, revered Kennedy during his lifetime and was revered by Kennedy in turn. Yet in the 1970s, he remarked: “He will be just a flicker, forever clouded by the record of his successors. I don’t think history will have much space for John Kennedy.”



But 50 years after his death, Kennedy is far from “just a flicker.” He remains a powerful symbol of a lost moment, of a soaring idealism and hopefulness that subsequent generations still try to recover. His allure—the romantic, almost mystic, associations his name evokes—not only survives but flourishes. The journalist and historian Theodore White, who was close to Kennedy, published a famous interview for Life magazine with Jackie Kennedy shortly after her husband’s assassination, in which she said:



At night, before we’d go to sleep, Jack liked to play some records; and the song he loved most came at the very end of this record. The lines he loved to hear were: Don’t let it be forgot, that once there was a spot, for one brief shining moment that was known as Camelot.

And thus a lyric became the lasting image of his presidency.



White, in his memoirs, recalled the reverence Kennedy had inspired among his friends:



I still have difficulty seeing John F. Kennedy clear. The image of him that comes back to me … is so clean and graceful—almost as if I can still see him skip up the steps of his airplane in that half lope, and then turn, flinging out his arm in farewell to the crowd, before disappearing inside. It was a ballet movement.

Friends were not the only ones enchanted by the Kennedy mystique. He was becoming a magnetic figure even during his presidency. By the middle of 1963, 59 percent of Americans surveyed claimed that they had voted for him in 1960, although only 49.7 percent of voters had actually done so. After his death, his landslide grew to 65 percent. In Gallup’s public-opinion polls, he consistently has the highest approval rating of any president since Franklin D. Roosevelt.



The circumstances of Kennedy’s death turned him into a national obsession. A vast number of books have been published about his assassination, most of them rejecting the Warren Commission’s conclusion that Lee Harvey Oswald acted alone. After the assassination, even Robert F. Kennedy, the president’s brother, spent hours—perhaps days—phoning people to ask whether there had been a conspiracy, until he realized that his inquiries could damage his own career. To this day, about 60 percent of Americans believe that Kennedy fell victim to a conspiracy.



“There was a heroic grandeur to John F. Kennedy’s administration that had nothing to do with the mists of Camelot,” David Talbot, the founder of Salon, wrote several years ago. His book Brothers: The Hidden History of the Kennedy Years, more serious than most Kennedy conspiracy theories, suggested that the president’s bold, progressive goals—and the dangers he posed to entrenched interests—inspired a plot to take his life.



There are many reasons to question the official version of Kennedy’s murder. But there is little concrete evidence to prove any of the theories—that the Mafia, the FBI, the CIA, or even Lyndon B. Johnson was involved. Some people say his death was a result of Washington’s covert efforts to kill Castro. For many Americans, it stretches credulity to accept that an event so epochal can be explained as the act of a still-mysterious loner.



His allure—the romantic, almost mystic, associations his name evokes—not only survives but flourishes.Well before the public began feasting on conspiracy theories, Kennedy’s murder reached mythic proportions. In his 1965 book, A Thousand Days, Schlesinger used words so effusive that they seem unctuous today, though at the time they were not thought excessive or mawkish: “It was all gone now,” he wrote of the assassination: “the life-affirming, life-enhancing zest, the brilliance, the wit, the cool commitment, the steady purpose.”



Like all presidents, Kennedy had successes and failures. His administration was dominated by a remarkable number of problems and crises—in Berlin, Cuba, Laos, and Vietnam; and in Georgia, Mississippi, and Alabama. Some of these, he managed adroitly and, at times, courageously. Many, he could not resolve. He was a reserved, pragmatic man who almost never revealed passion.



Yet many people saw him—and still do—as an idealistic and, yes, passionate president who would have transformed the nation and the world, had he lived. His legacy has only grown in the 50 years since his death. That he still embodies a rare moment of public activism explains much of his continuing appeal: He reminds many Americans of an age when it was possible to believe that politics could speak to society’s moral yearnings and be harnessed to its highest aspirations. More than anything, perhaps, Kennedy reminds us of a time when the nation’s capacities looked limitless, when its future seemed unbounded, when Americans believed that they could solve hard problems and accomplish bold deeds.



Alan Brinkley, a professor of American history at Columbia University, is the author of John F. Kennedy (2012) and Liberalism and Its Discontents (1998), which served as sources for this article.

Friday, November 22, 2013

50 Years

The road from Dallas to 2013 leads thru Selma and Birmingham. It leads to a dozen what ifs and dozens of bogus conspiracy theories.The road from Dallas to 2013 leads thru Selma and Birmingham. It detours thru the Ho Chi Minh Highway and the Mekong Delta.  It leads to a dozen what ifs and dozens of bogus conspiracy theories. It leads to a 50 years going back to 1963 and a world that doesn't exist anymore. The road from Dallas leads to the minds & hearts of those alive on 11/22/63 and our personal thoughts. Johnny will not come marching home again.



 

Wednesday, November 20, 2013

The State of Obamacare

by Paul Krugman


November 20, 2013, 11:02 am 13 Comments

The State of Obamacare

I haven’t been writing about the healthcare.gov thing, for the simple reason that I have nothing to say. What’s going on isn’t a policy question: we know from the states with working exchanges (including California) that the underlying structure of the law is workable. Instead, it’s about an implementation botch, which is an incredible mess, and reflects very badly on Obama. But the future of the reform depends not on policy per se but on whether the IT issues can be fixed well enough soon enough, a subject on which I have zero expertise.



Of course, that hasn’t stopped other people from breathlessly commenting on every twist and turn in the polls, every meaningless vote in the House, and so on. Hey, it’s a living.



But at this point there’s enough information coming in to make semi-educated guesses — and it looks to me as if this thing is probably going to stumble through to the finish line. State-run enrollments are mostly going pretty well; Medicaid expansion is going very well (and it’s expanding even in states that have rejected the expansion, because more people are learning they’re eligible.) And healthcare.gov, while still pretty bad, is starting to look as if it will be good enough in a few weeks for large numbers of people to sign up, either through the exchanges or directly with insurers.



If all this is right, by the time open enrollment ends in March, millions of previously uninsured Americans will in fact have received coverage under the law, and reform will be irreversible. Obama personally may never recover his reputation; Democratic hopes of a wave election in 2014 are probably gone, although you never know. But anyone counting on Obamacare to collapse is probably making a very bad bet.

Tuesday, November 19, 2013

Gettyburg Anniversary

This is the 150th anniversary of Lincoln delivering the Gettysburg Address. I get chills every time I read it. The precision of the language summarizing the American experiment is exhilarating. Lincoln said more in 3 minutes than today's politicians could say in 3 hours. As Mark Twain said, few people get saved after the first 20 minutes of the sermon. Eric Foner, our best contemporary Lincoln scholar, will be in town Thursday night and I plan to attend his talk on Lincoln's legacy. I wish I could walk in wearing a stovepipe hat.


Monday, November 18, 2013

HS Reunion

Fred Hudson will be going to his high school reunion next month. We're talking about 45 years ago when he received his HS diploma. Panic will soon set in. Will he recognize anyone at the reunion? Will anyone recognize/remember him? Name tags anyone? Should he do anything special to prepare? It's too late for a facelift. Senior moments are uncontrollable. It's not too late to start making up some lies, but what's the point? There's nothing wrong with having lived a life of obscurity and mediocrity, is there? I've never served time. I'm not dealing with a probation officer. I'm not in witness protection. Hey---give credit where credit is due.


Sunday, November 17, 2013

JFK

The books and the online postings are almost overwhelming.  It's hard to keep up with it all.  I am not much interested in assassination speculation because speculation is all it is.  I am mostly interested in his presidency, its relevance to today, and JFK's place and rating in history. 

Friday, November 15, 2013

This Time of Year

.There's something about this time of year after the time change when suddenly it's dark at 5 and the city lights seem brighter than usual with the approach of the holidays that seems to stir the heart if not the soul. We eat more than we should, plans for the season dance in our heads like sugar plums, and the hustle & bustle of life ticks up a notch. At some point we secretly wish it were January and it was all over for another year but it's too early for that yet. For the present stick to your plan if you have one. Otherwise start making a list and check it twice just like Santa is doing. We'll all make it thru somehow like we always do.


Wednesday, November 13, 2013

November 11, 1963

On November 11, 1963 President John F. Kennedy visited Arlington National Cemetery to commemorate the occasion. He brought along his son John-John, who was soon to turn three years old.


The President has seemed gloomy in recent days probably due to the death two months ago of his 2-day old son Patrick and the unraveling situation in South Vietnam.

A strange mood on contentment seems to always com...e over the President when he visits Arlington. A veteran himself, the President has told others that he would like to be buried at Arlington.

John-John is playful and delights everyone with his antics, running around all over the place.

As he leaves the cemetery, JFK tells Congressman Hale Boggs, “This is one of the really beautiful places on earth. I could stay here forever.”

The President’s car drives away as John, Jr. waves to the crowd.

Bill Minutaglio & Steven L. Davis - Dallas 1963

Reading this book kept me in chills and a cold sweat.  When President Kennedy visited Dallas in November of 1963 he went into the captial of Right Wing Crazy in the early 60's.  Dallas was the national headquarters for ALL of the right wing anti-communism nuttiness and the racism that was sweeping the country at the time with Cuba such a big focus with Castro seizing power in 1959 and the burgeoning civil rights movement.

I did not realize before reading this book just how crazy Dallas was.  The place was a loony bin according to this book.  Adali Stevenson had been physically roughed up in a prior visit.  Even LBJ and Lady Bird were treated horribly in a previous visit.  JFK was warned repeatedly not to go to Dallas, but there was no way he was not going.  It was critical to keep Texas in the Democratic column in 1964 and it was necessary to heal a breech between the liberal and conservative wings of the party.  Dallas itself hated President Kennedy.  Dallas voted for Nixon in 1960 though the Democrats carried the state.  Paradoxically, Nixon flew out of Dallas the morning JFK arrived.

The books is hypnotic as it moves toward the inevitable consclusion that we all know so well, those who lived through that time.  Read about the power structure in Dallas.  Read about the hate-filled local newspaper.  Read about about traitorous Gen. Edwin Walker, truly a crazy man, who whipped up right wing frenzy at will.  Dallas showed an outpouring of adoration to the Kenndys when they arrived, the undertone of right wing craziness was there.  A man named Lee Harvey Oswald was a product of that hatred.

The reader cannot help but compare Dallas 1963 to the present time and the same type of hatred toward President Obama.  The right wing hatred and paranoia we see today is nothing new.

Monday, November 11, 2013

Reading Makes You More Empathetic? So What!



Should Literature Be Useful?

Posted by Lee Siegel



Two recent studies have concluded that serious literary fiction makes people more empathetic, and humanists everywhere are clinking glasses in celebration. But I wonder whether this is a victory for humanism’s impalpable enrichments and enchantments, or for the quantifying power of social science.



The two studies, one by a pair of social psychologists at the New School, and another conducted by researchers in the Netherlands, divided participants into several groups. The methodology was roughly the same in both studies. In the New School experiment, one group read selected examples of literary fiction (passages by Louise Erdrich, Don DeLillo, and others); another read commercial fiction, and another was given serious non-fiction or nothing at all. The subjects were asked either to describe their emotional states, or instructed, among other tests, to look at photographs of people’s eyes and try to derive from these pictures what the people were feeling when the photographs were taken.



The results were heartening to every person who has ever found herself, throughout her freshman year of college, passionately quoting to anyone within earshot Kafka’s remark that great literature is “an axe to break the frozen sea inside us.” The subjects who had read literary fiction either reported heightened emotional intelligence or demonstrated, in the various tests administered to them, that their empathy levels had soared beyond their popular- and non-fiction-reading counterparts.



The studies’ conclusions are also particularly gratifying in light of the new Common Core Standards, hastily being adopted by school districts throughout the country, which emphasize non-fiction, even stressing the reading of train and bus schedules over imaginative literature. Here at last, it seemed, was a proper debunking of that skewed approach to teaching the art of reading.



There is another way to look at the studies’ conclusions, however. Instead of proclaiming the superiority of fiction to the practical skills allegedly conferred by reading non-fiction, the studies implied that practical effects are an indispensable standard by which to judge the virtues of fiction. Reading fiction is good, according to the studies, because it makes you a more effective social agent. Which is pretty much what being able to read a train schedule does for you, too.



Americans have always felt uncomfortable about any cultural activity that does not lead to concrete results. “He that wastes idly a groat’s worth of his time per day, one day with another, wastes the privilege of using one hundred pounds each day”: though Benjamin Franklin was fairly indifferent to money himself, the sentiment he expressed in that bit of advice became a hallmark of the national character. Idleness is still anathema in American life. (Kim Kardashian, who has restlessly turned her idle time into a profitable industry, is a Puritan at heart.) And the active daydream of writing and reading fiction is idleness in its purest state, neither promising nor leading to any practical or concrete result. From the didactic McGuffey Readers that lasted from the middle of the nineteenth century to the middle of the twentieth century to William Bennett’s “Book of Virtues” in our own time (a liberal response, “A Call to Character” by Colin Greer and Herbert Kohl, was published a few years later), the American impulse to make room for literature by harnessing it to a socially useful purpose has taken many forms. You might even say that the two archetypal fictional American characters, Huck Finn and Tom Sawyer, invented by the country’s most scathing satirist, are essentially arguments for the superiority of idleness over any morally, socially or financially useful American activity.



Perhaps it is appropriate, in our moment of ardent quantifying—page views, neurobiological aperçus, the mining of personal data, the mysteries of monetization and algorithms—that fiction, too, should find its justification by providing a measurably useful social quality such as empathy. Yet while the McGuffey Readers and their descendants used literature to try to inculcate young people with religious and civic morality, the claim that literary fiction strengthens empathy is a whole different kettle of fish.



Though empathy has become something like the celebrity trait of emotional intelligence, it doesn’t necessarily have anything to do with the sensitivity and gentleness popularly attributed to it. Some of the most empathetic people you will ever meet are businesspeople and lawyers. They can grasp another person’s feelings in an instant, act on them, and clinch a deal or win a trial. The result may well leave the person on the other side feeling anguished or defeated. Conversely, we have all known bookish, introverted people who are not good at puzzling out other people, or, if they are, lack the ability to act on what they have grasped about the other person.



To enter a wholly different realm, empathy characterizes certain sadists. Discerning the most refined degrees of discomfort and pain in another person is the fulcrum of the sadist’s pleasure. The empathetic gift can lead to generosity, charity, and self-sacrifice. It can also enable someone to manipulate another person with great subtlety and finesse.



Literature may well have taught me about the complex nature of empathy. There is, for example, no more empathetic character in the novel or on the stage than Iago, who is able to detect the slightest fluctuation in Othello’s emotional state. Othello, on the other hand, is a noble and magnanimous creature—if vain and bombastic as well—who is absolutely devoid of the gift of being able to apprehend another’s emotional states. If he were half as empathetic as Iago, he would be able to recognize the jealousy that is consuming his treacherous lieutenant. The entire play is an object lesson in the emotional equipment required to vanquish other people, or to protect yourself from other people’s machinations. But no one—and no study—can say for sure whether the play produces more sympathetic people, or more Iagos.



Indeed, what neither of the two studies did was to measure whether the empathetic responses led to sympathetic feeling. Empathetic identification with the ordeals suffered by Apuleius’s golden ass, Defoe’s Moll Flanders, Shakespeare’s King Lear—a play Dr. Johnson wanted to be performed with a revised, happy ending because he said its spectacle of suffering was too much to endure—Dostoevsky’s Raskolnikov, Alyosha, or Prince Myshkin, Emma Bovary, not to mention the protagonists of misanthropic modernists like Céline, Gide, Kafka, Mann, et al.—empathetic sharing of these characters’ emotions could well turn a person inward, away from humanity altogether. Yet even if empathy were always the benign, beneficent, socially productive trait it is celebrated as, the argument that producing empathy is literature’s cardinal virtue is a narrowing of literary art, not an exciting new expansion of it.



Fiction’s lack of practical usefulness is what gives it its special freedom. When Auden wrote that “poetry makes nothing happen,” he wasn’t complaining; he was exulting. Fiction might make people more empathetic—though I’m willing to bet that the people who respond most intensely to fiction possess a higher degree of empathy to begin with. But what it does best is to do nothing particular or specialized or easily formulable at all.



Fiction’s multifarious nature is why so many people have attributed so many effects to imaginative literature, some of them contradictory: catharsis (Aristotle); dangerous corruption of the spirit (Plato); feverish loosening of morals (Rousseau); redemptive escape from personality (Eliot); empowering creation beyond the boundaries of morality (Joyce). Fiction ruined Don Quixote, young Werther, and Emma Bovary, but it saved Cervantes, Flaubert, and Goethe.



It’s safe to say that, like life itself, fiction’s properties are countless and unquantifiable. If art is made ex nihilo—out of nothing—then reading is done in nihilo, or into nothing. Fiction unfolds through your imagination in interconnected layers of meaning that lift the heavy weight of unyielding facts from your shoulders. It speaks its own private language of endless nuance and inflection. A tale is a reassuringly mortalized, if you will, piece of the oceanic infinity out of which we came, and back into which we will go. That is freedom, and that is joy—and then it is back to the quotidian challenge, to the daily grind, and to the necessity of attaching a specific meaning to what people are thinking and feeling, and to the urgency of trying, for the sake of love or money, to profit from it.



Lee Siegel is the author of two collections of criticism, “Falling Upwards: Essays in Defense of the Imagination,” and “Not Remotely Controlled: Notes on Television.”



Inconsistencies in the Warren Report

Special SeriesThe Kennedy Assassination, 50 Years LaterNews >


History >

The Kennedy Assassination, 50 Years LaterInconsistencies Haunt Official Record Of Kennedy's Death

by Marcus D. Rosenbaum

November 10, 2013 5:10 AM Listen to the StoryWeekend Edition Sunday

7 min 30 sec PlaylistDownloadTranscript

Enlarge image

i

Jacqueline Kennedy (center), with Edward and Robert Kennedy on either side, watches the coffin of President John F. Kennedy pass on Nov. 25, 1963.



Keystone/Getty Images



Jacqueline Kennedy (center), with Edward and Robert Kennedy on either side, watches the coffin of President John F. Kennedy pass on Nov. 25, 1963.

Keystone/Getty Images The first thing T. Jeremy Gunn says when you ask him about President John F. Kennedy's assassination is, "I'm not a conspiracy theorist. I don't have a theory about what happened."

But he knows a lot about it from his work for the Assassination Records Review Board, established by Congress in 1992, about a year after the Oliver Stone film J.F.K. reignited questions about the assassination. The film rejected the official 1964 conclusion of the Warren Commission, which placed guilt on Lee Harvey Oswald alone. Instead, Stone proposed a vast government conspiracy linked to the C.I.A.

The Warren Commission in 1964 had hoped that its well-written, 450-plus-page report would put questions about the assassination to rest. Of course, it didn't. Immediately after the report, nearly 90 percent of the public believed it. But by 1966, only 36 percent of Americans believed Oswald acted alone. Today, just 24 percent think so.

In Depth

For more coverage of the 50th anniversary of the Kennedy assassination, visit:



"22 Days In November" from member station KERA in Dallas

"November 1963: Remembering JFK" from member station WBUR in Boston

To set the record straight, Congress set up the Review Board to release still-classified government material related to the assassination. Gunn, as its director of research and general counsel, and later as its executive director, read everything he could find in the government's files and questioned dozens of doctors and former officials, many of them under oath.

In the end, the Review Board released thousands — if not tens of thousands — of documents, "Almost everything that was substantively related to the assassination," Gunn says.

Yet for Gunn and much of the public, the record was still not set straight. "There were many things that were disturbing," Gunn says.

A Blood-Stained Report Destroyed

Enlarge image

i

The Warren Commission delivers its report on Kennedy's assassination to President Lyndon B. Johnson in the Cabinet Room of the White House on Sept. 24, 1964. From left: lawyer John McCloy, General Counsel J. Lee Rankin, Sen. Richard Russell, Rep. Gerald Ford, Chief Justice Earl Warren, President Johnson, former CIA Director Allen Dulles, Sen. John Sherman Cooper, and Rep. Hale Boggs.



Francis Miller/Time Life Pictures/Getty Images



The Warren Commission delivers its report on Kennedy's assassination to President Lyndon B. Johnson in the Cabinet Room of the White House on Sept. 24, 1964. From left: lawyer John McCloy, General Counsel J. Lee Rankin, Sen. Richard Russell, Rep. Gerald Ford, Chief Justice Earl Warren, President Johnson, former CIA Director Allen Dulles, Sen. John Sherman Cooper, and Rep. Hale Boggs.

Francis Miller/Time Life Pictures/Getty Images When Gunn pored over the material, what stuck out most for him was the medical evidence, like what he learned in his 1996 deposition of James Joseph Humes. Humes, who died three years later, was one of the doctors who performed the autopsy on Kennedy's body.

For one thing, Humes told Gunn that the autopsy was not performed strictly by the book; some procedures were left out, such as removing and weighing all the organs. Then, Humes made an eye-opening revelation.

"Dr. Humes admitted that the supposedly original handwritten version of the autopsy that is in the National Archives is in fact not the original version," Gunn says. He says Humes had never said that publicly before, even to the Warren Commission.

In the deposition, Humes explained that when he took the material home after the autopsy was completed, he began thinking about how he had once seen the bloodstained chair Abraham Lincoln had been sitting in when he was shot.

"I thought this was the most macabre thing I ever saw in my life," Humes said. "It just made a terrible impression on me. And when I noticed that these bloodstains were on this document that I had prepared, I said nobody's ever going to get these documents. So I copied them ... and burned the original notes in the fireplace."

Enlarge image

i

Exhibit 1, from Kennedy's autopsy report, is the bloodstained document Dr. James Joseph Humes did not destroy.



Apic/Getty Images



Exhibit 1, from Kennedy's autopsy report, is the bloodstained document Dr. James Joseph Humes did not destroy.

Apic/Getty Images When Gunn asked him whether there was anything in the original document that was not in the copy, Humes replied, "I don't think so."

Then Gunn showed Humes another document from the autopsy, a two-page document Gunn had marked as Exhibit No. 1. It, too, had blood stains, but Humes had not destroyed it. Why? Humes said it was because the document had been prepared by another doctor at the autopsy.

"I didn't — wouldn't — have the habit of destroying something someone else prepared," Humes told Gunn during questioning.

Imperfect Pictures

After that, Gunn turned to the official autopsy photographs, the ones that are kept in the National Archives. Humes had never handled them before; the Warren Commission had never shown them to him. In fact, when Humes testified before the Warren Commission, he complained that the artist who drew the schematics he was using for his testimony was not allowed to see the photos.

When Humes did get a close look at the pictures — in his Review Board deposition — he said he found it hard to tell what was what in the pictures.

In fact, Gunn says, it's hard for anyone to tell what's what in the pictures, especially such important details as how many bullet wounds there are, and whether they are entry wounds or exit wounds.

So Gunn turned to a retired Navy warrant officer, Sandra Spencer, who, according to government records, had processed the autopsy film. She had not been questioned by the Warren Commission.

Enlarge image

i

The president is struck by a bullet as he travels through Dallas in a motorcade Nov. 22, 1963. Next to him in the car is his wife, Jacqueline, and in the front seat is Texas Gov. John Connally.



Three Lions/HultonArchive/Getty Images



The president is struck by a bullet as he travels through Dallas in a motorcade Nov. 22, 1963. Next to him in the car is his wife, Jacqueline, and in the front seat is Texas Gov. John Connally.

Three Lions/HultonArchive/Getty Images When Gunn showed her the official photos from the National Archives during her deposition in 1997, she said they were not the pictures she remembered processing.

Spencer, who died about a year ago, was in her 20s when she worked in Washington, D.C., at the Navy's central photo lab. At that time, the lab processed all official White House photographs.

For questioning, she brought with her some pictures she had printed just a few days before Kennedy was murdered. She explained that the lab bought huge quantities of photographic paper, so the markings on the back of the prints she brought would certainly match the autopsy photos she processed. But they didn't, suggesting they were printed at a different time or a different place.

What's more, the official pictures weren't anything like the ones she remembered.

"The prints that we printed did not have the massive head damages that is visible here," she told Gunn. "... The face, the eyes were closed and the face, the mouth was closed, and it was more of a rest position than these show."

The National Archives' photos seemed to be taken in a bright, medical setting. The body was bloody. Spencer said the pictures she had processed seemed to be taken in a darkened room with a flash. She called them "pristine." "There was no blood or opening cavities ... or anything of that nature. It was quite reverent in how they handled it," she said.

Of course, Gunn interviewed Spencer 30 years after the event, and that's a long time to remember every detail. But still, why didn't she recognize any of the official autopsy photos? Why are they on different paper from what she was using at the time? And whatever happened to the pictures she did remember processing?

Tracking Oswald

Gunn, who now teaches history at Al Akhawayn University in Morocco, can recite a litany of other unresolved questions surrounding the Kennedy assassination — ones the Warren Commission failed to answer.

For example, in New Orleans in 1963, Oswald came in contact with the FBI. When he was arrested after a scuffle at a demonstration, he asked to meet with the FBI.

"Why would Oswald ask to see someone from the FBI?" Gunn asks. "But an FBI agent went and interviewed Oswald, came back and wrote a memo on it, put it in the file."



Lee Harvey Oswald is led down a corridor of the Dallas police station for questioning in connection with Kennedy's assassination on Nov. 23, 1963.



AP A few days later, Gunn says, someone at headquarters in Washington saw that Oswald was engaged in pro-Fidel Castro activities and instructed the field office to interview him. But the FBI field office didn't reply that it had already interviewed him. Instead, it said that it would see what it could do.

All the FBI files were in order, Gunn says. Nothing seemed to be missing, so why wouldn't they mention the previous interview? "It doesn't make sense," Gunn says.

Oswald's trip to Mexico City a few weeks before the assassination also raises unanswered questions.

"Mexico City in the 1960s was probably the spy capital of the Western Hemisphere," Gunn says. Naturally, the CIA had operations watching all the embassies in Mexico, tapping their phones and photographing comings and goings.

While he was there, Oswald attempted to obtain visas from the Cuban consulate and the Soviet embassy. In one taped conversation, Oswald — or someone saying he was Oswald — called the Soviet embassy.

Then-FBI Director J. Edgar Hoover listened to the tape and told President Lyndon Johnson that it wasn't Oswald's voice.

Whose voice was it? No one knows. The tape has disappeared.

Nor are there any photos of the Soviet embassy the day Oswald supposedly went there. The CIA told the Warren Commission that its camera wasn't working that day.

Why did Oswald ask to meet with the FBI in New Orleans? Why did he go to Mexico City? What happened to the evidence of his visit? More mysteries.

'It's Too Late'

So what's the truth about the assassination?

"For me, it's quite simple," Gunn says. "I don't know what happened."

Enlarge image

i

President John F. Kennedy aboard the "Honey Fitz" off Hyannis Port, Mass., on Aug. 31, 1963.



Cecil Stoughton/UPI/Landov



President John F. Kennedy aboard the "Honey Fitz" off Hyannis Port, Mass., on Aug. 31, 1963.

Cecil Stoughton/UPI/Landov "There is substantial evidence that points toward Oswald and incriminates Oswald," he says, "and the only person we can name where there is evidence is Oswald. But there's also rather important exculpatory evidence for Oswald, suggesting he didn't do it, and that he was framed."

Some believe the Warren Commission did not resolve the mysteries because it was part of a giant cover-up, perhaps to hide a conspiracy that reached deep inside government itself.

Others point to a more benign explanation. The new president, Johnson, was pushing the Warren Commission to come to a conclusion quickly. He wanted to move the country forward, not dwell on its traumatic past.

Besides, Gunn says, the panel genuinely believed that Oswald had killed Kennedy. "So they wanted to write the document in a way that would reassure the American public that it was a single gunman acting alone, somebody who's a little bit unstable, and that that's the explanation for what happened." Since the facts aren't clear, though, that document can look like a whitewash.

For the Warren Commission, transparency had its own difficulties. "There are serious problems with the forensics evidence, with the ballistics evidence, with the autopsy evidence," Gunn says. "And, in my opinion, if they had said that openly, it would have not put the issue to rest."



Dr. T. Jeremy Gunn served as executive director of the Assassination Records Review Board.



Courtesy of T. Jeremy Gunn Faced with that, the Warren Commission went with what it believed.

Gunn says that wasn't enough. It's not that he thinks all the loose ends needed to be tied up. "It wouldn't be unusual if Oswald had done the crime — or not done the crime — to have evidence that's inconsistent," he says.

It's the big mysteries that cause him the most trouble.

"If the president had been killed as part of a conspiracy, that needed to be known," he says.

"The institution that had the opportunity to best get to the bottom of this, as much as it was possible, was the Warren Commission, and they didn't do it," he says. "Now it's too late to do what should have been done originally."

Marcus D. Rosenbaum is a former senior editor and producer at NPR. He now works as a freelance journalist and teacher.