Monday, October 31, 2011

About Steve Jobs (2)

The Daily Beast Homepage Content Section Steve Jobs, Unvarnished
Oct 24, 2011 6:58 AM EDT Walter Isaacson’s hugely anticipated biography of the Apple visionary gives us a balanced look at the complicated life of a tech genius—from his counterculture youth to his years outside the company and even his insensitivity as father and CEO. Plus, watch the best moments from Walter Isaacon’s 60 Minutes appearance.

The Steve Jobs backlash has begun, sparked by a new biography that paints a balanced but often unflattering portrait of Apple’s visionary cofounder and CEO.

Steve Jobs, by Walter Isaacson, contains no huge revelations—which is no surprise, considering many other biographies have already covered much of the Jobs life story—but does deliver some key nuggets of information that will surprise and perhaps disappoint even those who have followed Jobs closely during his career.

Unlike other biographers, Isaacson had access to Jobs himself, interviewing the CEO some 40 times and speaking to most of the people who knew him well. Jobs first sought out Isaacson to write the biography in 2004, not long after his initial diagnosis of cancer. Isaacson demurred. But in 2009, knowing that his health had grown worse and that he did not have long to live, Jobs asked Isaacson again. At that point Isaacson, who previously had written biographies of Benjamin Franklin and Albert Einstein, signed on for the project.

Jobs’s death on Oct. 5 prompted an outpouring of grief, with Apple fans creating makeshift shrines outside Apple stores and industry luminaries delivering tributes to his genius and portraying him as a kind of selfless messiah, a silicon saint who had led the world out of darkness and into the light.

The Isaacson book paints a more complicated picture, one that Jobs’s wife, Laurene Powell Jobs, encouraged the author to present, saying that “there are parts of his life and personality that are extremely messy, and that’s the truth. You shouldn’t whitewash it. He’s good at spin, but he also has a remarkable story, and I’d like to see that it’s all told truthfully.”

The picture here is of a brilliant, successful yet amazingly narrow, limited, and ungenerous man, who, even as he was dying, could not let go of his desire to outdo his enemies.

She might not have expected, however, that Isaacson would include comments from Andy Hertzfeld, one of the original Macintosh developers and a longtime friend of the Jobs family who lives near them in Palo Alto, Calif., claiming that Laurene’s story about meeting Steve by accident at an event at Stanford, where she was a graduate student, was not entirely true—that in fact she had schemed to meet him. “Laurene is nice, but she can be calculating,” Hertzfeld says, “and I think she targeted him from the beginning. Her college roommate told me that Laurene had magazine covers of Steve and vowed she was going to meet him. If it’s true that Steve was manipulated, there is a fair amount of irony there.”

Laurene Jobs tells Isaacson that this is not the case. But still, there it is, in the book.

Also much addressed is Jobs’s notorious mean streak and willingness to be rude and belittling even to those closest to him. Jon Ive, Apple’s head of design and one of the closest people to Jobs during his career at Apple, recounts examples of Jobs being hurtful, something that puzzled Ive, since Jobs was also a very sensitive person. “His way to achieve catharsis is to hurt somebody. And I think he feels he has a liberty and a license to do that. The normal rules of social engagement, he feels, don’t apply to him. Because of how very sensitive he is, he knows exactly how to efficiently and effectively hurt someone. And he does do that.”

Jobs is painted as a stubborn egomaniac who refused to get treatment for his cancer when he was first diagnosed despite entreaties from close friends like Intel CEO Andy Grove, who is a cancer survivor himself, and Genentech chairman Arthur Levinson, who is an Apple director. “That’s not how cancer works,” Levinson recalls telling Jobs when he first set out to cure his disease by with a vegan diet, carrot juice, acupuncture, and visits to a psychic. “You cannot solve this without surgery and blasting it with toxic chemicals.”

Eventually Jobs did have surgery, but by then it was too late—the cancer had spread beyond his pancreas. The Isaacson book also reveals that for the last several years of his life Jobs knew his health situation was far worse than he let on to Apple and the public. For years some critics have complained that Jobs and Apple were not being forthcoming enough with shareholders about the true nature of his condition. The portrait presented by Isaacson is of a man who claimed he had been “cured” even when he knew this was not the case.

Jobs comes across as hypercompetitive and vengeful, a man who, even as he was dying, remained obsessed with Google’s Android operating system, which he considered to be a rip-off of the software in Apple’s iPhone. “I will spend my last dying breath if I have to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong,” Jobs says.

Isaacson also recounts Jobs’s fury when Jon Rubinstein, a key Apple engineer, left the company and later joined Palm, which was building a rival to the iPhone. Rubinstein was pushed out of Apple by Ive, who had clashed with Rubinstein and gave Jobs an ultimatum—he goes or I go. So Rubinstein left, and yet when he took a position at Palm, Jobs went ballistic, and even called rock star Bono, whose investment company, Elevation Partners, owned part of Palm, and asked him to intercede. Bono told Jobs to chill out. Jobs did, eventually, and later says, “The fact that they completely failed salves that wound.”

Jobs in his dying days seems not to have gained any special wisdom or perspective. He has unkind things to say about Rubinstein, Bill Gates, Steve Ballmer, Barack Obama, Google chairman Eric Schmidt, and others. In the case of Obama, Jobs refused to meet with the president unless Obama called him personally to ask for the meeting. When the pair finally met, Jobs comes across like a version of Montgomery Burns on The Simpsons, suggesting, among other things, that the U.S. should be more like China when it came to regulating (or not regulating) companies as they built factories, that the president should get rid of teachers’ unions, and that schools should stay in session until 6 p.m. and operate 11 months out of the year.

Perhaps the most disappointing side of Jobs involves his family. Isaacson delves into the story of Lisa Brennan-Jobs, the daughter Jobs fathered when he was 23 years old but refused to acknowledge, forcing her mother to live for a time on welfare and food stamps, and later on a meager subsidy even while Jobs became one of the wealthiest people in America. Jobs later acknowledged Lisa, and at age 14 she moved in with Jobs and his new family, after some kind of difficult situation that Isaacson hints at but, for reasons that are unclear, decides not to divulge completely. Even then, though, Jobs and his daughter had a rocky, on-and-off relationship, with the tech titan often cutting his daughter off when they had disagreements, forcing her to borrow money from Hertzfeld to pay for her tuition at Harvard, for example.

Jobs had three children with Laurene—a son, Reed, and two daughters, Erin and Eve. According to Isaacson, “Jobs developed a strong relationship with Reed, but with his daughters he was more distant. As he would with others, he would occasionally focus on them, but just as often would completely ignore them when he had other things on his mind.” Says Laurene: “He focuses on his work, and at times he has not been there for the girls.”

Isaacson says Jobs told him one reason he wanted to have a biography written was so that his children would know him. He said he hadn’t been around much for his kids, and he wanted them to understand why that was. This must be one of the saddest things I have ever read. The picture here is of a brilliant, successful yet amazingly narrow, limited, and ungenerous man, who, even as he was dying, could not let go of his desire to outdo his enemies and could not imagine anything more fulfilling to do with his limited time on earth than building more new gadgets and gizmos, a man who put work ahead of his family and was often appallingly hurtful to the people closest to him. He was, in other words, a man of his time, a symbol of all that is great and all that is wrong with our culture.

The Steve Jobs Biography (3)

I think it appropriate that I purchase an iPad2 today while I am reading the new biography of Steve Jobs. Mr. Jobs was one crazy dude. Much more about this later.

Sunday, October 30, 2011

The Genius of Jobs

The Genius of Jobs
By WALTER ISAACSON
Published: October 29, 2011

ONE of the questions I wrestled with when writing about Steve Jobs was how smart he was. On the surface, this should not have been much of an issue. You’d assume the obvious answer was: he was really, really smart. Maybe even worth three or four reallys. After all, he was the most innovative and successful business leader of our era and embodied the Silicon Valley dream writ large: he created a start-up in his parents’ garage and built it into the world’s most valuable company.

Room For Debate: Career Counselor: Bill Gates or Steve Jobs? But I remember having dinner with him a few months ago around his kitchen table, as he did almost every evening with his wife and kids. Someone brought up one of those brainteasers involving a monkey’s having to carry a load of bananas across a desert, with a set of restrictions about how far and how many he could carry at one time, and you were supposed to figure out how long it would take. Mr. Jobs tossed out a few intuitive guesses but showed no interest in grappling with the problem rigorously. I thought about how Bill Gates would have gone click-click-click and logically nailed the answer in 15 seconds, and also how Mr. Gates devoured science books as a vacation pleasure. But then something else occurred to me: Mr. Gates never made the iPod. Instead, he made the Zune.

So was Mr. Jobs smart? Not conventionally. Instead, he was a genius. That may seem like a silly word game, but in fact his success dramatizes an interesting distinction between intelligence and genius. His imaginative leaps were instinctive, unexpected, and at times magical. They were sparked by intuition, not analytic rigor. Trained in Zen Buddhism, Mr. Jobs came to value experiential wisdom over empirical analysis. He didn’t study data or crunch numbers but like a pathfinder, he could sniff the winds and sense what lay ahead.

He told me he began to appreciate the power of intuition, in contrast to what he called “Western rational thought,” when he wandered around India after dropping out of college. “The people in the Indian countryside don’t use their intellect like we do,” he said. “They use their intuition instead ... Intuition is a very powerful thing, more powerful than intellect, in my opinion. That’s had a big impact on my work.”

Mr. Jobs’s intuition was based not on conventional learning but on experiential wisdom. He also had a lot of imagination and knew how to apply it. As Einstein said, “Imagination is more important than knowledge.”

Einstein is, of course, the true exemplar of genius. He had contemporaries who could probably match him in pure intellectual firepower when it came to mathematical and analytic processing. Henri PoincarĂ©, for example, first came up with some of the components of special relativity, and David Hilbert was able to grind out equations for general relativity around the same time Einstein did. But neither had the imaginative genius to make the full creative leap at the core of their theories, namely that there is no such thing as absolute time and that gravity is a warping of the fabric of space-time. (O.K., it’s not that simple, but that’s why he was Einstein and we’re not.)

Einstein had the elusive qualities of genius, which included that intuition and imagination that allowed him to think differently (or, as Mr. Jobs’s ads said, to Think Different.) Although he was not particularly religious, Einstein described this intuitive genius as the ability to read the mind of God. When assessing a theory, he would ask himself, Is this the way that God would design the universe? And he expressed his discomfort with quantum mechanics, which is based on the idea that probability plays a governing role in the universe by declaring that he could not believe God would play dice. (At one physics conference, Niels Bohr was prompted to urge Einstein to quit telling God what to do.)

Both Einstein and Mr. Jobs were very visual thinkers. The road to relativity began when the teenage Einstein kept trying to picture what it would be like to ride alongside a light beam. Mr. Jobs spent time almost every afternoon walking around the studio of his brilliant design chief Jony Ive and fingering foam models of the products they were developing.

Room For Debate: Career Counselor: Bill Gates or Steve Jobs? Mr. Jobs’s genius wasn’t, as even his fanboys admit, in the same quantum orbit as Einstein’s. So it’s probably best to ratchet the rhetoric down a notch and call it ingenuity. Bill Gates is super-smart, but Steve Jobs was super-ingenious. The primary distinction, I think, is the ability to apply creativity and aesthetic sensibilities to a challenge.

In the world of invention and innovation, that means combining an appreciation of the humanities with an understanding of science — connecting artistry to technology, poetry to processors. This was Mr. Jobs’s specialty. “I always thought of myself as a humanities person as a kid, but I liked electronics,” he said. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.”

The ability to merge creativity with technology depends on one’s ability to be emotionally attuned to others. Mr. Jobs could be petulant and unkind in dealing with other people, which caused some to think he lacked basic emotional awareness. In fact, it was the opposite. He could size people up, understand their inner thoughts, cajole them, intimidate them, target their deepest vulnerabilities, and delight them at will. He knew, intuitively, how to create products that pleased, interfaces that were friendly, and marketing messages that were enticing.

In the annals of ingenuity, new ideas are only part of the equation. Genius requires execution. When others produced boxy computers with intimidating interfaces that confronted users with unfriendly green prompts that said things like “C:\>,” Mr. Jobs saw there was a market for an interface like a sunny playroom. Hence, the Macintosh. Sure, Xerox came up with the graphical desktop metaphor, but the personal computer it built was a flop and it did not spark the home computer revolution. Between conception and creation, T. S. Eliot observed, there falls the shadow.

In some ways, Mr. Jobs’s ingenuity reminds me of that of Benjamin Franklin, one of my other biography subjects. Among the founders, Franklin was not the most profound thinker — that distinction goes to Jefferson or Madison or Hamilton. But he was ingenious.

This depended, in part, on his ability to intuit the relationships between different things. When he invented the battery, he experimented with it to produce sparks that he and his friends used to kill a turkey for their end of season feast. In his journal, he recorded all the similarities between such sparks and lightning during a thunderstorm, then declared “Let the experiment be made.” So he flew a kite in the rain, drew electricity from the heavens, and ended up inventing the lightning rod. Like Mr. Jobs, Franklin enjoyed the concept of applied creativity — taking clever ideas and smart designs and applying them to useful devices.

China and India are likely to produce many rigorous analytical thinkers and knowledgeable technologists. But smart and educated people don’t always spawn innovation. America’s advantage, if it continues to have one, will be that it can produce people who are also more creative and imaginative, those who know how to stand at the intersection of the humanities and the sciences. That is the formula for true innovation, as Steve Jobs’s career showed.

Barry Goldwater - The Conscience of a Conservative

The backlash against the modern civil rights movement that fueled the rise of the Republican Party in the South started in 1964 when the Republicans nominated Barry Goldwater for president. Goldwater swept the South---he voted against the 1964 civil rightS act and it was clearly understood that Southerners voted for him to fight integration---but lost overhwhelming to Lyndon Johnson in the election. Still, 39% of the people nationwide voted for Goldwater.

Goldwater was Mr. Conservative before Ronald Reagan. You do not hear his name mentioned today. He stated his conversative views in an influential book published 1960 called THE CONSCIENCE OF A CONSERVATIVE. I thought it would be interesting to read this book in retrospect.

In a forward to my edition of the book, George Will writes:

"THE CONSCIENCE OF A CONSERVATIVE is one of the most consequential political writings in American history." Well, maybe so, but if so it's amazing how today nobody reads it anymore.

At the end of his forward, Will is correct in this assertion:

"And so continues an American political argument about how much government we want, and how much we are willing to pay for it in the coin of constricted freedom." This is indeed the discussion that we should be having in this country. For Republicans, universal health insurance like every other industrialized country on the planet can be viewed as a restriction of freedom. For Democrats, it is an expansion of freedom for American citizens to live their lives not having to worry about going bankrupt over unpayable medical bills. For Republicans, the startling explosion in income inequality in this country is not a problem in the name of freedom. For Democrats, it IS a big problem and should be addressed. And so on and so on to other issues.

Goldwater had his conservative principles. Of that fact there can be no doubt. The problem is that he is wrong on almost all counts.

He was against federal aid to education. Remnants of that view are still around today when Michelle Bachmann favors eliminating the Department of Education. Never mind that Ronald Reagan instigated the establishment of the governmental deparment. Being against federal aid to education on any rational basis is hopelessly out-of-date now.

It is outright hysterically funny to read Goldwater's chapter attacking labor unions. Writing against the backrop of the 50's, Goldwater thought that too powerful labor unions were a threat to our democracy. What would he say today when union membership is at an all-time now and it's the big coporations that have their thumb on labor unions? My guess is that he would not be concerned. Goldwater wanted a balance between capital and labor. It's bad if the balance favors labor (in his view) but I'm sure it would be OK with him if the balance favored capital.

It is equally funny to read Goldwater's critique of the welfare state. He acknowledges the concept of the common good, but thinks government should play no part in pursuing the common good. Instead, the common good should be a function of private charity. What planet were you living on, Barry? Certainly not planet Earth in the United States of America.

Saturday, October 29, 2011

Gene Chizik - All In (9)

"By my estimate, 60 per cent of our players come from single-parent homes.

The word family gets thrown around a lot among sports teams, but for our team it is more than a cliche. We truly considers ourselves a family."

P. 155

Friday, October 28, 2011

Critics See ‘Chilling Effect’ in Alabama Immigration Law

BY Campbell Robertson
New York Times
27 October 2011

ALABASTER, Ala. — The champions of Alabama’s far-reaching immigration law have said that it is intended to drive illegal immigrants from the state by making every aspect of their life difficult. But they have taken a very different tone when it comes to the part of the law concerning schools.

“No child will be denied an education based on unlawful status,” the state attorney general, Luther Strange, argued in a court filing.

The man who wrote the schools provision says the same thing, that it is not meant as a deterrent — at least not yet. It is, however, a first step in a larger and long-considered strategy to topple a 29-year-old Supreme Court ruling that all children in the United States, regardless of their immigration status, are guaranteed a public education.

The provision, which is known as Section 28, requires primary and secondary schools to record the immigration status of incoming students and their parents and pass that data on to the state.

Critics say it is a simple end in itself, an attempt to circumvent settled law and to scare immigrants away from school now, not at some point in the future. Weeks of erratic school attendance figures and a spike in withdrawals show that this has worked, they argue. And indeed, a federal appeals court on Oct. 14 blocked the provision pending an appeal by the Justice Department, though the court did not rule on the merits.

Michael M. Hethmon, general counsel for the Immigration Reform Law Institute in Washington, who wrote the provision, insists that its goal is much more ambitious.

The eventual target, he said, is the 1982 Supreme Court decision Plyler v. Doe. The case concerned a Texas statute that withheld funds for the education of illegal immigrants and allowed districts to bar them from enrollment, as well as one Texas school district’s plan to charge illegal immigrants tuition.

The court ruled that this violated the Constitution’s equal protection clause, saying that the statute “imposes a lifetime hardship on a discrete class of children not accountable” for their immigration status. In the decision, the court also said that the state had not presented evidence showing it was substantially harmed by giving these children — as distinct from any other children — a free public education.

Over the ensuing decades, measures have been passed in defiance of this ruling, most notably California’s Proposition 187, but they have been repeatedly struck down in the courts. Mr. Hethmon said the problem with these challenges is that they have not taken the trouble to gather the evidence the court found missing in Plyler.

“The toughest question has been obtaining reliable — and I mean reliable for peer-reviewed research purposes — censuses of the number of illegal alien students enrolled in school districts,” he said. “That information could be compared with other sorts of performance or resource allocation issues.”

The Alabama law directs schools to ascertain the immigration status of incoming students, through a birth certificate, other official documents or an affidavit by the child’s parents (the law also directs schools to determine the immigration status of an enrolling child’s parents, but gave no mechanism by which to do so).

That information is then passed on to the State Board of Education not only to prepare an annual report with the data but also to “contract with reputable scholars and research institutions” to determine the costs, fiscal and otherwise, of educating illegal immigrants.

Because no one is actually barred from attending school and the data is not passed on to law enforcement, the provision passes constitutional muster, Mr. Hethmon said.

But it also potentially enables a fresh challenge to Plyler v. Doe, and the idea that schools are obligated to provide a free education to illegal immigrants.

Critics dismiss this as a ruse.

They say that the law instills such fear in immigrant families with schoolchildren, leading predictably to such erratic attendance figures, that it belies any claim that the state is seriously attempting an accurate measurement.

“This seems to be really a transparent attempt at a pretext to try to justify discriminatory law,” said Lucas Guttentag, a professor of immigration law at Yale Law School and senior counsel of the Immigrants’ Rights Project of the American Civil Liberties Union. “The idea that they’re somehow going to collect this data and show anything that’s conceivably relevant is a fantasy.”

In Plyler v. Doe, Professor Guttentag said, the court found that the state’s actions were unconstitutional on a number of grounds. The state’s failure to show the impact of illegal immigration on schools was only a part of the decision, he said, and a nuanced one at that. The likelihood that the data collected by this law would lead to that decision’s being overturned, Professor Guttentag said, was extremely low.

Though there has not been a direct census, there are estimates of illegal immigrants in Alabama schools. According to American Community Survey data, a little less than one-half of 1 percent of the 800,000 children in Alabama schools are in the country illegally; of the 34,000 Hispanic children in Alabama schools, according to Pew Hispanic Center estimates, roughly two thirds are American citizens.

The law also requires schools to track the enrollment of illegal immigrants in remedial English programs (though this part, too, was ignored in the state’s actual execution of the law). There is existing data about the national origin of such students, at least at the district level.

Here in Shelby County, which has one of the fastest-growing Hispanic populations in the state, there are about 1,400 remedial English students, out of roughly 18,700 statewide. They came into the schools here speaking 52 languages, including Chinese and Arabic, though the majority came in as Spanish speakers, said Leah Dobbs Black, the English as a Second Language program supervisor for the county.

Ten years ago, she said, as many as 9 out of 10 students in need of remedial English were born outside the country, a fact students already report on language assessment forms. Now, she said, “It’s at least 50-50.”

Ms. Black added that Shelby schools spend about $4 million on the program out of an annual budget of $281 million, though she acknowledged that parents tend to complain more about the money spent on that program than others.

Whether the critics are correct in arguing that the law has created a “chilling effect,” inducing families to pull their children out of school, is harder to measure than it may seem.

While daily absences by Hispanic students ranged as high as 5,143, or 15 percent of the Hispanic student population, they had dropped to 1,230 the day before the provision was blocked, said a spokeswoman for the state Department of Education (on a normal day, she said, around 1,000 absences can be expected). Statewide data has not been compiled as to how many students have fully withdrawn, though interviews in several districts suggest that number could be in the hundreds.

Several parents in Shelby County who are in the country illegally said in interviews that they were less frightened about Section 28 than about other parts of the law. Their children were all United States citizens by birth, they said, and school officials had so far been reassuring.

The antagonism in schools now, they said, is mainly coming from other children.

“A little girl in my daughter’s class asked when she was going to go to Mexico because she was illegal,” said a 27-year-old woman who gave her name as Arelly, the mother of a fourth grader. “I think they hear their parents talking.”

It Didn't Have to be This Way

Op-Ed Columnist
The Path Not Taken
By PAUL KRUGMAN
Published: October 27, 2011



Financial markets are cheering the deal that emerged from Brussels early Thursday morning. Indeed, relative to what could have happened — an acrimonious failure to agree on anything — the fact that European leaders agreed on something, however vague the details and however inadequate it may prove, is a positive development.

But it’s worth stepping back to look at the larger picture, namely the abject failure of an economic doctrine — a doctrine that has inflicted huge damage both in Europe and in the United States.

The doctrine in question amounts to the assertion that, in the aftermath of a financial crisis, banks must be bailed out but the general public must pay the price. So a crisis brought on by deregulation becomes a reason to move even further to the right; a time of mass unemployment, instead of spurring public efforts to create jobs, becomes an era of austerity, in which government spending and social programs are slashed.

This doctrine was sold both with claims that there was no alternative — that both bailouts and spending cuts were necessary to satisfy financial markets — and with claims that fiscal austerity would actually create jobs. The idea was that spending cuts would make consumers and businesses more confident. And this confidence would supposedly stimulate private spending, more than offsetting the depressing effects of government cutbacks.

Some economists weren’t convinced. One caustic critic referred to claims about the expansionary effects of austerity as amounting to belief in the “confidence fairy.” O.K., that was me.

But the doctrine has, nonetheless, been extremely influential. Expansionary austerity, in particular, has been championed both by Republicans in Congress and by the European Central Bank, which last year urged all European governments — not just those in fiscal distress — to engage in “fiscal consolidation.”

And when David Cameron became Britain’s prime minster last year, he immediately embarked on a program of spending cuts in the belief that this would actually boost the economy — a decision that was greeted with fawning praise by many American pundits.

Now, however, the results are in, and the picture isn’t pretty. Greece has been pushed by its austerity measures into an ever-deepening slump — and that slump, not lack of effort on the part of the Greek government, was the reason a classified report to European leaders concluded last week that the existing program there was unworkable. Britain’s economy has stalled under the impact of austerity, and confidence from both businesses and consumers has slumped, not soared.

Maybe the most telling thing is what now passes for a success story. A few months ago various pundits began hailing the achievements of Latvia, which in the aftermath of a terrible recession, nonetheless, managed to reduce its budget deficit and convince markets that it was fiscally sound. That was, indeed, impressive, but it came at the cost of 16 percent unemployment and an economy that, while finally growing, is still 18 percent smaller than it was before the crisis.

So bailing out the banks while punishing workers is not, in fact, a recipe for prosperity. But was there any alternative? Well, that’s why I’m in Iceland, attending a conference about the country that did something different.

If you’ve been reading accounts of the financial crisis, or watching film treatments like the excellent “Inside Job,” you know that Iceland was supposed to be the ultimate economic disaster story: its runaway bankers saddled the country with huge debts and seemed to leave the nation in a hopeless position.

But a funny thing happened on the way to economic Armageddon: Iceland’s very desperation made conventional behavior impossible, freeing the nation to break the rules. Where everyone else bailed out the bankers and made the public pay the price, Iceland let the banks go bust and actually expanded its social safety net. Where everyone else was fixated on trying to placate international investors, Iceland imposed temporary controls on the movement of capital to give itself room to maneuver.

So how’s it going? Iceland hasn’t avoided major economic damage or a significant drop in living standards. But it has managed to limit both the rise in unemployment and the suffering of the most vulnerable; the social safety net has survived intact, as has the basic decency of its society. “Things could have been a lot worse” may not be the most stirring of slogans, but when everyone expected utter disaster, it amounts to a policy triumph.

And there’s a lesson here for the rest of us: The suffering that so many of our citizens are facing is unnecessary. If this is a time of incredible pain and a much harsher society, that was a choice. It didn’t and doesn’t have to be this way.

About Steve Jobs

Op-Ed Columnist
Limits of Magical Thinking
By MAUREEN DOWD
Published: October 25, 2011


Steve Jobs, the mad perfectionist, even perfected his stare.

He wanted it to be hypnotic. He wanted the other person to blink first. He wanted it to be, like Dracula’s saturnine gaze, a force that could bend your will to his and subsume your reality in his.

There’s an arresting picture of Jobs staring out, challenging us to blink, on the cover of Walter Isaacson’s new biography, “Steve Jobs.” The writer begins the book by comparing the moody lord of Silicon Valley to Shakespeare’s Henry V — a “callous but sentimental, inspiring but flawed king.”

Certainly, Jobs created what Shakespeare called “the brightest heaven of invention.” But his life sounded like the darkest hell of volatility.

An Apple C.E.O. who jousted with Jobs wondered if he had a mild bipolarity.

“Sometimes he would be ecstatic, at other times he was depressed,” Isaacson writes. There were Rasputin-like seductions followed by raging tirades. Everyone was either a hero or bozo.

As Jobs’s famous ad campaign for Apple said, “Here’s to the crazy ones. ... They push the human race forward.”

The monstre sacrĂ© fancied himself an “enlightened being,” but he was capable of frightening coldness, even with his oldest collaborators and family. Yet he often sobbed uncontrollably.

Isaacson told me that Jobs yearned to be a saint; but one of the colleagues he ousted from Apple mordantly noted that the petulant and aesthetic Jobs would have made an excellent King of France.

His extremes left everyone around him with vertigo.

He embraced Zen minimalism and anti-materialism. First, he lived in an unfurnished mansion, then a house so modest that Bill Gates, on a visit, was astonished that the whole Jobs family could fit in it. And Jobs scorned security, often leaving his back door unlocked.

Yet his genius was designing alluring products that would create a country of technology addicts. He demanded laser-like focus from employees to create an A.D.D. world.

He was abandoned by parents who conceived him out of wedlock at 23, and he then abandoned a daughter for many years that he conceived out of wedlock at 23.

Chrisann Brennan, the mother of Jobs’s oldest child, Lisa, told Isaacson that being put up for adoption left Jobs “full of broken glass.” He very belatedly acknowledged Lisa and their relationship was built, Isaacson says, on “layers of resentment.”

He could be hard on women. Two exes scrawled mean messages on his walls. As soon as he learned that his beautiful, willowy, blonde girlfriend, Laurene Powell, was pregnant in 1991, he began musing that he might still be in love with the previous beautiful, willowy, blonde girlfriend, Tina Redse.

“He surprised a wide swath of friends and even acquaintances by asking them what he should do,” Isaacson writes. “ ‘Who was prettier,’ he would ask, ‘Tina or Laurene?’ ” And “who should he marry?”

Isaacson notes that Jobs could be distant at times with the two daughters he had with Laurene (though not the son). When one daughter dreamed of going to the Oscars with him, he blew her off.

Andy Hertzfeld, a friend and former Apple engineer, lent Lisa $20,000 when she thought her father was not going to pay her Harvard tuition. Jobs paid it back to his friend, but Lisa did not invite him to her Harvard graduation.

“The key question about Steve is why he can’t control himself at times from being so reflexively cruel and harmful to some people,” Hertzfeld said. “That goes back to being abandoned at birth.”

He almost always wore black turtlenecks and jeans. (Early on, he scorned deodorant and went barefoot and had a disturbing habit of soaking his feet in the office toilet.)

Yet he sometimes tried to ply his exquisite taste to remake the women in his life.

When he was dating the much older Joan Baez — enthralled by her relationship with his idol, Bob Dylan — he drove her to a Ralph Lauren store in the Stanford mall to show her a red dress that would be “perfect” for her. But one of the world’s richest men merely showed her the dress, even after she told him she “couldn’t really afford it,” while he bought shirts.

When he met his sister, Mona Simpson, a struggling novelist, as an adult, he berated her for not wearing clothes that were “fetching enough” and then sent her a box of Issey Miyake pantsuits “in flattering colors,” she said.

He was a control freak, yet when he learned he had a rare form of pancreatic cancer that would respond to surgery, he ignored his wife, doctors and friends and put the surgery off for nine months, trying to heal himself with wacky fruit diets, hydrotherapy, a psychic and expressing his negative feelings. (As though he had to be encouraged.)

Addicted to fasting because he felt it produced euphoria and ecstasy, he refused to eat when he needed protein to fight his cancer.

The Da Vinci of Apple could be self-aware. “I know that living with me,” he told Isaacson as he was dying, “was not a bowl of cherries.”

Regarding Conservatism

Alan Wolfe
view bio
One Right The Power Lover The Visitor October 27, 2011 | 12:00 am Print
The Reactionary Mind: Conservatism from Edmund Burke to Sarah Palin
by Corey Robin
Oxford University Press, 304 pp., $36.75

AS THE REPUBLICAN Party lurches toward nominating a presidential candidate to run against Barack Obama, we are likely to hear talk of deep splits within the conservative movement. Tea Party activists, who hate state intervention into the economy, will be distinguished from social conservatives, who love state intervention into matters of sex. Ayn Rand’s militant atheism, so attractive to one half of the party leadership, will be contrasted to the equally warlike Christianity that appeals to the right’s other half. Pundits will discover that aggressive interventionists touched by neoconservatism are not the same thing as America-first nationalists influenced by isolationism. Some liberals will cheer. Long accustomed to divisions within their own ranks, they will for once take glee in the splits and bitter exchanges of their antagonists.

Don’t be fooled by any of this, argues Corey Robin. Against nearly all other leftists writing about rightists, Robin believes that there is only one kind of conservatism. Whether expressed in the lofty words of Burke or the rambling ravings of Palin, conservatism is always and everywhere a resentful attack on those who seek to make the world more fair. Take away the left and you destroy the rationale for the right. It is only because the modern world takes justice seriously, at least in theory, that we have thinkers and activists determined to put their bodies on the gears to stop the machinery from moving forward.

Robin treats conservatives as activists rather than as stand-patters. “Conservatism,” he writes, “has been a forward movement of restless and relentless change, partial to risk taking and ideological adventurism, militant in posture and populist in its bearings, friendly to upstarts and insurgents, outsiders and newcomers alike.” Burke, in Robin’s view, began this tradition, and figures such as de Maistre, de Bonald, and Sorel carried it forward. If we take all of them as the genuine articles, there is no need to draw a line between conservatives and reactionaries: all conservatives are reactionary. Conservatives are unified, and united in their rage. Their most passionate hate is directed at those they believe were assigned by God or nature to second-class status but still insist on their full rights as human beings.

For Robin, what began in the late eighteenth century has reached a kind of culmination in the early twenty-first century. Republicans in love with Ayn Rand express the same romantic protest against modern complexity as evangelical Christians lamenting for families of yore. Whatever their differences, both movements are counter-cultural, even counter-revolutionary. That is why they are the rightful heirs of all the European thinkers whom Robin evokes. Everything about these contemporary right-wing activists—their militant theatrics, their artificial populism, their refusal to compromise—was anticipated two centuries ago. “Far from being a recent innovation of the Christian Right or the Tea Party movement, reactionary populism runs like a red thread throughout conservative discourse from the very beginning.”

Robin adds a distinctive wrinkle to the common claim of Burke’s responsibility for modern conservatism. He says that it was not his Reflections on the Revolution in France but his Philosophical Enquiry into Our Ideas of the Sublime and the Beautiful that deserves the most attention. Power, as Robin summarizes Burke, “should never aspire to be—and can never actually be—beautiful. What great power needs is sublimity.” Owing to this emphasis on the sublime, Burke ought not to be read as a defender of the old regime. Not only had the Bourbons lost both their beauty and their sublimity, they had also become pathetic and decadent, lacking the capacity to justify themselves (and thus requiring thinkers such as Burke to carry out the thankless task).

Conservatives, says Robin, long for an imagined world too rarified ever to survive; they are theorists of loss. That is why, no matter how small the circle of privilege they defend, they have a certain appeal to the much larger collection of ordinary people whom they otherwise hold in contempt. Who has not experienced loss? Who would not want to return to an ideal world? The sacred is always more appealing than the profane. Try to make the world a more just place and you eliminate the sublime from it.

“The sublime,” Burke wrote, “is the sensation we feel in the face of extreme pain, danger or terror.” For all the emphasis on stability and tradition, conservatives admire revolutionaries because the terror they unleash gives us a glimpse of precisely such wonders. As Robin correctly points out, de Maistre preferred zealous if misguided Jacobins to lazy and self-satisfied nobles. Owing to its militancy, conservatism is zealously promoted by outsiders: Burke was Irish, de Maistre a Savoyard, Disraeli a Jew, Hamilton a West Indian. The same tendency can be witnessed today. It was not WASPs who revived the contemporary right but Jews and, downplayed by Robin, Catholics, who “helped transform the Republican Party from a cocktail party in Darien into the party of Scalia, d’Souza [sic] Gonzalez, and Yoo.”

Just as de Maistre could barely hide his Jacobin sympathies, the contemporary American right, in Robin’s account, is lock, stock, and barrel a product of the 1960s. “It’s time for God’s people to come out of the closet,” a Texas evangelist declares in Robin’s pages—a near perfect expression of the extent to which reaction against the gains of the 1960s could only be expressed in the language of the movement being denounced. Abby Hoffman prepared the way for Michele Bachmann. Mere economic protest does not get you the characters that constitute the Republican base today. For that you need people who genuinely believe that the world is coming to an end.

No other contemporary American figure captures this conservative combination of resentment and activism better than Antonin Scalia, the subject of one of Robin’s most interesting chapters. Despite talk of being faithful to texts, Robin argues, Scalia uses his power on the court to impose on the country the classic conservative mantra: the world is falling apart, and so only the obedience to rules, no matter how seemingly arbitrary and unfair, can save it from doom. “No Plato for him,” Robin writes of this intemperate and deeply reactionary judge. “He’s with Nietzsche all the way.” This at first does not seem quite right: Nietzsche is hardly a theorist of obedience to rules. But once we realize that for Scalia rule-following is only for the masses, while those on top get to do all the rule-writing, Robin’s take on the man strikes me as warranted. There are times when Scalia goes out of his way to remind us of how cruel the world can be—and how helpless we are in the face of these very cruelties. Scalia has buried himself deep inside the right-wing counterculture where winners, calling themselves victims, are given rights, while losers are instructed never to complain even as their rights are stripped from them.

I confess to being one of those who likes to divide conservatives into their parts as opposed to treating them as a whole. Robin makes a vigorous case that I am wrong, and I am tempted by his analysis—as far as it goes. To be sure, Robin exaggerates, and all too easily dismisses exceptions to his generalizations: he quotes Michael Oakeshott, and a bit too frequently, yet finally he has no choice but to throw him off the conservative bus. The very existence of such a thinker suggests that conservatism need not always be either as reactionary or as angry as Robin claims. Still, at least as regards reactionaries such as Scalia and Palin, a little rhetorical provocation seems justified. Robin is an engaging writer, and just the kind of broad-ranging public intellectual all too often missing in academic political science.

The real problem of persuasion lies elsewhere. In this book, Robin has chosen to republish essays, albeit with a comprehensive introduction, rather than to make a sustained argument. I cannot blame him for that; I have done the same myself. But at least one of the essays is so out of date that Robin repudiates it, and the entire second half of the book, while containing interesting asides on terror in Latin America or reactions to September 11, is only marginally related to the first half. Thus was lost an opportunity to develop an arresting theme, shape it with original and fresh examples, acknowledge its limits, and then make it part of our national conversation. Robin’s arguments deserve widespread attention. But they way he has presented them almost ensures that they will not get it.

Thursday, October 27, 2011

Spending More Doesn’t Make Us Healthier

BY Ezekiel J. Emanuel
New York Times
27 October 2011

If you have heard it once, you have heard it hundreds of times. “The United States spends too much on health care.” This is not a partisan point. You can hear this from Republicans as well as Democrats. “We know that our families, our economy and our nation itself will not succeed in the 21st century if we continue to be held down by the weight of rapidly rising health care costs,” President Obama said in 2009. Representative Paul D. Ryan, Republican of Wisconsin, agrees: “There is no serious dispute — on either side of the aisle.”

Unfortunately, few people really understand how much we spend on health care, how much we need to spend to provide quality care, and the difference between the two. Do we spend too much? Would cutting costs require rationing, or worse, death panels?

Let’s begin with the costs. In 2010, the United States spent $2.6 trillion on health care, over $8,000 per American. This is such an enormous amount of money, it’s difficult to grasp.

Consider this: If we stacked single dollar bills on top of one another, $2.6 trillion would reach more than 170,000 miles — nearly three-quarters of the way to the moon. Or, compare our spending to that of other countries. France has the fifth largest economy in the world, with a gross domestic product of nearly $2.6 trillion. The United States spends on health care alone what the 65 million people in France spend on everything: education, defense, the environment, scientific research, vacations, food, housing, cars, clothes and health care. In other words, our health care spending is the fifth largest economy in the world.

Or compare it to the second largest economy in the world, China. China’s G.D.P. is $5.9 trillion (compared to America’s $14.6 trillion). So the United States, with a population a quarter of the size of China’s, spends just on health care slightly less than half of what China spends on everything.

It is not just how much the United States spends on health care that is important, it is also how fast that amount is growing. For more than 30 years, health care costs have been growing 2 percent faster than the general economy. That means every year we spend ever more on health care and therefore have to spend less on other things — or borrow money to pay for the extra health care.

If we continue this rate of growth, health care will be roughly one-third of the entire economy by 2035 — one of every three dollars will go to health care — and nearly half by 2080.

This level of spending on health care is high, but is it worth it? Does it make us healthier?

The fact is that when it comes to health care, the United States is on another planet. The United States spends around 40 percent more per person than the next highest-spending countries, Switzerland and Norway.

Some economists say this comparison is inaccurate because it does not correct for the facts that brand name drugs cost more and doctors and nurses earn more in the United States than they do in other countries. But even correcting for these differences in prices, the United States still spends 15 percent more than the next highest-spending country — and about a quarter more than countries with some of the best health care systems in the world, like Germany and France.

What this means is that there is so much money in the American health care system, we can control spending without having to ration care. No one seriously claims that France or Germany ration care. We could get down to their level of spending without forcing Americans to wait in lines for heart surgery or cataract removal.

The truth is, the United States is not getting 20, 30, much less 40 percent better health care or results than other countries. While there are peaks of greatness, especially at some of America’s leading academic health centers and in integrated health care plans, the quality is uneven. And quality is a problem that affects all of us, rich and poor. Almost no matter how we measure it — whether by life expectancy or by survival for specific diseases like asthma, heart disease or some cancers; by the rate of medical errors; or simply by satisfaction with health services — the United States is actually doing worse than a number of countries, like France and Germany, that spend considerably less.

Even if you do not like comparing the United States with Europe, it is widely acknowledged that within the United States there is no clear link between higher spending on health care and longer life, less disability or better quality of life. A 2003 study published in the Annals of Internal Medicine found that Medicare patients who lived in areas with higher health care spending did not get better results. In some cases, more spending even appears to equal poorer health. A 2004 study in Health Affairs found that there was actually worse care in states with higher Medicare spending.

The $2.6 trillion the United States is spending on health care is too much, and we can reduce it without rationing or sacrificing quality.

Wednesday, October 26, 2011

Concerning Mr. Madison

Conservatives revere Madison because of his turn toward states rights in the latter years of his life under the influence of Thomas Jefferson.


Biography . History
The Inventor of Our Politics
Jack Rakove

The Inventor of Our Politics The Secret Agent Founders Lite October 26, 2011 | 12:00 am Print
James Madison
by Richard Brookhiser
Basic Books, 287 pp., $26.99

GO TO THE homepage of the Federalist Society, and you will discover that its logo is a profile of James Madison. Whether Madison (as opposed to, say, Hamilton) is the best icon for this celebrated consociation of conservative lawyers and law students could be subject to some dispute. Madison’s revealing proposal, in 1787, to give Congress the power to negate state laws, which he wanted to use to protect individual and minority rights, could just as easily qualify him as a trademark for the ACLU. His criticisms in the 1790s of presidential abuse of the powers of war and diplomacy hardly accord with neo-conservative doctrine or the take-no-prisoners constitutionalism of Dick Cheney and his legal saber, David Addington. Yet Madison’s profound awareness of the difficulty of constitution-making reveals a conservative sensitivity to the dangers of the experiment he had just pioneered. Some of Madison’s writings on representation echo themes that we associate with Burke, whom intellectually grounded conservatives so deeply admire, even while American conservatism now appears to be plunging into a know-nothing vacuum that its modern pioneers, such as the late William Buckley, would have abhorred.

With his long association with Buckley’s National Review, Richard Brookhiser might seem the best writer available to explain why Madison might be a conservative icon. Brookhiser has become a major player in the literary-historical cult of “founders’ chic,” and by my count this is his eighth contribution to the trade. Yet portraying Madison as a conservative of foundational stature is not in fact the path that Brookhiser takes. For one thing, interpretation in any serious sense of that term is at best a modest feature of this book: much of it is simply a narrative into which Madison is made to fit. Reading this book is another reminder of the differences between the respective approaches of journalists and scholars to the same life. Brookhiser’s biography may be the quickest-paced biography I have ever read. Knowing the background (as a scholar must) left me either gasping or gaping at how quickly Brookhiser can reduce significant points to the shortest statement possible. Indeed at points his biography struck me as a sort of adult version of the Landmark Books I was reading by third or fourth grade (back around the time that Mr. Cub started playing shortstop at Wrigley Field).

Brookhiser saves his central argument for his final pages, and it might have been wiser to bring it out more explicitly much earlier, the better to explain the latent emphasis of his effort. Madison left two legacies, Brookhiser suggests. One is the manifest legacy—the monument of “American constitutionalism,” Brookhiser calls it, just as Sir Christopher Wren made St. Paul’s Cathedral, his burial site, his monument. This means not merely the primal documents (the Constitution, the Bill of Rights, The Federalist) with which Madison is associated, but also “the laws of doing and not doing, and all the debate and revisions they have generated.” Here Madison is the author of the documents and modes of constitutional argument that give our tradition its underlying form. But Madison’s “other monument, coequal if not greater,” Brookhiser concludes, “is American politics,” meaning “the behavior that makes constitutionalism work.” This is the Madison who, even more than his ally Jefferson, constructed the first political party system of the 1790s, and who came to understand that the dominant force in republican politics was a public opinion that one could both educate and manipulate. This Madison is manifestly a political actor, making contingent decisions good and bad, and not simply a serious intellectual—arguably America’s greatest political thinker—whose legacy lies in the Constitution, its first ten amendments, and his twenty-nine essays in The Federalist.

Between these two points, there seems little doubt which one Brookhiser prefers as his real subject of interest. His treatment of Madison’s dominant role as the constitutional founder of the 1780s is, if not perfunctory, quickly dispatched. The major disputes that scholars still agonize over virtually disappear from his glib account. Almost exactly a century ago, Charles A. Beard and, in a different way, the early pluralist social scientists, made Madison’s Federalist 10 the ur-text of American constitutional thinking. That interpretation was effectively refuted by Douglass Adair in two famous essays written at mid-century. Ever since, platoons of historians and political scientists have made the republican Madison—and not Madison the co-chair of the Republican party—the creative genius of American constitution-making. This was the Madison who argued, against “the celebrated Montesquieu,” that large republics would prove more resistant to the “mischief of faction” than smaller ones, and that republican citizens, though hardly wallowing in depravity, would possess the same interests, passions, and self-confirming opinions as other ordinary mortals. Many conservative writers, following the late Martin Diamond, emphasize Federalist 10 as the model of a commercial republic, a society in which the ancient republican quest for virtuous, self-denying citizens yielded to a more modern conception that took the pursuit of self-interest as a more accurate norm.

Little if any of this interpretive stuff surfaces to disrupt Brookhiser’s narrative. Since Brookhiser does not carefully examine the broader context from which the Constitution emerged, or Madison’s efforts to make sense of the politics of the 1780s, his account cannot accurately describe how and why Madison’s analysis really mattered. To take one example, Brookhiser rightly emphasizes Madison’s interest in having Spain open the Mississippi to American navigation in the 1780s and in gaining American control of Florida later. But nowhere does he explain that the sharply sectional divisions which arose over John Jay’s efforts to negotiate a commercial treaty with Spain at the expense of southern expansion led Madison not only to worry that the Union might devolve into two or three regional confederacies, but also to start speculating about the problem of “factious majorities” that proved so essential to his constitutional thinking in 1787. Nor does Brookhiser have much to say about Madison’s disillusionment with state-based lawmaking in the 1780s, which was equally important in formulating his constitutional agenda. “What could he do with such clods?” Gordon Wood once quipped about Madison’s view of his fellow Virginia legislators. That theme is largely absent from Brookhiser’s book.

Yet as Brookhiser rightly observes, Madison was not (to quote Federalist 37) “an ingenious theorist” speculatively producing an ideal constitution “in his closet, or in his imagination.” Brookhiser’s engaged Madison could be “wrong and stubborn” on particular points, but “these are flyspecks against his patience and energy, learning and savvy. He lost arguments and he changed his mind.” Brookhiser is absolutely right to portray Madison as a politician whose leading ideas were driven by events, who did his best thinking precisely because he was responding to events, decisions, personalities, and elections.

Madison did not think of himself as a political writer. The very fact that he contributed to The Federalist was an accident of his decision to remain in New York as a delegate to Congress rather than return to lead the ratification campaign in Virginia. Brookhiser calls attention to the very different, slimmed-down essays that Madison wrote as a tentative platform for the Republican opposition in 1792. He errs in claiming that “Madison scholars spend relatively little time” on these essays, especially after Colleen Sheehan’s excellent book on Madison’s ideas of public opinion that places these “party essays” front and center. Yet Brookhiser also demonstrates the extent to which Madison the party-founder of the 1790s was actively conjuring up avenues of political competition that he might have abhorred only a few years earlier.

In assessing American politics after 1789, Brookhiser balances a residual profound respect for Madison’s general abilities with ample criticism of many of his positions. Like most commentators, he thinks Hamilton had a much better grasp of economic and foreign policies than his main detractors, Madison and Jefferson. When it comes to assessing their claims, in 1798, for a residual state role in protecting the Constitution against Federalist infringements, Brookhiser unleashes his editorial quill. Madison’s position in the Virginia Resolutions, alleging that the Federalists were edging toward monarchy “was, to speak plainly, nuts,” while Jefferson’s Kentucky Resolutions read the Constitution “like a divorce lawyer combing through a prenup.” Nice quips, but judgments such as these can never provide an adequate account of why people in the past acted as they did. Nor does it do much good to characterize their positions as “paranoia,” or to describe their attitudes toward Britain as Anglophobia. Those are descriptions, not explanations; and while they give the narrative a lift, they do little to clarify our understanding of the politics Brookhiser wants us to take seriously. The real challenge is to explain how the apparent abuse of executive power in the 1790s would have suggested that Federalist theories of executive power had monarchical elements, or to understand why the Virginians could have plausibly believed that Britain was a predatory power, even if Jefferson and Madison allowed the country to be snookered by the French.

Even with its journalistic flourishes, Brookhiser brings an admirable appreciation of the political realities in which Madison operated to every chapter, and that appreciation in turn reflects his own incisive grasp of Madison’s significance. “We will not find complete consistency in his career,” Brookhiser concludes, and “we should not look for it.” Politics is always too surprising, too dynamic, to allow that to happen. But so what? “It should be enough for us that a great mind gave it his best thoughts for as long as Madison did.”

Jack Rakove is writing a book called A Politician Thinking: The Creative Mind of James Madison.

Tuesday, October 25, 2011

The Steve Jobs Biography (2)

I have started the book, but expect to have things to say about Barry Goldwater before I post on Steve Jobs.

Sunday, October 23, 2011

Candice Millard - Destiny of the Republic (2)

Candice Millard – Destiny of the Republic

This is a most enjoyable book, a great example of narrative history by a great story teller.

James Garfield had a rough upbringing on the Ohio frontier. He showed promise from the beginning, rose to Brigadier General in the Civil War, graduated from Williams College, and became a college president at the ripe age of 27. As an Ohio congressman he was the surprise Republican nominee for President in 1880 and was elected. The Republican Party was split between the Stalwarts (reactionaries) and the Half-Breeds (progressives). Garfield was firmly in the progressive camp. He seemingly had great potential as chief executive but was gunned down by a “crazy” man shortly after taking office.

“James A. Garfield sprung from the people,” a reporter marveled. P. 78

For President Garfield the presidency was untenable because he didn’t have time to read and think. P. 88

“When Dr. Johnson defined patriotism as the last refuge of a scoundrel, he was unconscious of the then undeveloped capabilities and uses of the word ‘Reform.’” (Roscoe Conkling) P. 89

Despite the assassination of Lincoln, which was attributed to the exigencies of war, Americans did not take the threat of a President being murdered seriously. P. 90

The barbaric medical procedures used on the wounded President Garfield---this in 1881---before X-rays and MRIs are truly horrifying. Shortly after the shooting, a Dr. Bliss is probing into Garfield’s wound with his fingers and long unsterilized probes that only make the situation worse. P. 142

“It is one of the precious mysteries of sorrow that it finds solace in unselfish thought.” President Garfield P. 143

The President was taken from the train station back to the White House. There were no hospitals in those days. P. 146

In 1881 thousands of homes had telephones, but the phone was not yet part of the everyday life of the nation. P. 150

The city of Washington D.C. was in chaos after the shooting. P. 150-51

Alexander Graham Bell knew it was barbaric to search for the bullet with fingers and probes. Science had to find a better way. P. 151

It is fascinating to read how AGB fits into the story. Bell figured out how to use his telephone invention to determine the location of the bullet in President Garfield’s body via an induction method which I do not understand. P. 162

Garfield lived for weeks after being shot and for a time there was genuine hope that he would recover. P. 192

It is likely that Garfield’s doctors caused his death. The international medical community condemned Dr. Bliss’s methods. Specifically they condemned the repeated, unsterilized probing of the Presidents wounds. The resulting infection is what killed Garfield, who might have lived if the doctors had just left him alone. Dr. Bliss never recovered his reputation. P. 253-54

Shot on July 2, the President died on September 19.

Alexander Graham Bell had an unbelievably productive life. I did not know this before reading this book. His induction balance invention did not help Garfield because he was forced to examine the wrong side of Garfield’s body. Joseph Lister would live to see his discovery of antiseptic surgery widely accepted. It came much too late to help James Garfield. The 20th President’s children were all successful. Civil service reform, called the Pendleton Act, came shortly after Garfield’s death. Chester Arthur shed his association with Roscoe Conklin and became a respected President. The American people were briefly united in mourning the chief executive’s brutal death. His killer was hanged.

Saturday, October 22, 2011

The Flat Tax Fraud

Robert Reich-Chancellor's Professor of Public Policy, University of California at Berkeley
The Flat-Tax Fraud, and the Necessity of a Truly Progressive Tax
Posted: 10/22/11 10:47 AM ET

Herman Cain's bizarre 9-9-9 plan would replace much of the current tax code with a 9 percent individual income tax and a 9 percent sales tax. He calls it a "flat tax."

Next week Rick Perry is set to announce his own version of a flat tax. Former House majority leader Dick Armey -- now chairman of Freedom Works, a major backer of the Tea Party funded by the Koch Brothers and other portly felines (I didn't say "fat cats") -- predicts this will give Perry "a big boost." Steve Forbes, one of America's richest billionaires, who's on the board of the Freedom Works foundation, is delighted. He's been pushing the flat tax for years.

The flat tax is a fraud. It raises taxes on the poor and lowers them on the rich.

We don't know exactly what Perry will propose, but the non-partisan Tax Policy Center estimates that Cain's plan (the only one out there so far) would lower the after-tax incomes of poor households (incomes below $30,000) by 16 to 20 percent, while increasing the incomes of wealthier households (incomes above $200,000) by 5 to 22 percent, on average.

Under Cain's plan, fully 95 percent of households with more than $1 million in income would get an average tax cut of $487,300. And capital gains (a major source of income for the very rich) would be tax free.

The details of flat-tax proposals vary, of course. But all of them end up benefitting the rich more than the poor for one simple reason: Today's tax code is still at least moderately progressive. The rich usually pay a higher percent of their incomes in income taxes than do the poor. A flat tax would eliminate that slight progressivity.

Nowadays most low-income households pay no federal income tax at all -- a fact that sends many regressives into spasms of indignation. They conveniently ignore the fact that poor households pay a much larger share of their incomes in payroll taxes, sales taxes, and property taxes (directly, if they own their homes; indirectly, if they rent) than do people with high incomes.

Flat-taxers pretend a flat tax is good public policy, for two reasons.

First, they say, it would simplify paying taxes. Baloney. Flat-tax proposals don't eliminate popular deductions. (I'll be surprised if Perry's plan eliminates the popular mortgage-interest deduction, for example.) So most tax payers would still have to fill out lots of forms.

Second, they say a flat tax is fairer than the current system because, in Cain's words, a flat tax "treats everyone the same."

The truth is the current tax code treats everyone the same. It's organized around tax brackets. Everyone whose income reaches the same bracket is treated the same as everyone else whose income reaches that bracket (apart from various deductions, exemptions, and credits, of course).

For example, no one pays any income taxes on the first $20,000 or so of their income (the exact amount depends on whether the person is married and eligible for tax credits like the Earned Income Tax Credit of the Family Tax Credit.)

People in higher brackets pay a higher rate only on the portion of their income that hits that bracket -- not on their entire incomes.

So when Barack Obama calls for ending the Bush tax cut on incomes over $250,000, he's only talking about the portion peoples' incomes that exceed $250,000. He's not proposing to tax their entire incomes at the higher rate that prevailed under Bill Clinton.

Republicans have tried to sow confusion about this. They want Americans to believe, for example, that if the Bush tax cut ended, small business owners with incomes of $251,000 a year would suddenly have to pay 39 percent of their entire incomes in taxes rather than 35 percent. Wrong. They'd only have to pay the 39 percent rate on $1,000 -- the portion of their incomes over $250,000.

Get it? We already have a flat tax -- flat within each bracket.

The real problem is the top brackets are set too low relative to where the money is. The top-most bracket starts at $375,000 a year. People with incomes higher than that pay 35 percent -- again, only on that portion of their incomes exceeding $375,000.

This is absurd. It means a professional who's making, say, $380,000 a year pays the same income-tax rate as a plutocrat pulling in $2 billion or $20 billion.

Our current flat tax at the top is treating the nation's professional class exactly the same as it treats super-rich plutocrats. My doctor pays the same rate as Steve Forbes.

Actually, it's worse than that because the plutocrats get most of their income in the form of capital gains, which are taxed at only 15 percent. That's why America's 400 richest people -- who earned an average of $300 million last year, and who have more wealth than the bottom 150 million Americans put together -- now pay at a 17 percent rate (according to the IRS).

The Republicans' push for a flat tax masks what's really going on.

Remember: The top 1 percent is now raking in over 20 percent of the nation's total income and owns over 35 percent of the nation's wealth. Under almost anyone's view of fairness, these are grotesque portions. They're especially large relative to what they were as recently as thirty years ago, when the top 1 percent raked in under 10 percent. And these huge portions at the top continue to increase.

Meanwhile, the top tax bracket is now 35 percent -- the lowest it's been in three decades. Between the end of World War II and 1980 it never fell below 70 percent.

Simple fairness requires three things: More tax brackets at the top, higher rates in each bracket, and the treatment of all sources of income (capital gains included) exactly the same.

Not only fairness demands it, but also fiscal prudence. A truly progressive tax would bring in tens of billions of dollars a year from the people at the top who are in the best position to afford it.

Regressives are pushing the flat tax as a smokescreen. They'd rather not have anyone talk about the unfairness and fiscal absurdity of the current system.

Rather than merely oppose the flat tax, sensible people should push for a truly progressive tax -- starting with a top rate of 70 percent on that portion of anyone's income exceeding $5 million, from whatever source.

Gene Chizik - All In (8)

"Playing football at Auburn is not a right; it is a privilege."

Gene Chizik p. 132

The Steve Jobs Biography

Books of The Times
Making the iBio for Apple’s Genius
By JANET MASLIN
Published: October 21, 2011
Sign In to E-Mail

After Steve Jobs anointed Walter Isaacson as his authorized biographer in 2009, he took Mr. Isaacson to see the Mountain View, Calif., house in which he had lived as a boy. He pointed out its “clean design” and “awesome little features.” He praised the developer, Joseph Eichler, who built more than 11,000 homes in California subdivisions, for making an affordable product on a mass-market scale. And he showed Mr. Isaacson the stockade fence built 50 years earlier by his father, Paul Jobs.

STEVE JOBS

By Walter Isaacson

Illustrated. 630 pages. Simon & Schuster. $35.

“He loved doing things right,” Mr. Jobs said. “He even cared about the look of the parts you couldn’t see.”

Mr. Jobs, the brilliant and protean creator whose inventions so utterly transformed the allure of technology, turned those childhood lessons into an all-purpose theory of intelligent design. He gave Mr. Isaacson a chance to play by the same rules. His story calls for a book that is clear, elegant and concise enough to qualify as an iBio. Mr. Isaacson’s “Steve Jobs” does its solid best to hit that target.

As a biographer of Albert Einstein and Benjamin Franklin, Mr. Isaacson knows how to explicate and celebrate genius: revered, long-dead genius. But he wrote “Steve Jobs” as its subject was mortally ill, and that is a more painful and delicate challenge. (He had access to members of the Jobs family at a difficult time.) Mr. Jobs promised not to look over Mr. Isaacson’s shoulder, and not to meddle with anything but the book’s cover. (Boy, does it look great.) And he expressed approval that the book would not be entirely flattering. But his legacy was at stake. And there were awkward questions to be asked. At the end of the volume, Mr. Jobs answers the question “What drove me?” by discussing himself in the past tense.

Mr. Isaacson treats “Steve Jobs” as the biography of record, which means that it is a strange book to read so soon after its subject’s death. Some of it is an essential Silicon Valley chronicle, compiling stories well known to tech aficionados but interesting to a broad audience. Some of it is already quaint. Mr. Jobs’s first job was at Atari, and it involved the game Pong. (“If you’re under 30, ask your parents,” Mr. Isaacson writes.) Some, like an account of the release of the iPad 2, is so recent that it is hard to appreciate yet, even if Mr. Isaacson says the device comes to life “like the face of a tickled baby.”

And some is definitely intended for future generations. “Indeed,” Mr. Isaacson writes, “its success came not just from the beauty of the hardware but from the applications, known as apps, that allowed you to indulge in all sorts of delightful activities.” One that he mentions, which will be as quaint as Pong some day, features the use of a slingshot to shoot down angry birds.

So “Steve Jobs,” an account of its subject’s 56 years (he died on Oct. 5), must reach across time in more ways than one. And it does, in a well-ordered, if not streamlined, fashion. It begins with a portrait of the young Mr. Jobs, rebellious toward the parents who raised him and scornful of the ones who gave him up for adoption. (“They were my sperm and egg bank,” he says.)

Although Mr. Isaacson is not analytical about his subject’s volatile personality (the word “obnoxious” figures in the book frequently), he raises the question of whether feelings of abandonment in childhood made him fanatically controlling and manipulative as an adult. Fortunately, that glib question stays unanswered.

Mr. Jobs, who founded Apple with Stephen Wozniak and Ronald Wayne in 1976, began his career as a seemingly contradictory blend of hippie truth seeker and tech-savvy hothead.

“His Zen awareness was not accompanied by an excess of calm, peace of mind or interpersonal mellowness,” Mr. Isaacson says. “He could stun an unsuspecting victim with an emotional towel-snap, perfectly aimed,” he also writes. But Mr. Jobs valued simplicity, utility and beauty in ways that would shape his creative imagination. And the book maintains that those goals would not have been achievable in the great parade of Apple creations without that mean streak.

Mr. Isaacson takes his readers back to the time when laptops, desktops and windows were metaphors, not everyday realities. His book ticks off how each of the Apple innovations that we now take for granted first occurred to Mr. Jobs or his creative team. “Steve Jobs” means to be the authoritative book about those achievements, and it also follows Mr. Jobs into the wilderness (and to NeXT and Pixar) after his first stint at Apple, which ended in 1985.

With an avid interest in corporate intrigue, it skewers Mr. Jobs’s rivals, like John Sculley, who was recruited in 1983 to be Apple’s chief executive and fell for Mr. Jobs’s deceptive show of friendship. “They professed their fondness so effusively and often that they sounded like high school sweethearts at a Hallmark card display,” Mr. Isaacson writes.

STEVE JOBS

By Walter Isaacson

Illustrated. 630 pages. Simon & Schuster. $35

Steve Jobs introduced the new Macintosh personal computer on Jan. 24, 1984.
Of course the book also tracks Mr. Jobs’s long and combative rivalry with Bill Gates. The section devoted to Mr. Jobs’s illness, which suggests that his cancer might have been more treatable had he not resisted early surgery, describes the relative tenderness of their last meeting.

“Steve Jobs” greatly admires its subject. But its most adulatory passages are not about people. Offering a combination of tech criticism and promotional hype, Mr. Isaacson describes the arrival of each new product right down to Mr. Jobs’s theatrical introductions and the advertising campaigns. But if the individual bits of hoopla seem excessive, their cumulative effect is staggering. Here is an encyclopedic survey of all that Mr. Jobs accomplished, replete with the passion and excitement that it deserves.

Mr. Jobs’s virtual reinvention of the music business with iTunes and the iPod, for instance, is made to seem all the more miraculous (“He’s got a turn-key solution,” the music executive Jimmy Iovine said.) Mr. Isaacson’s long view basically puts Mr. Jobs up there with Franklin and Einstein, even if a tiny MP3 player is not quite the theory of relativity. The book emphasizes how deceptively effortless Mr. Jobs’s ideas now seem because of their extreme intuitiveness and foresight. When Mr. Jobs, who personally persuaded musician after musician to accept the iTunes model, approached Wynton Marsalis, Mr. Marsalis was rightly more impressed with Mr. Jobs than with the device he was being shown.

Mr. Jobs’s love of music plays a big role in “Steve Jobs,” like his extreme obsession with Bob Dylan. (Like Mr. Dylan, he had a romance with Joan Baez. Her version of Mr. Dylan’s “Love Is Just a Four-Letter Word” was on Mr. Jobs’s own iPod.) So does his extraordinary way of perceiving ordinary things, like well-made knives and kitchen appliances. That he admired the Cuisinart food processor he saw at Macy’s may sound trivial, but his subsequent idea that a molded plastic covering might work well on a computer does not. Years from now, the research trip to a jelly bean factory to study potential colors for the iMac case will not seem as silly as it might now.

Skeptic after skeptic made the mistake of underrating Steve Jobs, and Mr. Isaacson records the howlers who misjudged an unrivaled career. “Sorry Steve, Here’s Why Apple Stores Won’t Work,” Business Week wrote in a 2001 headline. “The iPod will likely become a niche product,” a Harvard Business School professor said. “High tech could not be designed and sold as a consumer product,” Mr. Sculley said in 1987.

Mr. Jobs got the last laugh every time. “Steve Jobs” makes it all the sadder that his last laugh is over.

Thursday, October 20, 2011

Candice Millard - Destiny of the Republic

I am reading this narrative account of the assasination of President Garfield in 1881. This event in American history is not much talked about. The book is riveting and interesting.

Sunday, October 16, 2011

THIS SAYS IT ALL

If you want to know why I am a Democrat read this article. If you wish to know why I am a Progressive, read this article. This says it all---the BEST summary of the Progressive world view that I have ever read. Thanks, Professor Reich!



October 16, 2011


Robert Reich
The Rise of the Regressive Right and the Reawakening of America
Posted: 10/16/11 05:44 PM ET

A fundamental war has been waged in this nation since its founding, between progressive forces pushing us forward and regressive forces pulling us backward.

We are going to battle once again.

Progressives believe in openness, equal opportunity, and tolerance. Progressives assume we're all in it together: We all benefit from public investments in schools and health care and infrastructure. And we all do better with strong safety nets, reasonable constraints on Wall Street and big business, and a truly progressive tax system. Progressives worry when the rich and privileged become powerful enough to undermine democracy.

Regressives take the opposite positions.

Eric Cantor, Paul Ryan, Rick Perry, Michele Bachmann and the other tribunes of today's Republican right aren't really conservatives. Their goal isn't to conserve what we have. It's to take us backwards.

They'd like to return to the 1920s -- before Social Security, unemployment insurance, labor laws, the minimum wage, Medicare and Medicaid, worker safety laws, the Environmental Protection Act, the Glass-Steagall Act, the Securities and Exchange Act, and the Voting Rights Act.

In the 1920s Wall Street was unfettered, the rich grew far richer and everyone else went deep into debt, and the nation closed its doors to immigrants.

Rather than conserve the economy, these regressives want to resurrect the classical economics of the 1920s -- the view that economic downturns are best addressed by doing nothing until the "rot" is purged out of the system (as Andrew Mellon, Herbert Hoover's Treasury Secretary, so decorously put it).

In truth, if they had their way we'd be back in the late nineteenth century -- before the federal income tax, antitrust laws, the Pure Food and Drug Act, and the Federal Reserve. A time when robber barons -- railroad, financial, and oil titans -- ran the country. A time of wrenching squalor for the many and mind-numbing wealth for the few.

Listen carefully to today's Republican right and you hear the same Social Darwinism Americans were fed more than a century ago to justify the brazen inequality of the Gilded Age: Survival of the fittest. Don't help the poor or unemployed or anyone who's fallen on bad times, they say, because this only encourages laziness. America will be strong only if we reward the rich and punish the needy.

The regressive right has slowly consolidated power over the last three decades as income and wealth have concentrated at the top. In the late 1970s the richest 1 percent of Americans received 9 percent of total income and held 18 percent of the nation's wealth; by 2007, they had more than 23 percent of total income and 35 percent of America's wealth. CEOs of the 1970s were paid 40 times the average worker's wage; now CEOs receive 300 times the typical workers' wage.

This concentration of income and wealth has generated the political heft to deregulate Wall Street and halve top tax rates. It has bankrolled the so-called Tea Party movement, and captured the House of Representatives and many state governments. Through a sequence of presidential appointments it has also overtaken the Supreme Court.

Scalia, Alito, Thomas, and Roberts (and, all too often, Kennedy) claim they're conservative jurists. But they're judicial activists bent on overturning 75 years of jurisprudence by resurrecting states' rights, treating the 2nd Amendment as if America still relied on local militias, narrowing the Commerce Clause, and calling money speech and corporations people.

Yet the great arc of American history reveals an unmistakable pattern. Whenever privilege and power conspire to pull us backward, the nation eventually rallies and moves forward. Sometimes it takes an economic shock like the bursting of a giant speculative bubble; sometimes we just reach a tipping point where the frustrations of average Americans turn into action.

Look at the Progressive reforms between 1900 and 1916; the New Deal of the 1930s; the Civil Rights struggle of the 1950s and 1960s; the widening opportunities for women, minorities, people with disabilities, and gays; and the environmental reforms of the 1970s.

In each of these eras, regressive forces reignited the progressive ideals on which America is built. The result was fundamental reform.

Perhaps this is what's beginning to happen again across America.

Our Primal Scream

America’s ‘Primal Scream’
By NICHOLAS D. KRISTOF
Published: October 15, 2011


IT’S fascinating that many Americans intuitively understood the outrage and frustration that drove Egyptians to protest at Tahrir Square, but don’t comprehend similar resentments that drive disgruntled fellow citizens to “occupy Wall Street.”

There are differences, of course: the New York Police Department isn’t dispatching camels to run down protesters. Americans may feel disenfranchised, but we do live in a democracy, a flawed democracy — which is the best hope for Egypt’s evolution in the coming years.

Yet my interviews with protesters in Manhattan’s Zuccotti Park seemed to rhyme with my interviews in Tahrir earlier this year. There’s a parallel sense that the political/economic system is tilted against the 99 percent. Al Gore, who supports the Wall Street protests, described them perfectly as a “primal scream of democracy.”

The frustration in America isn’t so much with inequality in the political and legal worlds, as it was in Arab countries, although those are concerns too. Here the critical issue is economic inequity. According to the C.I.A.’s own ranking of countries by income inequality, the United States is more unequal a society than either Tunisia or Egypt.

Three factoids underscore that inequality:

¶The 400 wealthiest Americans have a greater combined net worth than the bottom 150 million Americans.

¶The top 1 percent of Americans possess more wealth than the entire bottom 90 percent.

¶In the Bush expansion from 2002 to 2007, 65 percent of economic gains went to the richest 1 percent.

As my Times colleague Catherine Rampell noted a few days ago, in 1981, the average salary in the securities industry in New York City was twice the average in other private sector jobs. At last count, in 2010, it was 5.5 times as much. (In case you want to gnash your teeth, the average is now $361,330.)

More broadly, there’s a growing sense that lopsided outcomes are a result of tycoons’ manipulating the system, lobbying for loopholes and getting away with murder. Of the 100 highest-paid chief executives in the United States in 2010, 25 took home more pay than their company paid in federal corporate income taxes, according to the Institute for Policy Studies.

Living under Communism in China made me a fervent enthusiast of capitalism. I believe that over the last couple of centuries banks have enormously raised living standards in the West by allocating capital to more efficient uses. But anyone who believes in markets should be outraged that banks rig the system so that they enjoy profits in good years and bailouts in bad years.

The banks have gotten away with privatizing profits and socializing risks, and that’s just another form of bank robbery.

“We have a catastrophically bad misregulation of the financial system,” said Amar BhidĂ©, a finance expert at the Fletcher School of Law and Diplomacy at Tufts University. “Its consequences led to a taint of the entire system of modern enterprise.”

Economists used to believe that we had to hold our noses and put up with high inequality as the price of robust growth. But more recent research suggests the opposite: inequality not only stinks, but also damages economies.

In his important new book, “The Darwin Economy,” Robert H. Frank of Cornell University cites a study showing that among 65 industrial nations, the more unequal ones experience slower growth on average. Likewise, individual countries grow more rapidly in periods when incomes are more equal, and slow down when incomes are skewed.

That’s certainly true of the United States. We enjoyed considerable equality from the 1940s through the 1970s, and growth was strong. Since then inequality has surged, and growth has slowed.

One reason may be that inequality is linked to financial distress and financial crises. There is mounting evidence that inequality leads to bankruptcies and to financial panics.

“The recent global economic crisis, with its roots in U.S. financial markets, may have resulted, in part at least, from the increase in inequality,” Andrew G. Berg and Jonathan D. Ostry of the International Monetary Fund wrote last month. They argued that “equality appears to be an important ingredient in promoting and sustaining growth.”

Inequality also leads to early deaths and more divorces — a reminder that we’re talking not about data sets here, but about human beings.

Some critics think that Occupy Wall Street is simply tapping into the public’s resentment and covetousness, nurturing class warfare. Sure, there’s a dollop of envy. But inequality is also a cancer on our national well-being.

I don’t know whether the Occupy Wall Street movement will survive once Zuccotti Park fills with snow and the novelty wears off. But I do hope that the protesters have lofted the issue of inequality onto our national agenda to stay — and to grapple with in the 2012 election year.

Gordon Wood - Empire of Liberty (2)

Gordon S. Wood – Empire of Liberty

In the early days of the Republic, nowhere in the world was business and working for profit more celebrated, which made slavery all the more anomalous. Slavery was condemned, but it not only survived, it flourished. P. 2-3

By 1815 the U.S. was the most evangelical nation in the world. P. 3

By all accounts the new country was an experiment in republicanism. P. 5

Early on the people had a sense of exceptionalism. P. 7

The purpose of the Constitution was to temper the democratic excesses of the state legislatures. P. 16 (Wood seems to give scant notice to Shay’s Rebellion)

The interests of the local was destroying the interests of the people as a whole. P. 17

Once passed in 1791 the Bill of Rights was forgotten about and lay dormant until the 20th Century. P. 72 (amazing)

There were expectations amongst some that the country could have become an elective monarch like Poland. P. 74

It never ceases to amaze me to read about how George Washington invented the office of President. P. 76

According to Adams, GW was the best actor of the presidency we’ve ever had. P. 78
I’ve yet to get a real feel for the supposed tendencies toward monarchism in the 1790’s. P. 85

Hamilton sought to copy 18th century England. P. 93

Some Federalists hoped the states would revert to administrative units of the federal government. P. 97

Hamilton’s program laid the groundwork for the supremacy of the national over the state governments. P. 103

The Federalists were elitists, but this was, in my opinion, necessary in the 90’s to cement the country together. Hamilton felt rightly that the success of the new government depending on attracting the support of the people of means. The rest of the people would follow. Hierarchies were inevitable. P. 105-06

Federalism failed because of the capitalistic and democratic society that emerged after the Federalists established the stability of the new national government. P. 110

American independence was a disaster for Indians. P. 125

Indians had to submit to removal, destruction, or conform to the white man’s ways. P. 128

Hamilton argued for the first national bank based on the famous “necessary and proper” clause in Article 1 Section 8. P. 144-45

Hamilton had a different vision of the country than Jefferson and Madison. I say both were right and both were wrong in different ways. P. 147

Mostly Jefferson was wrong especially in his version of the country as a continuing confederation of independent states. P. 148

That Madison switched from being a nationalist to a states-righter is one of American history’s most debased topics. P. 148

The Federalists and the Jeffersonian Republicans were like today’s Democrats and Republicans: each believes that the other party is out to destroy the country. P. 161

Some historians believe that the Republican party’s main purpose was to protect slavery in the South. P. 166

Were the Federalists really “monarchical?” P. 172

It’s hard to figure John Adams. I like him even though some historians like to laugh at him. Chapter 6

The so-called crisis of 1798-99 with Citizen Genet and the XYZ Affair is a period of American history and the risk of war with France is a part of American history that I will never totally understand. Chapter 7

Wood seems to think there really was a Jeffersonian Revolution starting with his election in 1800. I suppose he is right, but I think less and less of Jefferson the more I read about him. Chapter 8

The expansion of the country west during Jefferson’s reign does not greatly interest me. Chapter 10

Strict constructionists always find a way to do what they want to do. Hence, they invoked the “necessary and proper” clause of the Constitution to justify the Louisiana Purchase. P. 371

Lewis and Clark started from St. Louis. P. 378

The Marbury decision was the case where the Marshall Court declared a law unconstitutional. The next time this happened wasn’t until the 1857 Dred Scott decision. P. 442

Jefferson’s record on slavery is horrific. Although he always condemned slavery, he was one of the biggest slaveholders in Virginia. He sold hundreds of slaves. He thought that by the age of 10 a boy should be treated like a man. There was no such thing as a kind slaveholder. P. 514-15

The Revolutionary leaders mistakenly thought that slavery was on its way to extinction. In their eyes there was evidence, obviously wrong, that slavery was dying out. P. 518-19

No matter what is said, everybody at the time knew in their hearts that slavery was wrong. P. 520

Stopping the slave trade did not end slavery. P. 523

In 1799 Washington had 317 slaves. P. 524

The Founders had a naĂ¯ve faith in the future. The best thing was not to rock the slavery boat. P. 525

From the beginning the nation moved in two different directions with slavery slowly disappearing in the North but persisting and eventually growing in the South. P. 526

Jefferson’s Republicans were the South’s party and the Federalists were the North’s party. There is no way you can tell me that the Jefferson’s party wasn’t at least partly formed to defend slavery. So-called states rights and race go hand in hand and always have. P. 527

The frustrations between the sections began during the Revolution and only got worse as time went on. P. 531

In 1787-1788 Federalists accepted the three-fifths compromise as necessary to keep the South in the Union. Eventually they realized their mistake: that Jefferson owed his election to the three-fifths provision, that his Republican party was Southern based and rested on slavery, and thru the three-fifths provision the South would dominate the federal government. P. 532

Wood devotes three chapter to life during the Republican Era. Chapters 15, 16, & 17

I will never fully understand the War of 1812. Wood devotes a chapter to it. Let it be. Chapter 18

One of Wood’s chief points is that the Founders were elitists who had exalted notions of the disinterested motivations of political leaders---the way it should be---but that they set in motion notions of egalitarianism that made the nation unrecognizable to them by the early years of the 19th century. Chapter 19

Saturday, October 15, 2011

Gene Chizik - All In (7)

Every new coach tries to change up things when he assumes his position. Every coach wants to do things his way. This was certainly the case when Gene Chizik took over the Auburn job. He says so very clearly in his book.

The players weren't close so the Coach did things to bring the players together. He is obviously big on motivation. He does everything he can to motivate his players.

Players have to trust one another. They have to help one another. Let us hope that foundation has been laid and will continue for the duration of the Chizik era.