Search Mar 31, 3:58 pm 30Obamacare, The Unknown Ideal
by Paul Krugman
No, I haven’t lost my mind — or suddenly become an Ayn Rand disciple. It’s not my ideal; in a better world I’d call for single-payer, and a significant role for the government in directly providing care.
But Ross Douthat, in the course of realistically warning his fellow conservatives that Obamacare doesn’t seem to be collapsing, goes on to tell them that they’re going to have to come up with a serious alternative.
But Obamacare IS the conservative alternative, and not just because it was originally devised at the Heritage Foundation. It’s what a health-care system that does what even conservatives say they want, like making sure that people with preexisting conditions can get coverage, has to look like if it isn’t single-payer.
I don’t really think one more repetition of the logic will convince many people, but here we go again. Suppose you want preexisting conditions covered. Then you have to impose community rating — insurers must offer the same policies to people regardless of medical history. But just doing that causes a death spiral, because people wait until they’re sick to buy insurance. So you also have to have a mandate, requiring healthy people to join the risk pool. And to make buying insurance possible for people with lower incomes, you have to have subsidies.
And what you’ve just defined are the essentials of ObamaRomneyCare. It’s a three-legged stool that needs all three legs. If you want to cover preexisting conditions, you must have the mandate; if you want the mandate, you must have subsidies. If you think there’s some magic market-based solution that obviates the stuff conservatives don’t like while preserving the stuff they like, you’re deluding yourself.
What this means in practice is that any notion that Republicans will go beyond trying to sabotage the law and come up with an alternative is fantasy. Again, Obamacare is the conservative alternative, and you can’t move further right without doing no reform at all.
Monday, March 31, 2014
Sunday, March 30, 2014
About Book Readers
Books are losing the war for our attention. Here’s how they could fight back.
By Matt McFarland March 19 at 9:13 amMore Comments
The joy of reading books has become less common. (Hannibal Hanschke/EPA)
Technology has reshaped everything from how we communicate to how we find a mate or a job. Yet the experience of reading books remains largely untransformed, and the popularity of books has suffered in the face of flashier media formats that are perfected for our busy world.
The number of non-book readers has tripled since 1978. And e-books aren’t saving the day. According to the Association of American Publishers, e-book sales were flat or in decline for most of 2013.
What’s happening to books? They’ve contained a wealth of human knowledge for generations. Books such as “Uncle Tom’s Cabin” and “The Jungle” have helped inspired movements and advanced human culture. But books struggle to compete with new forms of media that better grab our attention. Our smartphones and tablets chirp or vibrate when we receive e-mails and text messages. Notifications from social networks and apps force their way onto our devices through pop-up notifications. As books sit silently on shelves, our culture risks having great voices and ideas drowned out.
“The form that is long-form text, a couple hundred pages of words has been well-honed by humans over the course of centuries,” said Russ Grandinetti, vice president of Amazon Kindle content. “There are things that you can learn in a novel or in a well-written biography that you can’t learn from a movie or learn the same way. That’s why the form has evolved the way that it has.”
Books can still open our eyes to a world of knowledge, but their reserved nature hurts them.
“Most people walk around with some kind of device or have access to some kind of device that allows them to choose how to use their time,” Grandinetti said. “In a world with that much ubiquitous choice, books need to continue to evolve to compete for someone’s time and interest.”
Our current notion of a digital book may be too close to that of a physical book. The modern struggles of books and e-books suggests we need more reinvention to retain readers. But how?
One possibility would be to integrate more multimedia into e-books, which don’t offer the same dynamic story-telling as HTML and Web sites. There aren’t e-books being published that capture the blend of prose, photos, videos and graphics that we saw in The New York Times Snow Fall project.
For masterpieces that have already been written, embracing multimedia isn’t a solution. Plato’s “Republic” or the collected works of Shakespeare can’t be sexed up with digital graphics and short videos. But Spritz could help them.
Spritz is a Boston-based start-up devoted to making reading easier and faster. Its focus is on-the-go reading. Words flash rapidly, which helps hold the attention of readers. Our eyes are naturally drawn to movement and change. Plus, because words are flashing faster than the rate most people read at, we should all be able to get more reading done.
“We’re all pressed for time because it’s an information overload,” said Spritz founder and CEO Frank Waldman, who only finds the time to read books while flying. “The reason people aren’t reading e-books is because they’re so busy reading e-mails or catching up on the news.”
He compares Spritz to the lack of distractions running on a treadmill offers vs. being outside.
“You go out on the road and you have to watch out for road conditions, weather conditions. It’s up and downhill and much harder to do,” Waldman said. “When reading on the printed page, I feel distracted by all the others things around it. I have to swipe and swipe and swipe. It’s not as easy as settling in and having words stream to brain.”
Waldman describes Spritz’s use of the optimal recognition point as the company’s secret sauce (it’s the red letter you see in the demo video above). Words are positioned on the Spritz display so that the optimal recognition point remains steady, so our eyes never have to move. Waldman points to a 2005 research paper, which said for languages that read from left to right, the optimal viewing position is between the beginning and middle of the word. The positioning of words means time isn’t wasted scanning for the optimal recognition point of the next word to be read.
In limited testing, Spritz has found comprehension isn’t negatively affected at up to 400 words per minute. For people who struggle to find time to read books, Spritz could be a game-changer. At 400 words a minute, the “Catcher in the Rye” could be read in three hours and four minutes.
So far the response to Spritz has been generally very positive. Spritz has had 20,000 people register to receive its software development kit. It doesn’t plan to release its own app. At some point, it plans to charge users a fee along the lines of WhatsApp, which chargers $0.99 a year after a year of use.
Time will tell if it can turn the tide on the rise of non-book readers.
Books are losing the war for our attention. Here’s how they could fight back.
By Matt McFarland March 19 at 9:13 amMore Comments
The joy of reading books has become less common. (Hannibal Hanschke/EPA)
Technology has reshaped everything from how we communicate to how we find a mate or a job. Yet the experience of reading books remains largely untransformed, and the popularity of books has suffered in the face of flashier media formats that are perfected for our busy world.
The number of non-book readers has tripled since 1978. And e-books aren’t saving the day. According to the Association of American Publishers, e-book sales were flat or in decline for most of 2013.
What’s happening to books? They’ve contained a wealth of human knowledge for generations. Books such as “Uncle Tom’s Cabin” and “The Jungle” have helped inspired movements and advanced human culture. But books struggle to compete with new forms of media that better grab our attention. Our smartphones and tablets chirp or vibrate when we receive e-mails and text messages. Notifications from social networks and apps force their way onto our devices through pop-up notifications. As books sit silently on shelves, our culture risks having great voices and ideas drowned out.
“The form that is long-form text, a couple hundred pages of words has been well-honed by humans over the course of centuries,” said Russ Grandinetti, vice president of Amazon Kindle content. “There are things that you can learn in a novel or in a well-written biography that you can’t learn from a movie or learn the same way. That’s why the form has evolved the way that it has.”
Books can still open our eyes to a world of knowledge, but their reserved nature hurts them.
“Most people walk around with some kind of device or have access to some kind of device that allows them to choose how to use their time,” Grandinetti said. “In a world with that much ubiquitous choice, books need to continue to evolve to compete for someone’s time and interest.”
Our current notion of a digital book may be too close to that of a physical book. The modern struggles of books and e-books suggests we need more reinvention to retain readers. But how?
One possibility would be to integrate more multimedia into e-books, which don’t offer the same dynamic story-telling as HTML and Web sites. There aren’t e-books being published that capture the blend of prose, photos, videos and graphics that we saw in The New York Times Snow Fall project.
For masterpieces that have already been written, embracing multimedia isn’t a solution. Plato’s “Republic” or the collected works of Shakespeare can’t be sexed up with digital graphics and short videos. But Spritz could help them.
Spritz is a Boston-based start-up devoted to making reading easier and faster. Its focus is on-the-go reading. Words flash rapidly, which helps hold the attention of readers. Our eyes are naturally drawn to movement and change. Plus, because words are flashing faster than the rate most people read at, we should all be able to get more reading done.
“We’re all pressed for time because it’s an information overload,” said Spritz founder and CEO Frank Waldman, who only finds the time to read books while flying. “The reason people aren’t reading e-books is because they’re so busy reading e-mails or catching up on the news.”
He compares Spritz to the lack of distractions running on a treadmill offers vs. being outside.
“You go out on the road and you have to watch out for road conditions, weather conditions. It’s up and downhill and much harder to do,” Waldman said. “When reading on the printed page, I feel distracted by all the others things around it. I have to swipe and swipe and swipe. It’s not as easy as settling in and having words stream to brain.”
Waldman describes Spritz’s use of the optimal recognition point as the company’s secret sauce (it’s the red letter you see in the demo video above). Words are positioned on the Spritz display so that the optimal recognition point remains steady, so our eyes never have to move. Waldman points to a 2005 research paper, which said for languages that read from left to right, the optimal viewing position is between the beginning and middle of the word. The positioning of words means time isn’t wasted scanning for the optimal recognition point of the next word to be read.
In limited testing, Spritz has found comprehension isn’t negatively affected at up to 400 words per minute. For people who struggle to find time to read books, Spritz could be a game-changer. At 400 words a minute, the “Catcher in the Rye” could be read in three hours and four minutes.
So far the response to Spritz has been generally very positive. Spritz has had 20,000 people register to receive its software development kit. It doesn’t plan to release its own app. At some point, it plans to charge users a fee along the lines of WhatsApp, which chargers $0.99 a year after a year of use.
Time will tell if it can turn the tide on the rise of non-book readers.
Saturday, March 29, 2014
18 Things Highly Creative People Do Differently
Huffington Post
26 March 2014
Creativity works in mysterious and often paradoxical ways. Creative thinking is a stable, defining characteristic in some personalities, but it may also change based on situation and context. Inspiration and ideas often arise seemingly out of nowhere and then fail to show up when we most need them, and creative thinking requires complex cognition yet is completely distinct from the thinking process.
Neuroscience paints a complicated picture of creativity. As scientists now understand it, creativity is far more complex than the right-left brain distinction would have us think (the theory being that left brain = rational and analytical, right brain = creative and emotional). In fact, creativity is thought to involve a number of cognitive processes, neural pathways and emotions, and we still don't have the full picture of how the imaginative mind works.
And psychologically speaking, creative personality types are difficult to pin down, largely because they're complex, paradoxical and tend to avoid habit or routine. And it's not just a stereotype of the "tortured artist" -- artists really may be more complicated people. Research has suggested that creativity involves the coming together of a multitude of traits, behaviors and social influences in a single person.
"It's actually hard for creative people to know themselves because the creative self is more complex than the non-creative self," Scott Barry Kaufman, a psychologist at New York University who has spent years researching creativity, told The Huffington Post. "The things that stand out the most are the paradoxes of the creative self ... Imaginative people have messier minds."
While there's no "typical" creative type, there are some tell-tale characteristics and behaviors of highly creative people. Here are 18 things they do differently.
They daydream.
Creative types know, despite what their third-grade teachers may have said, that daydreaming is anything but a waste of time.
According to Kaufman and psychologist Rebecca L. McMillan, who co-authored a paper titled "Ode To Positive Constructive Daydreaming," mind-wandering can aid in the process of "creative incubation." And of course, many of us know from experience that our best ideas come seemingly out of the blue when our minds are elsewhere.
Although daydreaming may seem mindless, a 2012 study suggested it could actually involve a highly engaged brain state -- daydreaming can lead to sudden connections and insights because it's related to our ability to recall information in the face of distractions. Neuroscientists have also found that daydreaming involves the same brain processes associated with imagination and creativity.
They observe everything.
The world is a creative person's oyster -- they see possibilities everywhere and are constantly taking in information that becomes fodder for creative expression. As Henry James is widely quoted, a writer is someone on whom "nothing is lost."
The writer Joan Didion kept a notebook with her at all times, and said that she wrote down observations about people and events as, ultimately, a way to better understand the complexities and contradictions of her own mind:
"However dutifully we record what we see around us, the common denominator of all we see is always, transparently, shamelessly, the implacable 'I,'" Didion wrote in her essay On Keeping A Notebook. "We are talking about something private, about bits of the mind’s string too short to use, an indiscriminate and erratic assemblage with meaning only for its marker."
They work the hours that work for them.
Many great artists have said that they do their best work either very early in the morning or late at night. Vladimir Nabokov started writing immediately after he woke up at 6 or 7 a.m., and Frank Lloyd Wright made a practice of waking up at 3 or 4 a.m. and working for several hours before heading back to bed. No matter when it is, individuals with high creative output will often figure out what time it is that their minds start firing up, and structure their days accordingly.
They take time for solitude.
"In order to be open to creativity, one must have the capacity for constructive use of solitude. One must overcome the fear of being alone," wrote the American existential psychologist Rollo May.
Artists and creatives are often stereotyped as being loners, and while this may not actually be the case, solitude can be the key to producing their best work. For Kaufman, this links back to daydreaming -- we need to give ourselves the time alone to simply allow our minds to wander.
"You need to get in touch with that inner monologue to be able to express it," he says. "It's hard to find that inner creative voice if you're ... not getting in touch with yourself and reflecting on yourself."
They turn life's obstacles around.
Many of the most iconic stories and songs of all time have been inspired by gut-wrenching pain and heartbreak -- and the silver lining of these challenges is that they may have been the catalyst to create great art. An emerging field of psychology called post-traumatic growth is suggesting that many people are able to use their hardships and early-life trauma for substantial creative growth. Specifically, researchers have found that trauma can help people to grow in the areas of interpersonal relationships, spirituality, appreciation of life, personal strength, and -- most importantly for creativity -- seeing new possibilities in life.
"A lot of people are able to use that as the fuel they need to come up with a different perspective on reality," says Kaufman. "What's happened is that their view of the world as a safe place, or as a certain type of place, has been shattered at some point in their life, causing them to go on the periphery and see things in a new, fresh light, and that's very conducive to creativity."
They seek out new experiences.
Creative people love to expose themselves to new experiences, sensations and states of mind -- and this openness is a significant predictor of creative output.
"Openness to experience is consistently the strongest predictor of creative achievement," says Kaufman. "This consists of lots of different facets, but they're all related to each other: Intellectual curiosity, thrill seeking, openness to your emotions, openness to fantasy. The thing that brings them all together is a drive for cognitive and behavioral exploration of the world, your inner world and your outer world."
They "fail up."
Resilience is practically a prerequisite for creative success, says Kaufman. Doing creative work is often described as a process of failing repeatedly until you find something that sticks, and creatives -- at least the successful ones -- learn not to take failure so personally.
"Creatives fail and the really good ones fail often," Forbes contributor Steven Kotler wrote in a piece on Einstein's creative genius.
They ask the big questions.
Creative people are insatiably curious -- they generally opt to live the examined life, and even as they get older, maintain a sense of curiosity about life. Whether through intense conversation or solitary mind-wandering, creatives look at the world around them and want to know why, and how, it is the way it is.
They people-watch.
Observant by nature and curious about the lives of others, creative types often love to people-watch -- and they may generate some of their best ideas from it.
"[Marcel] Proust spent almost his whole life people-watching, and he wrote down his observations, and it eventually came out in his books," says Kaufman. "For a lot of writers, people-watching is very important ... They're keen observers of human nature."
They take risks.
Part of doing creative work is taking risks, and many creative types thrive off of taking risks in various aspects of their lives.
"There is a deep and meaningful connection between risk taking and creativity and it's one that's often overlooked," contributor Steven Kotler wrote in Forbes. "Creativity is the act of making something from nothing. It requires making public those bets first placed by imagination. This is not a job for the timid. Time wasted, reputation tarnished, money not well spent -- these are all by-products of creativity gone awry."
They view all of life as an opportunity for self-expression.
Nietzsche believed that one's life and the world should be viewed as a work of art. Creative types may be more likely to see the world this way, and to constantly seek opportunities for self-expression in everyday life.
"Creative expression is self-expression," says Kaufman. "Creativity is nothing more than an individual expression of your needs, desires and uniqueness."
They follow their true passions.
Creative people tend to be intrinsically motivated -- meaning that they're motivated to act from some internal desire, rather than a desire for external reward or recognition. Psychologists have shown that creative people are energized by challenging activities, a sign of intrinsic motivation, and the research suggests that simply thinking of intrinsic reasons to perform an activity may be enough to boost creativity.
"Eminent creators choose and become passionately involved in challenging, risky problems that provide a powerful sense of power from the ability to use their talents," write M.A. Collins and T.M. Amabile in The Handbook of Creativity.
They get out of their own heads.
Kaufman argues that another purpose of daydreaming is to help us to get out of our own limited perspective and explore other ways of thinking, which can be an important asset to creative work.
"Daydreaming has evolved to allow us to let go of the present," says Kaufman. "The same brain network associated with daydreaming is the brain network associated with theory of mind -- I like calling it the 'imagination brain network' -- it allows you to imagine your future self, but it also allows you to imagine what someone else is thinking."
Research has also suggested that inducing "psychological distance" -- that is, taking another person's perspective or thinking about a question as if it was unreal or unfamiliar -- can boost creative thinking.
They lose track of the time.
Creative types may find that when they're writing, dancing, painting or expressing themselves in another way, they get "in the zone," or what's known as a flow state, which can help them to create at their highest level. Flow is a mental state when an individual transcends conscious thought to reach a heightened state of effortless concentration and calmness. When someone is in this state, they're practically immune to any internal or external pressures and distractions that could hinder their performance.
You get into the flow state when you're performing an activity you enjoy that you're good at, but that also challenges you -- as any good creative project does.
"[Creative people] have found the thing they love, but they've also built up the skill in it to be able to get into the flow state," says Kaufman. "The flow state requires a match between your skill set and the task or activity you're engaging in."
They surround themselves with beauty.
Creatives tend to have excellent taste, and as a result, they enjoy being surrounded by beauty.
A study recently published in the journal Psychology of Aesthetics, Creativity, and the Arts showed that musicians -- including orchestra musicians, music teachers, and soloists -- exhibit a high sensitivity and responsiveness to artistic beauty.
They connect the dots.
If there's one thing that distinguishes highly creative people from others, it's the ability to see possibilities where others don't -- or, in other words, vision. Many great artists and writers have said that creativity is simply the ability to connect the dots that others might never think to connect.
In the words of Steve Jobs:
"Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn't really do it, they just saw something. It seemed obvious to them after a while. That's because they were able to connect experiences they've had and synthesize new things."
They constantly shake things up.
Diversity of experience, more than anything else, is critical to creativity, says Kaufman. Creatives like to shake things up, experience new things, and avoid anything that makes life more monotonous or mundane.
"Creative people have more diversity of experiences, and habit is the killer of diversity of experience," says Kaufman.
They make time for mindfulness.
Creative types understand the value of a clear and focused mind -- because their work depends on it. Many artists, entrepreneurs, writers and other creative workers, such as David Lynch, have turned to meditation as a tool for tapping into their most creative state of mind.
And science backs up the idea that mindfulness really can boost your brain power in a number of ways. A 2012 Dutch study suggested that certain meditation techniques can promote creative thinking. And mindfulness practices have been linked with improved memory and focus, better emotional well-being, reduced stress and anxiety, and improved mental clarity -- all of which can lead to better creative thought.
26 March 2014
Creativity works in mysterious and often paradoxical ways. Creative thinking is a stable, defining characteristic in some personalities, but it may also change based on situation and context. Inspiration and ideas often arise seemingly out of nowhere and then fail to show up when we most need them, and creative thinking requires complex cognition yet is completely distinct from the thinking process.
Neuroscience paints a complicated picture of creativity. As scientists now understand it, creativity is far more complex than the right-left brain distinction would have us think (the theory being that left brain = rational and analytical, right brain = creative and emotional). In fact, creativity is thought to involve a number of cognitive processes, neural pathways and emotions, and we still don't have the full picture of how the imaginative mind works.
And psychologically speaking, creative personality types are difficult to pin down, largely because they're complex, paradoxical and tend to avoid habit or routine. And it's not just a stereotype of the "tortured artist" -- artists really may be more complicated people. Research has suggested that creativity involves the coming together of a multitude of traits, behaviors and social influences in a single person.
"It's actually hard for creative people to know themselves because the creative self is more complex than the non-creative self," Scott Barry Kaufman, a psychologist at New York University who has spent years researching creativity, told The Huffington Post. "The things that stand out the most are the paradoxes of the creative self ... Imaginative people have messier minds."
While there's no "typical" creative type, there are some tell-tale characteristics and behaviors of highly creative people. Here are 18 things they do differently.
They daydream.
Creative types know, despite what their third-grade teachers may have said, that daydreaming is anything but a waste of time.
According to Kaufman and psychologist Rebecca L. McMillan, who co-authored a paper titled "Ode To Positive Constructive Daydreaming," mind-wandering can aid in the process of "creative incubation." And of course, many of us know from experience that our best ideas come seemingly out of the blue when our minds are elsewhere.
Although daydreaming may seem mindless, a 2012 study suggested it could actually involve a highly engaged brain state -- daydreaming can lead to sudden connections and insights because it's related to our ability to recall information in the face of distractions. Neuroscientists have also found that daydreaming involves the same brain processes associated with imagination and creativity.
They observe everything.
The world is a creative person's oyster -- they see possibilities everywhere and are constantly taking in information that becomes fodder for creative expression. As Henry James is widely quoted, a writer is someone on whom "nothing is lost."
The writer Joan Didion kept a notebook with her at all times, and said that she wrote down observations about people and events as, ultimately, a way to better understand the complexities and contradictions of her own mind:
"However dutifully we record what we see around us, the common denominator of all we see is always, transparently, shamelessly, the implacable 'I,'" Didion wrote in her essay On Keeping A Notebook. "We are talking about something private, about bits of the mind’s string too short to use, an indiscriminate and erratic assemblage with meaning only for its marker."
They work the hours that work for them.
Many great artists have said that they do their best work either very early in the morning or late at night. Vladimir Nabokov started writing immediately after he woke up at 6 or 7 a.m., and Frank Lloyd Wright made a practice of waking up at 3 or 4 a.m. and working for several hours before heading back to bed. No matter when it is, individuals with high creative output will often figure out what time it is that their minds start firing up, and structure their days accordingly.
They take time for solitude.
"In order to be open to creativity, one must have the capacity for constructive use of solitude. One must overcome the fear of being alone," wrote the American existential psychologist Rollo May.
Artists and creatives are often stereotyped as being loners, and while this may not actually be the case, solitude can be the key to producing their best work. For Kaufman, this links back to daydreaming -- we need to give ourselves the time alone to simply allow our minds to wander.
"You need to get in touch with that inner monologue to be able to express it," he says. "It's hard to find that inner creative voice if you're ... not getting in touch with yourself and reflecting on yourself."
They turn life's obstacles around.
Many of the most iconic stories and songs of all time have been inspired by gut-wrenching pain and heartbreak -- and the silver lining of these challenges is that they may have been the catalyst to create great art. An emerging field of psychology called post-traumatic growth is suggesting that many people are able to use their hardships and early-life trauma for substantial creative growth. Specifically, researchers have found that trauma can help people to grow in the areas of interpersonal relationships, spirituality, appreciation of life, personal strength, and -- most importantly for creativity -- seeing new possibilities in life.
"A lot of people are able to use that as the fuel they need to come up with a different perspective on reality," says Kaufman. "What's happened is that their view of the world as a safe place, or as a certain type of place, has been shattered at some point in their life, causing them to go on the periphery and see things in a new, fresh light, and that's very conducive to creativity."
They seek out new experiences.
Creative people love to expose themselves to new experiences, sensations and states of mind -- and this openness is a significant predictor of creative output.
"Openness to experience is consistently the strongest predictor of creative achievement," says Kaufman. "This consists of lots of different facets, but they're all related to each other: Intellectual curiosity, thrill seeking, openness to your emotions, openness to fantasy. The thing that brings them all together is a drive for cognitive and behavioral exploration of the world, your inner world and your outer world."
They "fail up."
Resilience is practically a prerequisite for creative success, says Kaufman. Doing creative work is often described as a process of failing repeatedly until you find something that sticks, and creatives -- at least the successful ones -- learn not to take failure so personally.
"Creatives fail and the really good ones fail often," Forbes contributor Steven Kotler wrote in a piece on Einstein's creative genius.
They ask the big questions.
Creative people are insatiably curious -- they generally opt to live the examined life, and even as they get older, maintain a sense of curiosity about life. Whether through intense conversation or solitary mind-wandering, creatives look at the world around them and want to know why, and how, it is the way it is.
They people-watch.
Observant by nature and curious about the lives of others, creative types often love to people-watch -- and they may generate some of their best ideas from it.
"[Marcel] Proust spent almost his whole life people-watching, and he wrote down his observations, and it eventually came out in his books," says Kaufman. "For a lot of writers, people-watching is very important ... They're keen observers of human nature."
They take risks.
Part of doing creative work is taking risks, and many creative types thrive off of taking risks in various aspects of their lives.
"There is a deep and meaningful connection between risk taking and creativity and it's one that's often overlooked," contributor Steven Kotler wrote in Forbes. "Creativity is the act of making something from nothing. It requires making public those bets first placed by imagination. This is not a job for the timid. Time wasted, reputation tarnished, money not well spent -- these are all by-products of creativity gone awry."
They view all of life as an opportunity for self-expression.
Nietzsche believed that one's life and the world should be viewed as a work of art. Creative types may be more likely to see the world this way, and to constantly seek opportunities for self-expression in everyday life.
"Creative expression is self-expression," says Kaufman. "Creativity is nothing more than an individual expression of your needs, desires and uniqueness."
They follow their true passions.
Creative people tend to be intrinsically motivated -- meaning that they're motivated to act from some internal desire, rather than a desire for external reward or recognition. Psychologists have shown that creative people are energized by challenging activities, a sign of intrinsic motivation, and the research suggests that simply thinking of intrinsic reasons to perform an activity may be enough to boost creativity.
"Eminent creators choose and become passionately involved in challenging, risky problems that provide a powerful sense of power from the ability to use their talents," write M.A. Collins and T.M. Amabile in The Handbook of Creativity.
They get out of their own heads.
Kaufman argues that another purpose of daydreaming is to help us to get out of our own limited perspective and explore other ways of thinking, which can be an important asset to creative work.
"Daydreaming has evolved to allow us to let go of the present," says Kaufman. "The same brain network associated with daydreaming is the brain network associated with theory of mind -- I like calling it the 'imagination brain network' -- it allows you to imagine your future self, but it also allows you to imagine what someone else is thinking."
Research has also suggested that inducing "psychological distance" -- that is, taking another person's perspective or thinking about a question as if it was unreal or unfamiliar -- can boost creative thinking.
They lose track of the time.
Creative types may find that when they're writing, dancing, painting or expressing themselves in another way, they get "in the zone," or what's known as a flow state, which can help them to create at their highest level. Flow is a mental state when an individual transcends conscious thought to reach a heightened state of effortless concentration and calmness. When someone is in this state, they're practically immune to any internal or external pressures and distractions that could hinder their performance.
You get into the flow state when you're performing an activity you enjoy that you're good at, but that also challenges you -- as any good creative project does.
"[Creative people] have found the thing they love, but they've also built up the skill in it to be able to get into the flow state," says Kaufman. "The flow state requires a match between your skill set and the task or activity you're engaging in."
They surround themselves with beauty.
Creatives tend to have excellent taste, and as a result, they enjoy being surrounded by beauty.
A study recently published in the journal Psychology of Aesthetics, Creativity, and the Arts showed that musicians -- including orchestra musicians, music teachers, and soloists -- exhibit a high sensitivity and responsiveness to artistic beauty.
They connect the dots.
If there's one thing that distinguishes highly creative people from others, it's the ability to see possibilities where others don't -- or, in other words, vision. Many great artists and writers have said that creativity is simply the ability to connect the dots that others might never think to connect.
In the words of Steve Jobs:
"Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn't really do it, they just saw something. It seemed obvious to them after a while. That's because they were able to connect experiences they've had and synthesize new things."
They constantly shake things up.
Diversity of experience, more than anything else, is critical to creativity, says Kaufman. Creatives like to shake things up, experience new things, and avoid anything that makes life more monotonous or mundane.
"Creative people have more diversity of experiences, and habit is the killer of diversity of experience," says Kaufman.
They make time for mindfulness.
Creative types understand the value of a clear and focused mind -- because their work depends on it. Many artists, entrepreneurs, writers and other creative workers, such as David Lynch, have turned to meditation as a tool for tapping into their most creative state of mind.
And science backs up the idea that mindfulness really can boost your brain power in a number of ways. A 2012 Dutch study suggested that certain meditation techniques can promote creative thinking. And mindfulness practices have been linked with improved memory and focus, better emotional well-being, reduced stress and anxiety, and improved mental clarity -- all of which can lead to better creative thought.
Friday, March 28, 2014
The Price of Slavery
The Price of Slavery
‘The Problem of Slavery in the Age of Emancipation,’ by David Brion DavisBy BRENDA WINEAPPLEMARCH 28, 2014
Continue reading the main story Continue reading the main story In 1862, when Nathaniel Hawthorne headed south from New England to see the Civil War firsthand, he came upon a group of former slaves trudging northward. “They seemed a kind of creature by themselves, not altogether human,” he wrote, “but perhaps quite as good, and akin to the fauns and rustic deities of olden times.” “Whoever may be benefited by the results of this war,” he added, “it will not be the present generation of negroes.”
Open Book: The End of a Long RoadMARCH 28, 2014 Hawthorne’s stunning comparison of real men and women to half-human creatures, even if kindly intended, gets to the heart of David Brion Davis’s “The Problem of Slavery in the Age of Emancipation,” the richly textured final volume in his exceptional trilogy about slavery in the Western Hemisphere. “I have long interpreted the problem of slavery,” he writes in his introduction, “as centering on the impossibility of converting humans into the totally compliant, submissive, accepting chattel symbolized by Aristotle’s ideal of the ‘natural slave.’ ”
An ad posted by the slave trader William F. Talbott of Lexington, Ky., in 1853. Credit Hulton Archive/Getty Images Less a political historian than a moral philosopher, Davis focuses here on 19th-century trans-Atlantic abolitionism and, in particular, the intellectual and theological origins of the antislavery movement in America. Borrowing from Freud and Descartes, he suggests that slaveholders projected onto their chattels the qualities they repressed in themselves. Particularly in America, the black population represented to white people “the finitude, imperfections, sensuality, self-mockery and depravity of human nature, thereby amplifying the opposite qualities in the white race.”
As a consequence, an American dream of freedom and opportunity was inseparable from a white illusion of superiority, bolstered by the subjugation and “animalization” of black people. That is, slaves were considered domesticated savages who would, if given the chance, revert to murder and mayhem. To many whites, particularly pro-slavery Southerners, this seemed the lesson of the violent and ultimately successful Haitian Revolution, which represented, as Davis puts it, “the unleashing of pure Id.”
Continue reading the main story
But the ironies of history are boundless. Although Haiti’s slaves did win their freedom, a prolonged civil war damaged the country’s economy. Seizing this opportunity, planters elsewhere in the Caribbean and in the American South increased production, which meant they needed to acquire more slaves. In 1803, South Carolina reopened its slave trade, importing 40,000 Africans in the next four years. Yet Great Britain, having lost as many as 50,000 soldiers and seamen in Haiti, responded differently, emancipating 800,000 colonial slaves in 1834 without spilling a drop of blood.
After Haiti, many well-meaning American reformers wanted to expunge the black “Id” peacefully by recolonizing free blacks in Africa. While Davis’s use of such Freudianism may seem overbearing at times, his analysis of the underpinnings of the American Colonization Society, founded in 1816, is subtle, wide-ranging and consistently judicious. Refusing to dismiss colonization out of hand, he places it within the context of the Exodus narrative of deliverance, which he then persuasively connects to American messianism. He also shows that many supporters of colonization did not consider black people to be inherently inferior. Rather, in the first decades of the 19th century, evangelical reformers argued that slavery and prejudice had so completely dehumanized the African-American that he could never escape what the New England clergyman Leonard Bacon termed “the abyss of his degradation” without being relocated to a less corrupt environment.
But, as Davis cogently observes, not only did the pompous style of many colonization speeches reinforce the “self-justificatory power of the myth of America,” the rhetoric was “so abstract and grandiose that it almost precluded serious discussion of capital investment, technological assistance, labor skills and markets.” Without money or training, how were the new colonists to establish a country, much less a prosperous one? What about the commercial networks they would endanger or the native people they would displace? As Davis makes clear, “the glaring defect in the colonizationist ideology was the refusal to recognize the vital contributions that blacks had made and would continue to make to American civilization.”
Black leaders, understandably resistant to the colonization movement, were crucial to its demise. Except for William Lloyd Garrison’s antislavery newspaper, The Liberator, initially financed by the wealthy black Philadelphian James Forten, for almost 10 years most of the white press refused to print the denunciations of colonization issued by black organizations. But the African-American newspaper Freedom’s Journal roundly criticized the American Colonization Society as perpetuating rather than eradicating slavery. And in 1829, David Walker’s radical “Appeal to the Coloured Citizens of the World” called on nonwhites to unite, to prove “that we are men, and not brutes, as we have been represented, and by millions treated.” Smuggled into the South, the “Appeal” caused so much consternation that when Walker died of consumption in 1830, it was widely believed that he had been murdered.
Black abolitionists like James Forten, Richard Allen and Samuel Cornish demanded an integrated American society, and by the 1830s the American abolition movement had evolved into an energetic biracial entity. Davis argues that the white abolitionists, however paternalistic, were sincerely inspired by Anglo-American Christianity and by the efforts of religious women. The Anti-Slavery Convention of American Women, which first met in New York in 1837, published several significant pamphlets calling for racial and gender equality without any patronizing blather, although by today’s standards their emphasis on the “elevation” of the black population might sound sanctimonious and condescending. That’s part of Davis’s larger argument: Abolitionism was not monolithic in makeup or in motivation. Black emigrationists were not the same as white colonizationists, nor was black nationalism the same as white nationalism.
Additionally, as Davis demonstrates, every movement contains nuances and paradoxes. When British reformers linked abolition to their crusade against wage slavery, Frederick Douglass replied that unlike exploited workers, chattel slaves did not even have “the privilege of saying ‘myself.’ ” Yet a correlation between these forms of bondage, as Davis points out, helps us to extend “the historically successful moral condemnation of slavery to other forms of coerced labor and exploitation,” from Nazi concentration camps to 21st-century sex trafficking.
“Moral progress seems to be historical, cultural and institutional,” Davis concludes, “not the result of a genetic improvement in human nature.” The passage of the Fugitive Slave Law and the Supreme Court’s Dred Scott decision, as well as the South’s increasing belligerence, suggested to many abolitionists that slavery could not be ended without violence. But although the Civil War was truly catastrophic, the Emancipation Proclamation and the 13th Amendment could not have been predicted at its start — or, for that matter, the subsequent end of slavery in Cuba and Brazil, which resulted partly because of Anglo-American abolition. Moral progress may be historical, cultural and institutional, but it isn’t inevitable. All the more reason this superb book should be essential reading for anyone wishing to understand our complex and contradictory past.
THE PROBLEM OF SLAVERY IN THE AGE OF EMANCIPATION
By David Brion Davis
422 pp. Alfred A. Knopf. $30
Does Reading Make You Healthier?
by Leo Robson from The New Statesman
Living life by the book: why reading isn't always good for you
Somewhere along the line, an orthodoxy hardened: cigarettes will kill you and Bon Jovi will give you a migraine, but reading – the ideal diet being Shakespeare and 19th-century novels, plus the odd modernist – will make you healthier, stronger, kinder. But is that true?
by Leo Robson Published 20 March, 2014 - 10:00
Books on Books (2003) by Jonathan Wolstenholme/Private Collection/Bridgeman Art LibraryThe Unexpected Professor: an Oxford Life in Books
John Carey
Faber & Faber, 353pp, £18.99
Reading and the Reader
Philip Davis
Oxford University Press, 147pp, £12.99
Why I Read: the Serious Pleasure of Books
Wendy Lesser
Farrar, Straus & Giroux, 226pp, £17.99
The Road to Middlemarch: My Life With George Eliot
Rebecca Mead
Granta Books, 296pp, £16.99
There is a series of postcards by the Dutch cartoonist Joost Swarte that applies the alarmist tone usually reserved for smoking to scenes of people reading. A sunbathing woman is going purple and the caption, set in black on white with a black border, says: “Reading causes ageing of the skin.” In other scenarios a man ignores the naked woman lying beside him (“Reading may reduce the blood flow and cause impotence”) and a mother pours huge quantities of salt into a meal (“Reading seriously harms you and others around you”). What makes the cartoons so flat and pointless, apart from Swarte’s winsome draftsmanship, is their apparent belief that the benevolence of reading is a stable fact, ripe for comic inversion, rather than a social attitude that we are free to dispute. It is the same ostensive irony that underpins George Orwell’s exercise in amateur accountancy, “Books v Cigarettes”.
Still, you can see where Swarte’s confusion came from. Reading has the best PR team in the business. Or perhaps it’s just that devoted readers have better access to the language of advocacy and celebration than chain-smokers or, say, power-ballad enthusiasts. Either way, somewhere along the line, an orthodoxy hardened: cigarettes will kill you and Bon Jovi will give you a migraine, but reading – the ideal diet being Shakespeare and 19th-century novels, plus the odd modernist – will make you healthier, stronger, kinder. With the foundation of Sex and Love Addicts Anonymous in 1976, reading became the last thing you can never do too often. Even the much-made argument that works of literature – Northanger Abbey, Madame Bovary – insist on the dangers of literature redounds to literature’s benefit, and provides yet another reason for reading.
But a serious, non-circular opposition case has been made, if not against reading, then against the idea that the western canon is morally improving or good for the soul. Shakespeare, most canonical of all, became a magnet for 1980s iconoclasts, who disparaged him as an imperial stooge (post-colonial theory), a tool of national power (cultural materialism) and a product of the same social/ideological energies as such putatively non-literary texts as James I’s Counterblaste to Tobacco (new historicism). Conducted for the most part in postgraduate seminar rooms and the pages of academic texts (the collection Political Shakespeare being perhaps the best-known English example), the debate was finally settled in the public sphere, where the cultural warriors, keen to alter reputations and revise the agenda, were greeted with indifference or derision.
At the turn of the 21st century, with the debate dying off and the future uncertain, Harold Bloom, in How to Read and Why, and Frank Kermode, in Shakespeare’s Language, tried to reassert the old agenda by teaching lessons that had been standard in their youth but had faded amid the chatter.
The project has since split in two, with reading primers teaching us “how” to read and reading memoirs providing testimony as to “why”, both in positive rather than implicitly combative terms. There is no longer any need to write “in defence of” reading, or, if there is, the defence is against forces such as “distraction” and “technology” that are indifferent to reading literature, not actively ranged against it. Even those memoirs that hinge on grisly challenges – a book a day (Tolstoy and the Purple Chair) or all 51 volumes of the Harvard Classics (The Whole Five Feet) – make no reference to “book addiction” or “hyper-literacy”. If a downside emerges, it does so between the lines.
In the penultimate sentence of his new book, John Carey says that reading “is freedom”, yet he provides more than enough evidence to the contrary. The Unexpected Professor is an autobiography (postwar austerity, grammar school, national service, Oxford, Oxford, Oxford) that doubles as a “selective and opinionated” history of English literature, and a glories-of-reading memoir that doubles as an anti-reading memoir. Carey notes that people like him often prefer reading things to seeing them – typically, his example comes not from his own life but from a poem by Wordsworth – and reflects: “So living your truest life in books may deaden the real world for you as well as enliven it.” But how, judging by this account, does reading enliven things?
Carey confesses to feeling guilty that as an undergraduate he could read all day, while “out in the real world” (there it is again) people were “slogging away”. But it doesn’t seem all that different from his life in the non-real world: “I secured a copy from Hammersmith Public Library . . . and slogged through all sixteen thousand lines of it. It was unspeakably boring” (Layamon’s Brut). “I slogged through it of course, because my aim was to learn, not to have fun” (Johnson’s Lives of the Poets). Even Wordsworth, who showed that reading can spoil you for experience, is read “as a kind of atonement”, in a “microscopically printed” edition that proves “not exactly an On-First-Looking-into-Chapman’s-Homer experience”. Once he had squinted his way through English literature, Carey was free to gorge on European novels, yet even that sounds like a mixed experience. Dostoevsky he found “hard going” and though there were other writers he enjoyed a good deal more – Zola, Tolstoy, Thomas Mann – he still “forced myself to make notes on the endpapers”. If there’s any enlivening going on, it’s not being enacted on life by literature but the other way around: playing cricket at other schools “made me understand better that bit in the Book of Numbers where the Israelites send out spies to size up the opposition . . .”
In What Good Are the Arts?, Carey wrote that the non-literary arts are “locked in inarticulacy”. But literature, in his version, is locked in articulacy, forever making pronouncements and cases and claims. His lifetime of reading, as recounted in this book, has given him nothing, other than the occasional ringing phrase, that he could not have found in some form of pamphlet. In Carey’s account, reading provides an opportunity to engage with writers who share your convictions and to reject the ones who don’t: Milton’s anti-royalism “put me on his side”, “what I liked most fiercely was Jonson’s exposure of rampaging luxury”, “What The Faerie Queene does is mythicise political power, attributing supernatural status to a dictatorial regime, and this makes it, at heart, crass and false”. A telling example of Carey’s picture of literature-as-logic comes when he quotes a well-known passage from George Eliot’s novel Middlemarch, a reflection on “that element of tragedy which lies in the very fact of frequency”:
If we had a keen vision and feeling of all ordinary human life, it would be like hearing the grass grow and the squirrel’s heart beat, and we should die of that roar which lies on the other side of silence. As it is, the quickest of us walk about well wadded with stupidity.
Although this is the passage Carey uses to support his view of Eliot as “the most intelligent of English novelists”, all he says is that she “is unusual in using poetry in the service of thinking . . . The tenderness of the heartbeat and the shock of the roar would be marvellous simply as a poetic moment. But it is also part of an argument.”
It comes down to a vision of language and how it relates to ideas. Carey writes that D H Lawrence “tries to make us see that, if he could, he’d communicate in some other way, freed from the limitations of thought”. But for Philip Davis, in his treatise-like Reading and the Reader, literature allows just such freedom. According to Davis, Eliot is not putting poetry to the service of “thinking”, in Carey’s op-ed sense of the word, but doing the kind of not-quite-thinking enabled by literary language. “Try counting the thoughts in a powerful paragraph in a realist novel,” he writes, after quoting the same passage from Middlemarch: “they are no longer separate units.” Earlier in the book he asserts that, “at its deepest”, an idea possesses more than “just a statable content”.
Carey is blithely confident about the meaning of literary texts but in the past has dismissed efforts to bring aesthetic response into the realm of scientific knowledge. Davis, by contrast, surrenders to literature’s indeterminacy but believes that its impact shows up on a brain scan. He quotes the example of cognitive scientists, his collaborators at the centre for reading research that he runs at the University of Liverpool, who have demonstrated “how a dramatically compressed Shakespearean coinage such as ‘this old man godded me’ excites the brain in a way that ‘this old man deified me’ . . . does not”. Davis claims that science shows “how” a Shakespearean coinage does this – but how literature achieves the effect is exactly what resists not just scientific decoding, but verbal description. “I cannot just talk about reading,” he writes, “when that is precisely not what I shall claim to be a literary way of thinking” (as if a vet used only man-made tools).
One result of Davis’s aversion to the general is a certain overexuberance with regard to quotations. He is constantly offering “a different instance”. When he writes “I can think of a hundred examples . . .” you are justified in fearing he will list them. Shakespeare is likened to “existential physics” and “process philosophy”, and a Shakespearean allusion renders a nonsensical proposition more nonsensical still: “In the readiness of all, the words themselves seem ready when they do come.” Equally forbidding though no more instructive is the sentence that begins: “It is fashionable to talk, after Csikszentmihalyi, of being ‘in the flow’ . . .” Though Davis has none of Carey’s semi-conscious misgivings about reading, he unwittingly exposes one of its greatest dangers. Biron, attacking study at the start of Love’s Labour’s Lost, claims that “light seeking light doth light of light beguile” (in which “light” means respectively the mind, enlightenment, sight and eyes). It might be said that Davis has read too much to write a readable book about reading.
However, Davis’s idea of what literature uniquely offers to the reader is a powerful one, and is shared to some extent by Wendy Lesser, the essayist and literary editor, in her warmer but no less erudite or sophisticated Why I Read, a tribute to what she calls “the serious pleasure of books”. Just as Davis likes writing in which language is used “as a sign of approximation to point to more than itself”, so Lesser admires writers who meet our desire for order “only halfway” (Eça de Queiroz) or give us “only a small part of what is really there” (Penelope Fitzgerald). But Lesser differs from Davis and also from Carey in taking a degree of responsibility: literature is grounded in the capricious reader, not in the permanent present of the text. Carey first read War and Peace in the 1960s but if his feelings about it have changed, he doesn’t tell us, whereas Lesser explains how it overtook Anna Karenina in her affections. And the reader’s shimmying perspective – the reader as human being – is treated as a topic in its own right by the journalist Rebecca Mead in The Road to Middlemarch, in which she traces how a novel that once gratified her teenage “aspirations to maturity and learnedness” has become “a melancholy dissection of the resignations that attend middle age, the paths untrodden and the choices unmade”.
Lesser and Mead treat the reader to a more attractive vision of reading, no less valuable for being far less dutiful, no less “salutary” for accommodating the kinds of pleasures that Lesser describes as “cellulose-based”. Carey’s distinctions between learning and having fun, between life and literature, are cleanly resolved. Just as reading the classics is not slog-work, so the library is not the unreal or anti-real world. “The library had been a place for studying,” Mead writes, of her rather jollier time at Oxford, “but it had also been a place for everything else; seeing friends, watching strangers, flirting and falling in love. Life happened in the library.” Without making the connection, she promotes a similarly unhermetic vision of her engagement with literature, which is not, she writes, just “a form of escapism” but a first-hand mode of existence – as Dickens more than implied when he wrote that reading Eliot’s Adam Bede had taken its place “among the actual experiences and endurances of my life”. When you are “grasped” by a book, Mead argues, “reading . . . feels like an urgent, crucial dimension of life itself”. And you can do it while you smoke.
Living life by the book: why reading isn't always good for you
Somewhere along the line, an orthodoxy hardened: cigarettes will kill you and Bon Jovi will give you a migraine, but reading – the ideal diet being Shakespeare and 19th-century novels, plus the odd modernist – will make you healthier, stronger, kinder. But is that true?
by Leo Robson Published 20 March, 2014 - 10:00
Books on Books (2003) by Jonathan Wolstenholme/Private Collection/Bridgeman Art LibraryThe Unexpected Professor: an Oxford Life in Books
John Carey
Faber & Faber, 353pp, £18.99
Reading and the Reader
Philip Davis
Oxford University Press, 147pp, £12.99
Why I Read: the Serious Pleasure of Books
Wendy Lesser
Farrar, Straus & Giroux, 226pp, £17.99
The Road to Middlemarch: My Life With George Eliot
Rebecca Mead
Granta Books, 296pp, £16.99
There is a series of postcards by the Dutch cartoonist Joost Swarte that applies the alarmist tone usually reserved for smoking to scenes of people reading. A sunbathing woman is going purple and the caption, set in black on white with a black border, says: “Reading causes ageing of the skin.” In other scenarios a man ignores the naked woman lying beside him (“Reading may reduce the blood flow and cause impotence”) and a mother pours huge quantities of salt into a meal (“Reading seriously harms you and others around you”). What makes the cartoons so flat and pointless, apart from Swarte’s winsome draftsmanship, is their apparent belief that the benevolence of reading is a stable fact, ripe for comic inversion, rather than a social attitude that we are free to dispute. It is the same ostensive irony that underpins George Orwell’s exercise in amateur accountancy, “Books v Cigarettes”.
Still, you can see where Swarte’s confusion came from. Reading has the best PR team in the business. Or perhaps it’s just that devoted readers have better access to the language of advocacy and celebration than chain-smokers or, say, power-ballad enthusiasts. Either way, somewhere along the line, an orthodoxy hardened: cigarettes will kill you and Bon Jovi will give you a migraine, but reading – the ideal diet being Shakespeare and 19th-century novels, plus the odd modernist – will make you healthier, stronger, kinder. With the foundation of Sex and Love Addicts Anonymous in 1976, reading became the last thing you can never do too often. Even the much-made argument that works of literature – Northanger Abbey, Madame Bovary – insist on the dangers of literature redounds to literature’s benefit, and provides yet another reason for reading.
But a serious, non-circular opposition case has been made, if not against reading, then against the idea that the western canon is morally improving or good for the soul. Shakespeare, most canonical of all, became a magnet for 1980s iconoclasts, who disparaged him as an imperial stooge (post-colonial theory), a tool of national power (cultural materialism) and a product of the same social/ideological energies as such putatively non-literary texts as James I’s Counterblaste to Tobacco (new historicism). Conducted for the most part in postgraduate seminar rooms and the pages of academic texts (the collection Political Shakespeare being perhaps the best-known English example), the debate was finally settled in the public sphere, where the cultural warriors, keen to alter reputations and revise the agenda, were greeted with indifference or derision.
At the turn of the 21st century, with the debate dying off and the future uncertain, Harold Bloom, in How to Read and Why, and Frank Kermode, in Shakespeare’s Language, tried to reassert the old agenda by teaching lessons that had been standard in their youth but had faded amid the chatter.
The project has since split in two, with reading primers teaching us “how” to read and reading memoirs providing testimony as to “why”, both in positive rather than implicitly combative terms. There is no longer any need to write “in defence of” reading, or, if there is, the defence is against forces such as “distraction” and “technology” that are indifferent to reading literature, not actively ranged against it. Even those memoirs that hinge on grisly challenges – a book a day (Tolstoy and the Purple Chair) or all 51 volumes of the Harvard Classics (The Whole Five Feet) – make no reference to “book addiction” or “hyper-literacy”. If a downside emerges, it does so between the lines.
In the penultimate sentence of his new book, John Carey says that reading “is freedom”, yet he provides more than enough evidence to the contrary. The Unexpected Professor is an autobiography (postwar austerity, grammar school, national service, Oxford, Oxford, Oxford) that doubles as a “selective and opinionated” history of English literature, and a glories-of-reading memoir that doubles as an anti-reading memoir. Carey notes that people like him often prefer reading things to seeing them – typically, his example comes not from his own life but from a poem by Wordsworth – and reflects: “So living your truest life in books may deaden the real world for you as well as enliven it.” But how, judging by this account, does reading enliven things?
Carey confesses to feeling guilty that as an undergraduate he could read all day, while “out in the real world” (there it is again) people were “slogging away”. But it doesn’t seem all that different from his life in the non-real world: “I secured a copy from Hammersmith Public Library . . . and slogged through all sixteen thousand lines of it. It was unspeakably boring” (Layamon’s Brut). “I slogged through it of course, because my aim was to learn, not to have fun” (Johnson’s Lives of the Poets). Even Wordsworth, who showed that reading can spoil you for experience, is read “as a kind of atonement”, in a “microscopically printed” edition that proves “not exactly an On-First-Looking-into-Chapman’s-Homer experience”. Once he had squinted his way through English literature, Carey was free to gorge on European novels, yet even that sounds like a mixed experience. Dostoevsky he found “hard going” and though there were other writers he enjoyed a good deal more – Zola, Tolstoy, Thomas Mann – he still “forced myself to make notes on the endpapers”. If there’s any enlivening going on, it’s not being enacted on life by literature but the other way around: playing cricket at other schools “made me understand better that bit in the Book of Numbers where the Israelites send out spies to size up the opposition . . .”
In What Good Are the Arts?, Carey wrote that the non-literary arts are “locked in inarticulacy”. But literature, in his version, is locked in articulacy, forever making pronouncements and cases and claims. His lifetime of reading, as recounted in this book, has given him nothing, other than the occasional ringing phrase, that he could not have found in some form of pamphlet. In Carey’s account, reading provides an opportunity to engage with writers who share your convictions and to reject the ones who don’t: Milton’s anti-royalism “put me on his side”, “what I liked most fiercely was Jonson’s exposure of rampaging luxury”, “What The Faerie Queene does is mythicise political power, attributing supernatural status to a dictatorial regime, and this makes it, at heart, crass and false”. A telling example of Carey’s picture of literature-as-logic comes when he quotes a well-known passage from George Eliot’s novel Middlemarch, a reflection on “that element of tragedy which lies in the very fact of frequency”:
If we had a keen vision and feeling of all ordinary human life, it would be like hearing the grass grow and the squirrel’s heart beat, and we should die of that roar which lies on the other side of silence. As it is, the quickest of us walk about well wadded with stupidity.
Although this is the passage Carey uses to support his view of Eliot as “the most intelligent of English novelists”, all he says is that she “is unusual in using poetry in the service of thinking . . . The tenderness of the heartbeat and the shock of the roar would be marvellous simply as a poetic moment. But it is also part of an argument.”
It comes down to a vision of language and how it relates to ideas. Carey writes that D H Lawrence “tries to make us see that, if he could, he’d communicate in some other way, freed from the limitations of thought”. But for Philip Davis, in his treatise-like Reading and the Reader, literature allows just such freedom. According to Davis, Eliot is not putting poetry to the service of “thinking”, in Carey’s op-ed sense of the word, but doing the kind of not-quite-thinking enabled by literary language. “Try counting the thoughts in a powerful paragraph in a realist novel,” he writes, after quoting the same passage from Middlemarch: “they are no longer separate units.” Earlier in the book he asserts that, “at its deepest”, an idea possesses more than “just a statable content”.
Carey is blithely confident about the meaning of literary texts but in the past has dismissed efforts to bring aesthetic response into the realm of scientific knowledge. Davis, by contrast, surrenders to literature’s indeterminacy but believes that its impact shows up on a brain scan. He quotes the example of cognitive scientists, his collaborators at the centre for reading research that he runs at the University of Liverpool, who have demonstrated “how a dramatically compressed Shakespearean coinage such as ‘this old man godded me’ excites the brain in a way that ‘this old man deified me’ . . . does not”. Davis claims that science shows “how” a Shakespearean coinage does this – but how literature achieves the effect is exactly what resists not just scientific decoding, but verbal description. “I cannot just talk about reading,” he writes, “when that is precisely not what I shall claim to be a literary way of thinking” (as if a vet used only man-made tools).
One result of Davis’s aversion to the general is a certain overexuberance with regard to quotations. He is constantly offering “a different instance”. When he writes “I can think of a hundred examples . . .” you are justified in fearing he will list them. Shakespeare is likened to “existential physics” and “process philosophy”, and a Shakespearean allusion renders a nonsensical proposition more nonsensical still: “In the readiness of all, the words themselves seem ready when they do come.” Equally forbidding though no more instructive is the sentence that begins: “It is fashionable to talk, after Csikszentmihalyi, of being ‘in the flow’ . . .” Though Davis has none of Carey’s semi-conscious misgivings about reading, he unwittingly exposes one of its greatest dangers. Biron, attacking study at the start of Love’s Labour’s Lost, claims that “light seeking light doth light of light beguile” (in which “light” means respectively the mind, enlightenment, sight and eyes). It might be said that Davis has read too much to write a readable book about reading.
However, Davis’s idea of what literature uniquely offers to the reader is a powerful one, and is shared to some extent by Wendy Lesser, the essayist and literary editor, in her warmer but no less erudite or sophisticated Why I Read, a tribute to what she calls “the serious pleasure of books”. Just as Davis likes writing in which language is used “as a sign of approximation to point to more than itself”, so Lesser admires writers who meet our desire for order “only halfway” (Eça de Queiroz) or give us “only a small part of what is really there” (Penelope Fitzgerald). But Lesser differs from Davis and also from Carey in taking a degree of responsibility: literature is grounded in the capricious reader, not in the permanent present of the text. Carey first read War and Peace in the 1960s but if his feelings about it have changed, he doesn’t tell us, whereas Lesser explains how it overtook Anna Karenina in her affections. And the reader’s shimmying perspective – the reader as human being – is treated as a topic in its own right by the journalist Rebecca Mead in The Road to Middlemarch, in which she traces how a novel that once gratified her teenage “aspirations to maturity and learnedness” has become “a melancholy dissection of the resignations that attend middle age, the paths untrodden and the choices unmade”.
Lesser and Mead treat the reader to a more attractive vision of reading, no less valuable for being far less dutiful, no less “salutary” for accommodating the kinds of pleasures that Lesser describes as “cellulose-based”. Carey’s distinctions between learning and having fun, between life and literature, are cleanly resolved. Just as reading the classics is not slog-work, so the library is not the unreal or anti-real world. “The library had been a place for studying,” Mead writes, of her rather jollier time at Oxford, “but it had also been a place for everything else; seeing friends, watching strangers, flirting and falling in love. Life happened in the library.” Without making the connection, she promotes a similarly unhermetic vision of her engagement with literature, which is not, she writes, just “a form of escapism” but a first-hand mode of existence – as Dickens more than implied when he wrote that reading Eliot’s Adam Bede had taken its place “among the actual experiences and endurances of my life”. When you are “grasped” by a book, Mead argues, “reading . . . feels like an urgent, crucial dimension of life itself”. And you can do it while you smoke.
Bookstore Illusions
by Rebecca Mead
from The New Yorker
Martin Amis, in his 1995 novel, “The Information,” delivered a memorable riff on the reading habits of transatlantic airplane travellers as divided by class. “In Coach the laptop literature was pluralistic, liberal, and humane: Daniel Deronda, trigonometry, Lebanon, World War 1, Homer, Diderot, Anna Karenina,” he wrote. In business class, however, “they were reading outright junk. Fat financial thrillers, chunky chillers and tublike tinglers.” But even that is better than what was being consumed in first class, filled with snoozing plutocrats, where nobody was reading anything, “except for a lone seeker who gazed, with a frown of mature skepticism, at a perfume catalogue.”
If Amis’s airplane taxonomy has been rendered obsolete by the development of the seat-back entertainment system—now travellers of all classes are watching “Skyfall,” while hoping it doesn’t—his analysis came to mind this morning, in the light of a story in the Times about the disappearance of bookstores from Manhattan. The paper reported that at last count there were a hundred and six Manhattan bookstores, down from a hundred and fifty in 2000. (If a hundred and six bookstores still sounds like more bookstores than there actually are in Manhattan, that’s because outlets like the Hudson News kiosks in Penn Station and Grand Central are included in the total.)
Every New Yorker of a couple of decades’ standing can cite her or his late, lamented local bookstore: for many years, mine was Spring Street Books, in Soho—until, in the nineties, it became a shoe store, a retail segment for which it seems there is an infinite need. In the Times, Esther Newberg, the literary agent, decried the transformation of Manhattan into “an outlet mall for rich people,” citing Fifth Avenue as her evidence. Manhattan—at least the hyper-affluent core of it—seems to have gone the way of Amis’s first-class cabin, becoming a place in which people who are too rich to read can stretch out and indulge themselves.
Thank goodness, then, for cattle class: the outer boroughs. As Emily Gould, the outer-borough novelist, tweeted this morning, “that bookstores article could just as easily be titled ‘independent bookstores thrive in Brooklyn and Jersey City.’ ” Word Bookstore, a successful independent in Greenpoint, is indeed opening a second store in Jersey City. Greenlight Bookstore, in Fort Greene—my current local—is such an exemplar of Brooklyn literary energy that entering its doors can feel like wandering into a Noah Baumbach movie, one centered around two young freelancers who meet at a reading group, their relationship sparked by an argument over whether next month’s choice should be Jonathan Lethem or Jennifer Egan.
But therein lies a problem, too. Those of us who cherish our local bookstores do so not simply because they are convenient—how great to be able to run out for milk and also pick up the new Karl Ove Knausgaard!—but also because we feel a duty to support them, because we believe in their mission. When books can be bought so cheaply online, or at one of the dwindling number of discount retailers, paying more to shop at a local bookstore feels virtuous, like buying locally sourced organic vegetables, or checking to see if a T-shirt is made in the U.S.A. It can be gratifying to the point of smugness to feel that one is being pluralistic, liberal, and humane; shopping at an independent bookstore may be one of the diminishing opportunities to experience that feeling in first-class New York City. Still, when I consider the vanished bookstores of Manhattan, I mourn not just their passing but the loss of a certain kind of book-buying innocence—a time when where one bought a book did not constitute a political statement, and reading it did not feel like participating in a requiem.
from The New Yorker
Martin Amis, in his 1995 novel, “The Information,” delivered a memorable riff on the reading habits of transatlantic airplane travellers as divided by class. “In Coach the laptop literature was pluralistic, liberal, and humane: Daniel Deronda, trigonometry, Lebanon, World War 1, Homer, Diderot, Anna Karenina,” he wrote. In business class, however, “they were reading outright junk. Fat financial thrillers, chunky chillers and tublike tinglers.” But even that is better than what was being consumed in first class, filled with snoozing plutocrats, where nobody was reading anything, “except for a lone seeker who gazed, with a frown of mature skepticism, at a perfume catalogue.”
If Amis’s airplane taxonomy has been rendered obsolete by the development of the seat-back entertainment system—now travellers of all classes are watching “Skyfall,” while hoping it doesn’t—his analysis came to mind this morning, in the light of a story in the Times about the disappearance of bookstores from Manhattan. The paper reported that at last count there were a hundred and six Manhattan bookstores, down from a hundred and fifty in 2000. (If a hundred and six bookstores still sounds like more bookstores than there actually are in Manhattan, that’s because outlets like the Hudson News kiosks in Penn Station and Grand Central are included in the total.)
Every New Yorker of a couple of decades’ standing can cite her or his late, lamented local bookstore: for many years, mine was Spring Street Books, in Soho—until, in the nineties, it became a shoe store, a retail segment for which it seems there is an infinite need. In the Times, Esther Newberg, the literary agent, decried the transformation of Manhattan into “an outlet mall for rich people,” citing Fifth Avenue as her evidence. Manhattan—at least the hyper-affluent core of it—seems to have gone the way of Amis’s first-class cabin, becoming a place in which people who are too rich to read can stretch out and indulge themselves.
Thank goodness, then, for cattle class: the outer boroughs. As Emily Gould, the outer-borough novelist, tweeted this morning, “that bookstores article could just as easily be titled ‘independent bookstores thrive in Brooklyn and Jersey City.’ ” Word Bookstore, a successful independent in Greenpoint, is indeed opening a second store in Jersey City. Greenlight Bookstore, in Fort Greene—my current local—is such an exemplar of Brooklyn literary energy that entering its doors can feel like wandering into a Noah Baumbach movie, one centered around two young freelancers who meet at a reading group, their relationship sparked by an argument over whether next month’s choice should be Jonathan Lethem or Jennifer Egan.
But therein lies a problem, too. Those of us who cherish our local bookstores do so not simply because they are convenient—how great to be able to run out for milk and also pick up the new Karl Ove Knausgaard!—but also because we feel a duty to support them, because we believe in their mission. When books can be bought so cheaply online, or at one of the dwindling number of discount retailers, paying more to shop at a local bookstore feels virtuous, like buying locally sourced organic vegetables, or checking to see if a T-shirt is made in the U.S.A. It can be gratifying to the point of smugness to feel that one is being pluralistic, liberal, and humane; shopping at an independent bookstore may be one of the diminishing opportunities to experience that feeling in first-class New York City. Still, when I consider the vanished bookstores of Manhattan, I mourn not just their passing but the loss of a certain kind of book-buying innocence—a time when where one bought a book did not constitute a political statement, and reading it did not feel like participating in a requiem.
Better Watch It
Best protect yourself these days. There is no such thing as an innocent cough. A pandemic could be in the offing. Stay away from creepy people. It's not worth it to hang around creepy people. Something bad could rub off on you. Avoid sociopaths. Trust me: they don't have your best interests at heart. Is there hope for the future? Best just get thru today and worry about that question at a... later date. Don't answer the phone if you don't recognize the area code. I routinely receive calls from New Hampshire and Idaho and I don't know anybody there so I don't answer. Seize the moment if you desire, but realize that it may not be your moment that you're seizing. It could be your worst enemy's moment in disguise.
Sunday, March 23, 2014
The Closing of the Progressive Mind
Black Pathology and the Closing of the Progressive Mind
How Jonathan Chait and other Obama-era progressives misunderstand the role of white supremacy in America's history and present
Ta-Nehisi Coates
Mar 21 2014, 4:52 PM ET 27
Among opinion writers, Jonathan Chait is outranked in my esteem only by Hendrik Hertzberg. This lovely takedown of Robert Johnson is a classic of the genre, one I studied incessantly when I was sharpening my own sword. The sharpening never ends. With that in mind, it is a pleasure to engage Chait in the discussion over President Obama, racism, culture, and personal responsibility. It's good to debate a writer of such clarity—even when that clarity has failed him.
On y va.
Chait argues that I've conflated Paul Ryan's view of black poverty with Barack Obama's. He is correct. I should have spent more time disentangling these two notions, and illuminating their common roots—the notion that black culture is part of the problem. I have tried to do this disentangling in the past. I am sorry I did not do it in this instance and will attempt to do so now.
Arguing that poor black people are not "holding up their end of the bargain," or that they are in need of moral instruction is an old and dubious tradition in America. There is a conservative and a liberal rendition of this tradition. The conservative version eliminates white supremacy as a factor and leaves the question of the culture's origin ominously unanswered. This version can never be regarded seriously. Life is short. Black life is shorter.
On y va.
The liberal version of the cultural argument points to "a tangle of pathologies" haunting black America born of oppression. This argument—which Barack Obama embraces—is more sincere, honest, and seductive. Chait helpfully summarizes:
The argument is that structural conditions shape culture, and culture, in turn, can take on a life of its own independent of the forces that created it. It would be bizarre to imagine that centuries of slavery, followed by systematic terrorism, segregation, discrimination, a legacy wealth gap, and so on did not leave a cultural residue that itself became an impediment to success.
The "structural conditions" Chait outlines above can be summed up under the phrase "white supremacy." I have spent the past two days searching for an era when black culture could be said to be "independent" of white supremacy. I have not found one. Certainly the antebellum period, when one third of all enslaved black people found themselves on the auction block, is not such an era. And surely we would not consider postbellum America, when freedpeople were regularly subjected to terrorism, to be such an era.
We certainly do not find such a period during the Roosevelt-Truman era, when this country erected a racist social safety, leaving the NAACP to quip that the New Deal was "like a sieve with holes just big enough for the majority of Negroes to fall through." Nor do we find it during the 1940s, '50s and '60s, when African-Americans—as a matter of federal policy—were largely excluded from the legitimate housing market. Nor during the 1980s when we began the erection of a prison-industrial complex so vast that black males now comprise 8 percent of the world's entire incarcerated population.
And we do not find an era free of white supremacy in our times either, when the rising number of arrests for marijuana are mostly borne by African-Americans; when segregation drives a foreclosure crisis that helped expand the wealth gap; when big banks busy themselves baiting black people with "wealth-building seminars" and instead offering "ghetto loans" for "mud people"; when studies find that black low-wage applicants with no criminal record "fared no better than a white applicant just released from prison"; when, even after controlling for neighborhoods and crime rates, my son finds himself more likely to be stopped and frisked. Chait's theory of independent black cultural pathologies sounds reasonable. But it can't actually be demonstrated in the American record, and thus has no applicability.
What about the idea that white supremacy necessarily "bred a cultural residue that itself became an impediment to success"? Chait believes that it's "bizarre" to think otherwise. I think it's bizarre that he doesn't bother to see if his argument is actually true. Oppression might well produce a culture of failure. It might also produce a warrior spirit and a deep commitment to attaining the very things which had been so often withheld from you. There is no need for theorizing. The answers are knowable.
There certainly is no era more oppressive for black people than their 250 years of enslavement in this country. Slavery encompassed not just forced labor, but a ban on black literacy, the vending of black children, the regular rape of black women, and the lack of legal standing for black marriage. Like Chait, 19th-century Northern white reformers coming South after the Civil War expected to find "a cultural residue that itself became an impediment to success."
In his masterful history, Reconstruction, the historian Eric Foner recounts the experience of the progressives who came to the South as teachers in black schools. The reformers "had little previous contact with blacks" and their views were largely cribbed from Uncle Tom's Cabin. They thus believed blacks to be culturally degraded and lacking in family instincts, prone to lie and steal, and generally opposed to self-reliance:
Few Northerners involved in black education could rise above the conviction that slavery had produced a "degraded" people, in dire need of instruction in frugality, temperance, honesty, and the dignity of labor ... In classrooms, alphabet drills and multiplication tables alternated with exhortations to piety, cleanliness, and punctuality.
In short, white progressives coming South expected to find a black community suffering the effects of not just oppression but its "cultural residue."
Here is what they actually found:
During the Civil War, John Eaton, who, like many whites, believed that slavery had destroyed the sense of family obligation, was astonished by the eagerness with which former slaves in contraband camps legalized their marriage bonds. The same pattern was repeated when the Freedmen's Bureau and state governments made it possible to register and solemnize slave unions. Many families, in addition, adopted the children of deceased relatives and friends, rather than see them apprenticed to white masters or placed in Freedmen's Bureau orphanages.
By 1870, a large majority of blacks lived in two-parent family households, a fact that can be gleaned from the manuscript census returns but also "quite incidentally" from the Congressional Ku Klux Klan hearings, which recorded countless instances of victims assaulted in their homes, "the husband and wife in bed, and … their little children beside them."
The point here is rich and repeated in American history—it was not "cultural residue" that threatened black marriages. It was white terrorism, white rapacity, and white violence. And the commitment among freedpeople to marriage mirrored a larger commitment to the reconstitution of family, itself necessary because of systemic white violence.
"In their eyes," wrote an official from the Freedmen's Bureau, in 1865. "The work of emancipation was incomplete until the families which had been dispersed by slavery were reunited."
White people at the time noted a sudden need in black people to travel far and wide. "The Negroes," reports one observer, "are literally crazy about traveling." Why were the Negroes "literally crazy about traveling?" Part of it was the sheer joy of mobility granted by emancipation. But there was something more: "Of all the motivations for black mobility," writes Foner, "none was more poignant than the effort to reunite families separated during slavery."
This effort continued as late the onset of the 20th century, when you could still find newspapers running ads like this:
During the year 1849, Thomas Sample carried away from this city, as his slaves, our daughter, Polly, and son …. We will give $100 each for them to any person who will assist them … to get to Nashville, or get word to us of their whereabouts.
Nor had the centuries-long effort to destroy black curiosity and thirst for education yielded much effect:
Perhaps the most striking illustration of the freedmen's quest for self-improvement was their seemingly unquenchable thirst for education .... The desire for learning led parents to migrate to towns and cities in search of education for their children, and plantation workers to make the establishment of a school-house "an absolute condition" of signing labor contracts ...
Contemporaries could not but note the contrast between white families seemingly indifferent to education and blacks who "toil and strive, labour and endure in order that their children 'may have a schooling'." As one Northern educator remarked: "Is it not significant that after the lapse of one hundred and forty-four years since the settlement [of Beaufort, North Carolina], the Freedmen are building the first public school-house ever erected here."
"All in all," Foner concludes, "the months following the end of the Civil War were a period of remarkable accomplishment for Southern blacks." This is not especially remarkable, if you consider the time. Education, for instance, was not merely a status marker. Literacy was protection against having your land stolen or being otherwise cheated. Perhaps more importantly, it gave access to the Bible. The cultural fruits of oppression are rarely predictable merely through theorycraft. Who would predicted that oppression would make black people hungrier for education than their white peers? Who could predict the blues?
And culture is not exclusive. African-American are Americans, and have been Americans longer than virtually any other group of white Americans. There is no reason to suppose that enslavement cut African-Americans off from a broader cultural values. More likely African-Americans contributed to the creation and maintenance of those values.
The African-Americans who endured enslavement were subject to two and half centuries of degradation and humiliation. Slavery lasted twice as long as Jim Crow and was more repressive. If you were going to see evidence of a "cultural residue" which impeded success you would see it there. Instead you find black people desperate to reconstitute their families, desperate to marry, and desperate to be educated. Progressives who advocate the 19th-century line must specifically name the "cultural residue" that afflicts black people, and then offer evidence of it. Favoring abstract thought experiments over research will not cut it.
Progressives who advocate the 19th-century line must name the "cultural residue" that afflicts black people, and then offer evidence of it. Abstract thought experiments will not cut it.Nor will pretending that old debates are somehow new. For some reason there is an entrenched belief among many liberals and conservatives that discussions of American racism should began somewhere between the Moynihan Report and the Detroit riots. Thus Chait dates our dispute to the fights in the '70s between liberals. In fact, we are carrying on an argument that is at least a century older.
The passage of time is important because it allows us to assess how those arguments have faired. I contend that my arguments have been borne out, and the arguments of progressives like Chait and the president of the United States have not. Either Booker T. Washington was correct when he urged black people to forgo politics in favor eliminating "the criminal and loafing element of our people" or he wasn't. Either W.E.B. Du Bois was correct when he claimed that correcting "the immorality, crime and laziness among the Negroes" should be the "first and primary" goal or he was not. The track record of progressive moral reform in the black community is knowable.
And it's not just knowable from Eric Foner. It can be gleaned from reading the entire Moynihan Report—not just the "tangle of pathologies" section—and then comparing it with Herb Gutman's The Black Family in Slavery and Freedom. It can be gleaned from Isabel Wilkerson's history of the Great Migration, The Warmth of Other Suns. One of the most important threads in this book is Wilkerson dismantling of the liberal theory of cultural degradation.
I want to conclude by examining one important element of Chait's argument—the role of the president of the United States who also happens to be a black man:
If I'm watching a basketball game in which the officials are systematically favoring one team over another (let's call them Team A and Team Duke) as an analyst, the officiating bias may be my central concern. But if I'm coaching Team A, I'd tell my players to ignore the biased officiating. Indeed, I'd be concerned the bias would either discourage them or make them lash out, and would urge them to overcome it. That's not the same as denying bias. It's a sensible practice of encouraging people to concentrate on the things they can control.
Obama's habit of speaking about this issue primarily to black audiences is Obama seizing upon his role as the most famous and admired African-American in the world to urge positive habits and behavior.
Chait's metaphor is incorrect. Barack Obama isn't the coach of "Team Negro," he is the commissioner of the league. Team Negro is very proud that someone who served on our staff has risen (for the first time in history!) to be commissioner. And Team Negro, which since the dawn of the league has endured biased officiating and whose every game is away, hopes that the commissioner's tenure among them has given him insight into the league's problems. But Team Negro is not—and should not be—confused about the commissioner's primary role.
"I'm not the president of black America," Barack Obama has said. "I'm the president of the United States of America."
Precisely.
And the president of the United States is not just an enactor of policy for today, he is the titular representative of his country's heritage and legacy. In regards to black people, America's heritage is kleptocracy—the stealing and selling of other people's children, the robbery of the fruits of black labor, the pillaging of black property, the taxing of black citizens for schools they can not attend, for pools in which they can not swim, for libraries that bar them, for universities that exclude them, for police who do not protect them, for the marking of whole communities as beyond the protection of the state and thus subject to the purview of outlaws and predators.
Obama-era progressives view white supremacy as something awful that happened in the past. I view it as one of the central organizing forces in American life.The bearer of this unfortunate heritage feebly urging "positive habits and behavior" while his country imprisons some ungodly number of black men may well be greeted with applause in some quarters. It must never be so among those of us whose love of James Baldwin is true, whose love of Ida B. Wells is true, whose love of Harriet Tubman and our ancestors who fought for the right of family is true. In that fight America has rarely been our ally. Very often it has been our nemesis.
Obama-era progressives view white supremacy as something awful that happened in the past and the historical vestiges of which still afflict black people today. They believe we need policies—though not race-specific policies—that address the affliction. I view white supremacy as one of the central organizing forces in American life, whose vestiges and practices afflicted black people in the past, continue to afflict black people today, and will likely afflict black people until this country passes into the dust.
There is no evidence that black people are less responsible, less moral, or less upstanding in their dealings with America nor with themselves. But there is overwhelming evidence that America is irresponsible, immoral, and unconscionable in its dealings with black people and with itself. Urging African-Americans to become superhuman is great advice if you are concerned with creating extraordinary individuals. It is terrible advice if you are concerned with creating an equitable society. The black freedom struggle is not about raising a race of hyper-moral super-humans. It is about all people garnering the right to live like the normal humans they are.
Saturday, March 22, 2014
Bill Clinton on Leadership
The former President distills his wisdom for Fortune.
What does leadership mean to you?
Leadership means bringing people together in pursuit of a common cause, developing a plan to achieve it, and staying with it until the goal is achieved. If the leader holds a public or private position with other defined responsibilities, leadership also requires the ability to carry out those tasks and to respond to unforeseen problems and opportunities when they arise. It is helpful to be able to clearly articulate a vision of where you want to go, develop a realistic strategy to get there, and attract talented, committed people with a wide variety of knowledge, perspectives, and skills to do what needs to be done. In the modern world, I believe lasting positive results are more likely to occur when leaders practice inclusion and cooperation rather than authoritarian unilateralism. Even those who lead the way don't have all the answers.
What attributes do leaders share?
Steadfastness in pursuit of a goal, flexibility in determining how best to achieve it. The courage to make a hard decision, and the confidence to stay with it and explain it. The common sense to listen to others and involve them. And the strength to admit it when you make a mistake or when a given policy is not working. You have to be able to trust others, and trust your instincts as well as your intellect. Finally, if the objective is to get something done on a matter that is both important and controversial, you have to be able to compromise as well as know the lines you can't cross.
How did you learn to be a leader?
I learned when I was very young to respect the human dignity of everyone I met, to observe them closely and listen to them carefully. From the adults in my extended family I learned that everybody has a story but not everyone can tell it. I learned that most of life's greatest wounds are self-inflicted, that trying and failing is far better than not trying at all, that everyone makes mistakes but most people are basically good. As a boy growing up in the civil rights years, then during Vietnam, I came to see politics as a way to help other people make their own life stories better. All along the way I learned a lot from other leaders, especially those who befriended me and shared their own experiences. Yitzhak Rabin reminded me that you don't make peace with your friends. Nelson Mandela told me and showed me that you can't be a great leader if you're driven by resentment and hatred, no matter how justified those feelings are. To be free to lead, you have to let a lot of things go. I'm grateful to them and everyone else who taught me to look for the dreams and hurts, hopes and fears, in the eyes of everyone I met.
Who are the great leaders in your mind?
There are too many to mention so I'll stick with a few. Nelson Mandela and Yitzhak Rabin were great for the reasons I mentioned and many more. Helmut Kohl oversaw the reunification of Germany, the European Union, and the creation of the eurozone.
Bill and Melinda Gates have built their amazing foundation, which is saving and lifting countless lives, driven by the principle that every life has equal value. They've selflessly given their money, time, and know-how to help solve global health and development problems. Muhammad Yunus and Fazle Abed have empowered huge numbers of poor people to live more productive lives.
Aung San Suu Kyi's dignified determination helped open her country to the world and inspired women and girls across the world.
This story is from the April 7, 2014 issue of Fortune.
Friday, March 21, 2014
You Can't Beat this Advice
I came along too late to relate strongly to The Beats, but one of them, it might have been Jack, it might have been Lawrence, it might have been Allen, I'm not sure, but one of them said in some context said, "Quit thinking!" Good advice for a weekend. That's me starting today at 5 till Sunday, and that isn't just typing, Truman.
Fighting Republican Voting Suppression
Thursday, Mar 20, 2014 05:04 PM CDT
by Joe Conason
The secret to defeating GOP’s restrictive voting laws: Debunk “voter fraud”
GOP has discouraged minorities and the poor from voting. Can Bill Clinton's new initiative expose their grift?
Growing up in Jim Crow Arkansas, Bill Clinton saw how the state’s dominant political and racial elite maintained power by suppressing the rights of minority voters who threatened its authority — and as a young activist, worked to bring down that illegitimate power structure. So when Clinton says, “There is no greater assault on our core values than the rampant efforts to restrict the right to vote” — as he does in a new video released by the Democratic National Committee — the former president knows of what he speaks.
In the segregationist South of Clinton’s youth, the enemies of the universal franchise were Democrats, but times have changed. Not just below the Mason-Dixon Line but across the country, it is Republicans who have sought to limit ballot access and discourage participation by minorities, the poor, the young and anyone else who might vote for a Democratic candidate.
No doubt this is why, at long last, the Democratic Party has launched a national organizing project, spearheaded by Clinton, to educate voters, demand reforms, and push back against restrictive laws. Returning to his role as the nation’s “explainer-in-chief,” Clinton may be able to draw public attention to the travesty of voter ID requirements and all the other tactics of suppression used by Republicans to shrink the electorate.
His first task is to debunk the claims of “voter fraud,” fabricated by Republican legislators and right-wing media outlets, as the rationale for restrictive laws. Lent a spurious credibility by the legendary abuses of old-time political machines, those claims make voter suppression seem respectable and even virtuous.
Some years ago, the Brennan Center for Justice, based at New York University and led by former Clinton speechwriter Michael Waldman, issued a 45-page report on voter fraud that remains definitive. “There have been a handful of substantiated cases of individual ineligible voters attempting to defraud the election system,” the report noted. “But by any measure, voter fraud is extraordinarily rare.” And because fraud is so unusual, GOP countermeasures, such as voter ID, do much more harm than good.
As the Brennan Center study noted, even some Republicans know their leaders have exaggerated stories of fraud for partisan advantage. In 2007, the Houston Chronicle quoted Royal Masset, the former political director of the Texas Republican Party, who observed that among Republicans, it is “an article of religious faith that voter fraud is causing us to lose elections.” Masset admitted that suspicion is false but said he believed that requiring voters to provide photo ID could sufficiently reduce participation by legitimate Democratic voters and add 3 percent to Republican tallies.
More recently, one of the dimmer lights in the Pennsylvania Republican Party — the majority leader of the state House of Representatives, in fact — boasted that the voter ID statute he had rammed through the legislature would “allow Governor Romney to win the election” in November 2012. Although Mike Turzai later insisted that “there has been a history of voter fraud in Pennsylvania,” the state government conceded in court that it could cite no evidence showing that “in-person voter fraud has in fact occurred in Pennsylvania or elsewhere.”
Clinton can also consult the President’s Commission on Election Administration, a bipartisan panel appointed by President Barack Obama to improve the country’s voting systems. In its final report issued last January, the commission forthrightly acknowledged that true voter fraud is “rare.” It was a singular admission by a group whose co-chairs included Benjamin Ginsberg, an aggressive Republican election attorney who bears the burden of responsibility for the outcome of Bush-Gore 2000.
If he is in a bipartisan mood, as he often is, Clinton would surely find the commission’s report uplifting — especially its recommendations to make voting more modern, more efficient, and above all, more accessible. For both parties to improve and expand, the democratic rights of citizens would be uplifting indeed.
But Clinton is more likely to find himself feeling less kindly toward the Republicans, as they continue to promote outrageous suppression while feigning outrage over “fraud.” The Democrats may be equally motivated by partisan self-interest — but so long as they defend the rights of the intimidated and the disenfranchised, their moral force will be undiminished.
Wednesday, March 19, 2014
Bingo!
MARCH 16, 2014
by Paul Krugman
Continue reading the main story There are many negative things you can say about Paul Ryan, chairman of the House Budget Committee and the G.O.P.’s de facto intellectual leader. But you have to admit that he’s a very articulate guy, an expert at sounding as if he knows what he’s talking about.
So it’s comical, in a way, to see Mr. Ryan trying to explain away some recent remarks in which he attributed persistent poverty to a “culture, in our inner cities in particular, of men not working and just generations of men not even thinking about working.” He was, he says, simply being “inarticulate.” How could anyone suggest that it was a racial dog-whistle? Why, he even cited the work of serious scholars — people like Charles Murray, most famous for arguing that blacks are genetically inferior to whites. Oh, wait.
Just to be clear, there’s no evidence that Mr. Ryan is personally a racist, and his dog-whistle may not even have been deliberate. But it doesn’t matter. He said what he said because that’s the kind of thing conservatives say to each other all the time. And why do they say such things? Because American conservatism is still, after all these years, largely driven by claims that liberals are taking away your hard-earned money and giving it to Those People.
Indeed, race is the Rosetta Stone that makes sense of many otherwise incomprehensible aspects of U.S. politics.
We are told, for example, that conservatives are against big government and high spending. Yet even as Republican governors and state legislatures block the expansion of Medicaid, the G.O.P. angrily denounces modest cost-saving measures for Medicare. How can this contradiction be explained? Well, what do many Medicaid recipients look like — and I’m talking about the color of their skin, not the content of their character — and how does that compare with the typical Medicare beneficiary? Mystery solved.
Or we’re told that conservatives, the Tea Party in particular, oppose handouts because they believe in personal responsibility, in a society in which people must bear the consequences of their actions. Yet it’s hard to find angry Tea Party denunciations of huge Wall Street bailouts, of huge bonuses paid to executives who were saved from disaster by government backing and guarantees. Instead, all the movement’s passion, starting with Rick Santelli’s famous rant on CNBC, has been directed against any hint of financial relief for low-income borrowers. And what is it about these borrowers that makes them such targets of ire? You know the answer.
One odd consequence of our still-racialized politics is that conservatives are still, in effect, mobilizing against the bums on welfare even though both the bums and the welfare are long gone or never existed. Mr. Santelli’s fury was directed against mortgage relief that never actually happened. Right-wingers rage against tales of food stamp abuse that almost always turn out to be false or at least greatly exaggerated. And Mr. Ryan’s black-men-don’t-want-to-work theory of poverty is decades out of date.
In the 1970s it was still possible to claim in good faith that there was plenty of opportunity in America, and that poverty persisted only because of cultural breakdown among African-Americans. Back then, after all, blue-collar jobs still paid well, and unemployment was low. The reality was that opportunity was much more limited than affluent Americans imagined; as the sociologist William Julius Wilson has documented, the flight of industry from urban centers meant that minority workers literally couldn’t get to those good jobs, and the supposed cultural causes of poverty were actually effects of that lack of opportunity. Still, you could understand why many observers failed to see this.
So, Republicans walk like racists, talk like racists and legislate like racists, but we're not sure whether they are racists or not.
As long as rich white liberals insist on maintaining second class citizens as permanent victims-who vote over 90% Democrat-the yoke of...
Blaming is a coping mechanism requiring no work and can be done by lazy people and also can be done by people who are not lazy but frankly...
But over the past 40 years good jobs for ordinary workers have disappeared, not just from inner cities but everywhere: adjusted for inflation, wages have fallen for 60 percent of working American men. And as economic opportunity has shriveled for half the population, many behaviors that used to be held up as demonstrations of black cultural breakdown — the breakdown of marriage, drug abuse, and so on — have spread among working-class whites too.
These awkward facts have not, however, penetrated the world of conservative ideology. Earlier this month the House Budget Committee, under Mr. Ryan’s direction, released a 205-page report on the alleged failure of the War on Poverty. What does the report have to say about the impact of falling real wages? It never mentions the subject at all.
And since conservatives can’t bring themselves to acknowledge the reality of what’s happening to opportunity in America, they’re left with nothing but that old-time dog whistle. Mr. Ryan wasn’t being inarticulate — he said what he said because it’s all that he’s got.
Tuesday, March 18, 2014
Got a Second?
A big-bang theory gets a big boost: Evidence that vast cosmos was created in split second
Video: Researchers say they've found evidence of what happened at the very first moment of the Big Bang. They say the universe grew so quickly, it left ripples in patterns of light, visible in the very far reaches of the universe.
By Joel Achenbach, Published: March 17E-mail the writer
In the beginning, the universe got very big very fast, transforming itself in a fraction of an instant from something almost infinitesimally small to something imponderably vast, a cosmos so huge that no one will ever be able to see it all.
This is the premise of an idea called cosmic inflation — a powerful twist on the big-bang theory — and Monday it received a major boost from an experiment at the South Pole called BICEP2. A team of astronomers led by John Kovac of the Harvard-Smithsonian Center for Astrophysics announced that it had detected ripples from gravitational waves created in a violent inflationary event at the dawn of time.
Like that, the big bang went into cosmic overdrive
Joel Achenbach MAR 17
The 35-year-old theory of “cosmic inflation” gets a boost from an experiment at the South Pole.
.“We’re very excited to present our results because they seem to match the prediction of the theory so closely,” Kovac said in an interview. “But it’s the case that science can never actually prove a theory to be true. There could always be an alternative explanation that we haven’t been clever enough to think of.”
The reaction in the scientific community was cautiously exultant. The new result was hailed as potentially one of the biggest discoveries of the past two decades.
Cosmology, the study of the universe on the largest scales, has already been roiled by the 1998 discovery that the cosmos is not merely expanding but doing so at an accelerating rate, because of what has been called “dark energy.” Just as that discovery has implications for the ultimate fate of the universe, this new one provides a stunning look back at the moment the universe was born.
“If real, it’s magnificent,” said Harvard astrophysicist Lisa Randall.
Lawrence Krauss, an Arizona State University theoretical physicist, said of the new result, “It gives us a new window on the universe that takes us back to almost the very beginning of time, allowing us to turn previously metaphysical questions about our origins into scientific ones.”
The measurement, however, is a difficult one. The astronomers chose the South Pole for BICEP2 and earlier experiments because the air is exceedingly dry, almost devoid of water vapor and ideal for observing subtle quirks in the ancient light pouring in from the night sky. They spent four years building the telescope, and then three years observing and analyzing the data. Kovac, 43, who has been to the South Pole 23 times, said of the conditions there, “It’s almost like being in space.”
The BICEP2 instrument sorts through the cosmic microwave background (CMB), looking for polarization of the light in a pattern that reveals the ripples of gravitational waves. The gravitational waves distort space itself, squishing and tugging the fabric of the universe. This is the first time that anyone has announced the detection of gravitational waves from the early universe.
There are other experiments by rival groups trying to detect these waves, and those efforts will continue in an attempt to confirm the results announced Monday.
“I would say it’s very likely to be correct that we are seeing a signal from inflation,” said Adrian Lee, a University of California at Berkeley cosmologist who is a leader of PolarBear, an experiment based on a mountaintop in Chile that is also searching for evidence of inflation. “But it’s such a hard measurement that we really would like to see it measured with different experiments, with different techniques, looking at different parts of the sky, to have confidence that this is really a signal from the beginning of the universe.”
The fact that the universe is dynamic at the grandest scale, and not static as it appears to be when we gaze at the “fixed stars” in the night sky, has been known since the late 1920s, when astronomer Edwin Hubble revealed that the light from galaxies showed that they were moving away from one another.
This led to the theory that the universe, once compact, is expanding. Scientists in recent years have been able to narrow down the age of the universe to about 13.8 billion years. Multiple lines of evidence, including the detection of the CMB exactly 50 years ago, have bolstered the consensus model of modern cosmology, which shows that the universe was initially infinitely hot and dense, literally dimensionless. There was no space, no time.
Then something happened. The universe began to expand and cool. This was the big bang.
Cosmic inflation throws gasoline on that fire. It makes the big bang even bangier right at the start. Instead of a linear expansion, the universe would have undergone an exponential growth.
In 1979, theorist Alan Guth, then at Stanford, seized on a potential explanation for some of the lingering mysteries of the universe, such as the remarkable homogeneity of the whole place — the way distantly removed parts of the universe had the same temperature and texture even though they had never been in contact with each other. Perhaps the universe did not merely expand in a stately manner but went through a much more dramatic, exponential expansion, essentially going from microscopic in scale to cosmically huge in a tiny fraction of a second.
It is unclear how long this inflationary epoch lasted. Kovac calculated that in that first fraction of a second the volume of the universe increased by a factor of 10 to the 26th power, going from subatomic to cosmic.
This is obviously difficult terrain for theorists, and the question of why there is something rather than nothing creeps into realms traditionally governed by theologians. But theoretical physicists say that empty space is not empty, that the vacuum crackles with energy and that quantum physics permits such mind-boggling events as a universe popping up seemingly out of nowhere.
“Inflation — the idea of a very big burst of inflation very early on — is the most important idea in cosmology since the big bang itself,” said Michael Turner, a University of Chicago cosmologist. “If correct, this burst is the dynamite behind our big bang.”
Princeton University astrophysicist David Spergel said after Monday’s announcement, “If true, this has revolutionary impacts for our understanding of the physics of the early universe and gives us insight into physics on really small scales.”
Spergel added, “We will soon know if this result is revolutionary or due to some poorly understood systematics.”
The inflationary model implies that our universe is exceedingly larger than what we currently observe, which is humbling already in its scale. Moreover, the vacuum energy that drove the inflationary process would presumably imply the existence of a larger cosmos, or “multiverse,” of which our universe is but a granular element.
“These ideas about the multiverse become interesting to me only when theories come up with testable predictions based on them,” Kovac said Monday. “The powerful thing about the basic inflationary paradigm is that it did offer us this clear, testable prediction: the existence of gravitational waves which are directly linked to the exponential expansion that’s intrinsic to the theory.”
The cosmological models favored by scientists do not permit us to have contact with other potential universes. The multiverse is, for now, conjectural, because it is not easily subject to experimental verification and is unobservable — from the South Pole or from anywhere else.
Sunday, March 16, 2014
Alabama needed St. Patrick
The legend is that St. Patrick drove the snakes out of Ireland. We coulda used him in Alabama.
Saturday, March 15, 2014
Dreaming of Lizzie
Last night I dreamed I was at a costume party and Lizzie Borden was there dressed 19th century but wearing a Beyonce mask. She was walking around with a plastic hatchet, tapping at everyone. I must say she was the life of the party, everyone wanting to get tapped with that play hatchet, wanting to ask her the one overwhelming question. So I did asked her, "Did you do it? They say you didn't do it and yet you were the only one who could have done it." She threw her head back and laughed, her Beyonce mask slipping for a moment. "Women have their ways!" She laughed again and vanished and I woke up dazed. I have never liked costume parties anyway.
Friday, March 14, 2014
31 Numbers
After Cosmos: The universe in 31 numbers
By Michael West, Friday, March 14, 7:34 PM
Michael West is director of the Maria Mitchell Observatory on Nantucket.
8.5 million: The number of people who watched the premiere Sunday of “Cosmos: A Spacetime Odyssey” on Fox and affiliated networks.
13.3 million: The number of people who watched the premiere of “Resurrection,” ABC’s new drama about loved ones mysteriously returning from the dead, which aired at the same time.
.17.9 million: The number of people who watched CBS’s sitcom “The Big Bang Theory” last week.
1916: The year Albert Einstein published his General Theory of Relativity, which laid the foundation for big bang cosmology.
0: The number of Nobel Prizes that Einstein won for his General Theory of Relativity.
240: The number of pieces Einstein’s brain was cut into for research purposes after he died.
100 billion: The estimated number of human beings who have ever lived on Earth.
400 billion: The estimated number of stars in our Milky Way galaxy.
38 percent: The share of atoms in the human body that are heavier than hydrogen and hence were made inside stars.
2: The number of golf balls left on the moon in 1971 by astronaut Alan Shepard.
4: The number of people in a family photograph left on the moon’s surface by astronaut Charles Duke in 1972.
8: The number of minutes it takes light from the sun to reach Earth.
328: The number of minutes it takes light from the sun to reach Pluto.
5 million: The number of minutes it will take the New Horizons spacecraft, which launched in 2006, to travel from Earth to Pluto.
3.5 percent: The amount of funding for NASA compared with that of the U.S. military in President Obama’s proposed 2015 budget.
7.3 billion: Dollar amount in the president’s proposed 2015 budget for the National Science Foundation, which funds research in astronomy, biology, chemistry, geology, mathematics, physics and other sciences.
12.9 billion: The cost in dollars of the Navy’s newest aircraft carrier, the USS Gerald R. Ford, christened in November.
26 percent: The share of American adults who think that the sun revolves around the Earth, according to a February study by the National Science Foundation.
9: The approximate number of years it would take to walk nonstop to the moon if you could.
3,536: The approximate number of years it would take to walk to the sun.
177: How many years it would take to drive to the sun at 60 miles per hour.
49 million: How many years it would take to drive to the next nearest star, Proxima Centauri.
9.3 billion: Approximate number of years the universe had existed before Earth formed.
50: The distance in miles from which the Hubble Space Telescope could discern the color of your eyes.
1.5 billion: Estimated total cost in dollars to build the European Extremely Large Telescope, the largest optical telescope ever, the construction of which will soon begin in northern Chile.
2.5 billion: Total cost in dollars to buy one tall Starbucks caffe latte for every man, woman and child in the European Union.
8.8 billion: Total cost in dollars to build the successor to the Hubble Space Telescope, the James Webb Space Telescope, which is scheduled for launch in 2018.
5 million: How many tons of matter the sun converts into energy every second.
1.3 million: The number of Earths that could fit inside the sun if it were hollow.
2/3: The fraction of Americans who can no longer see the Milky Way at night because of light pollution where they live.
Infinite: The universe’s potential to fascinate and inspire people of all ages.
Subscribe to:
Posts (Atom)