Saturday, March 5, 2011

I Can't Think!

I Can’t Think! by Sharon Begley from Newsweek

The Twitterization of our culture has revolutionized our lives, but with an unintended consequence—our overloaded brains freeze when we have to make decisions.

Imagine the most mind-numbing choice you’ve faced lately, one in which the possibilities almost paralyzed you: buying a car, choosing a health-care plan, figuring out what to do with your 401(k). The anxiety you felt might have been just the well-known consequence of information overload, but Angelika Dimoka, director of the Center for Neural Decision Making at Temple University, suspects that a more complicated biological phenomenon is at work. To confirm it, she needed to find a problem that overtaxes people’s decision-making abilities, so she joined forces with economists and computer scientists who study “combinatorial auctions,” bidding wars that bear almost no resemblance to the eBay version. Bidders consider a dizzying number of items that can be bought either alone or bundled, such as airport landing slots. The challenge is to buy the combination you want at the lowest price—a diabolical puzzle if you’re considering, say, 100 landing slots at LAX. As the number of items and combinations explodes, so does the quantity of information bidders must juggle: passenger load, weather, connecting flights. Even experts become anxious and mentally exhausted. In fact, the more information they try to absorb, the fewer of the desired items they get and the more they overpay or make critical errors.

This is where Dimoka comes in. She recruited volunteers to try their hand at combinatorial auctions, and as they did she measured their brain activity with fMRI. As the information load increased, she found, so did activity in the dorsolateral prefrontal cortex, a region behind the forehead that is responsible for decision making and control of emotions. But as the researchers gave the bidders more and more information, activity in the dorsolateral PFC suddenly fell off, as if a circuit breaker had popped. “The bidders reach cognitive and information overload,” says Dimoka. They start making stupid mistakes and bad choices because the brain region responsible for smart decision making has essentially left the premises. For the same reason, their frustration and anxiety soar: the brain’s emotion regions—previously held in check by the dorsolateral PFC—run as wild as toddlers on a sugar high. The two effects build on one another. “With too much information, ” says Dimoka, “people’s decisions make less and less sense.”

So much for the ideal of making well-informed decisions. For earlier generations, that mean simply the due diligence of looking things up in a reference book. Today, with Twitter and Facebook and countless apps fed into our smart phones, the flow of facts and opinion never stops. That can be a good thing, as when information empowers workers and consumers, not to mention whistle-blowers and revolutionaries. You can find out a used car’s accident history, a doctor’s malpractice record, a restaurant’s health-inspection results. Yet research like Dimoka’s is showing that a surfeit of information is changing the way we think, not always for the better. Maybe you consulted scores of travel websites to pick a vacation spot—only to be so overwhelmed with information that you opted for a staycation. Maybe you were this close to choosing a college, when suddenly older friends swamped your inbox with all the reasons to go somewhere else—which made you completely forget why you’d chosen the other school. Maybe you had the Date From Hell after being so inundated with information on “matches” that you chose at random. If so, then you are a victim of info-paralysis.

Imagine the most mind-numbing choice you’ve faced lately, one in which the possibilities almost paralyzed you: buying a car, choosing a health-care plan, figuring out what to do with your 401(k). The anxiety you felt might have been just the well-known consequence of information overload, but Angelika Dimoka, director of the Center for Neural Decision Making at Temple University, suspects that a more complicated biological phenomenon is at work. To confirm it, she needed to find a problem that overtaxes people’s decision-making abilities, so she joined forces with economists and computer scientists who study “combinatorial auctions,” bidding wars that bear almost no resemblance to the eBay version. Bidders consider a dizzying number of items that can be bought either alone or bundled, such as airport landing slots. The challenge is to buy the combination you want at the lowest price—a diabolical puzzle if you’re considering, say, 100 landing slots at LAX. As the number of items and combinations explodes, so does the quantity of information bidders must juggle: passenger load, weather, connecting flights. Even experts become anxious and mentally exhausted. In fact, the more information they try to absorb, the fewer of the desired items they get and the more they overpay or make critical errors.

This is where Dimoka comes in. She recruited volunteers to try their hand at combinatorial auctions, and as they did she measured their brain activity with fMRI. As the information load increased, she found, so did activity in the dorsolateral prefrontal cortex, a region behind the forehead that is responsible for decision making and control of emotions. But as the researchers gave the bidders more and more information, activity in the dorsolateral PFC suddenly fell off, as if a circuit breaker had popped. “The bidders reach cognitive and information overload,” says Dimoka. They start making stupid mistakes and bad choices because the brain region responsible for smart decision making has essentially left the premises. For the same reason, their frustration and anxiety soar: the brain’s emotion regions—previously held in check by the dorsolateral PFC—run as wild as toddlers on a sugar high. The two effects build on one another. “With too much information, ” says Dimoka, “people’s decisions make less and less sense.”

So much for the ideal of making well-informed decisions. For earlier generations, that mean simply the due diligence of looking things up in a reference book. Today, with Twitter and Facebook and countless apps fed into our smart phones, the flow of facts and opinion never stops. That can be a good thing, as when information empowers workers and consumers, not to mention whistle-blowers and revolutionaries. You can find out a used car’s accident history, a doctor’s malpractice record, a restaurant’s health-inspection results. Yet research like Dimoka’s is showing that a surfeit of information is changing the way we think, not always for the better. Maybe you consulted scores of travel websites to pick a vacation spot—only to be so overwhelmed with information that you opted for a staycation. Maybe you were this close to choosing a college, when suddenly older friends swamped your inbox with all the reasons to go somewhere else—which made you completely forget why you’d chosen the other school. Maybe you had the Date From Hell after being so inundated with information on “matches” that you chose at random. If so, then you are a victim of info-paralysis.

The problem has been creeping up on us for a long time. In the 17th century Leibniz bemoaned the “horrible mass of books which keeps on growing,” and in 1729 Alexander Pope warned of “a deluge of authors cover[ing] the land,” as James Gleick describes in his new book, The Information. But the consequences were thought to be emotional and psychological, chiefly anxiety about being unable to absorb even a small fraction of what’s out there. Indeed, the Oxford English Dictionary added “information fatigue” in 2009. But as information finds more ways to reach us, more often, more insistently than ever before, another consequence is becoming alarmingly clear: trying to drink from a firehose of information has harmful cognitive effects. And nowhere are those effects clearer, and more worrying, than in our ability to make smart, creative, successful decisions.

The research should give pause to anyone addicted to incoming texts and tweets. The booming science of decision making has shown that more information can lead to objectively poorer choices, and to choices that people come to regret. It has shown that an unconscious system guides many of our decisions, and that it can be sidelined by too much information. And it has shown that decisions requiring creativity benefit from letting the problem incubate below the level of awareness—something that becomes ever-more difficult when information never stops arriving.

Decision science has only begun to incorporate research on how the brain processes information, but the need for answers is as urgent as the stakes are high. During the BP oil-well blowout last year, Coast Guard Adm. Thad Allen, the incident commander, estimates that he got 300 to 400 pages of emails, texts, reports, and other messages every day. It’s impossible to know whether less information, more calmly evaluated, would have let officials figure out sooner how to cap the well, but Allen tells NEWSWEEK’s Daniel Stone that the torrent of data might have contributed to what he calls the mistake of failing to close off air space above the gulf on day one. (There were eight near midair collisions.) A comparable barrage of information assailed administration officials before the overthrow of the Egyptian government, possibly producing at least one misstep: CIA Director Leon Panetta told Congress that Hosni Mubarak was about to announce he was stepping down—right before the Egyptian president delivered a defiant, rambling speech saying he wasn’t going anywhere. “You always think afterwards about what you could have done better, but there isn’t time in the moment to second-guess,” said White House Communications Director Dan Pfeiffer. “You have to make your decision and go execute.” As scientists probe how the flow of information affects decision making, they’ve spotted several patterns. Among them:

Total Failure to Decide
Every bit of incoming information presents a choice: whether to pay attention, whether to reply, whether to factor it into an impending decision. But decision science has shown that people faced with a plethora of choices are apt to make no decision at all. The clearest example of this comes from studies of financial decisions. In a 2004 study, Sheena Iyengar of Columbia University and colleagues found that the more information people confronted about a 401(k) plan, the more participation fell: from 75 percent to 70 percent as the number of choices rose from two to 11, and to 61 percent when there were 59 options. People felt overwhelmed and opted out. Those who participated chose lower-return options—worse choices. Similarly, when people are given information about 50 rather than 10 options in an online store, they choose lower-quality options. Although we say we prefer more information, in fact more can be “debilitating,” argues Iyengar, whose 2010 book The Art of Choosing comes out in paperback in March. “When we make decisions, we compare bundles of information. So a decision is harder if the amount of information you have to juggle is greater.” In recent years, businesses have offered more and more choices to cater to individual tastes. For mustard or socks, this may not be a problem, but the proliferation of choices can create paralysis when the stakes are high and the information complex.

Imagine the most mind-numbing choice you’ve faced lately, one in which the possibilities almost paralyzed you: buying a car, choosing a health-care plan, figuring out what to do with your 401(k). The anxiety you felt might have been just the well-known consequence of information overload, but Angelika Dimoka, director of the Center for Neural Decision Making at Temple University, suspects that a more complicated biological phenomenon is at work. To confirm it, she needed to find a problem that overtaxes people’s decision-making abilities, so she joined forces with economists and computer scientists who study “combinatorial auctions,” bidding wars that bear almost no resemblance to the eBay version. Bidders consider a dizzying number of items that can be bought either alone or bundled, such as airport landing slots. The challenge is to buy the combination you want at the lowest price—a diabolical puzzle if you’re considering, say, 100 landing slots at LAX. As the number of items and combinations explodes, so does the quantity of information bidders must juggle: passenger load, weather, connecting flights. Even experts become anxious and mentally exhausted. In fact, the more information they try to absorb, the fewer of the desired items they get and the more they overpay or make critical errors.

This is where Dimoka comes in. She recruited volunteers to try their hand at combinatorial auctions, and as they did she measured their brain activity with fMRI. As the information load increased, she found, so did activity in the dorsolateral prefrontal cortex, a region behind the forehead that is responsible for decision making and control of emotions. But as the researchers gave the bidders more and more information, activity in the dorsolateral PFC suddenly fell off, as if a circuit breaker had popped. “The bidders reach cognitive and information overload,” says Dimoka. They start making stupid mistakes and bad choices because the brain region responsible for smart decision making has essentially left the premises. For the same reason, their frustration and anxiety soar: the brain’s emotion regions—previously held in check by the dorsolateral PFC—run as wild as toddlers on a sugar high. The two effects build on one another. “With too much information, ” says Dimoka, “people’s decisions make less and less sense.”

So much for the ideal of making well-informed decisions. For earlier generations, that mean simply the due diligence of looking things up in a reference book. Today, with Twitter and Facebook and countless apps fed into our smart phones, the flow of facts and opinion never stops. That can be a good thing, as when information empowers workers and consumers, not to mention whistle-blowers and revolutionaries. You can find out a used car’s accident history, a doctor’s malpractice record, a restaurant’s health-inspection results. Yet research like Dimoka’s is showing that a surfeit of information is changing the way we think, not always for the better. Maybe you consulted scores of travel websites to pick a vacation spot—only to be so overwhelmed with information that you opted for a staycation. Maybe you were this close to choosing a college, when suddenly older friends swamped your inbox with all the reasons to go somewhere else—which made you completely forget why you’d chosen the other school. Maybe you had the Date From Hell after being so inundated with information on “matches” that you chose at random. If so, then you are a victim of info-paralysis.

The problem has been creeping up on us for a long time. In the 17th century Leibniz bemoaned the “horrible mass of books which keeps on growing,” and in 1729 Alexander Pope warned of “a deluge of authors cover[ing] the land,” as James Gleick describes in his new book, The Information. But the consequences were thought to be emotional and psychological, chiefly anxiety about being unable to absorb even a small fraction of what’s out there. Indeed, the Oxford English Dictionary added “information fatigue” in 2009. But as information finds more ways to reach us, more often, more insistently than ever before, another consequence is becoming alarmingly clear: trying to drink from a firehose of information has harmful cognitive effects. And nowhere are those effects clearer, and more worrying, than in our ability to make smart, creative, successful decisions.

The research should give pause to anyone addicted to incoming texts and tweets. The booming science of decision making has shown that more information can lead to objectively poorer choices, and to choices that people come to regret. It has shown that an unconscious system guides many of our decisions, and that it can be sidelined by too much information. And it has shown that decisions requiring creativity benefit from letting the problem incubate below the level of awareness—something that becomes ever-more difficult when information never stops arriving.

Decision science has only begun to incorporate research on how the brain processes information, but the need for answers is as urgent as the stakes are high. During the BP oil-well blowout last year, Coast Guard Adm. Thad Allen, the incident commander, estimates that he got 300 to 400 pages of emails, texts, reports, and other messages every day. It’s impossible to know whether less information, more calmly evaluated, would have let officials figure out sooner how to cap the well, but Allen tells NEWSWEEK’s Daniel Stone that the torrent of data might have contributed to what he calls the mistake of failing to close off air space above the gulf on day one. (There were eight near midair collisions.) A comparable barrage of information assailed administration officials before the overthrow of the Egyptian government, possibly producing at least one misstep: CIA Director Leon Panetta told Congress that Hosni Mubarak was about to announce he was stepping down—right before the Egyptian president delivered a defiant, rambling speech saying he wasn’t going anywhere. “You always think afterwards about what you could have done better, but there isn’t time in the moment to second-guess,” said White House Communications Director Dan Pfeiffer. “You have to make your decision and go execute.” As scientists probe how the flow of information affects decision making, they’ve spotted several patterns. Among them:

Total Failure to Decide
Every bit of incoming information presents a choice: whether to pay attention, whether to reply, whether to factor it into an impending decision. But decision science has shown that people faced with a plethora of choices are apt to make no decision at all. The clearest example of this comes from studies of financial decisions. In a 2004 study, Sheena Iyengar of Columbia University and colleagues found that the more information people confronted about a 401(k) plan, the more participation fell: from 75 percent to 70 percent as the number of choices rose from two to 11, and to 61 percent when there were 59 options. People felt overwhelmed and opted out. Those who participated chose lower-return options—worse choices. Similarly, when people are given information about 50 rather than 10 options in an online store, they choose lower-quality options. Although we say we prefer more information, in fact more can be “debilitating,” argues Iyengar, whose 2010 book The Art of Choosing comes out in paperback in March. “When we make decisions, we compare bundles of information. So a decision is harder if the amount of information you have to juggle is greater.” In recent years, businesses have offered more and more choices to cater to individual tastes. For mustard or socks, this may not be a problem, but the proliferation of choices can create paralysis when the stakes are high and the information complex.

If we manage to make a decision despite info-deluge, it often comes back to haunt us. The more information we try to assimilate, the more we tend to regret the many forgone options. In a 2006 study, Iyengar and colleagues analyzed job searches by college students. The more sources and kinds of information (about a company, an industry, a city, pay, benefits, corporate culture) they collected, the less satisfied they were with their decision. They knew so much, consciously or unconsciously, they could easily imagine why a job not taken would have been better. In a world of limitless information, regret over the decisions we make becomes more common. We chafe at the fact that identifying the best feels impossible. “Even if you made an objectively better choice, you tend to be less satisfied with it,” says Iyengar.

A key reason for information’s diminishing or even negative returns is the limited capacity of the brain’s working memory. It can hold roughly seven items (which is why seven-digit phone numbers were a great idea). Anything more must be processed into long-term memory. That takes conscious effort, as when you study for an exam. When more than seven units of information land in our brain’s inbox, argues psychologist Joanne Cantor, author of the 2009 book Conquer Cyber Overload and an emerita professor at the University of Wisconsin, the brain struggles to figure out what to keep and what to disregard. Ignoring the repetitious and the useless requires cognitive resources and vigilance, a harder task when there is so much information.

It isn’t only the quantity of information that knocks the brain for a loop; it’s the rate. The ceaseless influx trains us to respond instantly, sacrificing accuracy and thoughtfulness to the false god of immediacy. “We’re being trained to prefer an immediate decision even if it’s bad to a later decision that’s better,” says psychologist Clifford Nass of Stanford University. “In business, we’re seeing a preference for the quick over the right, in large part because so many decisions have to be made. The notion that the quick decision is better is becoming normative.”

The brain is wired to notice change over stasis. An arriving email that pops to the top of your BlackBerry qualifies as a change; so does a new Facebook post. We are conditioned to give greater weight in our decision-making machinery to what is latest, not what is more important or more interesting. “There is a powerful ‘recency’ effect in decision making,” says behavioral economist George Loewenstein of Carnegie Mellon University. “We pay a lot of attention to the most recent information, discounting what came earlier.” Getting 30 texts per hour up to the moment when you make a decision means that most of them make all the impression of a feather on a brick wall, whereas Nos. 29 and 30 assume outsize importance, regardless of their validity. “We’re fooled by immediacy and quantity and think it’s quality,” says Eric Kessler, a management expert at Pace University’s Lubin School of Business. “What starts driving decisions is the urgent rather than the important.”

The Twitterization of our culture has revolutionized our lives, but with an unintended consequence—our overloaded brains freeze when we have to make decisions.
(Page 4 of 5)

Illustration by Matt Mahurin for Newsweek
Imagine the most mind-numbing choice you’ve faced lately, one in which the possibilities almost paralyzed you: buying a car, choosing a health-care plan, figuring out what to do with your 401(k). The anxiety you felt might have been just the well-known consequence of information overload, but Angelika Dimoka, director of the Center for Neural Decision Making at Temple University, suspects that a more complicated biological phenomenon is at work. To confirm it, she needed to find a problem that overtaxes people’s decision-making abilities, so she joined forces with economists and computer scientists who study “combinatorial auctions,” bidding wars that bear almost no resemblance to the eBay version. Bidders consider a dizzying number of items that can be bought either alone or bundled, such as airport landing slots. The challenge is to buy the combination you want at the lowest price—a diabolical puzzle if you’re considering, say, 100 landing slots at LAX. As the number of items and combinations explodes, so does the quantity of information bidders must juggle: passenger load, weather, connecting flights. Even experts become anxious and mentally exhausted. In fact, the more information they try to absorb, the fewer of the desired items they get and the more they overpay or make critical errors.

This is where Dimoka comes in. She recruited volunteers to try their hand at combinatorial auctions, and as they did she measured their brain activity with fMRI. As the information load increased, she found, so did activity in the dorsolateral prefrontal cortex, a region behind the forehead that is responsible for decision making and control of emotions. But as the researchers gave the bidders more and more information, activity in the dorsolateral PFC suddenly fell off, as if a circuit breaker had popped. “The bidders reach cognitive and information overload,” says Dimoka. They start making stupid mistakes and bad choices because the brain region responsible for smart decision making has essentially left the premises. For the same reason, their frustration and anxiety soar: the brain’s emotion regions—previously held in check by the dorsolateral PFC—run as wild as toddlers on a sugar high. The two effects build on one another. “With too much information, ” says Dimoka, “people’s decisions make less and less sense.”

So much for the ideal of making well-informed decisions. For earlier generations, that mean simply the due diligence of looking things up in a reference book. Today, with Twitter and Facebook and countless apps fed into our smart phones, the flow of facts and opinion never stops. That can be a good thing, as when information empowers workers and consumers, not to mention whistle-blowers and revolutionaries. You can find out a used car’s accident history, a doctor’s malpractice record, a restaurant’s health-inspection results. Yet research like Dimoka’s is showing that a surfeit of information is changing the way we think, not always for the better. Maybe you consulted scores of travel websites to pick a vacation spot—only to be so overwhelmed with information that you opted for a staycation. Maybe you were this close to choosing a college, when suddenly older friends swamped your inbox with all the reasons to go somewhere else—which made you completely forget why you’d chosen the other school. Maybe you had the Date From Hell after being so inundated with information on “matches” that you chose at random. If so, then you are a victim of info-paralysis.

The problem has been creeping up on us for a long time. In the 17th century Leibniz bemoaned the “horrible mass of books which keeps on growing,” and in 1729 Alexander Pope warned of “a deluge of authors cover[ing] the land,” as James Gleick describes in his new book, The Information. But the consequences were thought to be emotional and psychological, chiefly anxiety about being unable to absorb even a small fraction of what’s out there. Indeed, the Oxford English Dictionary added “information fatigue” in 2009. But as information finds more ways to reach us, more often, more insistently than ever before, another consequence is becoming alarmingly clear: trying to drink from a firehose of information has harmful cognitive effects. And nowhere are those effects clearer, and more worrying, than in our ability to make smart, creative, successful decisions.

The research should give pause to anyone addicted to incoming texts and tweets. The booming science of decision making has shown that more information can lead to objectively poorer choices, and to choices that people come to regret. It has shown that an unconscious system guides many of our decisions, and that it can be sidelined by too much information. And it has shown that decisions requiring creativity benefit from letting the problem incubate below the level of awareness—something that becomes ever-more difficult when information never stops arriving.

Decision science has only begun to incorporate research on how the brain processes information, but the need for answers is as urgent as the stakes are high. During the BP oil-well blowout last year, Coast Guard Adm. Thad Allen, the incident commander, estimates that he got 300 to 400 pages of emails, texts, reports, and other messages every day. It’s impossible to know whether less information, more calmly evaluated, would have let officials figure out sooner how to cap the well, but Allen tells NEWSWEEK’s Daniel Stone that the torrent of data might have contributed to what he calls the mistake of failing to close off air space above the gulf on day one. (There were eight near midair collisions.) A comparable barrage of information assailed administration officials before the overthrow of the Egyptian government, possibly producing at least one misstep: CIA Director Leon Panetta told Congress that Hosni Mubarak was about to announce he was stepping down—right before the Egyptian president delivered a defiant, rambling speech saying he wasn’t going anywhere. “You always think afterwards about what you could have done better, but there isn’t time in the moment to second-guess,” said White House Communications Director Dan Pfeiffer. “You have to make your decision and go execute.” As scientists probe how the flow of information affects decision making, they’ve spotted several patterns. Among them:

Total Failure to Decide
Every bit of incoming information presents a choice: whether to pay attention, whether to reply, whether to factor it into an impending decision. But decision science has shown that people faced with a plethora of choices are apt to make no decision at all. The clearest example of this comes from studies of financial decisions. In a 2004 study, Sheena Iyengar of Columbia University and colleagues found that the more information people confronted about a 401(k) plan, the more participation fell: from 75 percent to 70 percent as the number of choices rose from two to 11, and to 61 percent when there were 59 options. People felt overwhelmed and opted out. Those who participated chose lower-return options—worse choices. Similarly, when people are given information about 50 rather than 10 options in an online store, they choose lower-quality options. Although we say we prefer more information, in fact more can be “debilitating,” argues Iyengar, whose 2010 book The Art of Choosing comes out in paperback in March. “When we make decisions, we compare bundles of information. So a decision is harder if the amount of information you have to juggle is greater.” In recent years, businesses have offered more and more choices to cater to individual tastes. For mustard or socks, this may not be a problem, but the proliferation of choices can create paralysis when the stakes are high and the information complex.

Many Diminishing Returns
If we manage to make a decision despite info-deluge, it often comes back to haunt us. The more information we try to assimilate, the more we tend to regret the many forgone options. In a 2006 study, Iyengar and colleagues analyzed job searches by college students. The more sources and kinds of information (about a company, an industry, a city, pay, benefits, corporate culture) they collected, the less satisfied they were with their decision. They knew so much, consciously or unconsciously, they could easily imagine why a job not taken would have been better. In a world of limitless information, regret over the decisions we make becomes more common. We chafe at the fact that identifying the best feels impossible. “Even if you made an objectively better choice, you tend to be less satisfied with it,” says Iyengar.

A key reason for information’s diminishing or even negative returns is the limited capacity of the brain’s working memory. It can hold roughly seven items (which is why seven-digit phone numbers were a great idea). Anything more must be processed into long-term memory. That takes conscious effort, as when you study for an exam. When more than seven units of information land in our brain’s inbox, argues psychologist Joanne Cantor, author of the 2009 book Conquer Cyber Overload and an emerita professor at the University of Wisconsin, the brain struggles to figure out what to keep and what to disregard. Ignoring the repetitious and the useless requires cognitive resources and vigilance, a harder task when there is so much information.

It isn’t only the quantity of information that knocks the brain for a loop; it’s the rate. The ceaseless influx trains us to respond instantly, sacrificing accuracy and thoughtfulness to the false god of immediacy. “We’re being trained to prefer an immediate decision even if it’s bad to a later decision that’s better,” says psychologist Clifford Nass of Stanford University. “In business, we’re seeing a preference for the quick over the right, in large part because so many decisions have to be made. The notion that the quick decision is better is becoming normative.”

‘Recency’ Trumps Quality
The brain is wired to notice change over stasis. An arriving email that pops to the top of your BlackBerry qualifies as a change; so does a new Facebook post. We are conditioned to give greater weight in our decision-making machinery to what is latest, not what is more important or more interesting. “There is a powerful ‘recency’ effect in decision making,” says behavioral economist George Loewenstein of Carnegie Mellon University. “We pay a lot of attention to the most recent information, discounting what came earlier.” Getting 30 texts per hour up to the moment when you make a decision means that most of them make all the impression of a feather on a brick wall, whereas Nos. 29 and 30 assume outsize importance, regardless of their validity. “We’re fooled by immediacy and quantity and think it’s quality,” says Eric Kessler, a management expert at Pace University’s Lubin School of Business. “What starts driving decisions is the urgent rather than the important.”

Part of the problem is that the brain is really bad at giving only a little weight to a piece of information. When psychologist Eric Stone of Wake Forest University had subjects evaluate the vocabulary skills of a hypothetical person, he gave them salient information (the person’s education level) and less predictive information (how often they read a newspaper). People give the less predictive info more weight than it deserves. “Our cognitive systems,” says Stone, “just aren’t designed to take information into account only a little.”

The Neglected Unconscious
Creative decisions are more likely to bubble up from a brain that applies unconscious thought to a problem, rather than going at it in a full-frontal, analytical assault. So while we’re likely to think creative thoughts in the shower, it’s much harder if we’re under a virtual deluge of data. “If you let things come at you all the time, you can’t use additional information to make a creative leap or a wise judgment,” says Cantor. “You need to pull back from the constant influx and take a break.” That allows the brain to subconsciously integrate new information with existing knowledge and thereby make novel connections and see hidden patterns. In contrast, a constant focus on the new makes it harder for information to percolate just below conscious awareness, where it can combine in ways that spark smart decisions.

One of the greatest surprises in decision science is the discovery that some of our best decisions are made through unconscious processes. When subjects in one study evaluated what psychologist Ap Dijksterhuis of the Radboud University of Nijmegen in the Netherlands calls a “rather daunting amount of information” about four hypothetical apartments for rent—size, location, friendliness of the landlord, price, and eight other features—those who decided unconsciously which to rent did better. (“Better” meant they chose the one that had objectively better features.) The scientists made sure the decision was unconscious by having the subjects do a memory and attention task, which tied up their brains enough that they couldn’t contemplate, say, square footage.

There are at least two ways an info-glut can impair the unconscious system of decision making. First, when people see that there is a lot of complex information relevant to a decision, “they default to the conscious system,” says psychologist Maarten Bos of Radboud. “That causes them to make poorer choices.” Second, the unconscious system works best when it ignores some information about a complex decision. But here’s the rub: in an info tsunami, our minds struggle to decide if we can ignore this piece … or that one … but how about that one? “Especially online,” says Cantor, “it is so much easier to look for more and more information than sit back and think about how it fits together.”

Even experience-based decision making, in which you use a rule of thumb rather than analyze pros and cons, can go off the rails with too much information. “This kind of intuitive decision making relies on distilled expertise,” says Kessler. “More information, by overwhelming and distracting the brain, can make it harder to tap into just the core information you need.” In one experiment, M.B.A. students choosing a (make-believe) stock portfolio were divided into two groups, one that was inundated with information from analysts and the financial press, and another that saw only stock-price changes. The latter reaped more than twice the returns of the info-deluged group, whose analytical capabilities were hijacked by too much information and wound up buying and selling on every rumor and tip—a surefire way to lose money in the market. The more data they got, the more they struggled to separate wheat from chaff.?


Illustration by Matt Mahurin for Newsweek
Imagine the most mind-numbing choice you’ve faced lately, one in which the possibilities almost paralyzed you: buying a car, choosing a health-care plan, figuring out what to do with your 401(k). The anxiety you felt might have been just the well-known consequence of information overload, but Angelika Dimoka, director of the Center for Neural Decision Making at Temple University, suspects that a more complicated biological phenomenon is at work. To confirm it, she needed to find a problem that overtaxes people’s decision-making abilities, so she joined forces with economists and computer scientists who study “combinatorial auctions,” bidding wars that bear almost no resemblance to the eBay version. Bidders consider a dizzying number of items that can be bought either alone or bundled, such as airport landing slots. The challenge is to buy the combination you want at the lowest price—a diabolical puzzle if you’re considering, say, 100 landing slots at LAX. As the number of items and combinations explodes, so does the quantity of information bidders must juggle: passenger load, weather, connecting flights. Even experts become anxious and mentally exhausted. In fact, the more information they try to absorb, the fewer of the desired items they get and the more they overpay or make critical errors.

This is where Dimoka comes in. She recruited volunteers to try their hand at combinatorial auctions, and as they did she measured their brain activity with fMRI. As the information load increased, she found, so did activity in the dorsolateral prefrontal cortex, a region behind the forehead that is responsible for decision making and control of emotions. But as the researchers gave the bidders more and more information, activity in the dorsolateral PFC suddenly fell off, as if a circuit breaker had popped. “The bidders reach cognitive and information overload,” says Dimoka. They start making stupid mistakes and bad choices because the brain region responsible for smart decision making has essentially left the premises. For the same reason, their frustration and anxiety soar: the brain’s emotion regions—previously held in check by the dorsolateral PFC—run as wild as toddlers on a sugar high. The two effects build on one another. “With too much information, ” says Dimoka, “people’s decisions make less and less sense.”

So much for the ideal of making well-informed decisions. For earlier generations, that mean simply the due diligence of looking things up in a reference book. Today, with Twitter and Facebook and countless apps fed into our smart phones, the flow of facts and opinion never stops. That can be a good thing, as when information empowers workers and consumers, not to mention whistle-blowers and revolutionaries. You can find out a used car’s accident history, a doctor’s malpractice record, a restaurant’s health-inspection results. Yet research like Dimoka’s is showing that a surfeit of information is changing the way we think, not always for the better. Maybe you consulted scores of travel websites to pick a vacation spot—only to be so overwhelmed with information that you opted for a staycation. Maybe you were this close to choosing a college, when suddenly older friends swamped your inbox with all the reasons to go somewhere else—which made you completely forget why you’d chosen the other school. Maybe you had the Date From Hell after being so inundated with information on “matches” that you chose at random. If so, then you are a victim of info-paralysis.

The problem has been creeping up on us for a long time. In the 17th century Leibniz bemoaned the “horrible mass of books which keeps on growing,” and in 1729 Alexander Pope warned of “a deluge of authors cover[ing] the land,” as James Gleick describes in his new book, The Information. But the consequences were thought to be emotional and psychological, chiefly anxiety about being unable to absorb even a small fraction of what’s out there. Indeed, the Oxford English Dictionary added “information fatigue” in 2009. But as information finds more ways to reach us, more often, more insistently than ever before, another consequence is becoming alarmingly clear: trying to drink from a firehose of information has harmful cognitive effects. And nowhere are those effects clearer, and more worrying, than in our ability to make smart, creative, successful decisions.

The research should give pause to anyone addicted to incoming texts and tweets. The booming science of decision making has shown that more information can lead to objectively poorer choices, and to choices that people come to regret. It has shown that an unconscious system guides many of our decisions, and that it can be sidelined by too much information. And it has shown that decisions requiring creativity benefit from letting the problem incubate below the level of awareness—something that becomes ever-more difficult when information never stops arriving.

Decision science has only begun to incorporate research on how the brain processes information, but the need for answers is as urgent as the stakes are high. During the BP oil-well blowout last year, Coast Guard Adm. Thad Allen, the incident commander, estimates that he got 300 to 400 pages of emails, texts, reports, and other messages every day. It’s impossible to know whether less information, more calmly evaluated, would have let officials figure out sooner how to cap the well, but Allen tells NEWSWEEK’s Daniel Stone that the torrent of data might have contributed to what he calls the mistake of failing to close off air space above the gulf on day one. (There were eight near midair collisions.) A comparable barrage of information assailed administration officials before the overthrow of the Egyptian government, possibly producing at least one misstep: CIA Director Leon Panetta told Congress that Hosni Mubarak was about to announce he was stepping down—right before the Egyptian president delivered a defiant, rambling speech saying he wasn’t going anywhere. “You always think afterwards about what you could have done better, but there isn’t time in the moment to second-guess,” said White House Communications Director Dan Pfeiffer. “You have to make your decision and go execute.” As scientists probe how the flow of information affects decision making, they’ve spotted several patterns. Among them:

Total Failure to Decide
Every bit of incoming information presents a choice: whether to pay attention, whether to reply, whether to factor it into an impending decision. But decision science has shown that people faced with a plethora of choices are apt to make no decision at all. The clearest example of this comes from studies of financial decisions. In a 2004 study, Sheena Iyengar of Columbia University and colleagues found that the more information people confronted about a 401(k) plan, the more participation fell: from 75 percent to 70 percent as the number of choices rose from two to 11, and to 61 percent when there were 59 options. People felt overwhelmed and opted out. Those who participated chose lower-return options—worse choices. Similarly, when people are given information about 50 rather than 10 options in an online store, they choose lower-quality options. Although we say we prefer more information, in fact more can be “debilitating,” argues Iyengar, whose 2010 book The Art of Choosing comes out in paperback in March. “When we make decisions, we compare bundles of information. So a decision is harder if the amount of information you have to juggle is greater.” In recent years, businesses have offered more and more choices to cater to individual tastes. For mustard or socks, this may not be a problem, but the proliferation of choices can create paralysis when the stakes are high and the information complex.

Many Diminishing Returns
If we manage to make a decision despite info-deluge, it often comes back to haunt us. The more information we try to assimilate, the more we tend to regret the many forgone options. In a 2006 study, Iyengar and colleagues analyzed job searches by college students. The more sources and kinds of information (about a company, an industry, a city, pay, benefits, corporate culture) they collected, the less satisfied they were with their decision. They knew so much, consciously or unconsciously, they could easily imagine why a job not taken would have been better. In a world of limitless information, regret over the decisions we make becomes more common. We chafe at the fact that identifying the best feels impossible. “Even if you made an objectively better choice, you tend to be less satisfied with it,” says Iyengar.

A key reason for information’s diminishing or even negative returns is the limited capacity of the brain’s working memory. It can hold roughly seven items (which is why seven-digit phone numbers were a great idea). Anything more must be processed into long-term memory. That takes conscious effort, as when you study for an exam. When more than seven units of information land in our brain’s inbox, argues psychologist Joanne Cantor, author of the 2009 book Conquer Cyber Overload and an emerita professor at the University of Wisconsin, the brain struggles to figure out what to keep and what to disregard. Ignoring the repetitious and the useless requires cognitive resources and vigilance, a harder task when there is so much information.

It isn’t only the quantity of information that knocks the brain for a loop; it’s the rate. The ceaseless influx trains us to respond instantly, sacrificing accuracy and thoughtfulness to the false god of immediacy. “We’re being trained to prefer an immediate decision even if it’s bad to a later decision that’s better,” says psychologist Clifford Nass of Stanford University. “In business, we’re seeing a preference for the quick over the right, in large part because so many decisions have to be made. The notion that the quick decision is better is becoming normative.”

‘Recency’ Trumps Quality
The brain is wired to notice change over stasis. An arriving email that pops to the top of your BlackBerry qualifies as a change; so does a new Facebook post. We are conditioned to give greater weight in our decision-making machinery to what is latest, not what is more important or more interesting. “There is a powerful ‘recency’ effect in decision making,” says behavioral economist George Loewenstein of Carnegie Mellon University. “We pay a lot of attention to the most recent information, discounting what came earlier.” Getting 30 texts per hour up to the moment when you make a decision means that most of them make all the impression of a feather on a brick wall, whereas Nos. 29 and 30 assume outsize importance, regardless of their validity. “We’re fooled by immediacy and quantity and think it’s quality,” says Eric Kessler, a management expert at Pace University’s Lubin School of Business. “What starts driving decisions is the urgent rather than the important.”

Part of the problem is that the brain is really bad at giving only a little weight to a piece of information. When psychologist Eric Stone of Wake Forest University had subjects evaluate the vocabulary skills of a hypothetical person, he gave them salient information (the person’s education level) and less predictive information (how often they read a newspaper). People give the less predictive info more weight than it deserves. “Our cognitive systems,” says Stone, “just aren’t designed to take information into account only a little.”

The Neglected Unconscious
Creative decisions are more likely to bubble up from a brain that applies unconscious thought to a problem, rather than going at it in a full-frontal, analytical assault. So while we’re likely to think creative thoughts in the shower, it’s much harder if we’re under a virtual deluge of data. “If you let things come at you all the time, you can’t use additional information to make a creative leap or a wise judgment,” says Cantor. “You need to pull back from the constant influx and take a break.” That allows the brain to subconsciously integrate new information with existing knowledge and thereby make novel connections and see hidden patterns. In contrast, a constant focus on the new makes it harder for information to percolate just below conscious awareness, where it can combine in ways that spark smart decisions.

One of the greatest surprises in decision science is the discovery that some of our best decisions are made through unconscious processes. When subjects in one study evaluated what psychologist Ap Dijksterhuis of the Radboud University of Nijmegen in the Netherlands calls a “rather daunting amount of information” about four hypothetical apartments for rent—size, location, friendliness of the landlord, price, and eight other features—those who decided unconsciously which to rent did better. (“Better” meant they chose the one that had objectively better features.) The scientists made sure the decision was unconscious by having the subjects do a memory and attention task, which tied up their brains enough that they couldn’t contemplate, say, square footage.

There are at least two ways an info-glut can impair the unconscious system of decision making. First, when people see that there is a lot of complex information relevant to a decision, “they default to the conscious system,” says psychologist Maarten Bos of Radboud. “That causes them to make poorer choices.” Second, the unconscious system works best when it ignores some information about a complex decision. But here’s the rub: in an info tsunami, our minds struggle to decide if we can ignore this piece … or that one … but how about that one? “Especially online,” says Cantor, “it is so much easier to look for more and more information than sit back and think about how it fits together.”

Even experience-based decision making, in which you use a rule of thumb rather than analyze pros and cons, can go off the rails with too much information. “This kind of intuitive decision making relies on distilled expertise,” says Kessler. “More information, by overwhelming and distracting the brain, can make it harder to tap into just the core information you need.” In one experiment, M.B.A. students choosing a (make-believe) stock portfolio were divided into two groups, one that was inundated with information from analysts and the financial press, and another that saw only stock-price changes. The latter reaped more than twice the returns of the info-deluged group, whose analytical capabilities were hijacked by too much information and wound up buying and selling on every rumor and tip—a surefire way to lose money in the market. The more data they got, the more they struggled to separate wheat from chaff.?

Which brings us back to the experimental subjects Angelika Dimoka has put in an fMRI scanner. The prefrontal cortex that waves a white flag under an onslaught of information plays a key role in your gut-level, emotional decision-making system. It hooks up feelings about various choices with the output of the rational brain. If emotions are shut out of the decision-making process, we’re likely to overthink a decision, and that has been shown to produce worse outcomes on even the simplest tasks. In one classic experiment, when volunteers focused on the attributes of various strawberry jams they had just rated, it completely scrambled their preferences, and they wound up giving a high rating to a jam they disliked and a low rating to one they had found delicious.

How can you protect yourself from having your decisions warped by excess information? Experts advise dealing with emails and texts in batches, rather than in real time; that should let your unconscious decision-making system kick in. Avoid the trap of thinking that a decision requiring you to assess a lot of complex information is best made methodically and consciously; you will do better, and regret less, if you let your unconscious turn it over by removing yourself from the info influx. Set priorities: if a choice turns on only a few criteria, focus consciously on those. Some people are better than others at ignoring extra information. These “sufficers” are able to say enough: they channel-surf until they find an acceptable show and then stop, whereas “maximizers” never stop surfing, devouring information, and so struggle to make a decision and move on. If you think you’re a maximizer, the best prescription for you might be the “off” switch on your smart phone.

No comments: