I wish I had the expertise to do research in how the digital world is altering our brains.
by Nicholas Carr
The informavore in its cage
November 06, 2009
Edge is featuring, in "The Age of the Informavore," a fascinating interview with Frank Schirrmacher, the influential science and culture editor at Frankfurter Allgemeine Zeitung."The question I am asking myself," Schirrmacher says, "[which] arose through work and through discussion with other people, and especially watching other people, watching them act and behave and talk, [is] how technology, the Internet and the modern systems, has now apparently changed human behavior, the way humans express themselves, and the way humans think in real life ... And you encounter this not only in a theoretical way, but when you meet people, when suddenly people start forgetting things, when suddenly people depend on their gadgets, and other stuff, to remember certain things. This is the beginning, its just an experience. But if you think about it and you think about your own behavior, you suddenly realize that something fundamental is going on."
Tell me about it.
Later in the interview, Schirrmacher wonders what the effects will be as companies collect ever more behavioral data and apply ever more sophisticated predictive algorithms to it:
You have a generation — in the next evolutionary stages, the child of today — which [is adapting] to systems such as the iTunes "Genius", which not only know which book or which music file they like, [but] which [go] farther and farther in [predicting] certain things, like predicting whether the concert I am watching tonight is good or bad. Google will know it beforehand, because they know how people talk about it.
What will this mean for the question of free will? Because, in the bottom line, there are, of course, algorithms, who analyze or who calculate certain predictabilities ... The question of prediction will be the issue of the future and such questions will have impact on the concept of free will. We are now confronted with theories by psychologist John Bargh and others who claim there is no such thing as free will. This kind of claim is a very big issue here in Germany and it will be a much more important issue in the future than we think today. The way we predict our own life, the way we are predicted by others, through the cloud, through the way we are linked to the Internet, will be matters that impact every aspect of our lives. And, of course, this will play out in the work force — the new German government seems to be very keen on this issue, to at least prevent the worst impact on people, on workplaces.
It's very important to stress that we are not talking about cultural pessimism. What we are talking about is that a new technology which is in fact a technology which is a brain technology, to put it this way, which is a technology which has to do with intelligence, which has to do with thinking, that this new technology now clashes in a very real way with the history of thought in the European way of thinking.
The interview has drawn many responses (including one from me), the most recent of which is from John Bargh, who heads Yale's Automaticity in Cognition, Motivation and Evaluation Lab. He picks up on Schirrmacher's comments on prediction and describes how recent research in brain science is opening up powerful new possibilities for manipulating human behavior:
Schirrmacher is quite right to worry about the consequences of a universally available digitized knowledge base, especially if it concerns predicting what people will do. And most especially if artificial intelligence agents can begin to search and put together the burgeoning data base about what situation (or prime) X will cause a person to do. The discovery of the pervasiveness of situational priming influences for all of the higher mental processes in humans does say something fundamentally new about human nature (for example, how tightly tied and responsive is our functioning to our particular physical and social surroundings). It removes consciousness or free will as the bottleneck that exclusively generates choices and behavioral impulses, replacing it with the physical and social world itself as the source of these impulses. ...
It is because priming studies are so relatively easy to perform that this method has opened up research on the prediction and control of human judgment and behavior, 'democratized' it, basically, because studies can be done much more quickly and efficiently, and done well even by relatively untrained undergraduate and graduate students. This has indeed produced (and is still producing) an explosion of knowledge of the IF-THEN contingencies of human responses to the physical and social environment. And so I do worry with Schirrmacher on this score, because we [are] so rapidly building a database or atlas of unconscious influences and effects that could well be exploited by ever-faster computing devices, as the knowledge is accumulating at an exponential rate. ...
More frightening to me still is Schirrmacher's postulated intelligent artificial agents who can, as in the Google Books example, search and access this knowledge base so quickly, and then integrate it to be used in real-time applications to manipulate the target individual to think or feel or behave in ways that suit the agent's (or its owner's) agenda of purposes.
The Web has been called a "database of intentions." The bigger that database grows, and the more deeply it is mined, the more difficult it may become to discern whether those intentions are our own or ones that have been implanted in us.
1 comment:
Sometimes I think we are already heavily shaped by outside influences, we just don't realize or acknowledge it. From that perspective, I don't know that what Carr is talking about really changes things much.
Post a Comment