Is Google Making Students Stupid?
Outsourcing menial tasks to machines can seem liberating,
but it may be robbing a whole generation of certain basic mental
abilities.
More
Justin Morgan/Flickr
Today, computers often play both roles. Nicholas Carr, the author of the 2008 Atlantic cover story “Is Google Making Us Stupid?”, confronts this paradox in his new book, The Glass Cage: Automation and Us, analyzing the many contemporary fields in which software assists human cognition, from medical diagnostic aids to architectural modeling programs. As its title suggests, the book also takes a stand on whether such technology imprisons or liberates its users. We are increasingly encaged, he argues, but the invisibility of our high-tech snares gives us the illusion of freedom. As evidence, he cites the case of Inuit hunters in northern Canada. Older generations could track caribou through the tundra with astonishing precision by noticing subtle changes in winds, snowdrift patterns, stars, and animal behavior. Once younger hunters began using snowmobiles and GPS units, their navigational prowess declined. They began trusting the GPS devices so completely that they ignored blatant dangers, speeding over cliffs or onto thin ice. And when a GPS unit broke or its batteries froze, young hunters who had not developed and practiced the wayfinding skills of their elders were uniquely vulnerable.
Carr includes other case studies: He describes doctors who become so reliant on decision-assistance software that they overlook subtle signals from patients or dismiss improbable but accurate diagnoses. He interviews architects whose drawing skills decay as they transition to digital platforms. And he recounts frightening instances when commercial airline pilots fail to perform simple corrections in emergencies because they are so used to trusting the autopilot system. Carr is quick to acknowledge that these technologies often do enhance and assist human skills. But he makes a compelling case that our relationship with them is not as positive as we might think.
All of this has unmistakable implications for the use of technology in classrooms: When do technologies free students to think about more interesting and complex questions, and when do they erode the very cognitive capacities they are meant to enhance? The effect of ubiquitous spell check and AutoCorrect software is a revealing example. Psychologists studying the formation of memories have found that the act of generating a word in your mind strengthens your capacity to remember it. When a computer automatically corrects a spelling mistake or offers a drop-down menu of options, we’re no longer forced to generate the correct spelling in our minds.
This might not seem very important. If writers don’t clutter their minds with often-bizarre English spelling conventions, this might give them more energy to consider interesting questions of style and structure. But the process of word generation is not just supplementing spelling skills; it’s also eroding them. When students find themselves without automated spelling assistance, they don’t face the prospect of freezing to death, as the Inuits did when their GPS malfunctioned, but they’re more likely to make errors.
The solution might seem to be improving battery life and making spelling assistance even more omnipresent, but this creates a vicious cycle: The more we use the technology, the more we need to use it in all circumstances. Suddenly, our position as masters of technology starts to seem more precarious.
Relying on calculators to perform arithmetic has had similar risks and benefits. Automating the time-consuming work of multiplying and dividing large numbers by hand can allow students to spend time and energy on more complex mathematical subjects. But depending on calculators in classrooms can also lead students to forget how to do the operations that the machines perform. Once again, something meant to expedite a task winds up being an indispensable technology.
The phenomenon is not specific to modern technologies; the same concern appears in Plato’s Phaedrus, where a character in the dialogue worries about the effects of the phonetic alphabet: “This discovery … will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves.” Automating almost any task can rob us of an ability.
The difference today is the sheer breadth of mental tasks that have been outsourced to machines. Carr describes a 2004 study in which two groups of subjects played a computer game based on the logic puzzle Missionaries and Cannibals. Solving the puzzle required figuring out how to transport five missionaries and five cannibals across a river in a boat that could hold only three passengers. The cannibals, for self-evident reasons, could not outnumber missionaries on either of the riverbanks or in the boat.
The first group of players used a sophisticated software program that offered prompts and guidance on permissible moves in given scenarios. The second group used a simple program that gave no assistance. Initially, those using the helpful software made rapid progress, but over time those using the more basic software made fewer wrong moves and solved the puzzle more efficiently. The psychologist running the study concluded that those who received less assistance were more likely to develop a better understanding of the game’s rules and strategize accordingly.
As all good teachers know, students need to experience confusion and struggle in order to internalize certain principles. That’s why teachers avoid rushing in to assist students at the first hint of incomprehension. It’s neither necessary nor possible to abolish calculators and spellcheck programs in classrooms, but periodically removing these tools can help ensure that students use technology in order to free their minds for more interesting tasks—not because they can’t spell or compute without assistance.
Carr notes that the word “robot” derives from robota, a Czech term for servitude. His book is a valuable reminder that if we don’t carefully examine the process that makes us dependent on technology, our position in the master-servant relationship can become the opposite of what we imagine