Friday, January 25, 2013

Al Gore on the Internet

Close Al Gore on How the Internet is Changing the Way We Think

By Al Gore

In an excerpt from his new book, The Future, the Nobel Prize winner and former vice president talks global networks, Marshall McLuhan, and how computing is changing what it means to be human.


Technology and the "World Brain"



Writers have used the human nervous system to describe electronic communication since the invention of the telegraph. In 1851, only six years after Samuel Morse received the message "What hath God wrought?" Nathaniel Hawthorne wrote: "By means of electricity, the world of matter has become a great nerve vibrating thousands of miles in a breathless point of time. The round globe is a vast brain, instinct with intelligence." Less than a century later, H. G. Wells modified Hawthorne's metaphor when he offered a proposal to develop a "world brain" -- which he described as a commonwealth of all the world's information, accessible to all the world's people as "a sort of mental clearinghouse for the mind: a depot where knowledge and ideas are received, sorted, summarized, digested, clarified and compared." In the way Wells used the phrase "world brain," what began as a metaphor is now a reality. You can look it up right now on Wikipedia or search the World Wide Web on Google for some of the estimated one trillion web pages.



Since the nervous system connects to the human brain and the brain gives rise to the mind, it was understandable that one of the twentieth century's greatest theologians, Teilhard de Chardin, would modify Hawthorne's metaphor yet again. In the 1950s, he envisioned the "planetization" of consciousness within a technologically enabled network of human thoughts that he termed the "Global Mind." And while the current reality may not yet match Teilhard's expansive meaning when he used that provocative image, some technologists believe that what is emerging may nevertheless mark the beginning of an entirely new era. To paraphrase Descartes, "It thinks; therefore it is." [1]



The supercomputers and software in use have all been designed by human beings, but as Marshall McLuhan once said, "We shape our tools, and thereafter, our tools shape us." Since the global Internet and the billions of intelligent devices and machines connected to it---the Global Mind -- represent what is arguably far and away the most powerful tool that human beings have ever used, it should not be surprising that it is beginning to reshape the way we think in ways both trivial and profound -- but sweeping and ubiquitous.



In the same way that multinational corporations have become far more efficient and productive by outsourcing work to other countries and robosourcing work to intelligent, interconnected machines, we as individuals are becoming far more efficient and productive by instantly connecting our thoughts to computers, servers, and databases all over the world. Just as radical changes in the global economy have been driven by a positive feedback loop between outsourcing and robosourcing, the spread of computing power and the increasing number of people connected to the Internet are mutually reinforcing trends. Just as Earth Inc. is changing the role of human beings in the production process, the Global Mind is changing our relationship to the world of information.





The change being driven by the wholesale adoption of the Internet as the principal means of information exchange is simultaneously disruptive and creative. The futurist Kevin Kelly says that our new technological world -- infused with intelligence -- more and more resembles "a very complex organism that often follows its own urges." In this case, the large complex system includes not only the Internet and the computers, but also us.



Consider the impact on conversations. Many of us now routinely reach for smartphones to find the answers to questions that arise at the dinner table by searching the Internet with our fingertips. Indeed, many now spend so much time on their smartphones and other mobile Internet -- connected devices that oral conversation sometimes almost ceases. As a distinguished philosopher of the Internet, Sherry Turkle, recently wrote, we are spending more and more time "alone together."



The deeply engaging and immersive nature of online technologies has led many to ask whether their use might be addictive for some people. The Diagnostic and Statistical Manual of Mental Disorders (DSM), when it is updated in May 2013, will include "Internet Use Disorder" in its appendix for the first time, as a category targeted for further study. There are an estimated 500 million people in the world now playing online games at least one hour per day. In the United States, the average person under the age of twenty-one now spends almost as much time playing online games as they spend in classrooms from the sixth through twelfth grades. And it's not just young people: the average online social games player is a woman in her mid-forties. An estimated 55 percent of those playing social games in the U.S. -- and 60 percent in the U.K. -- are women. (Worldwide, women also generate 60 percent of the comments and post 70 percent of the pictures on Facebook.)



Of Memory, "Marks," and the Gutenberg Effect



Although these changes in behavior may seem trivial, the larger trend they illustrate is anything but. One of the most interesting debates among experts who study the relationship between people and the Internet is over how we may be adapting the internal organization of our brains -- and the nature of consciousness -- to the amount of time we are spending online.



Human memory has always been affected by each new advance in communications technology. Psychological studies have shown that when people are asked to remember a list of facts, those told in advance that the facts will later be retrievable on the Internet are not able to remember the list as well as a control group not informed that the facts could be found online. Similar studies have shown that regular users of GPS devices began to lose some of their innate sense of direction.



The implication is that many of us use the Internet -- and the devices, programs, and databases connected to it -- as an extension of our brains. This is not a metaphor; the studies indicate that it is a literal reallocation of mental energy. In a way, it makes sense to conserve our brain capacity by storing only the meager data that will allow us to retrieve facts from an external storage device. Or at least Albert Einstein thought so, once remarking: "Never memorize what you can look up in books."



For half a century neuroscientists have known that specific neuronal pathways grow and proliferate when used, while the disuse of neuron "trees" leads to their shrinkage and gradual loss of efficacy. Even before those discoveries, McLuhan described the process metaphorically, writing that when we adapt to a new tool that extends a function previously performed by the mind alone, we gradually lose touch with our former capacity because a "built-in numbing apparatus" subtly anesthetizes us to accommodate the attachment of a mental prosthetic connecting our brains seamlessly to the enhanced capacity inherent in the new tool.



In Plato's dialogues, when the Egyptian god Theuth tells one of the kings of Egypt, Thamus, that the new communications technology of the age -- writing -- would allow people to remember much more than previously, the king disagreed, saying, "It will implant forgetfulness in their souls: they will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks." [2]



So this dynamic is hardly new. What is profoundly different about the combination of Internet access and mobile personal computing devices is that the instantaneous connection between an individual's brain and the digital universe is so easy that a habitual reliance on external memory (or "exomemory") can become an extremely common behavior. The more common this behavior becomes, the greater one comes to rely on exomemory -- and the less one relies on memories stored in the brain itself. What becomes more important instead are the "external marks" referred to by Thamus 2,400 years ago. Indeed, one of the new measures of practical intelligence in the twenty-first century is the ease with which someone can quickly locate relevant information on the Internet.



Human consciousness has always been shaped by external creations. What makes human beings unique among, and dominant over, life-forms on Earth is our capacity for complex and abstract thought. Since the emergence of the neocortex in roughly its modern form around 200,000 years ago, however, the trajectory of human dominion over the Earth has been defined less by further developments in human physical evolution and more by the evolution of our relationship to the tools we have used to augment our leverage over reality.



Scientists disagree over whether the use of complex speech by humans emerged rather suddenly with a genetic mutation or whether it developed more gradually. But whatever its origin, complex speech radically changed the ability of humans to use information in gaining mastery over their circumstances by enabling us for the first time to communicate more intricate thoughts from one person to others. It also arguably represented the first example of the storing of information outside the human brain. And for most of human history, the spoken word was the principal "information technology" used in human societies.



The long hunter-gatherer period is associated with oral communication. The first use of written language is associated with the early stages of the Agricultural Revolution. The progressive development and use of more sophisticated tools for written language -- from stone tablets to papyrus to velum to paper, from pictograms to hieroglyphics to phonetic alphabets -- is associated with the emergence of complex civilizations in Mesopotamia, Egypt, China and India, the Mediterranean, and Central America.



The perfection by the ancient Greeks of the alphabet first devised by the Phoenicians led to a new way of thinking that explains the sudden explosion in Athens during the fourth and fifth centuries bce of philosophical discourse, dramatic theater, and the emergence of sophisticated concepts like democracy. Compared to hieroglyphics, pictographs, and cuneiform, the abstract shapes that made up the Greek alphabet -- like those that make up all modern Western alphabets -- have no more inherent meaning in themselves than the ones and zeros of digital code. But when they are arranged and rearranged in different combinations, they can be assigned gestalt meanings. The internal organization of the brain necessary to adapt to this new communications tool has been associated with the distinctive difference historians find in the civilization of ancient Greece compared to all of its predecessors.



The use of this new form of written communication led to an increased ability to store the collective wisdom of prior generations in a form that was external to the brain but nonetheless accessible. Later advances -- particularly the introduction of the printing press in the fourteenth century (in Asia) and the fifteenth century (in Europe) -- were also associated with a further expansion of the amount of knowledge stored externally and a further increase in the ease with which a much larger percentage of the population could gain access to it. With the introduction of print, the exponential curve that measures the complexity of human civilization suddenly bent upward at a sharply steeper angle. Our societies changed; our culture changed; our commerce changed; our politics changed.



Prior to the emergence of what McLuhan described as the Gutenberg Galaxy, most Europeans were illiterate. Their relative powerlessness was driven by their ignorance. Most libraries consisted of a few dozen hand-copied books, sometimes chained to the desks, written in a language that for the most part only the monks could understand. Access to the knowledge contained in these libraries was effectively restricted to the ruling elites in the feudal system, which wielded power in league with the medieval church, often by force of arms. The ability conferred by the printing press to capture, replicate, and distribute en masse the collected wisdom of preceding ages touched off the plethora of advances in information sharing that led to the modern world.



Less than two generations after Gutenberg's press came the Voyages of Discovery. When Columbus returned from the Bahamas, eleven print editions of the account of his journey captivated Europe. Within a quarter century sailing ships had circumnavigated the globe, bringing artifacts and knowledge from North, South, and Central America, Asia, and previously unknown parts of Africa.



In that same quarter century, the mass distribution of the Christian Bible in German and then other popular languages led to the Protestant Reformation (which was also fueled by Martin Luther's moral outrage over the print-empowered bubble in the market for indulgences, including the exciting new derivatives product: indulgences for sins yet to be committed). Luther's Ninety-Five Theses, nailed to the door of the church in Wittenberg in 1517, were written in Latin, but thousands of copies distributed to the public were printed in German. Within a decade, more than six million copies of various Reformation pamphlets had been printed, more than a quarter of them written by Luther himself.



The proliferation of texts in languages spoken by the average person triggered a series of mass adaptations to the new flow of information, beginning a wave of literacy that began in Northern Europe and moved southward. In France, as the wave began to crest, the printing press was denounced as "the work of the Devil." But as popular appetites grew for the seemingly limitless information that could be conveyed in the printed word, the ancient wisdom of the Greeks and Romans became accessible. The resulting explosion of thought and communication stimulated the emergence of a new way of thinking about the legacy of the past and the possibilities of the future.



The mass distribution of knowledge about the world of the present began to shake the foundations of the feudal order. The modern world that is now being transformed by kind rather than degree rose out of the ruins of the civilization that we might say was creatively destroyed by the printing press. The Scientific Revolution began less than a hundred years after Gutenberg's Bible, with the publication of Nicolaus Copernicus's Revolution of the Spheres (a copy of which he received fresh from the printer on his deathbed). Less than a century later Galileo confirmed heliocentrism. A few years after that came Descartes's "Clockwork Universe." And the race was on.



Challenges to the primacy of the medieval church and the feudal lords became challenges to the absolute rule of monarchs. Merchants and farmers began to ask why they could not exercise some form of self-determination based on the knowledge now available to them. A virtual "public square" emerged, within which ideas were exchanged by individuals. The Agora of ancient Athens and the Forum of the Roman Republic were physical places where the exchange of ideas took place, but the larger virtual forum created by the printing press mimicked important features of its predecessors in the ancient world.



Improvements to the printing press led to lower costs and the proliferation of printers looking for material to publish. Entry barriers were very low, both for obtaining the printed works of others and for contributing one's own thoughts. Soon the demand for knowledge led to modern works -- from Cervantes and Shakespeare to journals and then newspapers. Ideas that found resonance with large numbers of people attracted a larger audience still---in the manner of a Google search today.



In the Age of Enlightenment that ensued, knowledge and reason became a source of political power that rivaled wealth and force of arms. The possibility of self-governance within a framework of representative democracy was itself an outgrowth of this new public square created within the information ecosystem of the printing press. Individuals with the freedom to read and communicate with others could make decisions collectively and shape their own destiny.



At the beginning of January in 1776, Thomas Paine -- who had migrated from England to Philadelphia with no money, no family connections, and no source of influence other than an ability to express himself clearly in the printed word -- published Common Sense, the pamphlet that helped to ignite the American War of Independence that July. The theory of modern free market capitalism, codified by Adam Smith in the same year, operated according to the same underlying principles. Individuals with free access to information about markets could freely choose to buy or sell---and the aggregate of all their decisions would constitute an "invisible hand" to allocate resources, balance supply with demand, and set prices at an optimal level to maximize economic efficiency. It is fitting that the first volume of Gibbon's Decline and Fall of the Roman Empire was also published in the same year. Its runaway popularity was a counterpoint to the prevailing exhilaration about the future. The old order was truly gone; those of the present generation were busy making the world new again, with new ways of thinking and new institutions shaped by the print revolution.



It should not surprise us, then, that the Digital Revolution, which is sweeping the world much faster and more powerfully than the Print Revolution did in its time, is ushering in with it another wave of new societal, cultural, political, and commercial patterns that are beginning to make our world new yet again. As dramatic as the changes wrought by the Print Revolution were (and as were those wrought earlier by the introduction of complex speech, writing, and phonetic alphabets), none of these previous waves of change remotely compares with what we are now beginning to experience as a result of today's emergent combination of nearly ubiquitous computing and access to the Internet. Computers have been roughly doubling in processing power (per dollar spent) every eighteen to twenty-four months for the last half--century. This remarkable pattern -- which follows Moore's Law -- has continued in spite of periodic predictions that it would soon run its course. Though some experts believe that Moore's Law may now finally be expiring over the next decade, others believe that new advances such as quantum computing will lead to continued rapid increases in computing power.



Our societies, culture, politics, commerce, educational systems, ways of relating to one another -- and our ways of thinking -- are all being profoundly reorganized with the emergence of the Global Mind and the growth of digital information at exponential rates. The annual production and storage of digital data by companies and individuals is 60,000 times more than the total amount of information contained in the Library of Congress. By 2011, the amount of information created and replicated had grown by a factor of nine in just five years. (The amount of digital storage capacity did not surpass analog storage until 2002, but within only five years the percentage of information stored digitally grew to 94 percent of all stored information.) Two years earlier, the volume of data transmitted from mobile devices had already exceeded the total volume of all voice data transmitted. Not coincidentally, from 2003 to 2010, the average telephone call grew shorter by almost half, from three minutes to one minute and forty-seven seconds.



The number of people worldwide connected to the Internet doubled between 2005 and 2010 and in 2012 reached 2.4 billion users globally. By 2015, there will be as many mobile devices as there are people in the world. The number of mobile-only Internet users is expected to increase 56-fold over the next five years. Aggregate information flow using smartphones is projected to increase 47-fold over the same period. Smartphones already have captured more than half of the mobile phone market in the United States and many other developed countries.



But this is not just a phenomenon in wealthy countries. Although computers and tablets are still more concentrated in advanced nations, the reduction in the cost of computing power and the proliferation of smaller, more mobile computing devices is spreading access to the Global Mind throughout the world. More than 5 billion of the 7 billion people in the world now have access to mobile phones. In 2012, there were 1.1 billion active smartphone users worldwide -- still under one fifth of the global market. While smartphones capable of connecting to the Internet are still priced beyond the reach of the majority of people in developing countries, the same relentless cost reductions that have characterized the digital age since its inception are now driving the migration of smart features and Internet connectivity into affordable versions of low-end smartphones that will soon be nearly ubiquitous.



Already, the perceived value of being able to connect to the Internet has led to the labeling of Internet access as a new "human right" in a United Nations report. Nicholas Negroponte has led one of two competing global initiatives to provide an inexpensive ($100 to $140) computer or tablet to every child in the world who does not have one. This effort to close the "information gap" also follows a pattern that began in wealthy countries. For example, the United States dealt with concerns in the 1990s about a gap between "information haves" and "information have-nots" by passing a new law that subsidized the connection of every school and library to the Internet.



The behavioral changes driven by the digital revolution in developed countries also have at least some predictive value for the changes now in store for the world as a whole. According to a survey by Ericsson, 40 percent of smartphone owners connect to the Internet immediately upon awakening -- even before they get out of bed. And that kick--starts a behavioral pattern that extends throughout their waking hours. While they are driving to work in the morning, for example, they encounter one of the new hazards to public health and safety: the use of mobile communications devices by people who email, text, play games, and talk on the phone while simultaneously trying to operate their cars and trucks.



In one extreme example of this phenomenon, a commercial airliner flew ninety minutes past its scheduled destination because both the pilot and copilot were absorbed with their personal laptops in the cockpit, oblivious as more than twelve air traffic controllers in three different cities tried to get their attention -- and as the Strategic Air Command readied fighter jets to intercept the plane -- before the distracted pilots finally disengaged from their computers.



The popularity of the iPhone and the amount of time people communicate over its videoconferencing feature, FaceTime, has caused a few to actually modify the appearance of their faces in order to adapt to the new technology. Plastic surgeon Robert K. Sigal reported that "patients come in with their iPhones and show me how they look on FaceTime. The angle at which the phone is held, with the caller looking downward into the camera, really captures any heaviness, fullness and sagging of the face and neck. People say, 'I never knew I looked like that! I need to do something!' I've started calling it the 'FaceTime Facelift' effect. And we've developed procedures to specifically address it."





--------------------------------------------------------------------------------



[1] There is considerable debate and controversy over when--and even whether--artificial intelligence will reach a stage of development at which its ability to truly "think" is comparable to that of the human brain. The analysis presented in this chapter is based on the assumption that such a development is still speculative and will probably not arrive for several decades at the earliest. The disagreement over whether it will arrive at all requires a level of understanding about the nature of consciousness that scientists have not yet reached. Supercomputers have already demonstrated some capabilities that are far superior to those of human beings and are effectively making some important decisions for us already--handling high-frequency algorithmic trading on financial exchanges, for example--and discerning previously hidden complex relationships within very large amounts of data.



[2] The memory bank of the Internet is deteriorating through a process that Vint Cerf, a close friend who is often described as a "father of the Internet" (along with Robert Kahn, with whom he co--developed the TCP/IP protocol that allows computers and devices on the Internet to link with one another), calls "bit rot"--information disappears either because newer software can't read older, complex file formats or because the URL that the information is linked to is not renewed. Cerf calls for a "digital vellum"--a reliable and survivable medium to preserve the Internet's memory.



From the book THE FUTURE, by Al Gore, to be publishing by Random House this month. Copyright © 2013 by Albert Gore, Jr. Reprinted by arrangement with Random House. All rights reserved.







No comments: