EVENTY-FOUR THOUSAND YEARS ago, humanity nearly went extinct. A super-volcano at what?s now Lake Toba, in Sumatra, erupted with a strength more than a thousand times that of Mount St. Helens in 1980. Some 800 cubic kilometers of ash filled the skies of the Northern Hemisphere, lowering global temperatures and pushing a climate already on the verge of an ice age over the edge. Some scientists speculate that as the Earth went into a deep freeze, the population of Homo sapiens may have dropped to as low as a few thousand families.
The Mount Toba incident, although unprecedented in magnitude, was part of a broad pattern. For a period of 2 million years, ending with the last ice age around 10,000 B.C., the Earth experienced a series of convulsive glacial events. This rapid-fire climate change meant that humans couldn?t rely on consistent patterns to know which animals to hunt, which plants to gather, or even which predators might be waiting around the corner.
How did we cope? By getting smarter. The neuro physi ol ogist William Calvin argues persuasively that modern human cognition?including sophisticated language and the capacity to plan ahead?evolved in response to the demands of this long age of turbulence. According to Calvin, the reason we survived is that our brains changed to meet the challenge: we transformed the ability to target a moving animal with a thrown rock into a capability for foresight and long-term planning. In the process, we may have developed syntax and formal structure from our simple language.
Our present century may not be quite as perilous for the human race as an ice age in the aftermath of a super-volcano eruption, but the next few decades will pose enormous hurdles that go beyond the climate crisis. The end of the fossil-fuel era, the fragility of the global food web, growing population density, and the spread of pandemics, as well as the emergence of radically transformative bio- and nano technologies?each of these threatens us with broad disruption or even devastation. And as good as our brains have become at planning ahead, we?re still biased toward looking for near-term, simple threats. Subtle, long-term risks, particularly those involving complex, global processes, remain devilishly hard for us to manage.
But here?s an optimistic scenario for you: if the next several decades are as bad as some of us fear they could be, we can respond, and survive, the way our species has done time and again: by getting smarter. But this time, we don?t have to rely solely on natural evolutionary processes to boost our intelligence. We can do it ourselves.
Most people don?t realize that this process is already under way. In fact, it?s happening all around us, across the full spectrum of how we understand intelligence. It?s visible in the hive mind of the Internet, in the powerful tools for simulation and visualization that are jump-starting new scientific disciplines, and in the development of drugs that some people (myself included) have discovered let them study harder, focus better, and stay awake longer with full clarity. So far, these augmentations have largely been outside of our bodies, but they?re very much part of who we are today: they?re physically separate from us, but we and they are becoming cognitively inseparable. And advances over the next few decades, driven by breakthroughs in genetic engineering and artificial intelligence, will make today?s technologies seem primitive. The nascent jargon of the field describes this as ? intelligence augmentation.? I prefer to think of it as ?You+.?
Scientists refer to the 12,000 years or so since the last ice age as the Holocene epoch. It encompasses the rise of human civilization and our co-evolution with tools and technologies that allow us to grapple with our physical environment. But if intelligence augmentation has the kind of impact I expect, we may soon have to start thinking of ourselves as living in an entirely new era. The focus of our technological evolution would be less on how we manage and adapt to our physical world, and more on how we manage and adapt to the immense amount of knowledge we?ve created. We can call it the N?ocene epoch, from Pierre Teilhard de Chardin?s concept of the N?osphere, a collective consciousness created by the deepening interaction of human minds. As that epoch draws closer, the world is becoming a very different place.
OF COURSE, WE?VE been augmenting our ability to think for millennia. When we developed written language, we significantly increased our functional memory and our ability to share insights and knowledge across time and space. The same thing happened with the invention of the printing press, the telegraph, and the radio. The rise of urbanization allowed a fraction of the populace to focus on more-cerebral tasks?a fraction that grew inexorably as more-complex economic and social practices demanded more knowledge work, and industrial technology reduced the demand for manual labor. And caffeine and nicotine, of course, are both classic cognitive-enhancement drugs, primitive though they may be.
With every technological step forward, though, has come anxiety about the possibility that technology harms our natural ability to think. These anxieties were given eloquent expression in these pages by Nicholas Carr, whose essay ?Is Google Making Us Stupid?? (July/August 2008 Atlantic) argued that the information-dense, hyperlink-rich, spastically churning Internet medium is effectively rewiring our brains, making it harder for us to engage in deep, relaxed contemplation.
Carr?s fears about the impact of wall-to-wall connectivity on the human intellect echo cyber-theorist Linda Stone?s description of ?continuous partial attention,? the modern phenomenon of having multiple activities and connections under way simultaneously. We?re becoming so accustomed to interruption that we?re starting to find focusing difficult, even when we?ve achieved a bit of quiet. It?s an induced form of ADD?a ?continuous partial attention-deficit disorder,? if you will.
There?s also just more information out there?because unlike with previous information media, with the Internet, creating material is nearly as easy as consuming it. And it?s easy to mistake more voices for more noise. In reality, though, the proliferation of diverse voices may actually improve our overall ability to think. In Everything Bad Is Good for You, Steven Johnson argues that the increasing complexity and range of media we engage with have, over the past century, made us smarter, rather than dumber, by providing a form of cognitive calisthenics. Even pulp-television shows and video games have become extraordinarily dense with detail, filled with subtle references to broader subjects, and more open to interactive engagement. They reward the capacity to make connections and to see patterns?precisely the kinds of skills we need for managing an information glut.
Scientists describe these skills as our ?fluid intelligence??the ability to find meaning in confusion and to solve new problems, independent of acquired knowledge. Fluid intelligence doesn?t look much like the capacity to memorize and recite facts, the skills that people have traditionally associated with brainpower. But building it up may improve the capacity to think deeply that Carr and others fear we?re losing for good. And we shouldn?t let the stresses associated with a transition to a new era blind us to that era?s astonishing potential. We swim in an ocean of data, accessible from nearly anywhere, generated by billions of devices. We?re only beginning to explore what we can do with this knowledge-at-a-touch.
Moreover, the technology-induced ADD that?s associated with this new world may be a short-term problem. The trouble isn?t that we have too much information at our fingertips, but that our tools for managing it are still in their infancy. Worries about ?information overload? predate the rise of the Web (Alvin Toffler coined the phrase in 1970), and many of the technologies that Carr worries about were developed precisely to help us get some control over a flood of data and ideas. Google isn?t the problem; it?s the beginning of a solution.
In any case, there?s no going back. The information sea isn?t going to dry up, and relying on cognitive habits evolved and perfected in an era of limited information flow?and limited information access?is futile. Strengthening our fluid intelligence is the only viable approach to navigating the age of constant connectivity.