Cathy N. Davidson's Now You See It:Do the young really rule in the Internet era?

Reading between the lines.
Aug. 22 2011 6:47 AM

Who's Afraid of Digital Natives?

Let's not get intimidated by kids and their Internet savvy.

(Continued from Page 1)

There is the essential sameness, first of all, of the neural architecture of all humans, both young and old. According to Davidson, our habits of attention are entirely learned. Pre-Google adults learned to attend to static words printed on paper; our digital-age children have learned to attend to hypertext and multimedia. She fails to acknowledge, however, that we're all using the same basic equipment, shaped in specific ways by evolution. "What does it mean to say that we learn to pay attention?" Davidson writes. "It means no one is born with innate knowledge of how to focus or what to focus on. Infants track just about anything and everything and have no idea that one thing counts as more worthy of attention than another. They eventually learn because we teach them, from the day they are born, what we consider to be important enough to focus on."

This is simply wrong. Experiments by Andrew Meltzoff of the University of Washington and many others have shown that newborns only minutes old choose to focus on human faces over inanimate objects—a preference found across time and across cultures—and will even imitate simple gestures like sticking out the tongue. A studypublished earlier this summer found that the brains of infants as young as three months of age show greater activity when they hear sounds made by people than when they hear equally familiar sounds made by toys or water—evidence of the early development of a brain region specialized for inter-human communication. Throughout their lives, people focus on the prospects of danger, sex, and food, not because our parents taught us these things are important but because that's how we've been built to survive.

There are many other more subtle biases of the evolved human brain—its tendency to focus on the thing that changes rather than the thing that's constant, for example, or its predisposition to remember stories and pictures over abstract ideas—and their pervasive influence in shaping the way all of us focus and pay attention makes the idea that young people are "wired" completely differently seem rather facile. Our parents grew up listening to the radio, and we grew up watching TV; if our children grow up Facebook-ing and YouTube-ing, that's less a radical break than the elaboration of a technology that, after all, we created and shared with them.

Advertisement

Davidson neglects, too, the research showing the ways in which our attention is constrained by the cognitive capacities of our brains. She dismisses the value of single-minded focus, and the concerns of students and workers who struggle to cope with the multiplying demands on their attention. "If we feel distracted, that is not a problem," she declares. "It is an awakening." What we call distractions, Davidson says, are actually "opportunities for learning." We must give up the old, 20th-century way of paying attention, suited to a vanishing industrial era, and adopt a new, 21st-century way of paying attention, tailored to a digital epoch. Her position ignores the inflexible and near-universal limits on our working memory, which allow us to hold only a few items of information in our consciousness at a time, or the work of researchers like Clifford Nass of Stanford University. "Human cognition is ill-suited both for attending to multiple input streams and for simultaneously performing multiple tasks," Nass has written. In other words, people are inherently lousy at multitasking. Contrary to the notion that those who've grown up multitasking a lot have learned to do it well, Nass's research has found that heavy multitaskers are actually less effective at filtering out irrelevant information and at shifting their attention among tasks than others.

If it seems that some young people are more adept than their elders at handling multiple streams of information—at, say, doing their homework while also emailing, texting, Googling, Digging, iTuning, and Angry Birding—that may be a developmental difference rather than a cultural one. As we grow older, our brains change in predictable ways. We're less able to block out distractions, less able to hold many facts in our working memory. Members of the Internet generation aren't some exotic new breed of human, in other words. They're simply the young of the same species. And they won't be young forever. The digital age has brought all of us new and exciting tools that will surely continue to alter the way we learn and work. But focusing one's attention, gathering and synthesizing evidence, and constructing a coherent argument are skills as necessary as they were before—in fact, more necessary than ever, given the swamp of baseless assertion and outright falsehood that is much of the Web. Some day not too far in the future, the digital natives may find themselves turning down the music, shutting off the flickering screen, silencing the buzzing phone and sitting down to do just one thing at a time.

Annie Murphy Paul is a fellow at the New America Foundation and the author of the forthcoming book Brilliant: The Science of How We Get Smarter.

  Slate Plus
Slate Archives
Nov. 26 2014 12:36 PM Slate Voice: “If It Happened There,” Thanksgiving Edition Josh Keating reads his piece on America’s annual festival pilgrimage.