The Mysteriously Memorable 20s
Why do we remember more from young adulthood than from any other time of our lives?
Twentysomethings are having a moment. They’re inspiring self-help guides (see Meg Jay’s The Defining Decade: Why Your Twenties Matter—And How To Make the Most of Them Now), hit television shows, Tumblrs-turned-handbooks, and lyrical New Yorker think pieces. What is it about twentysomethings? Robin Henig asked in the New York Times Magazine not too long ago. In part, she was talking about the current crop of young adults. They are dreamy—they have their own fairy tales!—but also deflated and recession-squeezed; peculiarly savvy and adrift, connected and lonely, knowing and naïve. But she was also voicing a perennial obsession. What is it about twentysomethings in general? Why are we so fixated on the no-man’s-land between childhood and stable adulthood?
A little-known but robust line of research shows that there really is something deeply, weirdly meaningful about this period. It plays an outsize role in how we structure our expectations, stories, and memories. The basic finding is this: We remember more events from late adolescence and early adulthood than from any other stage of our lives. This phenomenon is called the reminiscence bump.
Memory researchers have been wrestling with the reminiscence bump since at least the 1980s, when studies began turning up evidence that memory has a peculiar affinity for events that happen during the third decade of life. They still aren’t completely sure what causes us to drench those years with special import, whether it’s the intrinsic qualities of events that happen within that time frame, a consequence of the way our 20-year-old brains encode information, or a recall strategy that arbitrarily favors milestones from our salad days.
Autobiographical memories are not distributed equally across the lifespan. Instead, people tend to experience a period of childhood amnesia between birth and age 5, a reminiscence bump between age 10 and age 30 (with a particular concentration of memories in the early 20s), and at any age, a vivid period of recency from the present waning back to the end of the reminiscence bump.
At first, researchers proposed that the reminiscence bump coincided with a phase of developing mental firepower. Young adults encoded more information about the world because they were using state-of-the-art biological equipment, the theory went—relatively fresh and agile minds. As cognitive function declined with age, the flood of recorded memories would naturally slow to a trickle, though recent experiences would remain accessible.
Going deeper into the mechanisms of recall, scientists also noted that the brain transcribes novel experiences more readily than mundane ones. For instance, a 1988 study found that 93 percent of vivid life memories concern unique or first-time events. Might the reminiscence bump reflect the fact that late adolescence and early adulthood are suffused with “firsts” (first relationship, first time leaving home, first job, first marriage, first child)?
I called up Joshua Foer, author of Moonwalking With Einstein: The Art and Science of Remembering Everything, for his take. The effect “does seem related to how adulthood is structured,” he allowed. One’s 20s form “the period that’s the most varied and exciting; that’s when you’re hitchhiking across the country, going on lots of dates, having interesting encounters and learning about things for the first time.”
“You’re going to remember your trip hiking across Peru,” Foer continued, “more than the year you spent sitting in your office doing the same job you’d been doing for the past five years.”
Yet the cognitive account of the reminiscence bump leaves many questions unanswered. It doesn’t explain why only a small portion of the memories that constitute the bump relate to novel experiences. Nor can a hypothesis grounded in mnemonic processes say much about the results of a 2010 study by Annette Bohn and Dorthe Berntsen, who created a form of reminiscence bump in schoolchildren without asking them to remember a thing. They asked a large group of students, aged 10 to 14, to write their life stories. Most of the future events the kids dreamed up clustered around young adulthood. If the reminiscence bump were merely an offshoot of how our brains store memories, the researchers argued, the children wouldn’t have also privileged their 20s when projecting ahead.
Such findings lend credence to an alternate theory about the bump, one soaked in what’s become known as the “narrative perspective.” This approach focuses not on the mechanics of memory but on its underlying motivational factors. It suggests that we organize remembered events in ways that help us understand who we are.
In 2002, following in the footsteps of the narrative theorists, Berntsen and David Rubin advanced a “life script” account of the reminiscence bump. They defined the life script as a culturally conditioned storyline of events that make up a skeletal life course—and claimed that people often consult such a template when asked to remember their pasts. According to the researchers, the benchmark moments in life scripts (called “slots”) have two relevant features: They’re usually happy—childbirth, one’s wedding day—and they occur with greater frequency during one’s 20s and late teens. (Why happy? Perhaps because joyful scenarios are more fun to rehearse and also help us regulate our moods. Or perhaps because we are reluctant, collectively, to acknowledge that our time on earth will contain suffering as well as pleasure.) Most of the memories that arise during the reminiscence bump also have a positive valence; such results would seem to support the idea that our recall strategies depend at least partially on socially inflected hunches about what the typical life entails.
Illustration by Robert Neubecker.
Except—what about the recollections of events that don’t fall within the parameters of the life script, the sad or unexpected ones? Why do these memories sometimes return with such laser-cut sharpness? A survey of older Bangladeshi men and women taken in 2000 revealed two reminiscence bumps: one in the typical 10-30 age range and one between the ages of 35 and 55. The scientists linked the second bump to a sorrow-tinged accident of history that the participants’ life scripts failed to predict: the Bangladesh Liberation War, when West Pakistani forces clamped down on protesters in what was then East Pakistan. The strength of the wartime memories couldn’t be entirely explained by the fact that those years were dramatic and frightening. Something else made the survivors’ painful memories so persistent—and that something might shed light on the standard reminiscence bump in ways the life script couldn’t.
The Bangladesh study set the stage for yet another explanation of our memory processes, one based on “self-defining episodes.” It could be that we favor recollections, whatever their emotional charge, that reinforce who we think we are. Integral to this account is the notion that identity and memory are interfused: Our self-image hangs on the experiences we salvage from the past, and we select certain moments for safekeeping because of how they anchor our self-image. Maybe the fact that the Bangladeshi survey takers remembered a period of political strife so lucidly meant that they found it personally formative. Likewise, if we recall more experiences from young adulthood, perhaps that’s because our sense of self glimmers between those early milestones.
To test this theory, a team of scientists from England’s University of Leeds devised a clever experiment. Noting that developmental psychologists have isolated the second and third decades as times of identity formation, they gathered a group of volunteers and tried to map the emergence of their self-perceptions. Participants were asked to complete 20 “I am” statements (e.g., “I am quick-tempered”; “I am a mother”). Then they were instructed to pick three statements and come up with 10 memories that seemed relevant to each. Finally, the volunteers were told to pinpoint as best they could the ages at which their three personality traits surfaced. If it’s true that we remember more assiduously during bursts of self-making—and that these self-making periods tend to span our late teens and early 20s—a few things should happen, the researchers reasoned. First, participants should frequently date the unfurling of their “I am” statements to young adulthood. Second, the memories they summoned to support each “I am” statement should constellate around the age at which they believed the “I am” statement started to apply.
That was exactly what transpired. A majority of the memories associated with a particular self-image came from the very same year that the self-image developed. It seemed clear that the more salient a past experience was to your identity, the more luminous it grew in your memory. And what turned out to be the median age at which all these traits and self-concepts were acquired? 22.9.
In the pilot of Girls, Lena Dunham’s Hannah Horvath delivers one of the 2012 fall TV season’s immortal lines. She can’t hang out with her parents, she explains exasperatedly, because “I have work, and then I have a dinner thing, and then I am busy—trying to become who I am.” Memory research supports this notion that our twentysomething years are gardens of self-creation (or ramen noodle cups full of self-creation, if we’re honest). No wonder the decade is so, well, memorable.
A delightfully named duo of scientists, Judith Gluck and Susan Bluck, have proposed that the convergence of three qualities can make an event indelible in our minds. It is joyous. It allows us to exert control. And we perceive it to be highly influential over the course of our lives. They add that all of these qualities fit into a narrative, identity-based account of the reminiscence bump, since we are eager to thread our autobiographies with positive experiences, to assert our agency, and to array the flux of our being across scaffolds of cause and effect.
Foer, too, seems most persuaded by the identity-based approach to the reminiscence bump. “The fact is that in this period [one’s 20s], you are becoming the person you’re going to be,” he says. And he describes a study in which researchers found that most movie adaptations and remakes occur exactly 20 years after the originals come out. Apparently, whatever touches people as young adults looms so large for the rest of their lives that when they reach the age at which their generation starts to create the culture—around 40—books and screens fill up with the arcana of 20 years ago. “So look out for a new Teenage Mutant Ninja Turtles film any day now,” Foer finished. Consider yourself warned.
This is hardly a stopping place for research into such peculiar memory hotspots, of course. Many questions have yet to be answered. Do we have more experiences that fit Gluck and Bluck’s criteria in our 20s? Does something about being young make us relate the external world more intensely to our inner lives, so that we see everyday episodes as self-defining? But as the field continues to grow, perhaps we should at least stop worrying about how the glory days flash by. Quicksilver as they seem, our 20s are destined to stick with us for a long time.
Katy Waldman is a Slate assistant editor.