In 2002, following in the footsteps of the narrative theorists, Berntsen and David Rubin advanced a “life script” account of the reminiscence bump. They defined the life script as a culturally conditioned storyline of events that make up a skeletal life course—and claimed that people often consult such a template when asked to remember their pasts. According to the researchers, the benchmark moments in life scripts (called “slots”) have two relevant features: They’re usually happy—childbirth, one’s wedding day—and they occur with greater frequency during one’s 20s and late teens. (Why happy? Perhaps because joyful scenarios are more fun to rehearse and also help us regulate our moods. Or perhaps because we are reluctant, collectively, to acknowledge that our time on earth will contain suffering as well as pleasure.) Most of the memories that arise during the reminiscence bump also have a positive valence; such results would seem to support the idea that our recall strategies depend at least partially on socially inflected hunches about what the typical life entails.
Except—what about the recollections of events that don’t fall within the parameters of the life script, the sad or unexpected ones? Why do these memories sometimes return with such laser-cut sharpness? A survey of older Bangladeshi men and women taken in 2000 revealed two reminiscence bumps: one in the typical 10-30 age range and one between the ages of 35 and 55. The scientists linked the second bump to a sorrow-tinged accident of history that the participants’ life scripts failed to predict: the Bangladesh Liberation War, when West Pakistani forces clamped down on protesters in what was then East Pakistan. The strength of the wartime memories couldn’t be entirely explained by the fact that those years were dramatic and frightening. Something else made the survivors’ painful memories so persistent—and that something might shed light on the standard reminiscence bump in ways the life script couldn’t.
The Bangladesh study set the stage for yet another explanation of our memory processes, one based on “self-defining episodes.” It could be that we favor recollections, whatever their emotional charge, that reinforce who we think we are. Integral to this account is the notion that identity and memory are interfused: Our self-image hangs on the experiences we salvage from the past, and we select certain moments for safekeeping because of how they anchor our self-image. Maybe the fact that the Bangladeshi survey takers remembered a period of political strife so lucidly meant that they found it personally formative. Likewise, if we recall more experiences from young adulthood, perhaps that’s because our sense of self glimmers between those early milestones.
To test this theory, a team of scientists from England’s University of Leeds devised a clever experiment. Noting that developmental psychologists have isolated the second and third decades as times of identity formation, they gathered a group of volunteers and tried to map the emergence of their self-perceptions. Participants were asked to complete 20 “I am” statements (e.g., “I am quick-tempered”; “I am a mother”). Then they were instructed to pick three statements and come up with 10 memories that seemed relevant to each. Finally, the volunteers were told to pinpoint as best they could the ages at which their three personality traits surfaced. If it’s true that we remember more assiduously during bursts of self-making—and that these self-making periods tend to span our late teens and early 20s—a few things should happen, the researchers reasoned. First, participants should frequently date the unfurling of their “I am” statements to young adulthood. Second, the memories they summoned to support each “I am” statement should constellate around the age at which they believed the “I am” statement started to apply.
That was exactly what transpired. A majority of the memories associated with a particular self-image came from the very same year that the self-image developed. It seemed clear that the more salient a past experience was to your identity, the more luminous it grew in your memory. And what turned out to be the median age at which all these traits and self-concepts were acquired? 22.9.
In the pilot of Girls, Lena Dunham’s Hannah Horvath delivers one of the 2012 fall TV season’s immortal lines. She can’t hang out with her parents, she explains exasperatedly, because “I have work, and then I have a dinner thing, and then I am busy—trying to become who I am.” Memory research supports this notion that our twentysomething years are gardens of self-creation (or ramen noodle cups full of self-creation, if we’re honest). No wonder the decade is so, well, memorable.
A delightfully named duo of scientists, Judith Gluck and Susan Bluck, have proposed that the convergence of three qualities can make an event indelible in our minds. It is joyous. It allows us to exert control. And we perceive it to be highly influential over the course of our lives. They add that all of these qualities fit into a narrative, identity-based account of the reminiscence bump, since we are eager to thread our autobiographies with positive experiences, to assert our agency, and to array the flux of our being across scaffolds of cause and effect.
Foer, too, seems most persuaded by the identity-based approach to the reminiscence bump. “The fact is that in this period [one’s 20s], you are becoming the person you’re going to be,” he says. And he describes a study in which researchers found that most movie adaptations and remakes occur exactly 20 years after the originals come out. Apparently, whatever touches people as young adults looms so large for the rest of their lives that when they reach the age at which their generation starts to create the culture—around 40—books and screens fill up with the arcana of 20 years ago. “So look out for a new Teenage Mutant Ninja Turtles film any day now,” Foer finished. Consider yourself warned.
This is hardly a stopping place for research into such peculiar memory hotspots, of course. Many questions have yet to be answered. Do we have more experiences that fit Gluck and Bluck’s criteria in our 20s? Does something about being young make us relate the external world more intensely to our inner lives, so that we see everyday episodes as self-defining? But as the field continues to grow, perhaps we should at least stop worrying about how the glory days flash by. Quicksilver as they seem, our 20s are destined to stick with us for a long time.