Lexicon Valley
A Blog About Language

Dec. 2 2016 9:00 AM

The Dad-ification of Language Shows That Masculinity Is Still Evolving

Media outlets groaned at the “dad jokes” President Obama cracked at his final turkey pardon this Thanksgiving. In the runup to the election, BuzzFeed fawned over “how dad” vice presidential candidate Tim Kaine was. And as Hello Giggles giggled over the incident: “President Obama yelling at Bill Clinton to hurry up is the most dad thing that ever dadded.” We, or at least nostalgic Democrats, are having a major dad moment right now—and it might not just be signaling a change in our language, but also a change in our ideas about masculinity.

The Oxford English Dictionary dates “dad jokes” to 1987, citing a defense of the wince-inducing wit in a Gettysburg Times op-ed. The OED also finds “uncoordinated dad-dancing” in a 1996 Scottish Sunday paper. But dad has been proliferating in mainstream slang recently. Last year, we pedestaled the “dad bod.” This year, we bought up Steph Curry’s “dad shoes,” plain white sneakers that seem more at home mowing the lawn than driving the line. Men air-guitar a solo on their favorite “dadrock” hit or let their paunches hang out, arms akimbo, when assuming the “dad stance.” Loose-fitting “dad jeans” and beat-up “dad caps” are must-haves to pull off the fashionably uncool “dad style,” a trend some are dubbing “dadcore.

BuzzFeed has been further expanding the linguistic possibilities of dad. One listicle, “17 Pictures That Show Just How Dad Time Kaine Is,” features the politician unabashedly grinning as he eats brunch, plays harmonica, and performs public duties with a contagious excitement. As the piece concludes: “It’s possible that Tim Kaine could be the most dad VP in American history.” (Slate would agree, if Kaine’s “dad gestures” are any measure). BuzzFeed has previously employed dad in this way. A 2015 quiz, for instance, asked “How Dad Are You Actually?” and invited readers to tick off a long list of stereotypic dadlike behaviors ranging from “Got to the airport five hours early” to “Talked about which road to use to get to a place.” Respondents then learned whether they were “not very dad” or “extremely dad.”

Nov. 30 2016 9:30 AM

I Went Looking For “The Good Ol’ Days.” It Took Me Back 5,000 Years.

This essay is adapted from the Pessimists Archive podcast, a show about technology and the history of unfounded fears. Subscribe on iTunes, or listen to the full audio version.

Why Donald Trump won will be debated for generations, but we can all agree on one thing: Nostalgia is powerful. Just consider the elegance of “again” in “Make America Great Again.” It says that today’s changes ruined a glorious yesterday—that a golden age is ours to reclaim. “People want to believe that they are part of the greatest nation, that redemption is around the corner, that a perfect nation in which no suffering happens is possible,” says Alan Levinovitz, an assistant professor of religion at James Madison University. He calls tales like these “nostalgia narratives,” and says their appeal is their simplicity. “They identify clear villains in the causing of suffering,” he explains—which makes them the favored argument of demagogues.


But nostalgia narratives contain a fatal flaw: They harken back to real times. Real times full of records and histories. Real times that can be cross-referenced against the “golden age” story being told about them. So I began wondering what would happen if I did the fact-checking—looking into each generation’s tale of better bygone times, to see if anyone, at any time, really felt they were getting it right.

I started making calls. It took me back 5,000 years.

First, today. During the campaign, the Daily Show sent a bunch of correspondents to a Trump rally to ask when America was last great. One guy said the 1980s. Another said 1913, when we passed the 17th Amendment. (Correspondent: “So, back when women couldn’t vote?”) And then, of course, someone said the 1950s—the most popular answer. A recent survey by the nonprofit organization PRRI found that 51% of Americans believe our way of life is worse than it was in the 1950s.

But did the people of post-war America believe they were living in a golden age? No, says Doug McAdam, a professor of sociology and political science at Stanford University, and author of Deeply Divided: Racial Politics and Social Movements in Post-War America. Despite our current romanticizing of that time, tensions were high—about race, class, the threat of nuclear war, a political system that Americans felt had failed them, and a culture sapped of energy. “People talked about how mindless the students on college campus were, only tracking towards a conformist, consumer-oriented way of life, without soul,” says McAdam. “So there was a lot of commentary on the deadening conformity.”

McAdam says that many people in post-war America pointed to the ’20s as a better time. But the newspapers of the 20s reflect a culture full of anxiety and nostalgia. Radio, recorded music, and the automobile were reshaping American life, causing people to fear that our fundamental humanness was under attack. This 1923 New York Times headline sums it up nicely: ‘AMERICAN LIFE IS TOO FAST’. The story goes on to quote then-Secretary of State Charles Evan Hughes, who spoke for a generation of intellectuals when he said, “It is the day of the fleeting vision. Concentration, thoroughness, the quiet reflection that ripens judgment are more difficult than ever.” Two years later, the dean of Princeton University would declare that “The general effect of the automobile was to make the present generation look lightly at the moral code, and to decrease the value of the home.”

Certainly, those aren’t the words of people living in a golden age. So where to next? The characters in Downton Abbey, who lived in the 20s, all sat around wistfully recalling the late 1800s. It was a time of progress in America as in England: We’d built the transcontinental railroad, the frontier was conquered, and cities were booming. But there was also a fast-spreading disease—something frightening called neurasthenia. An abbreviated list of symptoms: chronic headaches, insomnia, constipation, chronic diarrhea, impotence, amenorrhea, low spirits, constant anxiety, and chronic back pain. “Basically any type of condition that made life somewhat unpleasant was attributed to neurasthenia,” says David G. Schuster, author of a book called Neurasthenic Nation.

At the turn of the century, people believed that “nervous energy” kept us physically and mentally vibrant. But as life became busier, faster, and noisier, there was a pervasive fear that all our newly busy lives sapped our nervous energy. And when this happened, the thinking went, we got sick. “I think some people might have identified as neurasthenic because they were unhappy with some aspect of their life,” Schuster says. “Neurasthenia gave them a word to describe that: It’s not necessarily their fault, but they are a feather floating on larger breezes of modernity.”

Which is to say, neurasthenia was a nostalgia narrative transformed into a disease.

The people of the late 1800s wanted to return to purer, quieter times, so Schuster says they romanticized the decades before the civil war. But according to Harry L. Watson, a professor of history at the University of North Carolina-Chapel Hill, the people of antebellum America were busily debating their fallen nation. A common complaint went like this: “The republic that the framers had created in the 1770s and 80s and 90s had decayed,” says Watson, “and somehow we'd strayed away from the will of the founders, and we ought to get back to it.” Today, of course, that’s a Tea Party rallying cry. But when Andrew Jackson led this charge, some of the founding fathers were actually still alive—and refuting it. James Madison essentially said, “I am a founding father, and you’re wrong about what I meant.” It didn’t work. But even back then, Watson says, facts took a backseat to what felt true.

So, antebellum America romanticized revolutionary America. And who were the revolutionaries romanticizing? “Jefferson would talk about the ancient Saxon constitution of England as being far superior to anything that existed in his lifetime,” Watson says, “and bemoan that we couldn't seem to get back there. Benjamin Franklin said similar things.”

This is a hefty temporal jump: The Anglo Saxons inhabited Great Britain from roughly the years 500 to 1066. And though they may have written a hell of a constitution, the Vikings gave them little time to celebrate it. “You're living in this world in which these brutal pagan invaders are constantly destroying your crops, killing your family, wrecking the religious institutions that define your life,” says Andrew Rabin, a professor of English at the University of Louisville, “and so there is this strong nostalgia for an age before the Vikings came.”

Nostalgia, in fact, was central to the Anglo Saxon worldview. Their poems often contained what’s called an “ubi sunt” passage, which is when the narrator would start talking about how to Make The 10th Century Great Again. The poem “The Wanderer” contains a nice one, translated by Rabin from Old English: Where has the horse gone? Where is the rider? Where is the giver of gold? Where are the seats of the feast? Where are the joys of the hall? Alas for the bright cup, alas for the mailed warrior, alas for the splendor of the prince.

These poems don’t refer to a specific golden age, and we don’t know much about the Anglo Saxons before the Vikings arrived. But the Renaissance thinkers, who would come along four centuries later, romanticized the ancient Romans. And the Romans romanticized, well, earlier Romans.

The Roman historian Tacitus captures the mood. He records the empire from its beginning, in 509 B.C. (which he says was full of glorious heroes) to his time in about 100 B.C. (which he keeps apologizing for). “He's constantly saying, ‘I'm sorry for telling you about yet more murders that the autocratic emperors have committed against their own subjects, and more rapes, and more sexual perversion, and more records of excessive dining, eating, and, you know, sumptuary practices,’” says Alex Dressler, an assistant professor of classics at the University of Wisconsin-Madison. But Romans before Tacitus said basically the same thing, Dressler says. The more money and power the Romans acquired, the more they felt like their nation was getting indulgent and lazy, and therefore the more they looked backwards to a time before they got what they wanted. The wanting, it seems, mattered more than the having.

So let’s just skip to the beginning—to Mesopotamia, when humanity began writing for the first time. Eckart Frahm, a professor of Assyriology at Yale University, says a curious trend shows up in the old cuneiform tablets: At the beginning of writing, around 3500 B.C., nobody pens anything nostalgic. But after about two centuries, as records pile up and scholars read what their ancestors wrote, that changes. “Then,” he says, “there is this idea that there must have been an age where things were really perfect.”

There it is. As soon as we started telling our own story, we became seduced by it.

Of course, this can’t come as much of a surprise. Nostalgia seems hardwired, as human as fear and love. But having combed through 5,000 years of history to confirm an absence of the good ol’ days, what can we do with this information? Can these facts ever sway someone who looks backwards in longing, even if they aren’t sure exactly where they’re looking to? Levonovitz, the religion professor, says that’s not the right way to think about it. “Nostalgia narratives are often born of great pain,” he says. “And when you walk up to someone who is in great pain, and you rip away from them the key story that is keeping them from just dissolving into a puddle of suffering, you're messing with people.”

Which loops right back to Make America Great Again. The again was never meant to be a specific moment in time; it’s an arrow pointing any which way, to whenever someone felt better, or remembers feeling better, or assumes they might have felt better. Which means forward thinking Americans have their work cut out for them, pointing to a future that’s worth embracing over the past. “We either have to be patient and work slowly at the parts of the narratives that are most pernicious, and work gently and tactfully and lovingly with the people who believe them,” says Levinovitz, “or we have to be damn sure that when we rip that narrative away, we have something awesome to fill its place.”

It’s a hard story to tell, and it won’t be accomplished just by disproving the golden age. But if we can all be a little more aware of the stories we tell, and why we tell them, then at least that’s a start.

Nov. 29 2016 11:58 AM

The Redundant "Close Proximity" Is Way More Beloved Than It Should Be

This article originally appeared in Zócalo Public Square.

The warning echoes beneath the girdered ceiling of Boston’s South Station, and in the cramped bustle of New York’s Penn Station, on a TSA loop of repeating announcements: “Keep personal items in close proximity.”


Any prerecorded phrase, repeated often enough, can drive one mad. But that last two-word phrase is especially wretched.

“Close proximity” irks me viscerally, like chewing tinfoil. We are bombarded with it daily, not just by the TSA in train stations and airports, but also in the news, and even in literature.

For starters, it is redundant. Is there another variety of proximity than one that is close? Far proximity perhaps? Moderate proximity? There is not.

And it is wordy. “Stay close to your bags” works well, as does “keep your bags nearby.” If you like the sound of the word, you can use “keep your bags in proximity.”

It is hard to account for the popularity of the phrase. Maybe it’s pleasing to the ear. There’s the symmetrical arc of the five-syllable phrase, the two hard consonants kicking it off, accented on the unlikely third—that SIM sound made by “xim”—then trailing gently off. There’s a strong echo of the Latin root, proximitas, so proximity is a 50-cent word that suggests higher learning. The phrase, coming off our tongues, makes us feel smart, educated, erudite.

However pleasing it sounds, it’s a loathsome phrase for yet another reason: It seems we modify proximity with “close” because we do not trust the audience to understand proximity’s meaning when it stands alone. Anyone confused by “proximity,” we reason, will surely understand “close proximity.” It’s a patronizing attitude.

It is not surprising that the phrase has worked its way into bureaucratese, the hallmark of which is a redundant and long-winded surfeit of excessive verbiage. So it is easy to understand why TSA used the phrase in its announcement—“close proximity” carries the patina of officialdom, in a way that “nearby” or “close” alone do not.

But it is surprising how many good writers use the phrase. Robert Galbraith, a.k.a. J.K. Rowling, used the phrase four times in the lively and fun novel The Cuckoo’s Calling.

It sneaks past New Yorker editors, as in this piece on the paleo diet by Pulitzer Prize winning author Elizabeth Kolbert: “And, by living in close proximity to their equally crowded farm animals, early agriculturalists helped to bring into being a whole set of diseases that jumped from livestock to people.”

John McPhee, master practitioner and teacher of nonfiction style, dropped it into an otherwise exciting passage in his book The Founding Fish: “In close proximity to the canoe, one shad was a good deal more vigorous than the other had been.”

And the eloquent Gay Talese used it in Thy Neighbor’s Wife: “During the day he strolled through the business districts, noting the close proximity of Woolworth’s and J.C. Penney to the local massage parlor and X-rated theater.”

Consider those three sentences for a moment. Would they read any differently if an editor had struck the word “close”? I think not. And it’s not a matter of degrees. When writers want to indicate a more notable nearness, they typically modify the phrase with such, as in “such close proximity.” (Here, again, “such proximity” has the exact same effect.)

The phrase even taints journalism—which is surprising because journalists and editors are always working to pare a few syllables to fit ever-shrinking print news holes. It has appeared more than 4,000 times in the New York Times, starting in 1852. In 1864, the paper used it to describe Sherman’s march: “Richmond papers of Monday last are received. They report Gen. SHERMAN to be moving on, and in close proximity to Savannah.”

The phrase seems appropriate there, the sort of thing you would expect to read in musty archives. Oddly, though, the antiquated phrase appeals equally to younger scribes. BuzzFeed has used the phrase more than 200 times in just three years, in articles such as “You Might be Cleaning Your Penis Wrong.”

For all the wrong reasons, the phrase is probably here to stay, and I’ll try to tune it out, in Penn Station and elsewhere. Anyway, it’s obvious there are other stilted redundancies cluttering our lexicon. Hell, it’s blatantly obvious.

Nov. 28 2016 3:37 PM

In Moments of Crisis, Should You Really “Run, Hide, Fight”?

As a terrifying knife-attack scenario unfurled at Ohio State University on Monday morning, emergency services tweeted typical instructions to the student body: “Run Hide Fight.” This direction, regularly promoted by the FBI and DHS, is intended to simplify the thought-processes of people confronted with imminent danger. The message is ostensibly simple: Run if you can, hide if you can’t, and fight if there’s no other option on the table. Easy enough, right?

Nov. 22 2016 6:52 PM

There’s No Better Term for the Alt-Right Than Alt-Right

Tuesday afternoon, in the wake of this past weekend’s widely covered meeting of Richard Spencer’s white supremacist National Policy Institute, ThinkProgress published an editor’s note telling readers the site will no longer use the descriptor alt-right:

Nov. 21 2016 10:54 AM

Stress-Cleaning, Gilmore Girls Therapy, and the Wordplay of Post-Election Coping Strategies

Election stress has conceded to post-election stress. To deal with our new reality under President-elect Trump, many of us are “stress-eating” ice cream or seeking “workout therapy” at the gym. Wordplay, it seems on social media, is emerging as one of America’s coping mechanisms, and we’re getting through the election results with two words in particular: stress and therapy.

Life after Trump is putting stress in front of every verb you do. We are “stress-watching” TV, “stress-listening” to music, “stress-playing” guitar, “stress-reading” novels, “stress-exercising,” “stress-drinking,” “stress-texting,” “stress-cleaning,” and even having “stress-sex.” One tweeter stress-joked: “I feel like I’m carrying around a giant weight today. Mostly because of election results, partially from binge stress eating.” So many of us are “stress-living” now, as one user pointed out on Twitter. What’s going on with this stress- construction?

These stress- words are a special type of word formation linguists call synthetic compounding, which can function in several ways. Here, stress- is characterizing the purpose of the verb: Stress-eating, for instance, is a type of food consumption in response to stress. (Compare stress-jogging, say, to stress-relieving or stress-testing.) Speakers, liking economy, shortened the name of this behavior to stress-eating and then, in a process known as back-formation, rendered it a verb, stress-eat, to use more easily across other contexts.

Nov. 17 2016 2:02 PM

Why We Shouldn’t Talk About “Normalizing” Donald Trump

When Donald Trump won the presidency, our vocabularies didn’t bulge to accommodate the reality that this ignorant geyser of hate had ascended to the world’s yugest leadership position. We’re left pressing the same worn-out words into service, paradoxically reminding each other: This is not normal.

In an essay for the New York Times Magazine, Teju Cole wrote, of the days following Trump’s win, “All around were the unmistakable signs of normalization in progress. So many were falling into line without being pushed. It was happening at tremendous speed, like a contagion.” David Remnick told CNN, “We’ve normalized [the results] already. Less than a week after the election is over, suddenly Washington is going about its business talking about who’s going to get what jobs. You would think that Mitt Romney had won. It’s a hallucination.”


After a while one grows habituated to people explaining that Trump trespasses against all precedent and convention. With each new twist in the Trump saga—the uptick in hate crimes, the plan to appoint a racist as his chief strategist, the Twitter rants against the First Amendment, the seeking of security clearances for family members—we hear the same feeble-sounding plea. “He is not normal,” insisted John Oliver over the weekend. “He is abnormal.” Shouts of “normalization” have become normalized.

The frame we’re putting around the president-elect emphasizes how freakishly outside the mainstream his views and behavior lie. That’s useful, up to a point. But in appealing to what’s typical rather than what’s right or true, we’re missing an opportunity to make a stronger statement. Trump himself aims to center white men as “normal” and push everyone else to the periphery. If populist, white nationalist currents swept this demagogue into the White House, perhaps we shouldn’t denounce him by invoking the wisdom of crowds. Trump won the electoral college. Our country chose him. To more than 60 million of our fellow countrymen, Donald Trump is normal, even if it’s painful to admit that.

An outsider challenging establishment foes, Trump pledged to “make America great again”—essentially, to bring our country back to the days when white men ruled the roost. At the same time, he pledged to invert the meaning of normalcy in the United States circa 2016, bringing the fringe into the mainstream and expelling the elites to the margins. Liberals, in other words, don’t have a monopoly on the concept of normalcy. Trump’s candidacy was centered on his vision of what’s normal (the white working class) and what’s not (recent immigrants, our black president).

In this, Trump resembles Richard Nixon, who petitioned a “silent majority” of Americans to reassert their values during the turmoil of the late 1960s. And as James Taranto pointed out in the Wall Street Journal, he evokes Warren G. Harding, who campaigned for president in 1920 on the slogan “return to normalcy.” Both phrases carried within them a rejection of the upheavals reshaping U.S. society. They were conservative anthems, hostile to demographics newly empowered by the Great War (in Harding’s case) and emboldened by the women’s and civil rights movements (in Nixon’s).

While Nixon and Harding wielded the notion of normality against political outsiders, diplomats and advocates sought to normalize in a different sense—to solidify national alliances. A New York Times article from 1969 described “India’s desire to normalize relations with Pakistan.” In 1981, American diplomat Jeane Kirkpatrick vowed to “fully normalize” Chilean-American bonds. Over the next 20 years or so, normalize floated out of its foreign policy box. “Public consumption,” theorized the critic Michael Kimmelman in 2010, “normalizes all culture.” According to one expert, sexual education in schools should “normalize,” not “dramatize,” erotic feelings. A theater called the Infusionarium represents “the Children’s Hospital of Orange County’s latest effort to normalize” chemotherapy, per a 2014 column in the NYT.

In conversations about social justice, normalization often exists in opposition to intolerance or bigotry. A law or cultural product may pathologize (portray as sick), demonize (portray as evil), or exoticize (portray as alien). You can fight these othering impulses by harnessing empathy and imagination to recast difference as commonality. For instance, we can normalize transpeople by deploying gender-neutral pronouns and we can normalize those with disabilities by making sure our workplaces provide wheelchair ramps and accessible bathrooms.

When members of the media cry out against “normalizing” Trump, I suspect they are tapping into a parallel tradition, one with origins in critical theory. From Ezra Pound exhorting poets to “make it new” to Derrida expounding on “differance” to Walter Pater promoting “the addition of strangeness to beauty,” artists have long tried to shock and move their readers through defamiliarization. Investing the ordinary with weirdness commands attention and enhances perception. It compels audiences to look closer, to think and wonder and refuse to take the world for granted.

But de-normalizing (what we’re supposed to do with Trump) is more, well, normative than defamiliarizing. It presumes there’s an in-crowd to be venerated and an out-crowd to be shunned. And it makes that veneration and shunning a matter not of principle but of consensus.

We have excellent cause to shun Trump. He is a racist, sexist, Islamophobic liar. His many disqualifications for the office of the presidency could fill 100 copies of the “failing New York Times” and embarrass the ghost of every single founding father. We’d like to think that Trump’s cornucopia of hatreds and incompetencies place him outside our accepted norms. But railing against Trump’s “normalization” just plays into his grubby hands.

Here is a man who built his case to the nation on the idea that some human beings are “normal” and some are “other.” Yet our response to his political anointment is to harp on his distance from the mainstream. When we talk about whether Trump is or isn’t normal, we’re having the debate on his terms, and doing so in a way that spit-shines his rebel brand. Worse, in framing this as an issue of “normalization,” we’re engaging in wishful thinking: We want our fellow citizens to know and understand that Donald Trump is aberrant, just as we want countries to interact peacefully, and we want transpeople to have the same rights as everyone else. But we can’t dream Trump away. We can’t deny that the United States drank his poison. The problem with Trump isn’t that he’s abnormal. It’s that he’s abominable.

Nov. 17 2016 11:05 AM

The Cowardice of Asking “What Happened” on Election Night

After Donald Trump won the presidency, a dazed chorus emerged: What happened? The stunned syllables headlined the news. “US election: What happened?” ran Larry Beinhart’s analysis in Al Jazeera. “What happened to America?” as Griff Witte and Simon Denyer reported abroad for the Washington Post. We voiced these bewildered words closer to home, too. But this little phrase what happened is much more than a simple question looking for a literal answer. It also gives us insight into how we think about Trump’s election – and offers profound implications for what’s ahead.

In the immediate wake of Trump’s election, we didn’t need to specify what it was that happened. We all know what Vice News is referring to when they introduce a video, “What happened? Why did it happen? What happens next?” What else could it be? What happened is what happened, a new gestalt so all-consuming and self-evident, so unexpected and consequential, that any specification is required only for patients waking up from comas. It’s the vocabulary of collective experience, of a solidarity forged by an era-defining event. We are bound together, divided a country as we are, by the sheer fact of what happened.


The phrase also telescopes complexity. “What happened? How Pollsters, Pundits, and Politics Got It All Wrong,” NPR’s Shankar Vedantam wroteWhat happened enfolds the grander, stranger narrative of the 2016 presidential campaign, of America at this American-historic moment. With what happened, we can nod to a thick nest of forces and factors that shaped Trump’s victory while admitting that will be teasing out what, exactly, they are for a very long time to come.

We don’t use what happened for good news, of course. It expresses disbelief in the face of our own setbacks. “What happened?” we are open-mouthed after an unannounced layoff, a breakup out of the blue, or learning a friend we just had coffee with suddenly dropped dead. The phrase registers the dismay of devastation. It’s a threshold language, what happened, of paradigm shifts, sea changes, and rude awakenings. We thought we knew how the universe worked, we thought we understood its rules. But then something happened. An earthquake. A war. A Trump presidency. The world capsized our expectations, defied whatever logic, order, and conventional wisdom we vainly and arrogantly attempted to pin to it. We are reduced to uttering what happened, taking those three beats to register that we have entered some brave new world, that everything will now be different.

Yet while what happened acknowledges the seismic shifts like Trump’s election, it also hides from them. As Lynne Murphy, an American linguist who lives and works in England, observed: “Been asked for comment on election result several times today. ‘What happened yesterday’ is what they call it. New taboo.” Taboo, indeed, as what happened morphed into it happened: The Economist dissected Trump’s win with a piece titled “How It Happened.” Morocco World News published: “US Election: Yes, It Happened, Now What?” Variety noted this trend as well: “Donald Trump’s Victory Stuns the World: ‘It Happened’.” If we avoid invoking the words Trump won, perhaps we can keep all its evils contained in Pandora’s lexical box of what happened.

The language of what happened, finally, implies a cynical and defeatist passivity, a kind of disempowering victimization. The Trumpian Fates cut their cloth, the Wheel of Fortune turned. The stars aligned, the gods played on Olympus. And those of us fearful of this Trump presidency were powerless, cosmically and inevitably powerless, to do anything stop it. What happened? How could this happen? Why did this happen? It happened. Each what, this, and it gobbles up our own agency and responsibility, occluding, in their verbal abstraction, that what happened, while so much larger than any individual act or actor, was still human doing. As we seek to understand what happened, we’d be wise to remember that what will happen now is not a foregone conclusion beyond our control. It’s in our determination, a principle grounded in the core of democracy itself. 

Nov. 9 2016 12:02 AM

On Social Media, We Are Broken Heartsick Wretches Right Now

I wrote Tuesday afternoon that the pro-Hillary corridors of Facebook and Twitter rang with cheer and confidence—but that, also, maybe, Democrats’ social media triumphalism concealed some anxiety. Yeah. Remember the “cheer and confidence” part of that equation?

Trump just won Florida. We are all haunted heartsick wretches.

Nov. 8 2016 3:12 PM

The Triumphant Tweeting and Freaked-Out Facebooking of Election Day

Here we all are, fellow Americans. On the internet. Where so much of this interminable election has played out. Reading, refreshing, typing things, reading the things we typed, trying to project sanity. On this final day (please, let it be the final day), we come to social media like great herds of deer converging on a watering hole, skittish and desperate, lapping at drops.

What’s it like signing onto Twitter or Facebook on the most important date for American democracy in recent memory? If you follow the people I follow, most of whom support Hillary Clinton, it’s a master class on how the human psyche responds to stress and uncertainty. I’m seeing two strands of coping with the tantalizing dream of a Hillary presidency, the unlikely-yet-still-flickering prospect of a Trump regime. The first: triumphalism. Celebration. These are the friends uploading photos of themselves in pantsuits, applying lipstick or tugging on heels. Let’s smash the glass ceiling! They write. Let’s watch the shards fall! Some people are posting elated images of themselves outside polling stations or in phone bank offices. They’re “so inspired.” They’re baking Hillary-themed cupcakes. They’re swinging at Trump-shaped piñatas. They’re sharing giddy articles about the glorious stirrings in the soil of American politics and culture, the magnificent, progressive, human rights–revering beanstalk about to erupt and carry us into the sky.


I love these posts! I could spend all day reading them (and probably will). They are invigorating suggestions that Clinton’s victory is essentially a sure thing; they nod to the historicity of the moment. Their encapsulating gesture is the Hillary Shoulder Shimmy, a roguishly confident expression of delight in the successes ahead. I scroll through, and sisterhood blooms in me like a flower, and the future looks bright and verdant and full of promise, and …

My hands are shaking. I just thought you should know that. I was typing that utopian dawn-of-justice stuff and my hands were literally shaking. That’s because all this cheerful, can-do rationalism disguises the roiling dread that many Americans might be able to force down but not extinguish. Trump, relegated to the fever swamps of the repressed unconscious, keeps rattling the bars of his cage.

Sometimes he even escapes. For every two or three jubilant “Let’s do this!” posts in my feed, there’s an urgent entreaty stressing the stakes of the election and imploring readers to vote or else. For every service-y reminder on how to find your local polling station, there is a plainspoken, earnest effort to convey—at last, for perhaps you had not understood it until just now—the true depths of Trump’s intolerability.

The complicated tango of uplift, patriotic pride, and soul-curdling dread unfolding on my Facebook feed right now might be summed up by one friend’s blithe posting of a sunrise, with the caption: “Morning has broken, and today we elect the first female president of the United States!” High five! I watched the “likes” gather in real time—ding, ding, ding, like a stairway to heaven. Then a comment: Please be right. Then another comment: What’s the best website for up-to-date results?

On Twitter, too, our forced rhapsodies may conceal how we’re all freaking the hell out inside. When we’re not shooing each other toward “election detox” content (like this photo essay of children reading to lonely shelter dogs, you’re welcome!) or cracking wise about our apprehension, we’re nervously reiterating the various institutions and ideals that hang in the balance. We’ve greeted Election Day with weariness and grim resolve and philosophical solemnity and abstract impatience. Now we’re casting ballots for the children we were and the children we have or want. Our every choice drips with meaning, purpose: We are exercising our voting rights for our grandmothers, in suffragette white, and we are doing it draped in our mothers’ jewelry. Once we get our stickers, though, we feel unmoored again, so we tell our friends and followers what we’ve done and why we’ve done it and what we think will happen next. We are like sharks that might die if they stop swimming, except that in this case the swimming is scrolling and clicking and ruminating and stress-eating and polysyndeton.

Perhaps it is the give-and-take of such an asymmetrical contest: On one hand, Clinton’s qualifications and temperament so outshine Trump’s that her victory feels relatively assured. On the other hand, Trump’s staggering unfitness for the presidency means that even the slightest hint of a GOP win provokes extreme amounts of anxiety. The superego soothes and celebrates; the id cringes with terror. Reason obviously prevails, except I still can’t stop biting my nails and I’ve remade my ponytail four times in the past 30 minutes. If Facebook and Twitter are any indication, we’ll all be toggling between those two poles for the next six or seven hours, leaving scrips and scraps of verbiage in our distracted wake. Are you ready? Me too! Hurrah! Gulp.