We’re Finally Winning the Battle Against the Phrase “Battle With Cancer”
When the iconic theater actor and director Roger Rees died earlier in July, many reports quoted a gently worded press release written, it seemed, by his family: Rees, the release said, had “passed away … after a brief journey with cancer.” The diction gave me pause, even as I admired it. It was a clear step away from the familiar description of a dearly departed’s “battle” with the disease. But did this euphemism attenuate cancer in a way that felt cruel to the victim or untrue to the actual experience of, well, dying? I imagined a man walking slowly into the sunset, hand in hand with an adumbral figure. It seemed strange that the two silhouettes were moving in the same direction.
A journey “through” cancer might have been easier to visualize. There goes Rees, gracefully picking his way across the changing landscape, its rocks and eddies of sand and occasional sloping idylls. You are now entering cancer, reads the road sign behind him. Do not expect to enjoy your stay. But that preposition might be inapt, given that not every itinerant reaches the other side of the imperial malady. With, then, not through.
While this was the first time I’d encountered a journey with cancer, such quests crisscross the Web. Bloggers relate their “Journey With Inflammatory Breast Cancer,” their “Personalized Journey With Ovarian Cancer,” their “far-from-perfect journey with cancer” and “healing journey with cancer” and “beautiful journey with cancer” and “brave journey with cancer.” Sometimes, cancer is not a fellow traveler or pilgrim but a modifier clarifying the nature of the trip. There are “cancer journeys” and “long cancer journeys” and “personal cancer journeys,” all arcing across our feeds to converge, perhaps, in what Clive James called “the empty regions,” the endpoint of every mortal trek.
Taylor Swift, Waka Flocka, and the Roots of #Squad
In the centuries before automatic weapons, when armies clashed along the Anatolian coast or at the base of medieval castles, foot soldiers fought in square formations. The compact shape repelled enemy forces on horseback, a desperate armor wrought from geometry. The Latin word for square gave us squadron—a military unit—and then, in the 1640s, squad.
Squads, unlike divisions or battalions, contained a relatively small number of infantrymen, lending the term an underdog tinge. (It retains that echo; consider the upcoming Suicide Squad, a hyped-at-Comic-Con movie about supervillains conscripted to carry out hopeless assignments for the government.) And since they often formed for specialized tasks (rifle squad, first aid squad) or a one-time mission (rescue squad) the word acquired a glimmer of rakish expertise. (The Mission: Impossible crew, meshing spectacular skill with the breezy assurance that this will never work, could well be the Platonic form of the squad, if not the most mod.)
But squad has always meant solidarity most of all. When the rise of nationalism whipped Europe into a martial fever, army deserters, mutineers, and traitors were condemned, for symbolic reasons, to die by firing squad. The optics of those deaths—evil loners facing trusty comrades, us triumphing over them—wouldn’t be out of place in a grim remake of Taylor Swift’s "Bad Blood" video.
Activists Want to Replace “Car Accident” With “Car Crash.” Not So Fast.
A campaign is under way to ban the word accident from descriptions of car crashes. The New York City nonprofits Transportation Alternatives and Families for Safe Streets have teamed up to create a pledge, a hashtag, and a series of vigils reminding people that, while wrecks and injuries on the road may be unintentional, they are far from random, unpredictable, or unavoidable. “Accident is the transportation equivalent of ¯\_(ツ)_/¯,” explained Gizmodo last week, “immediately [exonerating] everyone involved.” But if you want to reduce the 30,000 or so automobile-related deaths that stain American roadways each year, you can take concrete steps: Don’t drive drunk. Obey traffic laws. Governments can focus on enforcing those laws, pushing for safe and effective street design, regulating vehicle licenses, and lowering the speed limit.
Advocates say the shift from accident to crash prevents negligent or reckless drivers from absolving themselves of responsibility. (There’s a reason 16-year-olds tell their parents they were “in an accident”—the construction demands the passive voice, and sounds antiseptically vague—rather than that they “totaled the van.”) According to Vox’s Joseph Stromberg, newspapers in the 1910s and ’20s tended to paint street collisions in lurid terms: Cars were a relatively new and threatening technology, and they emerged in the media (and in novels like The Great Gatsby) as dangerous villains, the murderous lackeys of the heedless rich.
A Brief Interview With the Director of Do I Sound Gay?
Do I Sound Gay?—a documentary from director David Thorpe about what is sometimes called the "gay voice" and his own effort to change the way he talks—opened this month in New York, Los Angeles, and other select cities. J. Bryan Lowder reviewed the film in Slate earlier on July 10. I recently spoke with Thorpe about his insecurity over not sounding "masculine enough" and about training with a vocal coach. This interview has been edited and condensed for clarity.
When Did Feminism Get So “Sneaky”?
Feminism has been sneaking around. Don’t believe me? A recent New York profile of TV host Katie Nolan hailed the “woman bringing a sneaky feminism to Fox sports.” A few days later, the New York Times went long on Amy Schumer’s boisterous feminism, which it characterized as her “sneaky power.” Like Broad City (another purveyor of “sneak-attack feminism”), Schumer’s work is something of a trysting spot for furtive sisterhood; last year in Slate Willa Paskin declared Inside Amy Schumer the “most sneakily feminist show on TV.”
Psst! Do you know what else is “sneakily feminist?” Showtime’s The Affair. Meanwhile the Hugh Dancy and Maggie Gyllenhaal flick Hysteria is “slyly feminist,” as is Pixar’s fable Inside Out (which, according to a separate review on Slate, accomplishes a “subtle but surprisingly feminist” swerve). Plus, the show Trophy Wife has bloomed, like some nocturnal desert flower, into “secretly one of the most feminist shows on TV.” Sundance chose the “top ten secretly feminist films” of all time (with Thelma and Louise at the mist-shrouded apex). Spy is “secretly a feminist attack on the patriarchy.” Not even academic books prove immune from such subtlety, secrecy, surprise: In a chapter on Ursula Le Guin’s invented folklore, scholar Jarold Ramsey notes that the “slyly feminist … appropriation of the mystique of ‘Old Man Coyote’ can be illustrated by the beginning of a Kesh myth about a war between bears and humans.”
When Should You Use Come vs. Cum? It Depends What Kind of Sex You’re Describing.
A note on reading this post: Please assume that every double entendre you encounter is intentional, unless it is not funny, in which case get your mind out of the gutter, perv.
“When it comes to the spelling of cum,” wrote sex columnist Maureen O’Connor a few days ago, “I defer to the Strunk and White of filth, the Vice style guide. Come is the verb, cum the resulting substance.”
Not so fast. Over at the Hairpin, Haley Mlotek jumped on O’Connor’s throwaway disclaimer. Her solution is to flush cum down the toilet in a Bounty square.
An Oxford English Dictionary for the Millennial Set, Fo’ Shizzle
Note: All dates in parentheses are for the earliest OED citation. Bold type indicates entries that are new or newly defined in the dictionary as of June 2015.
The much vaunted, ever-expanding Oxford English Dictionary announced its latest update last month with, for salivating word lovers like us, a press release that read like a late-night infomercial for a lexicographic breakthrough. Five hundred new words! More than 900 revised and updated entries!! But wait, there's more!!! If you act now, we'll throw in 2,400 new “senses” of existing words at no extra cost.
Grammar Sticklers and Illegal Parkers, Rejoice!
Grammar sticklers and illegal parkers, rejoice! An Ohio woman has successfully wiggled her way out of a summons on the peculiar grounds of absentee punctuation. While many a linguist (including John McWhorter in Slate) would argue that commas don’t matter much, to a state appeals court judge the missing mark in question was reason enough to dismiss a parking ticket.
Why Do Secular Scientists Keep Talking About Animal Sacrifice?
If I were to guess the modern profession that earnestly uses the religious word sacrifice, I never would've said scientific research. According to a 2009 Pew survey, 41 percent of American scientists identify as atheists—10 times the proportion of atheists in the general public. Given this secularism, I would've expected scientists to use euthanize, put to sleep, or even terminate to refer to killing animals for research. But no—the modern American scientist talks and writes about sacrificing rats, mice, fruit flies, and even plant seeds. So just how did sacrifice enter science?
Like many scientific discoveries, the English word sacrifice came from France. According to linguist Anne Curzan at the University of Michigan, the word was borrowed from French in the Middle English period between the 11th and 15th centuries. During that time it had just one meaning, which Curzan describes as "an offering, usually an animal, to God or a deity."
But an offering implies that you're depriving yourself of something you'd rather keep. And over time, this implication subsumed sacrifice's original meaning. By the Renaissance, sacrifice meant surrendering something valuable for the sake of a greater, more pressing claim. Which is how in 1597, Shakespeare could have Lord Capulet lament that Romeo and Juliet were "Poore Sacrifices to our Enmitie."
Like science, sacrifice has kept changing since Shakespeare's time. The major meaning nowadays, says linguist John McWhorter at Columbia University, "is 'to give something up.' " But he points out that a word can retain its old meaning while drifting into a new one. Given this still-drifting meaning, it's possible to see how scientists perceive research animals as "sacrifices to the greater good of—or pressing need for—scientific discovery," Curzan says.
The Oxford English Dictionary reports that sacrifice was first used to refer to killing research animals in 1903. In a research paper for the Journal of Physiology, Liverpool scientists C.S. Sherrington and E.E. Laslett drily report, "Animal sacrificed 30 days after the 2nd lesion." Even though sacrificed appears 12 times in their paper, it's never defined—suggesting that the word was so common in science that Sherrington and Laslett didn't need to explain what they meant. More than a century after this 1903 paper, a search for "sacrificed" on the research database PubMed pulls up 35,627 results. On Google Scholar, the phrase "rats were sacrificed" brings up about 68,100 results, while "mice were sacrificed" garners 108,000. For modern scientists, sacrifice is as much a part of our vocabulary as model, control, and theory.
Which isn't to say it'll stay there. Words, like science and religion and every organism on this planet, are constantly changing. But now that we know where sacrifice has been, can we figure out where is it going?
One hint is sacrifice's diminutive slang: sack. Sack is an unsurprising, inevitable form of sacrifice, says McWhorter, because "part of something being used a lot is you shorten it." Some gallows humor might also be involved, given sack's slang meaning of firing someone. And this dark humor and informality might explain why Google Scholar yields no results for "rats were sacked" and only 26 for "mice were sacked."
As a secular scientist, I don't like sack. I can understand why, as McWhorter suggests, scientists might use sack to distance themselves emotionally from killing research animals. But I don't think scientists should hide from the emotions and ethics and quandaries of animal research. I plan to keep using the word sacrifice, with its religious roots and evolving implications, so long as scientists continue to reluctantly kill animals for the greater good of human understanding and medicine.
The Incredible Shrinking Zeitgeist: How Did This Great Word Lose Its Meaning?
Not long ago, the New York Times crowned Tyler Brûlé, a sleekly sophisticated design mogul, “Mr. Zeitgeist.” But the throne was occupied: A different NYT piece had already declared Marie Antoinette queen of the ever-shifting zeitgeist. Before her, the paper had proclaimed Rosa Parks a “zeitgeist warrior,” and ascribed zeitgeist-whispering powers to Peaches and Pixie Geldof, Bionic Woman, the phrase “bonuses are back”(BAB), and Al Gore. The feistiest recent use of the term comes courtesy of Lindsay Zoladz, who compared Amy Schumer to “a comet streaking gloriously across the Zeitgeist, leaving a tail of smudged mascara and Fireball aftertaste in her wake.”
For a wisp of language compounded of ghostliness and time, the zeitgeist is sure making its presence felt. But what exactly do people mean when they invoke it today? A prevailing opinion about kale chips? Backlash against a guacamole recipe? How did the word zeitgeist come to feel so small?
A zeitgeist used to be a formidable thing. Matthew Arnold coined the term in 1848 to capture the spirit of social unrest that suffused Victorian England. In 1933, Aldous Huxley wrote in a letter that the zeitgeist “is a most dismal animal and I wish to heaven one cd escape from its clutches.” Implored W.H. Auden: “May we worship neither the flux of chance, nor the wheel of fortune, nor the spiral of the zeitgeist.” This threatening creature was capricious in its moods and careless about tradition. It was sinister—powerful enough to convince individuals that they were not responsible for their own choices, that they were merely carried along by the romantic gust of the now. In Bismarck’s Germany, the terrifying phantom of volk nationalism absolved people of any need to resist the pull of consensus and think for themselves. But if we used to talk about the Roaring ’20s or the Flower Power ’60s, great sweeps of history distilled into luminescent symbols, now we get a 5-minute-long zeitgeist consisting of, say, TV shows about white girls in Brooklyn. Somehow, the zeitgeist—once so historical and grand—has become an anemic, trivial little sprite.
The media coverage of Girls was arguably the first harbinger of our zeitgeist-saturated zeitgeist. “So zeitgeisty it hurts,” wrote Alexandra Petri of the show in 2012, because zeitgeists, like mirrors, can be cruel. Creator Lena Dunham was hailed/slagged as the “zeitgeist queen,” “the maven of the millennial zeitgeist,” a zeitgeist devourer, a zeitgeist seizer, a “zeitgeist figurehead,” and the darling of “media outlets desperate to ride the zeitgeist.” Since then, the term has continued to spiral. In 2013, Joseph Burgo wrote that Lady Gaga’s song Born This Way speaks to the “anti-shame zeitgeist”—but the zeitgeist was also accused of dabbling in fat-shaming and slut-shaming. Kanye vowed to “pop a wheelie on the Zeitgeist” in 2014. And now we are drowning in zeitgeist: Mindy Kaling said last month of Parks and Rec, “Because it’s zeitgeisty, it would be considered a hit [on cable].” (What if zeitgeist just equals “a raft of things that would be a hit on cable?”) We are supposed to believe that the contemporary zeitgeist is full of schadenfreude. And iPhones. And cheese.
On Twitter, a hoverboard might ride the crest of the zeitgeist and you may think you are original until you realize that “you’re just another spirit in the zeitgeist’s realm.” Elle magazine has its very own “zeitgeist” rubric, which entails articles on everything from the “ultimate girl crush” to “11 times Taylor Swift looked exactly like an emoji” to “6 things women should totally stop apologizing for” (including, perhaps, looking like an emoji).
Clearly this is a time in which we are constantly in the throes of some swift, consuming moment—an age of obsession, wherein we get “hysterically excited about very good but not hugely original cultural products seemingly every other month,” as Willa Paskin put it in Slate last year. Serial. True Detective. High Maintenance. There is always some new fad to point to, as totalizing as it is transient. So zeitgeist feels like a useful word for capturing this overwhelming illusion of cultural consensus. But what we are calling zeitgeisty these days has none of the inclusive significance of a true movement. From within our media bunkers, we imagine that the entire world is as transfixed by The Latest Thing as we are, but all we’ve done is type #BroadCity into Twitter.
Either the mini-zeitgeist guts our perception of just how diverse the culture is, or it offers us little incentive to care. The modifier zeitgeisty used to mark something out as widespread, a unifying force (Sgt Pepper was zeitgeisty)—now it serves as a password into specific echelons of cool. The word often refers to the niche-ily aspirational: Marie Kondo, or Pilates. It’s as if, since we can’t knit our fractured universe of tastes back together again, we’ve settled for paying lip service to the choicest fare.
Science-fiction writer William Gibson has claimed that the present moment is defined by “atemporality,” a “new and strange state of the world in which, courtesy of the Internet, all eras seem to exist at once.” The contemporary zeitgeist, then, has to do with not having a zeitgeist, or having an infinite number of zeitgeists, an undifferentiated Gesundheit of zeitgeists. This notion is hardly peculiar to 2015—in his 1910 tract The Spirit of Romance, Ezra Pound intoned that “all ages are contemporaneous.” If that’s true, surely the way to honor such simultaneity is not to parse it into fingernail slivers of fleeting obsession. Instead, let’s reserve this term for those startling, rare moments of clarity when an entire culture rises up as one, to support civil rights or condemn bigotry or mourn the dead. Either that, or zeitgeist should just give up the ghost.