Lexicon Valley
A Blog About Language

July 23 2015 3:25 PM

Activists Want to Replace “Car Accident” With “Car Crash.” Not So Fast.

A campaign is under way to ban the word accident from descriptions of car crashes. The New York City nonprofits Transportation Alternatives and Families for Safe Streets have teamed up to create a pledge, a hashtag, and a series of vigils reminding people that, while wrecks and injuries on the road may be unintentional, they are far from random, unpredictable, or unavoidable. “Accident is the transportation equivalent of ¯\_(ツ)_/¯,” explained Gizmodo last week, “immediately [exonerating] everyone involved.” But if you want to reduce the 30,000 or so automobile-related deaths that stain American roadways each year, you can take concrete steps: Don’t drive drunk. Obey traffic laws. Governments can focus on enforcing those laws, pushing for safe and effective street design, regulating vehicle licenses, and lowering the speed limit.

Advocates say the shift from accident to crash prevents negligent or reckless drivers from absolving themselves of responsibility. (There’s a reason 16-year-olds tell their parents they were “in an accident”—the construction demands the passive voice, and sounds antiseptically vague—rather than that they “totaled the van.”) According to Vox’s Joseph Stromberg, newspapers in the 1910s and ’20s tended to paint street collisions in lurid terms: Cars were a relatively new and threatening technology, and they emerged in the media (and in novels like The Great Gatsby) as dangerous villains, the murderous lackeys of the heedless rich.

Advertisement

But as vehicle ownership rose, automakers and other industry collectives gained power. They were able to agitate for new laws that limited the standing of pedestrians in court: Where once drivers were automatically blamed when casualties occurred, suddenly roads were considered to be “for cars,” and walkers imprudent interlopers. Manufacturers like Ford had already wised up to accident as the lawsuit equivalent of garlic breath on a date—during the Industrial Revolution, they learned to classify worker deaths as “accidents” to avoid the impression of fault. As the years cruised by, that precedent plus smart PR on the part of the auto-industrial complex—the National Automobile Chamber of Commerce even had its own wire service, in which wrecks were unfailingly described as bolts of bad luck—cemented the interchangeability of car crash and car accident.

It’s easy to see why accident caught on with the general public. Highway pileups can be shocking and graphic. (For proof of the elemental power of crumpled metal—call it the vehicular sublime—look no further than a Fast and Furious movie, or Mad Max.) The assertion that at least no harm was meant when one 2-ton chunk of speeding steel, iron, and glass wreaks havoc on another shrinks the destruction down to a manageable size. Maybe that’s the best argument for campaigns like Crash Not Accident: The more we use language to sugarcoat our terror of dying behind the wheel, the less action we’ll take to prevent automobile-related deaths.

That said, I don’t buy the argument that car accident is always an inaccurate phrase. Yes, sometimes an accident means an “event that happens unexpectedly, without a deliberate plan or cause.” (And yes, that definition jars when applied to crashes: One might reasonably “expect” vehicular mayhem to result if a driver is speeding and texting; one would certainly assign that mayhem a “cause.”) But the word can also simply describe “a happening that occurs unintentionally.” That seems to be the obvious spirit in which most traffic reports use accident today, and why not? Our justice system distinguishes between negligence and criminal intent for good reason. You could even assert that baked into the prevalence of accident is the fundamentally American idea of “innocent until proven guilty.” Ascribing bloodthirsty motives to a careless motorist feels as problematic as suggesting that she bears no responsibility for the pain she’s sown.

In classical philosophy, accidents are the opposite of occurrences that “happen without a cause.” In fact, they are precisely events that are contingent upon other events, rather than expressions of telos or inner nature. From that perspective, accident seems like the perfect word for a mishap that unfolds not necessarily from a person’s core being or values, but from his stupid lapse in judgment. (At the extreme edge of this claim lies drunk driving, which represents a choice and perhaps a deeper pathology.) You forgot to turn on your lights. No one repaired the pothole. These qualify as blameworthy errors with foreseeable consequences—exactly the sort of thing that might cause an accident.

Video Advertisement

July 22 2015 3:58 PM

A Brief Interview With the Director of Do I Sound Gay?

Do I Sound Gay?—a documentary from director David Thorpe about what is sometimes called the "gay voice" and his own effort to change the way he talks—opened this month in New York, Los Angeles, and other select cities. J. Bryan Lowder reviewed the film in Slate earlier on July 10. I recently spoke with Thorpe about his insecurity over not sounding "masculine enough" and about training with a vocal coach. This interview has been edited and condensed for clarity.

July 21 2015 10:00 AM

When Did Feminism Get So “Sneaky”?

Feminism has been sneaking around. Don’t believe me? A recent New York profile of TV host Katie Nolan hailed the “woman bringing a sneaky feminism to Fox sports.” A few days later, the New York Times went long on Amy Schumer’s boisterous feminism, which it characterized as her “sneaky power.” Like Broad City (another purveyor of “sneak-attack feminism”), Schumer’s work is something of a trysting spot for furtive sisterhood; last year in Slate Willa Paskin declared Inside Amy Schumer the “most sneakily feminist show on TV.”

Psst! Do you know what else is “sneakily feminist?” Showtime’s The Affair. Meanwhile the Hugh Dancy and Maggie Gyllenhaal flick Hysteria is “slyly feminist,” as is Pixar’s fable Inside Out (which, according to a separate review on Slate, accomplishes a “subtle but surprisingly feminist” swerve). Plus, the show Trophy Wife has bloomed, like some nocturnal desert flower, into “secretly one of the most feminist shows on TV.” Sundance chose the “top ten secretly feminist films” of all time (with Thelma and Louise at the mist-shrouded apex). Spy is “secretly a feminist attack on the patriarchy.” Not even academic books prove immune from such subtlety, secrecy, surprise: In a chapter on Ursula Le Guin’s invented folklore, scholar Jarold Ramsey notes that the “slyly feminist … appropriation of the mystique of ‘Old Man Coyote’ can be illustrated by the beginning of a Kesh myth about a war between bears and humans.”

July 17 2015 4:20 PM

When Should You Use Come vs. Cum? It Depends What Kind of Sex You’re Describing.

A note on reading this post: Please assume that every double entendre you encounter is intentional, unless it is not funny, in which case get your mind out of the gutter, perv.

“When it comes to the spelling of cum,” wrote sex columnist Maureen O’Connor a few days ago, “I defer to the Strunk and White of filth, the Vice style guide. Come is the verb, cum the resulting substance.”

Advertisement

Not so fast. Over at the Hairpin, Haley Mlotek jumped on O’Connor’s throwaway disclaimer. Her solution is to flush cum down the toilet in a Bounty square.

“It’s so … truncated,” she argued. “It’s squat. ‘Cum’ is for the men trying to get me to cyber with them on ICQ; it’s the paperbacks I found in the bookcase of a rental apartment my family once lived in. ‘Cum’ is the deliberately misspelled phrase that, to me, fails to denote any kind of sexually satisfying sleaze and instead only conjures up grease, a shortcut to pleasure, and there are no shortcuts to pleasure, or at least there shouldn’t be.”

This is both persuasive and wrong. (O’Connor’s argument-from-authority is unpersuasive and wrong, because the Vice style guide governs the use of cum/come in a highly specific context—Vice articles—and our sex lives are not Vice articles, thank you very much.) True, the spelling of the material and the act that produces it should not be part-of-speech dependent. There’s no precedent for such orthographic niceness: One juices, and one drinks juice. One surfs in the surf, and buckles one’s buckle, and sleeps by going to sleep. Likewise, one comes and the resulting emission, come, is qualitatively different from what you get when you cum, which is cum.

Rather than part-of-speech dependent, in other words, the spelling should be context dependent, because the tonal shimmer around cum is very distinct from the one around come. Come and cum are not two different words for the same act-cum-orgasm ectoplasm; they are two different words for two different acts-cum-orgasm ectoplasms. And all four meanings have their place.

For example, there you are, having sweet vanilla sex with your shy, classically handsome crush, listening to vintage Taylor Swift while the pie in the oven gets a little burnt, though it’s still definitely edible. He comes, there is come, veni, weenie, vici.

Or: The standoffish but mysteriously attractive guy from the party wants to have his way with you. His pillows are made of leather. Crazy. “I’m gonna cum,” he grunts, and you reply, “Great.” Even if you don’t believe that these two situations are substantively different enough from each other to justify using two separate words, what’s your alternative? You cannot do away with come altogether without depleting our cultural supply of sexual puns: Come on our girls. Come on our Facebook page (a favorite of Slate’s DoubleX Gabfest). Come as you are (a sentence that is enthusiastically tautological if you try to complete it with the obvious participle, coming). The harder they come, the quicker they fall asleep. Etc.  

But nor do you want come all across the board. (Gross.) Just as we need puns, we also need cum, a term whose blunt force is commensurate with the raunchy sexual substance that occasionally shoots out of a dude’s eggplant emoji. Without getting too torrid about it, sometimes a gentleman is not coming in a liquid sigh of polite satisfaction but … cumming. Like a geyser with an attitude problem. Alan Cumming would “stick with cum,” which must count for something.

So, while Lexicon Valley eschews prescriptivism as a rule, it seems wise, just this once, to spray-paint some lines on the soccer field.

Here are two scenarios in which you must use come:

  • In a piece of serious journalism, this one excluded. You shouldn’t presume to know the spirit in which the ejaculate was ejaculated and should revert to the more tasteful and less evocative option. Come, the more neutral term, preserves your subject’s privacy.
  • If you are screwing someone of delicate sensibilities, and he or she would be offended were you to describe your dalliances with the graphic U-version.

And here are some contexts in which it probably makes more sense to use come:

  • magazine article, real magazine
  • novel, literary
  • nonfiction, historical
  • letter to the editor, newspaper

Contexts in which it probably makes more sense to use cum:

  • magazine article, men’s magazine
  • novel, romance
  • nonfiction, steamy memoir
  • sexts

Contexts in which it doesn’t really matter:

  • in bed! Put down the notepad and be grateful for homonyms.

July 15 2015 4:29 PM

An Oxford English Dictionary for the Millennial Set, Fo’ Shizzle

Note: All dates in parentheses are for the earliest OED citation. Bold type indicates entries that are new or newly defined in the dictionary as of June 2015.

The much vaunted, ever-expanding Oxford English Dictionary announced its latest update last month with, for salivating word lovers like us, a press release that read like a late-night infomercial for a lexicographic breakthrough. Five hundred new words! More than 900 revised and updated entries!! But wait, there's more!!! If you act now, we'll throw in 2,400 new “senses” of existing words at no extra cost.

July 9 2015 3:01 PM

Grammar Sticklers and Illegal Parkers, Rejoice!

Grammar sticklers and illegal parkers, rejoice! An Ohio woman has successfully wiggled her way out of a summons on the peculiar grounds of absentee punctuation. While many a linguist (including John McWhorter in Slate) would argue that commas don’t matter much, to a state appeals court judge the missing mark in question was reason enough to dismiss a parking ticket.

July 9 2015 1:36 PM

Why Do Secular Scientists Keep Talking About Animal Sacrifice?  

If I were to guess the modern profession that earnestly uses the religious word sacrifice, I never would've said scientific research. According to a 2009 Pew survey, 41 percent of American scientists identify as atheists—10 times the proportion of atheists in the general public. Given this secularism, I would've expected scientists to use euthanize, put to sleep, or even terminate to refer to killing animals for research. But no—the modern American scientist talks and writes about sacrificing rats, mice, fruit flies, and even plant seeds. So just how did sacrifice enter science?

Like many scientific discoveries, the English word sacrifice came from France. According to linguist Anne Curzan at the University of Michigan, the word was borrowed from French in the Middle English period between the 11th and 15th centuries. During that time it had just one meaning, which Curzan describes as "an offering, usually an animal, to God or a deity."

Advertisement

But an offering implies that you're depriving yourself of something you'd rather keep. And over time, this implication subsumed sacrifice's original meaning. By the Renaissance, sacrifice meant surrendering something valuable for the sake of a greater, more pressing claim. Which is how in 1597, Shakespeare could have Lord Capulet lament that Romeo and Juliet were "Poore Sacrifices to our Enmitie."

Like science, sacrifice has kept changing since Shakespeare's time. The major meaning nowadays, says linguist John McWhorter at Columbia University, "is 'to give something up.' " But he points out that a word can retain its old meaning while drifting into a new one. Given this still-drifting meaning, it's possible to see how scientists perceive research animals as "sacrifices to the greater good of—or pressing need for—scientific discovery," Curzan says.

The Oxford English Dictionary reports that sacrifice was first used to refer to killing research animals in 1903. In a research paper for the Journal of Physiology, Liverpool scientists C.S. Sherrington and E.E. Laslett drily report, "Animal sacrificed 30 days after the 2nd lesion." Even though sacrificed appears 12 times in their paper, it's never defined—suggesting that the word was so common in science that Sherrington and Laslett didn't need to explain what they meant. More than a century after this 1903 paper, a search for "sacrificed" on the research database PubMed pulls up 35,627 results. On Google Scholar, the phrase "rats were sacrificed" brings up about 68,100 results, while "mice were sacrificed" garners 108,000. For modern scientists, sacrifice is as much a part of our vocabulary as model, control, and theory.

Which isn't to say it'll stay there. Words, like science and religion and every organism on this planet, are constantly changing. But now that we know where sacrifice has been, can we figure out where is it going?

One hint is sacrifice's diminutive slang: sack. Sack is an unsurprising, inevitable form of sacrifice, says McWhorter, because "part of something being used a lot is you shorten it." Some gallows humor might also be involved, given sack's slang meaning of firing someone. And this dark humor and informality might explain why Google Scholar yields no results for "rats were sacked" and only 26 for "mice were sacked."

As a secular scientist, I don't like sack. I can understand why, as McWhorter suggests, scientists might use sack to distance themselves emotionally from killing research animals. But I don't think scientists should hide from the emotions and ethics and quandaries of animal research. I plan to keep using the word sacrifice, with its religious roots and evolving implications, so long as scientists continue to reluctantly kill animals for the greater good of human understanding and medicine.

July 6 2015 2:58 PM

The Incredible Shrinking Zeitgeist: How Did This Great Word Lose Its Meaning?

Not long ago, the New York Times crowned Tyler Brûlé, a sleekly sophisticated design mogul, “Mr. Zeitgeist.” But the throne was occupied: A different NYT piece had already declared Marie Antoinette queen of the ever-shifting zeitgeist. Before her, the paper had proclaimed Rosa Parks a “zeitgeist warrior,” and ascribed zeitgeist-whispering powers to Peaches and Pixie Geldof, Bionic Woman, the phrase “bonuses are back”(BAB), and Al Gore. The feistiest recent use of the term comes courtesy of Lindsay Zoladz, who compared Amy Schumer to “a comet streaking gloriously across the Zeitgeist, leaving a tail of smudged mascara and Fireball aftertaste in her wake.”

For a wisp of language compounded of ghostliness and time, the zeitgeist is sure making its presence felt. But what exactly do people mean when they invoke it today? A prevailing opinion about kale chips? Backlash against a guacamole recipe? How did the word zeitgeist come to feel so small?

Advertisement

A zeitgeist used to be a formidable thing. Matthew Arnold coined the term in 1848 to capture the spirit of social unrest that suffused Victorian England. In 1933, Aldous Huxley wrote in a letter that the zeitgeist “is a most dismal animal and I wish to heaven one cd escape from its clutches.” Implored W.H. Auden: “May we worship neither the flux of chance, nor the wheel of fortune, nor the spiral of the zeitgeist.” This threatening creature was capricious in its moods and careless about tradition. It was sinister—powerful enough to convince individuals that they were not responsible for their own choices, that they were merely carried along by the romantic gust of the now. In Bismarck’s Germany, the terrifying phantom of volk nationalism absolved people of any need to resist the pull of consensus and think for themselves. But if we used to talk about the Roaring ’20s or the Flower Power ’60s, great sweeps of history distilled into luminescent symbols, now we get a 5-minute-long zeitgeist consisting of, say, TV shows about white girls in Brooklyn. Somehow, the zeitgeist—once so historical and grand—has become an anemic, trivial little sprite.

The media coverage of Girls was arguably the first harbinger of our zeitgeist-saturated zeitgeist. “So zeitgeisty it hurts,” wrote Alexandra Petri of the show in 2012, because zeitgeists, like mirrors, can be cruel. Creator Lena Dunham was hailed/slagged as the “zeitgeist queen,” “the maven of the millennial zeitgeist,” a zeitgeist devourer, a zeitgeist seizer, a “zeitgeist figurehead,” and the darling of “media outlets desperate to ride the zeitgeist.” Since then, the term has continued to spiral. In 2013, Joseph Burgo wrote that Lady Gaga’s song Born This Way speaks to the “anti-shame zeitgeist”—but the zeitgeist was also accused of dabbling in fat-shaming and slut-shaming. Kanye vowed to “pop a wheelie on the Zeitgeist” in 2014. And now we are drowning in zeitgeist: Mindy Kaling said last month of Parks and Rec, “Because it’s zeitgeisty, it would be considered a hit [on cable].” (What if zeitgeist just equals “a raft of things that would be a hit on cable?”) We are supposed to believe that the contemporary zeitgeist is full of schadenfreude. And iPhones. And cheese.

On Twitter, a hoverboard might ride the crest of the zeitgeist and you may think you are original until you realize that “you’re just another spirit in the zeitgeist’s realm.” Elle magazine has its very own “zeitgeist” rubric, which entails articles on everything from the “ultimate girl crush” to “11 times Taylor Swift looked exactly like an emoji” to “6 things women should totally stop apologizing for” (including, perhaps, looking like an emoji).

Clearly this is a time in which we are constantly in the throes of some swift, consuming moment—an age of obsession, wherein we get “hysterically excited about very good but not hugely original cultural products seemingly every other month,” as Willa Paskin put it in Slate last year. Serial. True Detective. High Maintenance. There is always some new fad to point to, as totalizing as it is transient. So zeitgeist feels like a useful word for capturing this overwhelming illusion of cultural consensus. But what we are calling zeitgeisty these days has none of the inclusive significance of a true movement. From within our media bunkers, we imagine that the entire world is as transfixed by The Latest Thing as we are, but all we’ve done is type #BroadCity into Twitter.

Either the mini-zeitgeist guts our perception of just how diverse the culture is, or it offers us little incentive to care. The modifier zeitgeisty used to mark something out as widespread, a unifying force (Sgt Pepper was zeitgeisty)—now it serves as a password into specific echelons of cool. The word often refers to the niche-ily aspirational: Marie Kondo, or Pilates. It’s as if, since we can’t knit our fractured universe of tastes back together again, we’ve settled for paying lip service to the choicest fare.

Science-fiction writer William Gibson has claimed that the present moment is defined by “atemporality,” a “new and strange state of the world in which, courtesy of the Internet, all eras seem to exist at once.” The contemporary zeitgeist, then, has to do with not having a zeitgeist, or having an infinite number of zeitgeists, an undifferentiated Gesundheit of zeitgeists. This notion is hardly peculiar to 2015—in his 1910 tract The Spirit of Romance, Ezra Pound intoned that “all ages are contemporaneous.” If that’s true, surely the way to honor such simultaneity is not to parse it into fingernail slivers of fleeting obsession. Instead, let’s reserve this term for those startling, rare moments of clarity when an entire culture rises up as one, to support civil rights or condemn bigotry or mourn the dead. Either that, or zeitgeist should just give up the ghost.

June 29 2015 3:17 PM

Documenting the Diversity of American English

“Gas is really expensive anymore.”

“He’s in school in Boston—so don’t I.”

“I need me a salad.”

To a high school English teacher or self-styled grammarian, the above sentences are likely cringe-worthy. To most native speakers of English, in fact, they would sound either inelegant or incorrect. Why then, depending on where in North America you live, are they a part of normal, everyday speech?

June 26 2015 12:02 PM

R-E-S-P-E-C-T, Find Out What It Means to Scalia

Words may have lost all meaning to the Supreme Court, as Antonin Scalia suggested Thursday in his dissent from the King v. Burwell decision to uphold health care subsidies, but there’s one word that has a meaning quite particular to the Supreme Court: respectfully. It is a long-standing tradition that Supreme Court dissents often conclude with the gracious words “I respectfully dissent.” So it was taken as a grave sign of incomity in 2000 when Ruth Bader Ginsburg concluded her stinging minority opinion in Bush v. Gore with a bare “I dissent,” the Supreme Court’s equivalent of a glove slap to the face. On Thursday, Scalia likewise eliminated respect from his King v. Burwell dissent, concluding his torrent of outrage with “I dissent.” (Even though Scalia thinks that today's marriage equality decision was a descent into "the mystical aphorisms of the fortune cookie," that was not enough to provoke an "I dissent" from him, signaling that same-sex marriage might not upset him quite so much as health care subsidies.)

I wondered how often such disrespectful dissent occurred in the hallowed halls of the Supreme Court. Law professor Stephanie Tai helpfully pointed me to a Harvard Law Review note, “From Consensus to Collegiality: The Origins of the ‘Respectful’ Dissent,” that charts the history of dissents both respectful and less so. The convention is one that’s only been around for 50 years, but in that time it has become a quite durable one, making departures from it quite striking.

Advertisement

The note, by New York attorney Chris Kulawik, takes its inspiration from ordinary language philosopher J.L. Austin’s theory of speech acts. In How to Do Things With Words, Austin discussed certain speech acts as “performative utterances,” statements that draw their meaning not just from the semantics of their words but from the social context in which the words are uttered. The classic example is “I do,” in a marriage ceremony, which has a network of implications and commitments far beyond what the two words would mean in any other context. Another example would be when Scalia referred to Ginsburg as “Goldberg” the other week, a seeming slip of the tongue to which some imputed more sinister significance. Likewise with “I respectfully dissent” and “I dissent,” which in the patois of the Supreme Court take on the implications of a courteous response and a furious retort, respectively.

In the early history of the court, dissents were polite, defensive, and even apologetic, stressing the focus on consensus. “In any instance where I am so unfortunate as to differ with this Court,” Justice Bushrod Washington pleaded in U.S. v. Fisher (1805), “I cannot fail to doubt the correctness of my own opinion. But if I cannot feel convinced of the error, I owe it, in some measure, to myself, and to those who may be injured by the expense and delay to which they have been exposed, to show, at least, that the opinion was not hastily or inconsiderately given.” By the 20th century, dissent had become enough of a norm not to require such hand-wringing, though it was still comparatively rare. But by 1950, dissent was both common and far more prominent: Dissents were not just expressions of disagreement but judicial statements. With the Warren Court taking on ever more divisive social issues, the court tried to mitigate its own divisions by embracing a “norm of collegiality,” embodied by the respectful dissent.

respect.jpg

Chris Kulawik

The “respectful” dissent as we know it today emerged on the Warren Court in 1957, especially in the opinions of Charles Whittaker, followed closely by other justices. As with its immediate predecessors, “The respectful dissent is the dominant speech act of the Roberts Court,” being used in 70 percent of dissents. The remainder either have no dissenting speech act whatsoever, or, more rarely, contain what Kulawik terms “assertive dissents,” which “withhold respect where convention requires it.” Of the years 2005–09 that the note covers, most justices hewed to the respect norm, topped by the studiously polite David Souter and the newest appointee Sonia Sotomayor.

respect2.jpg

Chris Kulawik

We recall that Ruth Bader Ginsburg performed a notorious nonrespectful dissent in Bush v. Gore. Yet a closer look reveals this to be typical behavior of the atypical Ginsburg: She never respectfully dissents. She believes the respectful dissent to be disingenuous when “you’ve shown no respect at all.” Ginsburg also disputes the intrinsic significance of her assertive dissent in Bush v. Gore. She would rather, it seems, be analyzed on substance than performance; she consciously omitted “I dissent” from her otherwise excoriating Hobby Lobby dissent. 

Ginsburg’s bluntness separates her from the other judges, whose performative dissents require more interpretive argle-bargle. Most justices reserve the assertive dissent for the most controversial and consequential of cases. John Paul Stevens, usually quite respectful, used “I emphatically dissent” in his Citizens United dissent, his sole assertive dissent. In the 2007 Parents Involved in Community Schools desegregation case, Breyer concluded, “I must dissent.” As Kulawik writes, “an assertive dissent is ultimately an act of protest, a signal from one Justice to the world at large that the majority opinion does not deserve legitimation — that the majority has acted impermissibly and produced significant costs for political society.” Under this interpretation, it is Scalia who protests the most. He ties Breyer for assertive dissents, used in his Defense of Marriage Act (U.S. v. Windsor) and Guantánamo (Boumediene v. Bush) dissents, but he is vastly stingier with his respectful dissents, marking him as the most substantively disrespectful of the justices if we disqualify Ginsburg, who thinks the whole business meaningless. (Perhaps this helps explain why Scalia and Ginsburg have somehow managed to remain friends despite agreeing more frequently over Italian opera than they ever do in English.) 

Like the philosophies of W.V.O. Quine, Wilfrid Sellars, and the later Wittgenstein, Austin’s philosophy was a rejoinder to the logical positivist idea of meaning, which posited that meanings of statements could be specified atomically and precisely. Perhaps surprisingly, Scalia’s textualist philosophy invokes this holistic and sometimes squishy perspective: “Words, like syllables, acquire meaning not in isolation but within their context,” he wrote in K Mart v. Cartier (1988), uncannily echoing Jacques Derrida’s deconstructionist maxim, “Il n'y a pas de hors-texte” (“there is nothing outside context”). “Context always matters,” Scalia repeated in his King v. Burwell dissent. Such linguistic indeterminacy may be correct (I happen to think it is), but it makes Scalia’s supposedly precise textualist philosophy no less vague than the philosophy of an evolving “living Constitution” to which Ginsburg subscribes. Perhaps this is what Scalia meant when he said of Ginsburg, “She's a really good textualist.” In that context, textualism, I respectfully submit, is just another act of interpretive jiggery-pokery.

READ MORE STORIES