Lexicographers have always understood this. As Johnson wrote, “to enchain syllables, and to lash the wind, are equally the undertakings of pride, unwilling to measure its desires by its strength.” And so lexicographers have always decided what goes into a dictionary by paying attention to the way people use words. In resigning themselves to this role they are, as John McIntyre once suggested, acknowledging the wisdom of Thomas Carlyle’s reply to Margaret Fuller’s statement, “I accept the universe.” “Gad! She’d better.”
And yet when Macdonald was writing in The New Yorker half a century ago, he fulminated against the policy of Webster’s Third to define words as most people used and understood them, such as “nauseated” as the primary sense of nauseous. (Note to readers under the age of 75: According to an old prescriptive rule, nauseous may be used only to mean “nauseating.”) Even if nine-tenths of the citizens of the United States were to use a word incorrectly, MacdDonald declared, the remaining tenth would be correct—he did not say by what criterion or on whose authority—and the dictionary should back them up. But a dictionary that followed Macdonald’s advice would be as useless in practice as the Hungarian-English phrasebook in the Monty Python sketch that translated “Can you direct me to the train station?” as Please fondle my bum.
Although dictionaries are powerless to prevent linguistic conventions from changing, this does not mean, as dichotomists fear, that they cannot state the conventions in force at a given time. That is the rationale behind the American Heritage Dictionary’s Usage Panel—which I chair—a list of 200 authors, journalists, editors, academics, and other public figures whose writing shows that they choose their words with care. Every year they fill out questionnaires on pronunciation, meaning, and usage, and the Dictionary reports the results in Usage Notes attached to entries for problematic words, including changes in repeated ballotings over the decades. The Usage Panel is meant to represent the virtual community for whom careful writers write, and when it comes to best practices in usage, there can be no higher authority than that community.
The powerlessness of dictionaries to freeze linguistic change does not mean that they are doomed to preside over a race to the bottom. Macdonald worried that the dictionaries of 1988 would list without comment the solecisms mischievious, inviduous, and nuclear pronounced as “nucular.” We now have an additional quarter of a century to test his predictions. Look them up.
And now we come to the biggest and most bogus controversy of them all. The fact that many prescriptive rules are worth keeping does not imply that every pet peeve, bit of grammatical folklore, or dimly remembered lesson from Miss Grundy’s classroom is worth keeping. Many prescriptive rules originated for screwball reasons, impede clear and graceful prose, and have been flouted by English’s greatest writers for centuries. The most notorious is the ban on split verbs (including split infinitives), which led Chief Justice and grammatical stickler John Roberts to precipitate a governance crisis in 2009 when he unconsciously edited the oath of office and had Barack Obama “solemnly swear that I will execute the office of president to the United States faithfully” (rather than “faithfully execute,” the wording stipulated in the Constitution). Bogus rules, which proliferate like urban legends and are just as hard to eradicate, are responsible for vast amounts of ham-fisted copy editing and smarty-pants one-upmanship. Yet when language experts try to debunk the spurious rules, the dichotomizing mindset imagines that they are trying to abolish all standards of good writing. It is as if anyone who proposed repealing a stupid law, like those on miscegenation or Sunday store closings, was labeled an anarchist.
What about those who say that correct usage is really a membership card for the ruling classes? In earlier centuries there was some truth to this notion, as Hitchings documents in his engaging history, but today that would be a stretch. Define the 1 percent however you want—the upper echelons of commerce, government, culture, academia, even the British royal family—and you’d be hard-pressed to argue that they are paragons of correct usage and good style. For quite some time now the language connoisseurs have been schoolteachers, writers of letters to the editor, and ink-stained wretches on Grub Street (and their digital descendants).
* * *
Standards of usage, then, are desirable, even if all of them are arbitrary and mortal and many of them are spurious and discardable. And yet this understanding, widely shared among knowledgeable writers on language, is no match for a good dichotomy—particularly when it furnishes the narrative for an extended snark. Which brings us to the other bookend in The New Yorker’s participation in “the language wars,” in which Joan Acocella tortures quotations from several writers so she can mock them as prescriptivist toffs or descriptivist bohos.
The stereotyping begins with her treatment of Henry Watson Fowler, author of the 1926 classic A Dictionary of Modern English Usage, and his campaign against “genteelism” in writing.
Fowler defines “genteelism” as “the substituting, for the ordinary natural word that first suggests itself to the mind, of a synonym that is thought to be less soiled by the lips of the common herd, less familiar, less plebian, less vulgar, less improper, less apt to come unhandsomely betwixt the wind & our nobility.” As is obvious here, Fowler was dealing not just with language but with its moral underpinnings, truth and falsehood. To many people, he seemed to offer an idealized view of what it meant to be English—decency, fair play, roast beef—and to recommend, even to prescribe, those things.