Science

The Kids Are All Right

Have we made our children into moral monsters?

A child makes reasoned judgments.

Photo illustration by Slate. Photos by Thinkstock.

We’re in the middle of a moral panic—a panic over morals. An op-ed posted Monday in the New York Times warns that our children don’t believe in moral facts. They don’t get that things are right or wrong; they don’t accept the inner truth of values. Should a person lie or cheat or kill to get ahead? Kids today, these kids today, they couldn’t say for sure. I mean, aren’t ethics just, like, a matter of opinion? Everyone totally gets to have their own opinions, amirite

This Village of the Damned scenario, laid out by Fort Lewis College philosopher Justin McBrayer, has clearly struck a chord. As of Thursday, it sat among the Timesfive most emailed, viewed, and shared articles of the week, and has inspired more than 1,800 comments. McBrayer’s most alarming claim—the one that pushed his essay into Facebook feeds and Twitter timelines—tells us how the nation’s youth came to be so dissolute. If millennials haven’t learned respect for ethics, it’s because we grown-ups let them down.

Where’s the evidence that we’re giving children lax and jumbled moral guidance? McBrayer starts by looking at his undergrads: “As a philosopher, I already knew that many college-aged students don’t believe in moral facts.” He may not have much data to back up this point, but it’s obvious to him—and to his colleagues—that “the overwhelming majority of college freshmen in their classrooms view moral claims as mere opinions that are not true or are true only relative to a culture.”

So that’s the social problem, the fact that needs explaining. How did college students get so wishy-washy? Have their values gotten lost in the fog of postmodernism that blankets modern universities? McBrayer doesn’t think so. His students seem to lack a moral vision from the day they get to campus, and besides, professors have been espousing moral relativism since the pre-Socratic age. No, if there’s something different about these kids today—and there’s surely something different—then it must be coming from another source. But where?

McBrayer finds his answer in a visit to a second-grade open house. He learns that his son is being taught in school to spot the difference between “facts” and “opinions.” According to the Common Core standards, students must learn that any given statement can be classified either as a factual matter, based on evidence or observation, or a subjective belief.* That sounds OK at first, but McBrayer argues that it’s a false dichotomy, since not all facts are strictly provable, and opinions can be formed from evidence and observation. (Among philosophers, the proper boundaries of these terms have been the subject of a long debate.)

But this confusion of categories isn’t just misleading, says McBrayer; it’s dangerous. The public school curriculum instructs children that any moral claim whatsoever—from thou shalt not kill on down the line—should be taken as “opinion,” a paltry product of belief, with no special purchase on the truth. Here’s the “punchline” of this lesson plan, as he puts it: “There are no moral facts. And if there are no moral facts, then there are no moral truths.” If we teach our kids that when it comes to ethics, every view is valid, then we’ll fill their brains with “doublethink” and lock them in a labyrinth of moral relativism. “It should not be a surprise that there is rampant cheating on college campuses,” McBrayer writes. “[W]e’ve taught our students for 12 years that there is no fact of the matter as to whether cheating is wrong.”

So that’s the argument. And here’s an opinion: It’s total crap. If you subject McBrayer to the sort of claims-testing that is routinely taught to second-graders—i.e., if you assess it using evidence and observation—the essay falls apart. There’s no evidence that college students are any less morally resolute today than they were in years past. There’s no evidence that public school lesson plans are changing how we think (and besides, the Common Core hasn’t been around that long). There’s no evidence that moral relativism, McBrayer’s bugbear, causes “rampant cheating,” or indeed any other substantial harm. In fact—in fact—most of the evidence goes the other way.

The problem starts with McBrayer’s mode of inquiry. To establish his baseline claim, that modern college students are more than ever moral relativists, he relies on the philosopher’s favorite tools: anecdote and intuition. He noticed that his students don’t hold absolutist moral views, and he’s heard the same from friends in his department. Are there any data to support his claim? Nope, not at all, not important, he doesn’t care.

But here’s the thing: There are some data on this question, and they contradict McBrayer’s argument. James R. Beebe of SUNY–Buffalo is an “experimental” philosopher—a scholar who believes that fundamental questions about the human mind should be subject to something more than armchair theories and water-cooler conversation. That’s why he surveyed 2,500 people in the Buffalo area, from seventh-graders to senior citizens, to figure out how their moral judgments varied with their age. Beebe wasn’t looking at what we think is right or wrong, but rather how we think about those values—do we take them as objective facts or understand them as matters of opinion? (Scholars call this question one of “meta-ethics.”)

Here’s how Beebe’s study worked. Each subject got a set of statements to assess. Some were factual in nature—“New York City is further north than Los Angeles,” for example, or “The Earth is only 6,000 years old.” (The latter statement isn’t true, but it’s still a statement of fact.) Other statements were matters of taste or preference, such as “Classical music is better than rock music.” And still others were ethical claims, like “Hitting someone just because you feel like it is wrong.” Then Beebe asked his subjects to consider that someone else might disagree with them about those statements. “Is it possible for both of you to be correct,” he asked, “or must one of you be mistaken?”

In other words, Beebe wanted to know if people thought these statements were grounded in a deeper truth, or if they were a matter of opinion. His study is not yet published, but he made two important findings: First, subjects’ views about the objectivity of moral claims varied from one statement to another. When it came to statements about racial discrimination, for example, people were almost universal in their moral certitude: Don’t be racist appeared to be a bedrock value, not a matter of opinion. Subjects wavered on other statements, though. Don’t desecrate the American flag struck about half as a moral truth and the other half as being subject to debate.

This is not a brand-new finding. Experimental philosophers have been measuring meta-ethics for about 10 years now, and their work has converged on the notion that our intuitions change with context, and that any given person might be a moral relativist about one thing (the wearing of headscarves, let’s say) and objectivist about another (such as female circumcision). That may not surprise you, but it runs against an old convention in philosophy that frames these mindsets as if they’re absolute. This outdated point of view may explain McBrayer’s doctrine—undoubtedly false—that the majority of college students “view moral claims as mere opinions.” No, they’re moderates like the rest of us: They see some moral claims as mere opinions, and others not so much.

It’s possible that college students have a certain bias to see moral claims as matters of opinion; i.e., they might be somewhat less likely than the rest of us to judge any given claim as fact. Indeed, Beebe found that people ages 17 to 29 were especially inclined to rate ethical statements as being subjective, compared with younger and older groups. It’s not a huge effect—maybe a 10 percent difference—but it’s there.

In one sense, this second finding confirms McBrayer’s argument: College students really are a bunch of relativists! But they’re the special case; the younger kids in Beebe’s study—the 12-to-16-year-olds who would soon be entering college—show up as being exactly as objective as the fuddy-duddy retirees. So we can’t really say that “kids these days” have looser morals than they used to, or that they’re being trained to think as relativists and then those views get packed off with them to college. Instead, they happen to be turning into relativists at about the same time that they get to college.

As Beebe points out, this is neither a new idea nor a new phenomenon. In the 1950s and 1960s, the psychologist William Perry conducted a study of students’ intellectual development at Harvard and concluded that they progress through nine phases of thinking, of which the middle three are steeped in moral relativism. About the same time, another Harvard psychologist, Lawrence Kohlberg, devised a theory of moral maturity with increasingly universal ideas of right and wrong. But the process hiccups about the time we choose a major, Kohlberg observed in 1969. At some point between junior year of high school and junior year of college, a significant number of the students in his sample seemed to regress into “hedonic relativism, jazzed up with some philosophic and sociopolitical jargon,” he wrote. “[T]he college sophomore turns out to be the oddest and most interesting moral fish of all.”

This will all ring true for anyone who’s ever spent a late night smoking weed and ranting to his college roommates about the nature of reality. (I’m looking at you, “Russ” from the second-floor suite!) In a follow-up study, also unpublished, Beebe found the same, short-lived burst of moral relativism among people in their late teens and early 20s who live in China, Ecuador, and Poland. That’s just more evidence that McBrayer has it wrong. Today’s crop of college students in Durango, Colorado, may be spouting off about how nothing’s real, man—but college students very likely pulled the same routine 10 years ago, and 20 years ago, and 50 years ago. They seem to do the same today in Shandong, Quito, Wroclaw, and lots of other places where the wanton Common Core standards do not apply.

McBrayer’s piece—slim on facts, morbidly obese on opinions—takes it as a given that all this campus relativism is something new, and that it reflects a cohort effect particular to the millennial generation. We can make the reasoned judgment (another category taught to public school kids) that he’s wrong on that front. But let’s not ignore his frantic, underlying message, and the moral weight behind the piece. If it were true that kids today didn’t believe in moral facts, would he be right that we should panic?

This question appears to put us in a thicket of debate between philosophers who identify as moral realists and those who take an “anti-realist” position. This, again, is meta-ethics: Are the things we consider morals natural properties? Can we glean them from the world around us in the same way that we see a certain shape and glean that it’s a square? Or are morals more derived than essential, more subjective than objective?

McBrayer appears to be a moral realist, or at least he’s defended the realist position in his published papers. He’s also an avowed Christian who says that belief in God makes people happier, healthier, longer-lived, and more faithful to their partners. (For these claims, at least, he appeals to empirical research.) He appears to think that moral realism is good, that a belief in moral facts makes us better people and the world a better place. That’s his opinion, and he’s welcome to it. Everyone is entitled to his own opinion!

We’re also entitled to the facts. As it happens, psychologists and experimental philosophers have looked at this as well, and some of the evidence supports McBrayer’s intuition. A few years ago, for example, researchers at Boston College arranged to have someone canvass for donations to a children’s charity near subway stations. The canvasser wore a vest and carried a binder, and said the same things to every passerby, except for one sentence. He asked some people, during the pitch, “Do you agree that some things are just morally right or wrong, good or bad, wherever you happen to be from in the world?” For others, he asked, “Do you agree that our morals and values are shaped by our culture and upbringing?” Then the authors counted up donations from people in the two conditions, and discovered that members of the former group—the ones who had been primed to think of moral absolutes—were twice as likely to contribute.

For another recent study, conducted at UCLA, researchers presented students with relativistic and objectivist accounts of female genital mutilation, then asked them to roll a pair of dice and report the result in exchange for tickets to a raffle. (The higher the roll, the more chances they would have to win.) In the end, students who had been primed to think of morals as negotiable ended up reporting higher dice rolls—a clear sign that they were cheating. Other research, using standard questionnaires to assess people’s moral outlooks, has found that moral relativism correlates with willingness to switch price tags at a store, or to lie about a child’s age to get a kiddie price, or to copy software illegally.

McBrayer may be onto something: If you put relativist ideas into people’s heads, they do get a bit nihilistic, and in general, relativist worldviews are indeed associated with at least some minor misbehaviors. But there’s another batch of research that complicates the story. The psychologists Geoffrey Goodwin and John Darley have found that more objectivist moral thinkers tend to be more closed-minded. When confronted with a disagreement, they’re not inclined to understand the other person’s point of view. They’re less likely to say they’d want to have that person as a roommate, or to adjust their own opinions through discussion and debate. Goodwin and Darley even found that moral realists perform worse on certain kinds of logic puzzles that require thinking through alternative scenarios. (Here’s one—the five-block task.)

So moral relativism makes us more corrupt, but it also keeps us open-minded; moral objectivism keeps us on the straight and narrow, but it also breeds intolerance. Is one of these outcomes clearly better than the other? McBrayer seems to think so: He believes that moral facts will keep us safe; he’s more afraid of nihilism than intolerance. For a second opinion, I called my friend Joshua Knobe, an experimental philosopher at Yale who has worked on related questions. He thinks there’s no more reason to be fearful of moral relativism than there is to think that atheists are untrustworthy because nothing compels them to be good people. “I think there’s a lot of reason to be terrified,” he said, “not of the [relativistic] people who end up being miscreants, but of the people who are so full of conviction about whatever religious views they hold, and they’re absolutely convinced theirs is the only right way to think.” You know, like the types who might join ISIS.

But we needn’t choose up sides in this debate. Maybe you’re like Knobe, and intuit that a belief in moral facts does more harm than good; maybe I’m like McBrayer, and inclined to think the opposite. Are we making claims of fact, or claims of moral rightness? If these are moral claims, then must one of us be correct, or would it be OK that we hold diverging views? Wait, bro, are you saying that we could be, like, meta-relativists? Thass crazy!

It’s not that crazy. In spite of McBrayer’s warnings in the New York Times, no one learns to be a sworn adherent to a given moral view. Instead, almost all of us are somewhere in the middle, believing both in moral facts and moral opinions, depending on how old we are, and how we’re asked the question, and what some guy in a vest happened to say to us outside the subway station. “All the data show that it’s not right to say that people are relativists or objectivists,” Knobe told me. “The answer is, it’s complicated.”

That phrase—it’s complicated—appears to be a key insight of Knobe and his colleagues’ approach to philosophy. For years, scholars took it as a given that regular folks believed in moral facts, that objectivism is a natural state that can be overcome only by those with enough sophistication. In the past few years, we’ve come to understand that a “normal person” has far more subtle views about the distinctions between facts and opinions.

The same lesson applies outside of meta-ethics. Consider last week’s infernal multicolored dress: Some people saw white and gold, others blue and black; the disagreement blew our minds. (It’s as if we all were color realists, until a single image made the case for color anti-realism.) Philosophers, for their part, have been color realists, too. Some have even used color as analogy for moral truth; both are objective properties, they say, that we perceive and understand. But is that really true?

In 2010 a pair of philosophers tested people’s intuitions about color. They asked 31 undergrads to imagine that an alien comes down to Earth and gets into an argument. In one version of the story, the alien disagrees with an Earthling about whether a compact disc is round or triangular. In another, the alien disputes that a ripe tomato is red instead of green. In a third, the alien argues that a certain food tastes bad instead of good. In each of these cases, the students had to say whether the alien was right or wrong, or if there was no correct answer to the question.

The results were interesting. Students felt that taste—delicious versus disgusting—was of course a matter of opinion. (Ninety-nine percent agreed). On the other hand, shape—circular versus triangular—seemed more like an objective question with right and wrong answers. (Sixty-nine percent agreed). And for color—red versus green—the students were about evenly split. Half were color realists, and the other half were anti-realists.

Those data tell us, once again, that these questions don’t have simple answers. College freshmen may come off like fools, doubting every moral universal, but perhaps we ought to think of their displays as a form of practice. They’re learning how to weigh the evidence and challenge their beliefs. That’s a skill worth having, whether you’re an undergraduate or a writer for the New York Times.

*Correction, March 9, 2015: This article originally misttated that Common Core is a curriculum. It is a set of educational standards.