The Dismal Science

Can You Train Business School Students To Be Ethical?

The way we’re doing it now doesn’t work. We need a new way.

Can MBA students be ethical?
Can MBA students be ethical?

Photograph by iStockphoto/Thinkstock.

A few years ago, Israeli game theorist Ariel Rubinstein got the idea of examining how the tools of economic science affected the judgment and empathy of his undergraduate students at Tel Aviv University. He made each student the CEO of a struggling hypothetical company, and tasked them with deciding how many employees to lay off. Some students were given an algebraic equation that expressed profits as a function of the number of employees on the payroll. Others were given a table listing the number of employees in one column and corresponding profits in the other. Simply presenting the layoff/profits data in a different format had a surprisingly strong effect on students’ choices—fewer than half of the “table” students chose to fire as many workers as was necessary to maximize profits, whereas three quarters of the “equation” students chose the profit-maximizing level of pink slips. Why? The “equation” group simply “solved” the company’s problem of profit maximization, without thinking about the consequences for the employees they were firing.

Rubinstein’s classroom experiment serves as one lesson in the pitfalls of the scientific method: It often seems to distract us from considering the full implications of our calculations. The point isn’t that it’s necessarily immoral to fire an employee—Milton Friedman famously claimed that the sole purpose of a company is indeed to maximize profits—but rather that the students who were encouraged to think of the decision to fire someone as an algebra problem didn’t seem to think about the employees at all.

The experiment is indicative of the challenge faced by business schools, which devote themselves to teaching management as a science, without always acknowledging that every business decision has societal repercussions. A new generation of psychologists is now thinking about how to create ethical leaders in business and in other professions, based on the notion that good people often do bad things unconsciously. It may transform not just education in the professions, but the way we think about encouraging people to do the right thing in general.

At present, the ethics curriculum at business schools can best be described as an unsuccessful work-in-progress. It’s not that business schools are turning Mother Teresas into Jeffrey Skillings (Harvard Business School, class of ’79), despite some claims to that effect. It’s easy to come up with examples of rogue MBA graduates who have lied, cheated, and stolen their ways to fortunes (recently convicted Raj Rajaratnam is a graduate of the University of Pennsylvania’s Wharton School of Business; his partner in crime, Rajat Gupta, is a Harvard Business School alum). But a huge number of companies are run by business school grads, and for every Gupta and Rajaratnam there are scores of others who run their companies in perfectly legal anonymity. And of course, there are the many ethical missteps by non-MBA business leaders—Bernie Madoff was educated as a lawyer; Enron’s Ken Lay had a Ph.D. in economics.

In actuality, the picture suggested by the data is that business schools have no impact whatsoever on the likelihood that someone will cook the books or otherwise commit fraud. MBA programs are thus damned by faint praise: “We do not turn our students into criminals,” would hardly make for an effective recruiting slogan.

If it’s too much to expect MBA programs to turn out Mother Teresas, is there anything that business schools can do to make tomorrow’s business leaders more likely to do the right thing? If so, it’s probably not by trying to teach them right from wrong—moral epiphanies are a scarce commodity by age 25, when most students start enrolling in MBA programs. Yet this is how business schools have taught ethics for most of their histories. They’ve often quarantined ethics into the beginning or end of the MBA education. When Ray began his MBA classes at Harvard Business School in 1994, the ethics course took place before the instruction in the “science of management” in disciplines like statistics, accounting, and marketing. The idea was to provide an ethical foundation that would allow students to integrate the information and lessons from the practical courses with a broader societal perspective. Students in these classes read philosophical treatises, tackle moral dilemmas, and study moral exemplars such as Johnson & Johnson CEO James Burke, who took responsibility for and provided a quick response to the series of deaths from tampered Tylenol pills in the 1980s.

It’s a mistake to assume that MBA students only seek to maximize profits—there may be eye-rolling at some of the content of ethics curricula, but not at the idea that ethics has a place in business. Yet once the pre-term ethics instruction is out of the way, it is forgotten, replaced by more tangible and easier to grasp matters like balance sheets and factory design.  Students get too distracted by the numbers to think very much about the social reverberations—and in some cases legal consequences—of employing accounting conventions to minimize tax burden or firing workers in the process of reorganizing the factory floor.

Business schools are starting to recognize that ethics can’t be cordoned off from the rest of a business student’s education. The most promising approach, in our view, doesn’t even try to give students a deeper personal sense of mission or social purpose – it’s likely that no amount of indoctrination could have kept Jeff Skilling from blowing up Enron. Instead, it helps students to appreciate the unconscious ethical lapses that we commit every day without even realizing it and to think about how to minimize them.  If finance and marketing can be taught as a science, then perhaps so too can ethics.

These ethical failures don’t occur at random – countless experiments in psychology and economics labs and out in the world have documented the circumstances that make us most likely to ignore moral concerns – what social psychologists Max Bazerman and Ann Tenbrusel call our moral blind spots.  These result from numerous biases that exacerbate the sort of distraction from ethical consequences illustrated by the Rubinstein experiment. A classic sequence of studies illustrate how readily these blind spots can occur in something as seemingly straightforward as flipping a fair coin to determine rewards. Imagine that you are in charge of splitting a pair of tasks between yourself and another person. One job is fun and with a potential payoff of $30; the other tedious and without financial reward. Presumably, you’d agree that flipping a coin is a fair way of deciding—most subjects do. However, when sent off to flip the coin in private, about 90 percent of subjects come back claiming that their coin flip came up assigning them to the fun task, rather than the 50 percent that one would expect with a fair coin. Some people end up ignoring the coin; more interestingly, others respond to an unfavorable first flip by seeing it as “just practice” or deciding to make it two out of three. That is, they find a way of temporarily adjusting their sense of fairness to obtain a favorable outcome.

There are many such examples of what Bazerman and Tenbrusel would argue are unintentional ethical failings: People fall prey to self-serving bias: an accountant whose future business depends on maintaining the approval of the companies he’s meant to be auditing is genuinely more likely to believe his clients’ books are in order. We discriminate unconsciously against those who aren’t like us, passing them over for promotion or low-balling them in negotiations. And even when we lie, cheat, or steal for personal gain, we often disengage, at least temporarily, from the set of values that would normally lead us to look down upon those who lie, cheat, and steal.

These acts are all done, by and large, unthinkingly. In the terminology of Nobel laureate Daniel Kahneman, they’re processed automatically by our System 1 thinking—that is, the thinking that’s driven by intuition and emotion. If we could only force the System 2 part of our brain, which reasons logically through decisions, with full appreciation of the many biases that plague our intuitions and instincts, we might behave differently. So a first step is at least equipping business school students (also future lawyers, doctors, accountants, and probably everyone else) with a basic understanding of our psychological frailties and vulnerabilities—greater self-knowledge is at least a first step towards a solution.

However, mere knowledge of our flaws isn’t necessarily enough to stave off unconscious temptation. As Kahneman notes in his best-seller, Thinking Fast and Slow, “My intuitive thinking is just as prone to [cognitive errors] as it was before I made a study of these issues.” No one, Kahneman and Bazerman included, is immune to ethical blind spots.

Encouraging people to act ethically, then, can take some ingenuity. But often a minor change can make a lot of difference. Take something as simple and seemingly irrelevant as where you sign a legal document. These days, signatures verifying that you’ve provided truthful and accurate information usually come at the end of documents like tax returns and insurance claims. Yet according to a recent study, signing a pledge of honesty before filling out a form resulted in half as many misreported expenses as signing at the end.  Like a courtroom oath, signing first invites a commitment to ethical principles. Some solutions may involve thinking more proactively about removing temptation altogether. Consider an example from the classroom. In MBA programs, professors sometimes like to give take-home, closed-book exams, trusting their students not to peek at their texts as they take the test. But the temptation to cheat and rationalize (everyone is doing it) may simply be too great for too many when given such an easy opportunity. By confining closed-book exams to in-classroom tests, we remove the temptation. Regulation can serve a similar function: the Glass-Steagall Act prevented commercial banks from getting into the investment banking business. With its repeal in 1999, banking CEOs were drawn into using the safe and secure resources of their commercial side for risky speculation, a decision that they no doubt rationalized as profitable in the short-run but ultimately contributed to the financial crisis.

The fundamental problem with the partitioning off of ethics and values from the rest of the business school curriculum is that students will think about self-serving bias and discrimination and moral disengagement only so long as it’s the focus of classroom conversation, then forget it amidst discussions of marketing, finance, and accounting. In addition to greater self-awareness, we need a discussion of the kinds of structural solutions that force people to confront ethics rather than leaving them in the background. This will help our students with their own ethical lapses and help them in their roles as future business leaders.

What we need to do is equip our students to become “Moral Architects,” to create environments that naturally lead people—themselves included—in the right direction. Being a moral architect can involve modest organizational changes (like shifting where people sign a document) to more complex ones (like introducing an ethical checklist for all important decisions, in the way that doctors and pilots use checklists to reduce errors and save lives). It also involves training students to know when it’s most valuable to remove a temptation in the first place (for example, designing organizations to minimize conflicts of interest).

The only way we’ll get our students to integrate their moral compasses with the practical tools of business we teach them is to incorporate the topic of ethics throughout the curriculum. This will require the accounting and finance and marketing professors to grasp the ethical blind spots inherent in their respective areas, and to appreciate and recognize approaches to lessening them. Professors, in other words, need to be moral architects themselves.

When you stop and ask students whether they’d like their dying words to be “I maximized profits,” a wave of laughter ripples through the class, as all but the most callous have higher aspirations for themselves. When we ask MBA students why they might want to be a CEO, the first two responses are “I want to make a difference” and “I enjoy a challenge”; “Making gobs of money” always comes in third. We need to work harder to equip students to live up to those aspirations. And if we’re not going to make a better-faith effort in this endeavor, perhaps we should remove discussion of ethics from business schools altogether. Otherwise, it serves merely as empty PR for MBA programs and to appease the consciences of those who teach in them.