Workplace drug testing is widespread but ineffective.

Workplace Drug Testing Is Intrusive and Ineffective. Why Do Employers Still Use It?

Workplace Drug Testing Is Intrusive and Ineffective. Why Do Employers Still Use It?

Read this first.
Dec. 27 2015 8:00 PM

Why Do Employers Still Routinely Drug-Test Workers?

There’s very little evidence it’s worth the cost or hassle.


Photo illustration by Slate. Photo by Thinkstock.


Photo illustration by Slate. Photo by Thinkstock.

I was shocked to hear, several weeks ago, that a fellow journalist, who recently started as a research editor at a national magazine, had been asked to pee into a cup. As a condition of employment, this colleague told me, she’d been asked to show up at a laboratory on two days’ notice, lock her possessions in a cabinet, and deliver a sample of clean, drug-free urine into a plastic receptacle with her social security number printed on the side. Why would a research editor need to undergo this screening? “Who knows,” she said. “I guess someone could be high and not check all the facts in an article?”

Daniel Engber Daniel Engber

Daniel Engber is a columnist for Slate

Let me put this out there now, before you’ve read too far: I was not myself drug-tested before publishing this story. Neither was my editor, nor the copy editor, nor any of the designers, programmers, or art editors who worked on it. For all I know, the lot of them—the entire staff of Slate—could be huddled in a conference room right this very minute, passing joints around and shooting dope and snorting PCP. But ours is not the standard workplace in America.


I’m guessing that my incredulity—drug-testing, for real?—might sound out-of-touch. It’s a charge I can’t refute: Only since I started reporting on this topic did I learn that my acquaintance’s urinalysis was unusual only in our tiny corner of the working world. For the most part, media professionals reside in a private Xanadu, with little effort spent monitoring our drug consumption. Or at least that’s how it seems when I gaze across the chasm at my peers in other industries: According to a recent survey of almost 70,000 working adults from across the United States, 48.2 percent said that their employers performed drug screenings of some kind. What I’d imagined was a relic of the DARE-generation, as out of date as scrambled metaphors for drug-induced neuropathy, has never really gone away. On the contrary, drug testing is still widespread.

“Increasing numbers of employers are doing some sort of drug-testing,” says Barry Sample, the aptronymic director of science and technology for the Employer Solutions business unit of Quest Diagnostics. “These days it is rather uniform across many, many employment sectors. Most of the larger corporations, and most—if not all—of the Fortune 500 have some sort of drug-testing.” In all, Sample estimates that some 45 to 50 million workplace drug tests are taken annually in the U.S., making up a massive industry in biomedical HR.


Photo illustration by Holly Allen. Photo by Thinkstock.

The practice has recently begun to creep in new directions. The drug test has long been a condition of employment for a large proportion of America’s workforce; now, more and more, it’s a condition of unemployement benefits, too. In November, lawmakers in West Virginia discussed a bill to drug test anyone applying for a state-controlled welfare programs. Ohio lately held a set of hearings on the same thing. And Wisconsin started screening applicants for jobs training and food stamps. At least a dozen other states already have such laws in place, and at least a dozen have proposed the same in every year since 2009, according to the National Conference of State Legislatures.

This broad and retro culture of drug testing seems at odds with the growing disengagement from our long and painful War on Drugs. States are legalizing marijuana, and its use is on the rise; politicians now evince broad support for undoing policies that filled our prisons with harmless drug offenders. Yet despite this shift in strategy and realignment of our values, the drug testing of employees—performed at great expense to both the public and private sectors—remains routine.


That might make sense if testing yielded clear benefits to the companies that deploy it or to society at large. But here’s the most distressing fact about drug testing in the workplace: As was the case 30 years ago, testing has no solid base of evidence, no proof that it succeeds. We don’t know if screening workers for recent drug use makes them more productive, lowers their risk of getting into accidents, or otherwise helps maintain the social order. And what positive effects we do understand—there are indeed a few—seem almost accidental. They may not be worth the time and money and intrusion.

In other words, the drug testing of employees isn’t so much a thoughtful labor policy as a compulsive habit. It’s something that we do because we’ve always done it, and we don’t know how to stop. Testing has become a national addiction, and it may be time to taper off.

Like so many excesses of drug culture, screening got its start in the 1960s, when the Department of Defense took urine samples from veterans of Vietnam, to identify abusers and assist their rehabilitation. But the widespread programs that we have today were not conceived until the 1980s, amid the fervor that developed after two putatively drug-related transportation accidents. The details of those incidents are telling.

The first occurred one night in the spring of 1981, when a twin-engine Navy plane crashed into the deck of a nuclear supercarrier, the USS Nimitz, and caused an onboard missile to detonate. Fourteen crewmen died in the fiery disaster, and subsequent autopsies of their bodies showed that six had used marijuana at some point in the preceding 30 days.


This was not a drug-induced disaster, though—or at least THC was not the chemical at fault. An official report on the accident noted that the plane’s three-man crew tested clean for marijuana, and suggested that its pilot might have been impaired instead by cold medicine. But a panic over drugged-out, hippie seamen had already taken flight, and the Navy installed a “get-tough” policy on drugs.

In the years that followed, drug-related safety concerns spread from the military to other branches of government. Starting in 1982, the Federal Aviation Administration paid researchers to swab the mouths of 289 dead private pilots, all victims of fatal accidents, and learned that 2.1 percent of them had recently used marijuana. But the anxiety over worker drug use didn’t stop at airline pilots, or other employees for whom a momentary lapse could mean life or death. Drug testing was soon reimagined as a means of counteracting all sorts of social problems, from inefficiency to moral rot.

By 1986, surveys found that 20 to 25 percent of major American companies had drug-testing programs in place. That September, Ronald Reagan made it official U.S. policy, signing an executive order to counteract the “serious adverse effects” that drugs exert “upon a significant proportion of the national work force,” and ordering the heads of every agency to put in place formal testing programs. His vision for a “drug-free federal workplace” was important, he declared without any formal evidence, because federal employees who use illegal drugs “tend to be less productive, less reliable, and prone to greater absenteeism than their fellow employees.” He claimed they’re also prone to lapses of judgment, subject to blackmail, and a burden to their colleagues who don’t likewise indulge.

The national binge on drug testing had only just begun. In January 1987, an afternoon Amtrak train heading north from Washington, D.C. derailed outside Baltimore, killing 16 passengers and injuring 174. (“It looked like an atomic bomb going off,” said one witness, hitting another panic-button issue of the time.) An investigation found that warning signals that might have prevented the crash had been tampered with, but at least some portion of the blame fell to railroad engineer Ricky Gates, who eventually admitted that he’d been passing a joint back and forth with his brakeman, and that he’d taken “about three hits” by the time the accident took place. An acknowledged alcoholic, Gates had also been out drinking the night before, and apparently had a hangover. He says he skipped safety checks that morning so that he could get through his day a little faster. (Gates himself blames the narcotics, and after serving four years in prison, he took up work as a drug counselor.)


Photo illustration by Holly Allen. Photos by Thinkstock.


A subsequent survey of big businesses later that year found that the drug-testing rate among companies listed on the Fortune 1000 had roughly doubled, to 49 percent. Congress passed a law in 1988 that required the maintenance of a “drug-free workplace” by any company that held significant government contracts or grants, and another federal law, passed in 1991, required drug and alcohol testing of “safety-sensitive” employees in private transportation companies. By 1996, a survey from the American Management Association found that more than 80 percent of its member companies had some form of drug testing, and two-thirds tested all new hires. Over a span of less than 20 years, employee drug testing had become the norm.

While the screening programs multiplied, the scientific case for workplace testing failed to grow in kind. Early laboratory studies showed that acute drug use led to clear impairment—and thus increased the risk of accidents and poor on-the-job performance. But urinalysis—still the most common form of drug testing—doesn’t tell you whether someone is getting high in the office or behind the wheel. It tells you only that he or she may have gotten high at some point in the last few days. An employee who puffs a joint in the evening as he watches TV, but is otherwise alert and conscientious on the job, would still be singled out for discipline or denied a position.

It’s possible that workers who use drugs are less productive on the job, but that’s been hard to prove in practice. More straightforward is the claim that transportation employees who use drugs—guys like Amtrak’s Ricky Gates, perhaps—are a menace on the road. If that’s true then drug-tests might help limit traffic accidents. But according to Scott MacDonald, an addiction researcher and safety expert at the University of Victoria, the data don’t offer much support for widespread testing as it has been implemented. It’s true that people are at greater risk of getting into fatal accidents while they’re high on marijuana, but the fact of having used the drug in recent days or weeks has not been shown to carry independent risk. (Drug-testing programs screen for the latter.) Researchers have tried to link drug screening to reductions in workplace accidents at restaurants, construction companies, and metal foundries, among other industries. Some do find positive effects, but a systematic review of 23 studies, published last year in the journal Accident Analysis & Prevention, found that “the evidence base for the effectiveness of testing in improving workplace safety is at best tenuous.”

Another problem comes from the fact that drug-testing programs don’t usually reflect a rational discrimination among illicit drugs. Not all substances have the same effects, and some may pose dangers that others don’t. A long-haul trucker, for example, who is high on weed or heroin would be a much bigger threat on the road than one who’d taken uppers. The use of stimulants might even help prevent accidents, instead of causing them. “If you look at laboratory studies where people are given stimulants and have to complete endurance tasks, they actually do better,” says MacDonald.


Photo illustration by Holly Allen. Photo by Thinkstock.


Other drugs that aren’t generally tested for, like alcohol, can lead to all the workplace ills (inattention, slacking off, lapse of judgment, absenteeism) that screening aims to help eradicate. In fact, illicit drugs compose a modest portion of the nation’s substance-related problems. The National Institute on Drug Abuse reports that 17.3 million Americans are dependent on alcohol, about four times as many as are dependent on marijuana. If you lumped together all the serious potheads with all the people who are addicted to painkillers, cocaine, heroin, stimulants, tranquilizers, hallucinogens, inhalants, and sedatives—that is to say, if you stuffed all the non-alcohol-related substance abusers into one giant category—you’d still end up with a total of just 8.9 million addicts, half as many as you’d find for booze. Yet we tend to screen employees for illegal drugs only and leave them to their liquor.

Some testing critics, such as Adam Moore of the University of Washington, have argued that it would make more sense to do regular testing of an employee’s actual, on-the-job state of mind, rather than his or her recent drug use. If a worker nods off all the time, or sits there in an unproductive daze, what’s the difference if his problem is caused by tested substances such as marijuana or heroin, as opposed to other, legal ones? (Remember the pilot who crashed into the USS Nimitz and kicked off the modern testing frenzy? The one doped up on cold medicine?) Drugs may not even be the central problem: An airline pilot who suffers from chronic and debilitating insomnia, for example, could be more dangerous than one who does whippets on the weekend.

We have no idea whether drug tests reliably increase productivity, reduce absenteeism, prevent blackmail, or otherwise improve the lot of most employees who take them—be they restaurant employees, magazine fact checkers, or anyone else. (That’s not to say there isn’t any evidence at all: Such programs have been shown to reduce job turnover in the U.S. Postal Service, for example.) More certain is the fact that testing functions as a real deterrent. A pair of studies published in 1999 compared workplace drug testing to self-reported drug use, and found a clear, negative relationship: People subject to drug tests were less likely to report in surveys that they’d taken drugs. More careful study suggests that some other workplace interventions, such as drug education programs and the distribution of formal drug-use policies, also push down self-reported drug use. (Since these often go along with drug testing, the data can be hard to interpret.) But testing does independently have a significant effect on employee behavior: It encourages people not to use drugs. (Of course these findings are based on self-report, and people might be lying.)

Could that upside (if it is indeed an upside) be worth the ample cost of testing? That’s hard to say. It’s easier to get a handle on the balance of benefits and costs when it comes to public programs, such as screening applicants for unemployment perks. To that end, one simply tallies up the money spent on public screenings, and compares it to the money saved by cutting off drug-abusing applicants. An analysis of seven state testing programs by ThinkProgress suggests that governments are in the red from testing.


No one is surprised to hear that a government program might be inefficient and ill-conceived. But what about the private testing programs? It’s more telling that businesses across the country—most with no deeper ideology than maximizing profit—still spend their time and money testing new employees’ urine. The process must be degrading and dispiriting for their workers, too. Why do they persist? If the drug-tests really were of little practical use, wouldn’t market forces make them go away?

It may well be that testing works, but its successes hide in reams of proprietary data. “The question of what these employer screens do is really important, but it’s a complete black box,” says Notre Dame economist Abigail Wozniak, who has looked at the effects of testing on a macro level. “You need a lot of data to know whether your screening process is improving the bottom line, and it’s not at all clear that even the firms know this for themselves.” On the other hand, she says, it’s not too hard to imagine real benefits to testing: At the very least, they might help companies to figure out whether job applicants can exert at least a little bit of self-control. When someone fails a test—when they are unwilling to abstain from drugs for several weeks around their application—perhaps it tells you something about their character. (It’s also possible that employee testing helps companies control their legal liability in case of accidents.)

Even if the protocols were wholly useless, though, we’d still be stuck in the inertia of the 1980s panic. It’s much easier to put in place a cautious screening program than it is to dismantle one—and it’s much easier to follow standard practice in your industry than it is to buck the trend. (If everyone else in your industry were screening out the junkies, would you really want to be the one that lets those people in?) In any case, drug-testing may be as much a cultural value as a business one; it’s an American invention and an American proclivity. When it comes to urinalysis, this country seems to have no equal in the world: “Workplace drug testing occurs internationally,” says Barry Sample, but “clearly we are doing more tests than anyone else.”  

It’s also clear that drug testing is unequal in its application, even here at home. Those who work in certain industries (mining, manufacturing, transportation, government) are tested more often than other employees. More disturbingly, members of certain racial and ethnic groups may bear an extra burden. A study published in 2014 found that being young, black, and male made you more likely to report having been drug-tested in the workplace, and the disparity that accrued to black employees was most pronounced among those who worked as technicians or in jobs described as being executive, administrative, managerial, or financial. According to the paper, this bias has “important public health implications deserving further study.”


Photo illustration by Holly Allen. Photo by Thinkstock.

But further research into race and drug testing turned up an unexpected finding. For a paper published earlier this year, Notre Dame’s Wozniak compared worker data from states that encourage workplace drug-testing (such as Ohio, Utah, and Alaska) to those that have put in place some laws to limit testing programs (such as Rhode Island, Vermont, and Montana). Then she measured the effects of drug-testing legislation on employment, controlling for the type of job. In places where pro-testing laws were passed, she found that rates of black employment went up by 7 to 30 percent, while their real wages grew relative to those of white employees. The effect was most pronounced among low-skilled black men—those who would seem most likely to be tested.

Why might this be happening? Wozniak thinks the drug tests work to counteract employer bias. In the absence of a screening, bosses might assume (wrongly) that blacks have a higher risk of taking drugs than do whites. The presence of objective testing, though, assuages this concern: It lets employers know that black applicants aren’t using drugs; it takes away a prime excuse for job discrimination.

That’s not to say that workplace drug testing will help to usher in a colorblind society, and one peculiar side effect may not justify a widespread folly. But the Wozniak study does reveal a central truth about our testing habit: It isn’t only doing what we think it’s doing, and what we think it’s doing might be wrong. Maybe that’s what happens when you get addicted to a policy prescription. Your thinking gets a little fuzzy. You’re always searching for that perfect fix.