Do baseball players try harder in their contract years?
Albert Pujols has been struggling at the start of his new contract with the Los Angeles Angels
Photo by Harry How/Getty Images.
In 2011, the New York Mets' most reliable slugger—career-.300-hitting David Wright—had a miserable season. At an age when most players reach their prime, Wright batted just .254 with 14 home runs. This year, he has returned to form and then some: He's second in the National League with a .370 average and leads the league with an OPS+ of 190. As a Mets fan, I'm thrilled he's playing so well. But as a Mets fan, I'm also a little pissed off.
Does Wright have a new approach at the plate, or could something else explain his recent resurgence? Maybe it's that Wright happens to be in the last season of a six-year, $55 million contract. With his finances in the lurch—the Mets must decide whether to keep him for 2013—he stands to gain more than ever from having an incredible "comeback" season. He has a financial incentive to train harder, play harder, and hit the ball harder than he did last year.
Mets fans caught a whiff of opportunism last year, too, when All-Star shortstop Jose Reyes—28 years old, in the twilight of a his own contract—had his best season ever. He left the team for a payout in Miami, and his numbers promptly dropped off by 40 percent. And the Cardinals front office watched Albert Pujols, perhaps the best hitter in baseball, grab a $240 million, 10-year contract with the Angels this past off-season. As the ink of his signature dries, Pujols limps along with a .238 batting average and a slugging percentage of .406.
This tendency, real or imagined, for players to amp up their effort for a new contract and then start dogging it the year after has the distinction of being at once incredibly obvious (athletes turn it up when it counts!) and deeply counterintuitive (athletes always give 110 percent!). Of course David Wright puts in some extra effort with his livelihood on the line. Of course he plays as hard as he can every game. Which is it?
"The obvious answer," wrote Malcolm Gladwell in a 2006 discussion of the contract-year phenomenon with ESPN's Bill Simmons, "is that effort plays a much larger role in athletic performance than we care to admit." Problem is, no one knows if that's true. Once you get past the most salient anecdotes—David Wright for me, Erick Dampier for Gladwell—it becomes nearly impossible to discern whether professional athletes have a genuine tendency to turn it up in their contract years (or to goof off the year after). Sports economists and freelance statheads have gone at this question for decades, and their results—presented in peer-reviewed journals, conference talks, and plenty of undergraduate theses—haven't yet produced a reliable answer.
All these studies face the same fundamental problem: To figure out whether players tend to do better or worse in a given year of their contracts, they first need to determine how well a player should perform. In other words, they have to guess what David Wright would be expected to do in his 2012 season, given everything else we know about him, and then compare that guess to his actual performance. That's a tricky task, and one beset by confounding correlations. For one thing, an athlete's output has a lot to do with his age: A player might improve his stats through his late 20s, and then start to decline as he reached his mid-30s. By the time David Wright gets to the end of his next contract, he'll be a creaky 35 or 36—a point of his career where most guys are barely producing at all. In that case, Wright would be overperforming just by keeping his numbers out of the gutter.
There are other confounds, too. Players who do well in their contract years tend to sign long-term deals. They also tend to play a little bit worse in the following year, as their performance regresses to the mean. Meanwhile, guys who blow it in their contract years are more likely to end up washing out of the league or signing a short-term deal. Any analysis must take all these variables into account before it can start guessing how anyone "should" perform in his contract year or the one after. And even the most thorough models can introduce subtle distortions that tilt the results.
An economist at DePaul University named Anthony Krautmann has been trying to negotiate these problems for more than 20 years. Over that time, studies have piled up on both sides of the issue, with some finding evidence for a "shirking effect"—where a player like Pujols slacks off at the start of a big contract—and others turning up zeros. Krautmann's own work has produced some contradictory answers. In his most recent papers, he's concluded that the presence or absence of an effect seems to depend upon how you've decided to measure performance. Is a baseball player's output best described by his OPS+, or by some other number on his stat line? Should basketball players be judged according to their points per game or their efficiency ratings?
In 2007, occasional Slate contributor Phil Birnbaum published the results of a study looking at all the players with contract years between 1977 and 2001 (PDF). Birnbaum compared their actual performance (in terms of runs created) with what might have been expected given the two years before and the two years after. The "free-agent effect" from his study turned out to be indistinguishable from zero. The year before, Baseball Prospectus writer Dayn Perry had used a similar approach on a sample of "212 prominent free agents" and come up with a very large effect—players performed 9.4 percent better than expected.
What about other sports? You wouldn't expect to see much of an effect in the NFL, where contracts aren't guaranteed and players have an incentive to keep trying no matter what. (The best football study so far, by an undergraduate at Brown, found no evidence for the contract-year phenomenon.) There's better evidence of shirking in pro basketball, which makes some sense. The game is full of "hustle" stats—rebounding, blocks, steals—and even the casual fan can tell that players modulate their effort between the regular season and the playoffs. Why wouldn't they do the same from one year to the next? Basketball has its own issues, though: What if a player's "increased effort" translated into more selfish play—more shots per game, let's say, but a weaker overall performance? And how much of a basketballer's output is a function of his teammates?
In the end, the contract-year effect might be so small as to be undetectable. Or it might be large, but so difficult to isolate that we'll never find it for sure. The most we can say right now is that it seems plausible. But even that invites some more confusing questions: If there were a contract-year effect—if players did increase their effort as their contracts were set to expire—would the owners and GMs know about it? Let's say David Wright hits .370 this year. Is the Mets front office going to be fooled into thinking that he'll do it again?
Maybe some general managers are now savvy enough to ignore these blips and outliers. In the Moneyball era, smart GMs could game the system by placing more value on "rental" players going for their next big contract, like Cliff Lee in 2010. But in that case, the players might have an incentive to change their behavior, too. If GMs know you’re going to regress after your contract year, why bother playing harder?
Here's one more possibility, courtesy of Nate Silver. Let's say baseball players try as hard as they can every season, short of doing anything that might jeopardize their long-term career prospects. In their contract years, though, the risk-reward calculation shifts: Now it's more important to boost their stats by any means possible. Could the elusive effect be a product of the steroids era—a bit of statistical fuzz hidden among all the other distortions of the late 1990s and early 2000s? If Silver is right, then the contract year phenomenon, never yet captured in the wild, might already be extinct.