There is No Deadwood Effect: What Moneylaw Gets Wrong About Tenure
The Moneylaw movement is decidedly anti-tenure. Part of the reason, it seems to me, is an empirical institution that tenure destroys professors incentives to produce. Given the reality that tenure isn’t going anywhere, scholars have suggested that we rethink much of the way we organizing hiring by replacing intuitions (good grades and clerkships a good scholar makes) with data (prior publications rule).
But does tenure actually affect performance? The evidence suggests that it does not. That is, tenure doesn’t appear to cause individuals who were successful junior faculty members to become deadwood. Rather, those people who end up unproductive were likely always bad writers and teachers. Tenure doesn’t make you terrible, it just doesn’t separate all that well. Why? Probably because it is much too easy to get tenure at most institutions, and selection, to the extent it occurs, is on grounds unrelated to productivity (collegiality, political correctness, etc.)
Does this finding hold more generally? I became curious this year as I watched the performance of Pat Burrell of the Philadelphia Phillies. Pat has had an inconsistent career. He had a banner year in 2002, earning him a six-year deal with the Phillies for $50M, but has since struggled at the plate (in the field, he is as graceful as a gouty, overweight, camel). This year, as his deal expires, he’s again hitting like a champ, with an 25 homeruns and a very good OPS. The question is whether Pat is exhibiting the “contract year effect”, whereby players hit well in the year their deal expires, and then slump immediately thereafter. Or, to put it in terms that the sports-impaired might understand, is Pat about to become deadwood?
The problem is that while some early sabermetrics work found support for the contract year effect, later, better controlled, work, has been much more mixed. Joel Maxcy, Rodney Fort, and Anthony Krautmann, for example, found that while pre-contract players played more than the average (gutting it out to signal that they were healthy), their performance didn’t decline at statistically significant levels post-contract. They suggest that the result may be explained by the existing structure of labor contracts, which sometimes contain incentive-based payoffs, and which always are very well-monitored by fans, coaches, and the media. Brent Estes’ dissertation, to the contrary, finds that player performance does improve in the option year, and declines in the first year of his contract, controlling for other factors.
I don’t know enough about sabermetrics to really evaluate the Maxcy et al. / Estes debate. My prior, obviously, is that it would be surprising if MLB teams had not compensated for expected effort through contract terms. Regardless, Moneylaw’s hostility toward tenure, on the ground that it necessarily reduces output, isn’t fully justified by Moneyball‘s practitioners.
What do you think?
(Image Credit: Ansel Adams, Grand Teton Driftwood).