The Contradictory Goals of Law School Rankings

usnwr1.jpgAs usual, a ton of blogospheric attention has been devoted to the US News law school rankings. Over at PrawfsBlawg, Geoffrey Rapp has found a way to get the numerical rankings of law schools in the Third and Fourth Tiers. At TaxProf, Paul Caron ranks the law schools by reputation score. At Brian Leiter’s Law School Reports, Brian Leiter offers suggestions for improving the rankings. At Law Librarian Blog, Joe Hodnicki tracks law school rankings from 1996-present. I, too, have posted about the US News Rankings.

If we step back from this year’s frenzy, I believe that there’s an important fact about law school rankings that accounts for much of the displeasure about them. Law school ranking systems have contradictory goals. Here’s why. Law schools, like many institutions, are not incredibly dynamic and changing in the short term. They often change slowly, not dramatically. The result: We shouldn’t see much movement year to year in the rankings. Most schools should stay about where they are. A few schools might move over time, but any one year’s movement is not significant in the grand scheme of things. So to be accurate, rankings shouldn’t change all that much.

But rankings systems have a contradictory goal: They need to reflect some kind of change, or else looking at the rankings each year would be like watching glaciers move. There must be some drama in the rankings year by year. We eagerly await our rankings each year, and we don’t want rankings at five or ten year intervals. And we don’t want stable rankings — we want changes to cheer and kvetch about.

There is another value in rankings reflecting some degree of change each year beyond our enjoyment of babbling on about them. Law schools work very hard on hiring new and lateral professors, promoting their reputations, improving their schools, increasing their admissions selectivity, and so on. We want our work to be reflected in a tangible manner. We want results for a year’s worth of hard work in improving the school. We don’t want to wait a decade or longer to see results. Unfortunately, the US News rankings often don’t reflect this work very well. But they do show that something is happening. We can then complain about the disconnect between what we’re doing and our ranking: “We did all this, and our ranking hasn’t moved. Damn that US News for their flawed system!” Or, we can justify rises in our rankings: “We’ve moved up several spots in the rankings. This is, of course, due to all the wonderful improvements we’ve been making to our school.” Either way, at least we have something to talk about.

The reality is that probably very little we do has much effect vis-a-vis our ranking with other schools over a period of time. We might improve our faculty by hiring some great laterals, but over the course of time, our competitor schools will also likely have done the same. True, one school might outpace another, but big shifts are the exception not the norm.

turtle1.jpgSo the rankings need to reflect a state of affairs that is largely static, with a few gradual changes over the course of a long time. They must do so in a way that keeps people interested and excited. The rankings must display glacial change in a dramatic way. To use another metaphor, the rankings must make a turtle race seem exciting.

A few years ago, Dan Filler and I created a chart of the US News rankings for the top 25 law schools from 1997 to 2006. The interesting thing about the chart is how little movement most schools demonstrated over the course of time. Let’s look at Cornell Law School. In 1997, they were 12, then their ranking went like this over the next decade: 12, 10, 10, 12, 13, 10, 12, 11, 13, 12. When they drifted from 10 to 13 over the course of a few years, there were probably cries of outrage for dropping out of the top 10. When they suddenly jumped from 13 to 10, they probably celebrated with great cheers. Headline: “Cornell dramatically rises to the top 10!” In reality, Cornell is trapped in an orbit around 11.5 (that’s their average ranking over the past decade). And they barely go much higher or much lower than that. From year to year, it appears that there is something going on — Cornell appears to be moving. But it’s just a clever illusion, created by US News to achieve the two contradictory goals of rankings.

Paul Caron provides links to law schools responding to this year’s rankings. David Lat collects emails from law schools responding to rankings fluctuations.

At the end of the day, I believe in the following points:

1. For the average law school, US News ranking doesn’t change that dramatically. Only a few law schools make any major advances or drops in rankings.

2. In reality, schools don’t change that rapidly. Some schools that appear to have moved significantly in their US News ranking may have moved due to changes in methodology more than actual changes in the institution.

3. The legal world goes into a frenzy each year when the rankings come out, but changes in the rankings from one year to the next can’t possibly have any meaning. What matters is changes that occur over the course of a long period of time.

4. US News knows how to sell issues. Its rankings must change each year, or else nobody would care to buy the issue each year. It knows the two contradictory goals for rankings systems. It’s solution is a rankings system that shuffles things around a little bit each year, enough to give us the drama we crave. Although most schools go up and down each year, over the course of time, they basically stay in the same place.

Is is true that some schools move significantly over a period of time. So there are exceptions, but there aren’t very many.

To be more meaningful, rankings should probably be done in five-year intervals rather than one-year intervals. Information over the course of five years should be factored into the rankings, not just information for any one given year. Would such a ranking system be successful? Probably not. US News wants to sell issues each year, not every five years. Moreover, we don’t want to wait every five years for a new ranking. We want something exciting to talk about each spring.

And so the game will continue on. . . .

You may also like...

11 Responses

  1. TJ says:

    This is very insightful, but there is one tension here that is unexplored. One of the most frequent criticisms of the US News ranks is that they are self-reinforcing. But self-reinforcement would imply that a school that drops will keep dropping (or at least not go back up). But your theory is that schools remain fairly constant and bounce back, even after dropping. The two cannot both be true.

  2. Sean M. says:

    My school put out a press release for moving to 30 from 31.

    I am embarrassed (though, I suppose, I get to say I go to a “top 30” school now…)

  3. G says:

    The US News system is no more broken than legal scholarship is.

  4. AY says:

    The rankings are self-reinforcing. The data U.S. News uses is actually about a year and a half old at press time, so what can develop is a slight yet noticeable “yo-yo” effect as each incoming class makes their choice based on a ranking that uses data from the current 2L class.

  5. Robert Arvanitis says:

    Using external rankings to sell magazines is no different than the schools’ own use of their brand image, so assiduously cultivated by their own marketing departments.

    Nor are those two processes much different from students’ cultivation of what they perceive schools to want in the admissions process.

    Everyone is grooming, and each has an ax to grind. Welcome to the marketplace.

  6. Maxwell Demon says:

    I think it would be more fun if the law schools were ranked the same way college sports teams were–specifically, the rankings should change weekly, and a school’s position should be affected by every piece of news relating to it. Of course this makes no sense, but ooh, the drama.

    Separate note–is Yale Law permanently enshrined at number 1, or could it potentially be replaced?

  7. Frank says:

    Very insightful. James English has noticed some similar developments in other “award systems” in his book The Economy of Prestige. He also doubts that new rankings systems will help matters much, because they fall into the same trap–they want to give different results than the old established ranking system, but if the results are too different, everyone will dismiss it as off-the-wall.

  8. coggieguy says:

    Why not dip into college sports for the proper ranking analogy: an annual rite of Law School March Madness. Take the top 64 schools (of course leaving about 5 schools deploring the process as unfair) and have playoffs – mock trials etc. Go through the first two rounds in one weekend of crazed simulated jurisprudence. Then the Sweet 16, Elite 8 and Final Four. Think about the possibilities – three point depositions hurled up as the clock runs down, full court press of obscure latin terms, shooting precedents like foul shots. Cinderella teams like George Mason can come out of nowhere to dramtically rise in the rankings! Everyone would complain about Yale being a perennial Number one seed, and get homecourt advantage in the East regional. At the end, the new reigning national champs would be crowned, and deans everywhere would be out on the recruiting trail to get new talent and talking about next year.

    Of course the downtrodden can have the Judicial NIT and mumble about getting experience for next year.

    A modest proposal for your consideration……certainly more entertaining than statistical analysis of US News rankings which are eerily reminiscent of analysis of which Soviets were on Lenin’s tomb for the May Day parade.

  9. Orin Kerr says:

    But rankings systems have a contradictory goal: They need to reflect some kind of change, or else looking at the rankings each year would be like watching glaciers move. There must be some drama in the rankings year by year. We eagerly await our rankings each year, and we don’t want rankings at five or ten year intervals. And we don’t want stable rankings — we want changes to cheer and kvetch about.

    Who is “we”?

  10. To Maxwell says:

    As for whether Yale is permanently enshrined as #1 — consider this. A vastly disproportionate number of law professors and deans are Yale grads. And who is surveyed in determining who is #1? Law professors and deans. If there were a large number of Cooley grads answering the survey, the results might be different.

  11. Maxwell Demon says:

    @9:24–that makes sense. Still, I’m surprised that the business and medical rankings don’t have a permanent #1; I would think that the same kind of institutional bias would apply. With law, even assuming a rigged game, I’m also surprised that Harvard hasn’t figured out a way to win occasionally.