US News Rankings: A Chart of the Past Decade

Co-authored with Dan Filler.

The US News rankings have captivated legal academia. The rankings have had a tremendous effect on student decisions about which law schools to attend. They have also had an impact on which authors receive offers from law reviews (the letterhead effect) and the choices that authors make when faced with multiple publication offers. As one might expect, given that US News wants to sell magazines, schools move around the rankings to some degree. Each year the rankings produce a few new “haves” and a gang of fresh “have nots.”

We have created a chart of the trends in the US News ranking for the top 25 law schools over the past decade. Below is a small version of the chart; click on it for the full-size version. More analysis is below the chart.

As the chart demonstrates, there are some bands of stability and some areas of volatility. The same six schools have occupied the top six positions for the last decade. There has been little movement in the top 15. But below the top 15, schools dance around quite substantially.

When students choose law schools, they should remain focused on the forest and not get lost in the trees. Focusing on year-to-year changes can be misleading. For example, in 2006, Wash. U. moved up five spots from 24 to 19. But a year earlier, it dropped from 20 to 24. What is the real Wash. U? Over time, one can see a dramatic change — Wash. U was in the high twenties and early thirties until it leveled out at 25 in 2002. In another example, if one looked at GW in 1998, it was ranked 20. But at that time the 20 was an anomaly, as Wash U was 24 in 1997 and 25 in 1998. After 2004, GW has been consistenly ranked either 20 or 19. To the extent that the US News rankings have any value at all, it is evident only in long-term trends, not in yearly fluctuations.

There are other instances where the US News rankings are simply a game of musical chairs for certain groups of schools. For example, Berkeley, Virginia, and Michigan have been have engaged in a US News game of cat-and-mouse over the past decade. When one school drops, students may become crestfallen. Prospective students may shift their preferences. However, over time, the ordering of these schools appears just to shuffle around a lot, with no discernible pattern. Relying on the US News rankings to choose among these three law schools is like choosing one’s hometown based on today’s weather report.

Below is the full data set; click on the chart for a larger image.

UPDATE: We have corrected an error pointed out in the comments. The charts now reflect Virginia’s correct 2006 ranking of #8. We also learned that a commenter at xoxohth has created even more comprehensive data charts here in Microsoft Excel format.

One more point regarding what looking at the rankings temporally tells us. For many schools, the rankings don’t change very much. And even the big changes are simply often a reflection of the fact that so many schools are tied or nearly tied; hence, a small nudge upward or downward will lead to a bigger fluctuation in rank. If people look at any given year and compare it to the year before, they might assume that there is some kind of progress for certain schools and some regress for others. But if they look at the big picture, there is lasting change for only a few schools. For example, take Berkeley. From 1997 to 2006, it was ranked 7, 10, 8, 9, 7, 10, 13, 11, and 8. So it is basically where it started, but sometimes it was a “Top 10” school and sometimes it wasn’t. Of course, in the real world, Berkeley did not make a prodigal journey. If one looks at the rankings when Berkeley went from 10 to 8, she might think: “Berkeley is on the move. It’s now firmly in the top 10.” But a few years later, Berkeley would not only fall to 10, but would plunge as low as 13. One might be tempted to think: “Oh my, Berkeley’s really plunging now. They must be doing something wrong.” Now, Berkeley’s back in the top 10. Should we think “progress”? No. There’s no progress. Berkeley is basically where it always was — in the top 10, where it clearly belongs in my opinion. The only change is where US News places it in the rankings. Therefore, looking at the rankings temporally suggests that one shouldn’t take the US ranking changes in any one year very seriously.

You may also like...

13 Responses

  1. Hank says:

    The data for Virginia is incorrect. They should be at 8 instead of 10.

  2. Frank says:

    Very nice collection of data. I’m impressed by the stability more than the movement; I don’t know if it validates the rankings themselves or criticism of them. As statistics texts show, variability and bias are separate indicators of (in)validity.

    I think the stability demonstrates that the rankings are not very variable. But it doesn’t do much to show that they are unbiased.

    Of course, variability may be a good thing to the extent schools themselves change. But I have little sense of that. Have any of these schools changed a great deal over the past 10 years? If they have, then perhaps we should see more variabiltiy. But if they have not, perhaps the rankings, for all their defects, have found some useful way of assessing quality (ala Korobkin’s guarded endorsement of them in the recent Indiana symposium), or at least reflecting pervasive assessments of quality and balancing such assessments with relatively stable quantitative data.

  3. ac says:

    Rank variability does not seem like the right metric. Even with fairly stable scores, ranks could vary wildly with fairly small changes in score. For example, in this year’s USNWR, while the gap between 1 and 2 is 8 points, the gap between 19 and 34 is 6 points.

  4. mj fghg says:

    The 2006 rankings in your chart is actually the 2007 rankings

  5. mj fghg — We realize that — see this post. For some reason, US News rankings say they are for the next year; we’re going by the year in which the ranking issue came out to avoid any confusion.

  6. J. says:

    Just to note: The rep ratings are even more constant. The big difference is that Michigan is also always a T6 in the rep ratings, and usually a point or 2 above UVa/Boalt.

  7. nt says:

    Stanford was not number 4 in 1997 (USNews 1998); it hasn’t been below number 3 since 1991.

  8. Scott Moss says:

    The consistency of “academic reputation” was striking to me. I think that we academics believe schools’ academic prominence rises and falls substantially each year; see Leiter’s blog, for example, talking about how this or that school has substantially improved/weakened with its last three or four faculty hires or losses.

    My impression is that certain schools have become much stronger (or stagnated) in terms of scholarship in the past however many years. But perhaps our perception “from the inside” of the degreee of change is exaggerated.

    Does the consistency of academic rep mean that schools seeking to improve their US News ranking should NOT focus on scholarship, because to date NO school has improved its academic reputation substantially, even over many years?

  9. Dan Filler says:

    NT, Stanford was in fact ranked fourth in 1997. Its overall score was 0.1 behind third place Chicago.

  10. nt says:

    I think I may have found the problem. According to this website, which has the old records on file,

    http://groups.yahoo.com/group/LawSchoolRankings/files/

    the rankings as originally published in the magazine were incorrectly calculated. The correct rankings, which USNews later issued, had Stanford at 98.5 and Chicago at 98.1.

    A trivial difference, obviously.

  11. Jane Doe says:

    Do you hear yourselves? “So and So University is actually 3, not 4!” Are you kidding me? They’re just rankings; the opinion of one magazine based on WHAT exactly? Still not sure. Maybe everyone should focus on important stuff, like poverty, for example, and not if your precious law school is ranked 2 or 3 in 1997!

  12. John Doe, J.D. says:

    It’s really amazing how much influence the media has on academia. You would think that intelligent people would refrain from such naive bickering over arbitrary rankings. Furthermore, if you actually consider how these rankings are conducted you would see that they really have nothing to do with quality or “career value”.

    If there one is thing that has certainly become clear to me in the field of law, it’s that your success as an attorney is not dependent on which school you graduate from. In the end, it’s just a piece of paper you own. I’ve known terrible attorneys that graduated from “top” (and I mean top in the sense of these subjective rankings…as in 1-10!) law schools.

    I am sure that if you ask any experienced attorney, they will tell you that you acquire most of your legal knowledge through experience and not during law school!

    The lesson to learn: how successful you are as an attorney (or anything in life, for that matter) is not dependent on the writing on a piece of paper!

  13. Darwin says:

    Where can you find past US News rankings for law schools not in the top 25?