Are Alternative Law School Rankings Any Better than US News?

The WSJ has an article on alternative law school rankings to the infamous US News rankings. According to the article: “In the last two years, at least a dozen upstart Web sites, academic papers and blogs have stepped in with surveys of their own to feed the hunger for information on everything from the quality of the faculty to what a school’s diploma might be worth to future employers.” It has this chart of some alternative rankings of law schools:


In my opinion, all of these rankings have serious flaws.

US News — The reputation surveys are only given to deans and just one or two faculty members (a very unrepresentative sample of faculty). The reputation surveys are too easy to game. And the reputation scores of 1 through 5 are not granular enough. For example, Yale has an academic reputation score of 4.9, Harvard 4.8, and Stanford 4.7. That means that people in the surveys are rating these schools with 4s or less. Who gives less than a 5 to any of these schools on a 1-5 scale? Some of the other numbers factored into the US News equation are quite silly and can be easily cooked, with schools using accounting tricks that would make Enron officials blush.

Supreme Court Clerkship Placement — This is a ridiculous way to rank schools. Getting a Supreme Court clerkship is like winning the lottery. There are far too many qualified people than positions, and getting a position certainly takes merit but it also takes a lot of luck. Part of it depends upon the connections of a school’s professors, who can place clerks with feeder judges or may even have influence with a Supreme Court Justice. Nobody seriously goes to law school planning on getting a Supreme Court clerkship. And it’s based on total number of clerks, so the ranking in the WSJ column is meaningless since some schools are much larger than others (Harvard is more than twice the size of Yale).

Elite Law Firm Placement — This is better than Supreme Court clerkship placement, but still quite flawed. It assumes that going to an elite law firm is the premiere job in the law. But what about government jobs? AUSAs? Judicial clerkships? Academia? Public interest? Ranking based on elite law firm placement will create terrible incentives for law schools to steer students into big law firms when this may not be what particular students really want to do with their lives.

Law Journal Citations — This is just silly. I don’t see any connection between how many times articles in a school’s journal receive citations and a school’s academic reputation.

SSRN Downloads — Another problematic metric. SSRN downloads only measure a paper’s popularity with Internet communities. They don’t measure a paper’s quality. High download counts are skewed toward schools with larger faculties, and to papers by professors who blog or who write about economics or technology issues. I wish I could be more sanguine about SSRN downloads as a ranking mechanism, for GW, the law school where I teach, ranks in the top 10.

Leiter’s Rankings — In this version of his ranking system, Leiter ranks schools by per capita citations to faculty scholarship. Leiter himself recognizes some of the problems with citation counts as a metric of quality, so he understands the flaws in his system. Citations are better than SSRN downloads, but they are still deeply flawed and even in an ideal world only capture a faculty’s scholarly reputation, which is a very important component of a law school’s quality, but there are other factors as well: quality of the student body, resources, teaching, job placement, etc.

So how are we to rank law schools? If all these methods are flawed, what is the ideal method? If the US News equation is silly, what factors should be considered and how much should they be weighed?

You may also like...

8 Responses

  1. Anthony says:

    None of the issues you raise indicate “serious flaws” with those rankings. Of course elite law firm placement says nothing about AUSA jobs, law professor placement, etc. I said that in my article, and I told the WSJ reporter the same thing (though it didn’t make the article). But I’ve never argued that my elite law firm placement rankings correlate with clerkship placement, and have never advocated that law schools should abandon every other goal in order to maximize national firm placement. The purpose of my elite law firm placement rankings is to measure elite law firm placement, and the intended audience for my rankings are prospective law students who want to work in elite law firms.

    Now this isn’t to say that there aren’t problems with how some of these ranking systems are actually used. For instance, if U.S. News claims that it’s rankings measure the earning power of law school graduates, then there is clearly a disconnect between the stated purpose of the rankings (measuring earning power) and the methodology employed by the ranker (inputs like the # of volumes in the law library have nothing to do with the earning power of graduates — sticking another 10,000 books in the law library isn’t going to get graduates more jobs or higher salaries). Similarly, there would be a disconnect between purpose and audience if U.S. News marketed its rankings exclusively to law professors (who likely have little or no interest in the earning power of graduates, but are more interested in factors like academic reputation, professor salaries, faculty teaching loads, etc.).

    However, I don’t think any of the rankings listed (with the exception of U.S. News, and possibly some of Leiter’s survey-based rankings) suffer from very serious purpose/methodology or purpose/audience problems (though obviously some improvements could be made to all of them, like having the SCOTUS rankings adjust for school size). If people are using my elite firm placement study as a proxy for public defender placement, or if people are using law journal citations to measure law school academic reputation, then the problem is not with the original study but with the people who are using those rankings improperly.

  2. Bruce Boyden says:

    I also don’t think “elite law firm placement” is a bad measuring stick; but the problem is coming up the right denominator. I suspect that students that place well at elite law firms are equally able to place well at whatever they’d like to do, whether it’s public interest, government, or what have you. It’s just that firm placements are easier to count. So, you could measure school quality by dividing the number of placements in the Am Law 200 by all those who tried to get such positions.

    Of course, that’s the rub. The easiest denominator is just the total number of students in the graduating class. But using that figure assumes that the proportion of students who try to get large firm jobs is constant across all schools. But that’s probably false, at least toward the top of the rankings. Also, using firm placement probably advantages schools that are local maximums — the best of the closest, that is. I think this is why Chicago comes out so well in Anthony’s rankings. Still, I don’t think anything about such a ranking says that firm jobs are better than other jobs, they’re just easier to measure.

  3. Anonymous Person says:

    An interesting alternative to Mr. Ciolli’s study, inspired by Professor Boyden’s comment, might be to consider elite law firm placement but analyzing only placement outside of the locality of the school. That is, for example, don’t consider how Columbia places in NYC, but how Columbia places in other markets. The result would evidence which schools are really “national” ones.

  4. Anthony says:

    Anonymous Person: That information is already in my study–see Appendices A (regional TQS) and C (regional per capita). The national rankings that appeared in the WSJ (Appendix B in the paper) are a amalgamation of the nine regional rankings.

    Bruce: Actually, local schools, including elite local schools, tend to do worse in their home regions than elite non-local schools. For instance, in my study Chicago placed better than NYU in the Mid Atlantic region, but NYU placed better than Chicago in the Midwest. If you skim the placement rankings for individual regions I mentioned above, you’ll see that this was a pretty consistent trend.

    Why this counterintuitive result? My theory: supply and demand. Yes, all else equal, a prestigious local school will likely have advantages over an equally prestigious out-of-state school that would translate into better job placement (e.g. strong local alumni network, easy for local firms to do on campus recruiting, etc.). But all else isn’t equal–for one thing, students from the prestigious local school are in very high supply, whereas students from the prestigious out-of-state school are relatively scarce. As a result, firms can afford to be more picky with students from the local school compared to students from the out-of-state school, explaining why Chicago students place better than NYU students in New York and NYU students place better than Chicago students in Chicago.

    Of course there are other possible explanations. For instance, one could argue that NYU students who want to work in Chicago need higher grades to get an elite firm job than NYU students who want to work in New York, and thus lots of NYU students who want to work in Chicago try to get jobs there, fail because of their grades, and then “settle” for New York, giving the impression that NYU places better in Chicago than it really does. I don’t buy this explanation because I have seen no evidence suggesting that 1) the grade distribution of out-of-state job seekers (at elite law schools at least) differs significantly from the grade distribution of in-state job seekers or that 2) most law students are highly geographically flexible (in fact anecdotal evidence suggests the opposite). Of course, given that the relevant data either doesn’t exist or is locked away in the registrar or career services office makes it impossible to conclusively prove or disprove this claim.

    As for the denominator, I agree that attempts would be ideal, but that data either doesn’t exist or isn’t available. However I don’t think students in the graduating class is a good denominator (in fact Leiter’s improper use of that denominator in his employment study is one of the reasons I decided to do a study of my own). You’re correct that law schools different significantly in the percentage of students who pursue firm vs. non-firm careers. For instance, for the period of my study, 80% of Columbia grads pursued firm careers while only 45% of Yale grads went to firms directly after graduation. Of course a lot of the people not going into firms were clerking, so an adjustment had to be made there (I saw a Yale career services study that indicated that clerks go into law firms after their clerkships at almost the same right as non-clerks, so at least that adjustment was relatively easy to make).

    However regional preferences are an extremely important factor that need to be taken into account, especially when taking into account the audience for such a study–after all, when it comes down to it most law students aren’t thinking “Which law school will maximize my chances of getting a great law firm job anywhere in the United States?” (though I’m sure some do) but rather “Which law school will maximize my chances of getting a great law firm job in [insert name of preferred legal market]?” Regional adjustments are particularly important because legal markets differ dramatically. For instance, at the time of my study the Mid-Atlantic region had almost double the number of jobs (and better average quality of jobs) compared to the Pacific region. Since 78% of Columbia’s class works in the Mid-Atlantic region while only 15% of Stanford’s class works in that region, having the denominator be total class size (or even total class size reduced by those who never attempted to get a firm job) without making any adjustment for student geographical preferences or regional legal market conditions would produce very flawed results that a prospective student couldn’t rely upon. So, ideally the denominator would be the total number of students who attempt to get a job at a law firm in Region A. Then, once you have data for Regions A, B, C, D, E, and so on you can aggregate it (likely after making some more adjustments) in order to measure national placement as best as possible.

  5. Anthony says:

    Ugh, after rereading that I wish I had proofread before hitting the post button. Guess that’s what I get for working on a lengthy screed while doing MBE questions.

  6. I’ve long advocated ranking Law Schools by their access to a quality Major League Baseball park…coincidentally, my alma mater, Maryland Law, is right up the street from Camden Yards and I believe would be ranked first using my totally objective methodology.

    I realize my method would hurt some over-hyped schools such as Yale, UVA and Duke but, hey – prospective students and employers need to know all the facts.

  7. ADERUS MILAN says:

    People are agopnizing over these rankings, wondering if they should exist at all…and the answer is a resounding YES! Then we ask what methods are best, and most have at least some methods, even those that put location on par with prestige. Rankings force businesses and other entities to up their performances and that is always a good thing for any consumer. If there’s one dysfunction we can all look at, it is the employers (mainly prestigious law firms) that put too much emphasis on rankings…and it has a nasty trickle-down (to the students and their LSAT scores and, to a lesser extent, grades) effect. Firms need to start doing the legwork to look for the John Edwardses (UNC) and Johnny Cocherans (Loyola, CA) of the world. And they could even search more for those diamonds in the rough at third-tier schools (and there are lots of them). Law students study virtually the same material, cases and the like. No ranking can determine how a student uses his education, no matter what school he attends. There are many so-called “lower-tiered” schools I would love to attend, but I can’t, because I’m scared I won’t have a job when I graduate. And I know I’ll be a great attorney no matter where I go, but firms don’t recognize that this is the case for many…and that many great lawyers do not come from top-10 schools.

  8. ADERUS MILAN says:

    I meant to say that most rankings have at least some validity