Category: Law School (Rankings)


US News 2009

They again seem to have leaked early at read ’em and weep. Swayed by some of the arguments Brian Leiter makes here, I’m not going to reproduce the list. (And besides, it seems like the folks who excavated the information deserve the hits, not that the equities much matter or that others will feel the same way.). After satiating your curiosity, come back here and talk about ways to make the system better.


Improving the US News Rankings: A Wish List

usnwr1.jpgA new article in the ABA Journal profiles Bob Morse, the US News & World Report “rankings czar.” I recently corresponded with Bob when he wrote to me about my parody of the rankings. He took my humor in good spirit. According to the ABA Journal article:

Since it began the rankings in 1987, the magazine is often attacked as wielding too much power; its methodology is denounced as easily manipulated and too subjective to carry such inordinate weight.

No one understands this more than Robert Morse, the man who created the law school rankings for U.S. News. As the magazine’s data research director, Morse says he, too, feels a high level of anxiety each year when the law school rankings are revealed. . . .

He also feels the heat from those who resent their enduring influence. For a ratings czar, he is a very re­luctant despot. Far from being impervious to complaint, he maintains a blog where he explains his rankings and encourages constructive criticism. He’s been known to show up unannounced at gatherings likely to denounce him.

The article goes on to note that Bob Morse is open to suggestions for improving the rankings:

Morse says he understands and agrees that the rankings are not perfect, and he would like nothing more than to discuss with law school deans ways to improve them.

“Deans are welcome to call me or come by my office in Washington,” Morse says. “I want to work with them to improve the rankings.”

For better or worse, the US News rankings are here to stay. They are tremendously influential, and despite our constant complaints, I doubt that the influence of the rankings will diminish. So we can continue to gripe and grumble, with probably little effect. Or, we might be to work with the magazine to improve the rankings. Bob says he’s amenable to suggestions for improvement:

Bob Morse has his own blog that invites comments and criticisms. He’s shown up uninvited to university symposiums dedicated to fighting the U.S. News rankings he created. He wants to hear what the critics have to say.

So since Bob is listening, I pose the question: How ought the rankings to be improved?

The current US News methodology is here.

Here are a few things I’d recommend:

1. The reputation surveys are not sent out broadly enough. They go out only to deans and to newly-tenured professors. A broader cross-section of law school faculties should be used in the poll.

2. A more granular reputation scoring system should be used. The current 1-5 score isn’t granular enough. For starters, how do top schools like Yale and Harvard have average scores less than 5. Who’s giving them a score of 4? Seems fishy to me. Suppose a dean thinks Yale is the best and that Chicago is excellent — not quite as good as Yale, but very close. Yale therefore gets a 5. Does that mean Chicago gets a 4? That’s a big drop. Giving Chicago a 5 says it is equal, which may not be the dean’s view. There’s a problem here — the scale isn’t granular enough.

3. The number of library volumes shouldn’t be a part of the scoring system. This strikes me as a silly factor in ranking law schools.

These are just a few ideas. What are yours? The purpose of this thread is not to gripe about the rankings, but to propose fixes and improvements, so please focus your comments on suggestions for reforming the US News rankings.


The Official Leaked US News Law School Rankings, Plus Ranking Secrets Revealed!

usnwr1.jpgI’ve got the scoop of the year! An anonymous source from US News & World Report leaked this memo to me. It is a memo written by the magazine’s “law school ranking executive” describing how the magazine arrived at this year’s official rankings.

See below for a sneak peak at this year’s rankings as well as some amazing secrets about how US News ranks law schools.

Read More


A Market in Rankings?

inTradelogo.gifComplaining about law school rankings is a cottage industry in the legal academy. (Or rather more than a cottage industry, I suppose.) Everyone — or nearly everyone — dislikes the current system, and while I am less skeptical than most — it doesn’t seem unreasonable to me that students planning on shelling out $70,000+ in tuition might want some comparative measure of quality — I agree that the current system leaves something to be desired. It seems to me that we could set up a market based solution.

A student recently suggested to me that inTrade ought to set up a prediction market in U.S. News Rankings. That way students could hedge against the risk that the value of their degree may drop if their school shifts in the rankings. It is not a bad idea, but the problem is that such a market — while allowing a bit of U.S. News risk arbitrage and hedging — would ultimately be about simply predicting the mysteries of the U.S. News system. Suppose, however, that we set up contracts for something other than U.S. News status. For example, one might purchase a contract predicting that West Dakota Law School’s graduates would have an average starting salary of $100,000 or more. This would provide information of the kind that most students care about. Alternatively, one might create a contract that pays out if East Carolina Law School’s faculty places 10 articles in top-ten law reviews this year or some other measure of scholarly accomplishment. Then we could compare the share prices for Harvard and Yale. Of course, we would still just be getting a market in prediction of a particular outcome, rather than actual quality, but it might not be a bad proxy and it might capture more of the dispersed knowledge about law school quality. Of course, in order for the system to work you would need a relatively thick market in the contracts offered and even if there were only three or four contracts per law school, the number of contracts available in the market would be huge. On the other hand, law profs and law students are nothing if not obsessed with status, and I suspect that there would be a sizable contingent eager to cash in on their obsession.

What do you say? What contracts do you think inTrade should offer?


The Green Bag Asks: Your Law School (Really) Got Game?

Green_Bag_Almanac_Front_Cover_2006_small.jpgThis year, my last, at Ole Miss Law School, I was asked to Chair an ad hoc faculty committee on law school rankings. Like many law schools, ours has been flustered by the seemingly arbitrary way that our school has fluctuated in the U.S. News & World Report yearly rankings. And like others, we wanted not to care about such capricious things, but alas, others (including prospective students, current students, and alumni to name a few) did care. So as an institution we (myself and four faculty committee members) set out to study the factors one by one and try to determine where we could change policies, add money, etc., to constructively move factors that we had some control over.

What struck me during last semester as the committee met on a bi-weekly basis was that some schools that were perpetually labeled elite (by being in the First Tier) really did not have that many prolific or productive scholars. On the other hand, the opposite was also true: many a Third and Fourth Tier (though certainly not all) were bustling with faculty activity and innovation. So what was going on? Why wasn’t any current ranking system capturing these characteristics of the law school market?

Though I have not figured out the answer to this question, Inside Higher Ed reports today that The Green Bag Journal plans to put law school’s extravagant claims about having the best and greatest faculties in the universe to the test:

On their Web sites and in the other marketing materials that law schools distribute to raise their profiles — sometimes derided as “law porn” — virtually every law school boasts of having a faculty made up of stellar scholars, brilliant teachers and selfless public servants. “We continue to add depth to our already diverse and multifaceted faculty — excellent teachers whose high-quality research impacts leading academic and public policy issues,” reads the Web site of Northwestern University’s law school . . . .

But how are applicants — for admission and/or jobs — to know whether the schools are living up to their promises on faculty quality, that all-important indicator of the institutions’ overall quality? asks the Green Bag, which describes itself as “an entertaining journal of law.” . . . .

Read More


Ridiculously Unscientific Ranking: SSRN February Edition

The February 2008 SSRN law school rankings based on downloads are out. I thought I’d be irresponsible and compare the data to the list compiled in May, 2007, when I last republished rankings based on new downloads.* The number in the parenthesis represents the change, the “big” number is the total new downloads. Snarky comments in brackets are mine.

1. George Washington University – Law School 80571 (+10) [They’ve got nothing to hide!]

2. Harvard University – Harvard Law School 53984 (-1)

3. Columbia University – Columbia Law School 37401 (-2)

4. University of Chicago – Law School 35080 (same)

5. University of Texas at Austin 30469 (-2) [A prediction market, anticipating Leiter’s departure?]

6. Yale University – Law School 30317 (+1)

7. University of California, Los Angeles – School of Law 29491 (-2)

8. Stanford Law School 28123 (-2)

9. Georgetown University – Law Center 26502 (same)

10. University of Illinois – College of Law 25501 (+2)

11. New York University – School of Law 23416 (+2)

12. University of Pennsylvania Law School 23165 (+5)

13. University of California, Berkeley – School of Law (Boalt Hall) 22649 (+2)

14. Vanderbilt University – School of Law 19329 (same)

15. University of Tennessee, Knoxville – College of Law 18094 (+15) [Instapundit deploys!]

16. University of Minnesota – Twin Cities – School of Law 17226 (same)

17. Duke University – School of Law 14477 (+2)

18. George Mason University – School of Law 14206 (+4)

19. University of San Diego – School of Law 14096 (+2)

20. University of Michigan at Ann Arbor – Law School 13048 (+3)

21. University of Southern California – Law School 12993 (-1)

22. Northwestern University – School of Law 12811 (-4)

23. Loyola Law School – Los Angeles 12222 (+3)

24. Fordham University – School of Law 12132 (+8)

25. Florida State University – College of Law 11518 (-1)

26. Yeshiva University – Benjamin N. Cardozo School of Law 11159 (+1)

27. Boston University – School of Law 11050 (+2)

28. Temple University 10569 (+9) [Validates entire SSRN ranking project]

29. University of Virginia – School of Law 10212 (-1)

30. Ohio State University – Michael E. Moritz College of Law 10012 (-20) [@#$@]

31. American University – Washington College of Law 9902 (+17)

32. Suffolk University Law School 8711 (Offlist)

33. Indiana University School of Law-Bloomington 8521 (-1)

34. Cornell University – School of Law 8369 (-3)

35. Brooklyn Law School 8228 (Offlist)

36. Emory University – School of Law 8217 (-28)

37. University of Louisville – Louis D. Brandeis School of Law 7116 (Offlist)

38. Chapman University – School of Law 7092 (Offlist)

39. Boston College – Law School 6703 (-14)

40. Notre Dame Law School 6613 (-1)

41. Case Western Reserve University – School of Law 6579 (-6)

42. University of Colorado Law School 6182 (-4)

43. St. John’s University – School of Law 6030 (Offlist)

44. Rutgers, The State University of New Jersey – School of Law-Camden 5927 (-2)

45. Washington University, St. Louis – School of Law (-5)

46. University of Arizona – James E. Rogers College of Law 5630 (-1)

47. University of Florida – Fredric G. Levin College of Law 5595 (Offlist)

48. Seton Hall University – School of Law 5491 (+2)

49. University of Iowa – College of Law 5466 (Offlist)

50. New York Law School 5447 (Offlist)

*You can slice these data many ways, including per capita, total downloads, total papers. In my view, all methods are similarly (un)scientific.


Interdisciplinarity, Leiter and the Bluebook

bluebook.jpgGordon Smith has a nice summary post of the debate between Brian Leiter, Mary Dudziak and others on whether Brian’s faculty citation rankings accurately measure “impact in legal scholarship.”

The basic framework of the debate is

Objection: “But you didn’t measure X…”

Leiter: “True. Let a hundred flowers bloom, and do your own data collection!”

(Which strikes me as pretty persuasive.) I wanted to add a different ingredient into the pot. I think Leiter’s rankings mismeasure impact in interdisciplinary scholarship for a reason unrelated to his methodology or its merits. Simply put: the Bluebook itself undervalues interdisciplinary collaborations and thus scholarship.

I’m not nearly the first to observe that the Bluebook’s citation rules have an ideological component. See, e.g., Christine Hurt’s great piece on that very topic. But consider the interaction between Bluebook Rule 15.1, 16 and Leiter’s study. R.16 states that the citation of author names in signed law review articles should follow Rule 15.1. R. 15.1 states that when there are two or more authors, you have a choice:

Either use the first author’s name followed by “ET AL.” or list all of the authors’ names. Where saving space is desired, and in short form citations, the first method is suggested . . Include all authors’ names when doing so is particular relevant.

This seems to me to express a pretty strong non-listing preference. The “problem” is that much good interdisciplinary work results from collaborations among more than two authors – it is the nature of the beast. Take, for example, my colleague Jaya Ramji-Nogales’ forthcoming triple-authored article Refugee Roulette: Disparities in Asylum Adjudication, which was front-paged by the Times back in June. Two of the article’s authors are in danger of being ET AL.’ed in many law review footnotes, and consequently ignored in subsequent Leiter citation counts (unless the citing article’s author chooses to mention them by name in the text). This seems like a trivial objection, but it will take on increasing weight over the next ten years as empirical legal studies really comes online in the major law reviews. (Obviously, I’m writing in part because I’ve two articles in the pipeline where I’m a part of three-author teams, and the “et al.” problem is somewhat salient.)

Bluebook editors: I know you are lurking here! Can you fix this silly problem in the 19th edition?


More on Law School Rankings vs. Parent University Rankings

Earlier this week, I blogged about Paul Caron’s chart of law schools that were ranked more highly than their parent universities. Some commenters pointed out that because not all universities have law schools, there is a greater chance that universities might be ranked lower (because there are more of them).

Paul now has a new chart. He writes: “[S]ince U.S News ranks more national universities (262) than law schools (184), a more meaningful measure would look at the disparity by percentiles rather than in absolute rank.”


Law School Rankings vs. Parent University Rankings

Over at TaxProf, Professor Paul Caron has a chart of law schools that outrank their parent universities in the US News rankings. Some law schools far outrank their universities. I often wonder what effect the standing of the main university has on a law school. Paul’s chart demonstrates that law schools can thrive in the rankings even when their parents are not highly-ranked. Does this mean that law schools can establish a reputation that is by and large unaffected by the ranking of their parents? Or perhaps they would be ranked even higher but for their parents. Is their ranking differential due to greater reputation scores? Or do other US News factors account for the differential?