We Hate Rankings, But We Love Them Too

leiter-rankings.jpg

In an earlier post here, Dave Hoffman adds another quibble about Brian Leiter’s citation rankings of law professors. Several others have voiced criticisms about the rankings, including Mary Dudziak and Brian Tamanaha.

In the comments to Dave’s post, Marty Lederman and Brian Leiter get into a debate about the rankings, with Marty saying that the rankings don’t produce much in the way of surprises. In other words, the rankings tell us what we already know. Brian responds that the rankings do reveal a few suprises, but he agrees that the rankings aren’t giving us any shocking news.

I’ve always found the way we professors react to rankings to be quite interesting sociologically. We hate them and love them at the same time. We realize their faults, yet we still crave them. What we really want is to rank the status of who’s work has the most impact. Probably the best way to do so would be to get a large number of scholars to rate each professor from a 1 to a 10. Then folks could be ranked by their average score. Of course, that might involve some hurt feelings, so we turn to other less-helpful metrics, such as citations. True, citations have many flaws, but we love to rank other academics and see where each of us stand, so we need something that we can use to do the ranking. Citations are easy to count, and the lists they produce aren’t all that bad — despite the fact that citations aren’t really the best way to measure what we want. The best way, as I’ve said, is a system where we rank each other, but absent that system, we look for the second-best.

Now, we could try to take the high road and say that we’re above ranking and don’t care . . . but many folks do care. We want rankings but we just don’t want to admit how much we like this guilty pleasure.

Regarding Leiter’s rankings, I’m surprised nobody has quibbled about how he has defined the fields. His categories are:


Business Law

Civil Procedure

Constitutional & Public Law

Criminal Law & Procedure

Critical Theories

Environmental Law

Evidence

Intellectual Property/Cyberlaw

International Law

Labor & Employment Law

Law & Economics

Law & Philosophy

Law & Social Science

Legal Ethics/Legal Profession

Legal History

Tax

Torts & Products Liability

Wills, Trusts & Estates

Consitutional/public law is an immensely broad field — why not define it in subfields? Other fields, such as tax, are much more narrowly-circumscribed. Why are intellectual property and cyberlaw combined? Why are criminal law and criminal procedure combined? Criminal law and criminal procedure often have very little to do with each other. Other fields are missing. Where’s property? Those poor folks who write in multiple fields (like me) get listed nowhere. We don’t exist; we’re ghosts who roam the earth with our satchel of citations but with nowhere to put them down. I do have a field — privacy law — but that doesn’t make the cut. Nor does law and humanities.

The response to this is, as Dave Hoffman says, then do your own rankings! And that I’m not willing to do, and since nobody wants to devise a ranking system other than citations, then Brian rules the roost.

It’s the same phenomenon as US News. We all decry a magazine for ranking schools, but complaining and griping are not going to do much good. US News wants to sell magazines. They don’t care about getting the rankings perfect — just good enough to be plausible . . . good enough so that people buy the issue. Brian has created an alternative ranking system, which seems to me to be the best way to counter US News. If you don’t like it, do it yourself!

We love to take pot shots at rankings, especially by claiming that they are valued too much. Just like we love to attack student edited law reviews, claiming that article placements are too random. But when it comes down to it, we still rely on them. Gripe as we might, the status quo exists because we accept it and don’t take the effort to change it.

Marty’s comments basically ask: “Why should professors take citation rankings seriously?” Marty faults them for telling us what we already know, but would they be better if they resulted in a list of esoteric scholars we’d never heard of? Brian’s citation rankings work, despite their tremendous flaws, because they come close to approximating our rough sense of things. That’s why the US News rankings work. Despite using immensely silly criteria, US News creates rankings that come close to our sense of things (albeit with some anomalies). If they were too far off, they’d be laughed at and discredited.

Like it or not, we take rankings seriously. Too seriously. Rankings are law professor crack, and whine all we want, we’re addicted.

You may also like...

2 Responses

  1. Michael Yelnosky says:

    I decided to “do my own rankings,” and I think the results have lots of surprises.

    http://law.rwu.edu/facultyproductivity/

    Nobody has ever ranked the scholarly productivity of “non-elite” law schools before.

    And next year I plan to apply the same methodology to compare the productivity of the top 10 schools in my study (Hofstra, Roger Williams, Michigan State, New York Law, Wayne State, Capital, Mississippi, Chapman, Widener, and Willamette)to the schools ranked from 51-100 by U.S. News. I expect more surprises.

  2. kuepo says:

    I decided to “do my own rankings,” and I think the results have lots of surprises.

    http://law.rwu.edu/facultyproductivity/

    Nobody has ever ranked the scholarly productivity of “non-elite” law schools before.

    And next year I plan to apply the same methodology to compare the productivity of the top 10 schools in my study (Hofstra, Roger Williams, Michigan State, New York Law, Wayne State, Capital, Mississippi, Chapman, Widener, and Willamette)to the schools ranked from 51-100 by U.S. News. I expect more surprises.