Category: Law School (Rankings)


A Law Porn Blog

law-porn.jpgIt’s known as “law porn” — those glossy brochures that arrive in torrents in every professor’s mailbox touting the wonderful accomplishments of law schools. There’s been a recent swell in posting lately in the legal blogosphere about law porn — from tips by David Bernstein about how best to produce the porn to Jeff Harrison’s plea for a Do Not Mail list.

Why does law porn exist? To raise up a law school’s US News Ranking. Law schools like to tout their accomplishments. Without law porn, how would we know that Professor X published a new book? Or that Professor Y spoke at the school? Or that the school put on a symposium? Or that Professor Z got an honorary degree from the University of Antarctica Law School?

Brian Leiter has devoted a significant amount of time to mocking the obscene claims made within some law school promotional materials.

What should be done? How do we stamp out law porn?

The answer, I believe, is to give the law schools a different outlet for releasing all this information. After all, we want to encourage law schools to do the kinds of things depicted in law porn — publish articles and books, hold conferences, have faculty workshops, and otherwise create a vibrant intellectual community. We want this healthy activity to be reflected in a law school’s ranking. We just don’t like it placed in our mailboxes.

My solution is for law schools to create a law porn blog. A representative from each school can post about the various news, conferences, and publications at the school. Of course, it need not be called “law porn blog,” although with a moniker like that, I’m sure it would enhance the visitor traffic. But a blog can serve as a centralized resource for announcing law school news, and it can save countless money and trees.

So we don’t need to end law porn — just steer it to a new venue, an online red-light district for the legal blogosphere. It’s time for the law schools to join together to create a law porn blog.


In Praise of Market Imperfections

You would expect to go out of business if you hired people without knowing if they could do the job. And, the same would be true if you had no reliable way of measuring if they actually were doing the job once they were hired. Law Schools do both of these. They would prefer to hire second tier students from elite law schools rather than top students form non elite schools. Yet, the empirical evidence I know of shows that the scholarly production of the non elites once hired is no lower than that of the elites. In fact, since law reviews use credentials as a basis for article selection, non elites may be actually outperforming elites. Do we have any reliable way to evaluate what the new hires do? Give me a break. We have faculty classroom visits announced ahead of time that result in evaluations that could have been written ahead of time – all positive given the propensity of law professors to shirk from institutional responsibilites. And we have student evaluations that largely reflect expected grades. On scholarship, we send the articles to a list of reviewers influenced by the candidate or just the regular suppliers of positive letters. Be grateful for market imperfections!


Law School Capture

My blogging schitk is grousing about legal education. I do this mainly on moneylaw and classsbias and serve as a technical advisor to privilegelaw – a blog that must be read starting earlier and moving to more recent. In many respects I think legal education has been captured by and run for the convenience of faculty who are far more often than not the children of privilege. (If you are already preparing to comment, I ask that you skip it if the comment is about a law professor who is not a child of privilege.) As I blog along this month these themes will become more developed. First, here is a test to examine your own school for its level of capture.

Before taking the test there are some clarifications. There is good and bad capture. I can imagine a law school captured by the faculty and, with or without help from the administration, run for the benefit of stakeholders. This would be faculty that is constantly asking “What should we be doing”? and matching it against what it is doing. On the other hand, capture can mean that a faculty runs the law school for its convenience with only modest limitations imposed by others and even here observing the limits are part of a pattern of self-interested behavior.

Second, from time to time I get an email that carries with it the assumption that all my grousing is about my own School. Wrong! The examples are not all taken from my School, and if you really called my bluff I would not bet that my school is any different than the average. So how does your school stack up on the capture quiz: (you can give your school partial points)

1. Are classes scheduled mid week and mid day even though it means conflicts that limit student choices? (1 point for a yes.)

2. Has you school seriously reviewed any of its foreign programs, centers, institutes or degree programs in the past two years? (1 point for a no.)

3. Does your school depend on adjuncts to teach mainline courses while offering small enrollment specialized courses taught by full time profs? (1 point for a yes)

4. Does your school have a high curve that is sometimes defended by not wanting to hurt the feelings of the students or other justifications that amount to “I do not want to actually have to evaluate someone?” (1 point for a yes)

5. Do colleagues propose programs that are needed even though they will not actually be teaching, traveling, or receiving a reduced teaching load if the program is adopted? (1 point for a no)

6. Can students graduate and take half or more of their classes on a pass/fail basis. (See question 4) (1 point for a yes)

7. Does your school encourage massive, barely supervised, externships that generate tuition dollars, provide free labor and, by the way, mean less teaching ? (1 point for a yes).

8. Does your administration mass mail glossy reports listing every conceivable thing faculty submit as reportable? (1 point for a yes).

9. Does your dean appear to be afraid to suggest that the School should do better and then hold people accountable? (1 point for a yes)

10. Is the norm that just about everyone is gone by 11 AM on Friday? (1 point for a yes)

If you scored a 10, it’s best to go into receivership and start from scratch.

If you are in the 7-9 range you will probably be a 10 soon.

If you are 4-6, I think you are average and a few hires could move you either way.

If you are 3 or less, congratulations.


Is Sorting Law School’s Only Function?

17402.jpgBainbridge and others are abuzz over Rush and Matsuo’s paper, Does Law School Curriculum Affect Bar Examination Passage? An Empirical Analysis of the Factors Which Were Related to Bar Examination Passage between 2001 and 2006 at a Midwestern Law School. The paper reports that simply taking “bar courses” generally does not improve performance on the Bar Exam.

The paper is clearly written but not (for me) surprising: it fits unpublished research I’ve seen, and common sense. I’d bet that a large minority of all law professors, and a majority of law professors hired since 1990, haven’t sat for the Bar in the jurisdiction hosting their law school. It would be surprising if teaching behind this veil of ignorance could significantly improve test scores for marginal students. You can’t teach to a test you haven’t seen.

But if that’s true, two questions come to mind. The first has been addressed by some commentators already, and boils down to: if not bar courses, what courses should law students take? Josh Wright responds: antitrust! Sam Kamin disagrees: professors you like! As for me, I offered the following comments in a package of diverse suggestions on this topic from my colleagues distributed to our first year students at the end of the Spring term:

I recommend that you select courses that are challenging and intrinsically interesting. This means tailoring course selection to your abilities (take a tax course, especially if you are afraid of math); and interests (recall what made you excited about the Law before coming here). The data I have seen do not correlate Bar passage with any particular package of courses, but rather with your overall performance and work ethic. Certain employers may expect to see foundational courses like corporations and evidence on your transcript, but I believe those expectations are the exception rather than the rule. The bottom line: take classes that will make you want to come to school in the morning.

Maybe such advice is helpful, maybe not. But regardless, it doesn’t answer the big (second) question, which is this: is there a point to law school beyond sorting students?

Read More

Law School Ranking: Measurement vs. Characterization

I have long been concerned about negative externalities from ranking systems. Perhaps people and institutions are always prone to try to distinguish themselves. If so, Brian Leiter’s expert consultation for the MacLean’s rankings of Canadian law schools may be a good thing, since, as he states, it resulted in “a ranking system that can not be gamed, that does not depend on self-reported data, and is not an indecipherable stew of a dozen different ingredients.”

However, Benjamin Alarie at Toronto has critiqued Leiter’s efforts. Here are a few issues:

One of the central problems with how Faculty quality is measured is that it doesn’t assess influence in publications aside from 33 Canadian law journals. As an initial matter, I think it is fair to say that academics seek to publish in places with the most active audiences for particular types of research—for example, the best journals to publish law and economics research in are likely to be American peer reviewed journals such as the Journal of Legal Studies, or the American Law and Economics Review, or even professional economics journals.

[I]t is unlikely that frequency of citation is a perfect proxy for quality; for example, overly provocative papers are sometimes cited for being so provocative.

It is unclear what the threshold used was for including the firms as among the “elite firms” used by Maclean’s.

[A national reach] measure . . . misses “international reach” of the law schools that regularly place students in the excluded top New York and Boston firms, in international NGOs, and in various other attractive positions.

[By the way, I put the links into those block quotes above.]

I think these are all valid points, but the problem is even larger. The consumers of these rankings are, by and large, students looking for a good education and firms looking for well-trained lawyers. Why so much focus on whether the law schools are feeders for “top” firms? Perhaps the best law teaching is that which manages to train people for diverse careers in law.

Finally, a more philosophical point.

Read More


Law (Professor) Blog Ranking

counting2.jpg[UPDATES IN RED] With the assistance of our intern, Sam Yospe, I decided to update the law blog ranking project first completed by Roger Alford at Opinio Juris. The following list ranks 41 law professor blogs according to traffic (as calculated by The Truth Laid Bear). To minimize distortion, we applied average monthly data, and ran the measurements about two weeks ago. This list only includes blogs that have at least one law professor as a regular blogger, and we exclude blogs that focus entirely on politics or current events, and blogs that are not tracked by Truth Laid Bear. Some blogs, like Patently-O, appear to be tracked only inconsistently by TLB and are not included in this list for the time being.

While this list ranks blogs by traffic, we have also included Truth Laid Bear’s own weighted rankings. TLB ranks blogs using an algorithm that accounts for a “link score,” a measure of how often blogs are linked to by other blogs. While the ranking by traffic that appears below and TLB’s ranking are related, the correlation appears to be statistically insigificant. For example, Bainbridge ‘s blog is ranked second by TLB amongst legal blogs. Yet, by traffic it ranks ninth. Conversely, Sentencing Law and Policy is the ranked third amongst all legal blogs in traffic, yet it ranked 2,164 by TLB, a lower ranking than some legal blogs that receive less traffic.

These data suggest that there is significant heterogeneity in the audience of legal blogs, as some blogs seem to have wide audiences of readers not shared by others, and (indeed) exist in entirely different communal spaces. This fractured audience finding challenges my flat traffic thesis. Importantly, this post does not intend to suggest a thing about the relative quality of the blogs ranked, nor those that are not mentioned. This isn’t even a popularity contest.

Read More

Can Lawyers Afford Not to Play the Rankings Game?

In an article in National Jurist, rankings expert Brian Leiter was quoted as saying that “The more info and the more competing measures there are out there, the less concerned law schools will be about pleasing their U.S. News master.” In a different setting, I too have been enamored of a diversity of rankings. I’ve also hoped that law schools would more formally recognize, say, their top 10% of brief-writers, researchers, or oral advocates, elevating the visibility of those with exceptional skills in areas outside of exam-taking.

However, Leigh Jones reports that there are some costs associated with a diversity of rankings:

By some estimates, law firms have about 200 chances each year to participate in rankings, awards programs or so-called “league table” publications that they hope will distinguish them from the competition. Not only are firms finding their marketing resources stretched thin by the onslaught, but they also say it is getting tougher to wade through the rubbish. “Not a day goes by that I don’t come across another one from someone I’ve just never heard of,” said Lloyd Pearson at White & Case.

Pearson is the “communications manager at the 1,907-attorney firm,” and “was brought aboard last year to handle the flood of surveys, questionnaires, phone calls and research related to awards and rankings that the firm pursues each year.” What happens to firms who can’t hire someone to manage the information overload?

Unfortunately, avoiding the rat race may not be much of an option. As law schools learned to their chagrin, an “echo chamber” effect can cause early ratings to become self-reinforcing. This dynamic sheds new light on lawsuits against websites that purport to rank or score lawyers. Plaintiffs may rightly worry that a low initial rating will become a self-fulfilling prophecy, handicapping their chances at getting good cases and thereby pushing them further down the pecking order.

Hat Tip: Eric Goldman.


Are Alternative Law School Rankings Any Better than US News?

The WSJ has an article on alternative law school rankings to the infamous US News rankings. According to the article: “In the last two years, at least a dozen upstart Web sites, academic papers and blogs have stepped in with surveys of their own to feed the hunger for information on everything from the quality of the faculty to what a school’s diploma might be worth to future employers.” It has this chart of some alternative rankings of law schools:


In my opinion, all of these rankings have serious flaws.

US News — The reputation surveys are only given to deans and just one or two faculty members (a very unrepresentative sample of faculty). The reputation surveys are too easy to game. And the reputation scores of 1 through 5 are not granular enough. For example, Yale has an academic reputation score of 4.9, Harvard 4.8, and Stanford 4.7. That means that people in the surveys are rating these schools with 4s or less. Who gives less than a 5 to any of these schools on a 1-5 scale? Some of the other numbers factored into the US News equation are quite silly and can be easily cooked, with schools using accounting tricks that would make Enron officials blush.

Supreme Court Clerkship Placement — This is a ridiculous way to rank schools. Getting a Supreme Court clerkship is like winning the lottery. There are far too many qualified people than positions, and getting a position certainly takes merit but it also takes a lot of luck. Part of it depends upon the connections of a school’s professors, who can place clerks with feeder judges or may even have influence with a Supreme Court Justice. Nobody seriously goes to law school planning on getting a Supreme Court clerkship. And it’s based on total number of clerks, so the ranking in the WSJ column is meaningless since some schools are much larger than others (Harvard is more than twice the size of Yale).

Read More


Is This The Beginning of the End for U.S. News Undergrad Rankings, and Will Law School Rankings Survive the Collapse?

The New York Times reports today that the presidents of dozens of liberal arts colleges have agreed to stop participating in U.S. News’ college rankings survey. According to the report, the Annapolis Group, an association of liberal arts colleges, released a statement that a majority of the 80 college presidents attending its annual meeting had declared their intent not to participate in the U.S. News rankings. The move follows on the heels of similar efforts by college presidents earlier this year, and of a widely-publicized critique of the rankings system last month in the Chronicle of Higher Education.

Has the liberal arts world finally decided that enough is enough? The Times quotes Judith Shapiro, president of Barnard College: “Frankly, it had bubbled up to the point of, why should we do this work for them? … [T]his is not our project.” Of course, the jury is still out on whether the liberal arts colleges’ nascent rebellion will have legs. Not surprisingly, some schools at the top of the food chain – e.g., #2 Amherst – plan to continue to cooperate with U.S. News, and want further “discussion” of the issue. Still, this latest move by liberal arts colleges seems to be more than mere window dressing.

All of this has me wondering: If U.S. News loses its undergrad rankings cash cow, will the law school rankings be far behind? Or might the law school rankings survive, even if the undergrad rankings collapse? Put differently, are there reasons why the law school world will (and perhaps should) continue to “do U.S. News’ work for them”?

I can think of a couple of reasons why law school rankings might survive, despite the collapse of undergrad rankings.

Read More


May SSRN Download Counts

From the Department of Possibly Misleading Information comes the Law School SSRN Rankings for May. See previous installments here and here. I originally highlighted in blue schools that significantly outperformed my impression of their popularly conceived rank; and in red those that underperformed. But then I reconsidered and concluded that this was an unproductive exercise. So I am presenting these data without further interpretation.

By Total Downloads

1 Harvard University – Harvard Law School 209695

2 University of Chicago – Law School 188088

3 Columbia University – Columbia Law School 149467

4 Stanford Law School 136852

5 University of Texas at Austin – School of Law 124802

6 University of California, Los Angeles – School of Law 114838

7 Yale University – Law School 111644

8 Georgetown University – Law Center 103398

9 George Washington University – Law School 91441

10 University of California, Berkeley – School of Law (Boalt Hall) 81539

11 University of Southern California – Law School 80660

12 University of Illinois – College of Law 79954

13 Vanderbilt University – School of Law 79510

14 New York University – School of Law 74137

15 University of Minnesota – Twin Cities – School of Law 70470

16 University of Pennsylvania Law School 62323

17 Duke University – School of Law 50452

18 University of Michigan at Ann Arbor – Law School 49161

19 Emory University – School of Law 48911

20 George Mason University – School of Law 46048

21 University of San Diego – School of Law 45385

22 University of Virginia – School of Law 42921

23 Boston University – School of Law 35095

24 Ohio State University – Michael E. Moritz College of Law 34651

25 Northwestern University – School of Law 34541

26 Boston College – Law School 33519

27 Florida State University – College of Law 32810

28 Yeshiva University – Benjamin N. Cardozo School of Law 30092

29 Cornell University – School of Law 29511

30 Fordham University – School of Law 28711

31 Loyola Law School – Los Angeles 25838

32 Michigan State University – College of Law 25441

33 Temple University – James E. Beasley School of Law 17913

34 Washington University, St. Louis – School of Law 17536

35 New York Law School 17094

36 Case Western Reserve University – School of Law 16896

37 Indiana University School of Law – Bloomington 16583

38 Rutgers, The State University of New Jersey – School of Law-Camden 14568

39 University of North Carolina at Chapel Hill – School of Law 14084

40 Washington and Lee University – School of Law 13188

41 University of Maryland – School of Law 12471

42 University of Colorado Law School 12403

43 Notre Dame Law School 12216

44 Brooklyn Law School 12096

45 University of Tennessee, Knoxville – College of Law 11952

46 University of Cincinnati – College of Law 11671

47 University of Iowa – College of Law 11637

48 University of California, Davis – School of Law 11310

49 University of Arizona – James E. Rogers College of Law 11202

50 University of Wisconsin – Law School 11106

Read More