ExpressO Presents Yet Another Law Review Ranking

At the AALS conference this past weekend, I picked up the new glossy promoting ExpressO, the law review article submission service from Berkeley Electronic Press. ExpressO’s 2007 “Law Review Submission’s Guide” gives up the up-to-date ranking of the top 100 law reviews based on number of manuscripts received via ExpressO. The new data is interesting both as a snapshot of past author behavior, and perhaps as guidance about where the pack may be headed next. The study itself is not new – the 2006 rankings are here.

Here’s the latest top 20:

1.NYU

2. Wisconsin

3. UCLA

4. Arizona

5. Virginia

6. Hastings

7. Northwestern

8. Notre Dame

9. Maryland

10. BC

11. Chicago

12. Iowa

13. BU

14. Illinois

15. San Diego

16. Georgetown

17. W&L

18. Southern Cal

19. Wake Forest

20. Texas

Cal was at 23, Michigan at 27, and Columbia at 80. We also learn that this ranking system offers that most precious of commodities: upward and downward mobility. Yale is #74 (down from #60), Harvard is #85 (down from #57), and the Alabama Law Review is a remarkably popular #22 (up from #64).


I suppose there are several things one could take away from this data. First, the degree of movement seems a bit surprising, since the number of submissions is probably fairly large, and the reputation of schools is pretty static. I wonder if this means that people read the prior year’s data, and specifically target under-submitted journals. This might explain a couple of other big leaps: San Diego from #85 to (gulp!) #15; Arizona to #4 from #29, California to #23 from #10. Perhaps minor fluctuations in US News rankings sends schools skittering in and out of the ExpressO top 50, as authors adjust their targets. It does seem clear that “long shot” journals receive fewer ExpressO submissions, but does that mean people are saving their money…or that they’d rather submit to the HLR and YLJ directly rather than through some funky electronic service? And if so, what to make of the popularity of NYU, Virginia and Chicago?

Of course, the explanation might be quite mundane. Maybe the big upward movers recently started accepting electronic copies. Maybe the raw number difference between the schools was tiny – since it costs relatively little to submit widely, there might only be a 5% variance between #10 and #80. Or maybe something on particular journal websites (think: HLR and YLJ) suggested that – notwithstanding ExpressO’s willingness to submit articles these reviews – authors were better served by direct submission.

It is clear that BePress thinks it’s useful to provide some ordering of journals, if only to generate posts like this. Nonetheless, one senses that folks in the People’s Republic are at least a bit embarrassed about the rankings. According to the booklet “they are intended to complement, not replace, other rankings mechanisms such as number of citations and law school ranking”. Uh huh.

I will leave it to others to spend countless hours doing the statistical legwork analyzing this new dataset. The list leaves me wondering the reasons for these preferences. And I wonder whether this data might be as good a proxy for perceptions of law school quality among productive junior faculty as the US News faculty reputation surveys. (I recognize that among top law schools, this might not be a particularly good indicator.)

The good news for authors is that last year, over 90% of all articles received at least one offer. And 45% got that first offer within a week. The good news for law school gossip hounds is that BePress has produced something new to assist in that most essential of all professorial pursuits: procrastination.

UPDATE: Over at Moneylaw, my old colleague Al Brophy takes the NYU Law Review to task for requiring those submitting electronically to use ExpressO. A fair critique, and a good explanation for NYU’s first place finish.

You may also like...

8 Responses

  1. Tim S says:

    I can tell you that ExpressO’s willingness to submit to a journal bears no relationship to whether ExpressO’s electronic format is compliant with the journal’s own electronic submission format. My journal does not accept electronic ExpressO for this reason, but I don’t believe ExpressO so informs its authors. To satisfy its oblgation to the authors, ExpressO prints out a hard copy and mails it to us. Perhaps professors are aware of the difference and choose to submit to such journals differently. If the variance between journals is narrow, this could have an enormous relative effect

  2. Note that some journals have their own online submission systems; others do not. The journals that do not will tend to have lower ExpressO rankings, because “direct” submission is a more attractive prospect when it doesn’t involve a large mailing envelope and twenty-seven dollars in postage. Is this difference correlated with anything meaningful in law review ranking? Probably not.

  3. Another thing to keep in mind is that ExpressO is used by student authors as well as members of the legal profession and academy. In fact, when I was reviewing manuscripts, it was my estimate that nine out of ten ExpressO submissions we received were penned by second or third year law students. Because student-written articles net a journal far fewer citations, many journals choose to publish only a few such submissions each year. During my year as Managing Editor I think we published only two or three articles that were submitted through ExpressO, because it’s so tedious to separate the wheat from the chaff.

    The fact that ExpressO is popular with law students leads me to believe that these rankings reflect that a substantial portion of ExpressO authors are gaming the system. As you suggest when you point out the wild swings in rankings, I think the students who use the system, and know that their chances of publication in Harvard Law Review are an order of magnitude lower than being struck by lightning, consciously select journals with lower rankings with the notion that publication there is more attainable.

    I mean no disparagement towards student authors, but the pretentious nature of the legal academy simply results in fewer citations to student-authors work, and a key metric by which journals are measured is the quantity of citations their articles receive.

  4. Anthony says:

    “I mean no disparagement towards student authors, but the pretentious nature of the legal academy simply results in fewer citations to student-authors work, and a key metric by which journals are measured is the quantity of citations their articles receive.”

    Do you have any empirical evidence to back this up? Because the data I’ve been gathering shows that this isn’t really the case — for instance, student notes in the Columbia Law Review appear to be cited on average a lot more than the average non-student article in the Florida State Law Review. Perhaps even more significantly, the Columbia Journal of Law and Social Problems, which *only* publishes articles by Columbia law students who happen to be members of that journal, is ranked #21 out of 84 public policy journals according to the Washington & Lee Law Library site.

    Perhaps student work would be cited more often if journal editors didn’t stigmatize them with the “notes” “comments” “student article” etc. labels and instead treated student papers the exact same way as papers written by everyone else?

    Interestingly enough, my data also shows that many (perhaps even most) citations to student notes do not follow the BlueBook rule that requires identifying the work as a student-written piece… perhaps that should send a signal to law reviews to not engage in these sorts of ridiculous practices.

  5. A number of law reviews now say in around November that their issues are closed and that they will not review any submissions until the new board is elected in February or March. How dumb can these law student elites get? Important manuscripts that are finished after the Christmas break are not going to be submitted to law reviews that say “No submissions please for the next few months.” It reminds me of those stores you see where all the entrances from the street are blocked except one. Why make it easy for customers to get access? After all, the more customers, the more work for the employees. Same with these law reviews. One manuscript is just as good as another, right? Who needs the extra work of reviewing them?

  6. Steve says:

    I’d be interested in how many law profs use ssrn for submission. I’m hesitant to use it since it only sends a link to the paper. There are so many reasons to be dissatisfied with the whole law review process; why do law profs put-up with it? Is it becasue the system protects the status quo or simply becasue lawyers hate change?

  7. “Do you have any empirical evidence to back this up?”

    No, unfortunately my evidence is entirely anecdotal, based upon the articles and ranking of the journal for which I worked, and conversations with editors from several other journals in that same subject area. I’ll be the first to say that my evidence lacks the weight of a long-term statistical study. In sum, I have observed a correlation — but not proof of causation — between the ratio of student to non-student content and W&L ranking, at least for a handful of technology law journals.

    “Perhaps student work would be cited more often if journal editors didn’t stigmatize them with the “notes” “comments” “student article” etc. labels and instead treated student papers the exact same way as papers written by everyone else?”

    I decry this practice as well, and in 2004 the journal for which I served adopted formatting and page-layout guidelines that render student-written works virtually indistinguishable from the rest of the book, although truly blind conditions will never exist because the first footnote in each article tells you something about the author’s credentials.

    “Interestingly enough, my data also shows that many (perhaps even most) citations to student notes do not follow the BlueBook rule that requires identifying the work as a student-written piece… perhaps that should send a signal to law reviews to not engage in these sorts of ridiculous practices.”

    Although I agree with your implicit argument that the Bluebook should be revised to eliminate the disparity in citation style between student and non-student authored work, the fact that the citations you’ve found already fail to observe this rule doesn’t necessarily tell us that the legal community doesn’t care about an author’s credentials. This may easily be explained by poor technical editing, which is more common than many technical editors would like to admit. 🙂

    I applaud your endeavors to gather data on this matter, and I look forward to any long-term analysis of the cite-worthiness of student work versus practitioner/academician work in the future.

  8. Archana says:

    This is really interesting stuff. Do you have any advice on submitting to specialty journals? I am trying to publish a piece about Guatemalan law and just submitted this past week to about 20 journals via ExpressO.