Nominally Empirical Evidence of Unraveling in the Law Review Market

book21a.jpgIn a previous post, I observed that “the time for submitting law review articles is creeping backwards.” I then hypothesized that “we are experiencing what Alvin Roth called the ‘unraveling’ of a sorting market.” This is bad news:

Authors may not be able to get any sense at all of the “market value” of their article (loosely reflected, the myth goes, by multiple offers at a variety of journals). Conversely, journals feeling pressure to move quickly will increasingly resort to proxies for quality like letterhead, prior publication, and the eminences listed in the article’s first footnote (which tell you who an author’s friends and professional contacts are).

At the end of that post, I promised to “explore empirical evidence that this is in fact an unraveling market problem (as opposed to anecdote, to the extent possible).” As it turns out, this was a hard promise to deliver on. There simply isn’t data out there – at least that I’ve been able to find, that collects historical information about the submission processes to law reviews. This is somewhat surprising. Law professors are insular, interested in navel gazing, and well-motivated to do anything other than grading. Moreover, the process of submission is an economically consequential activity. But only recently, in two works-in-progress, has there have been any attempt made to systematically get at this problem. See here, and here.

I thought I’d make a modest contribution to the field by contributing some data from Temple in this recent submission season, and ask our readers to contribute with their experience as well. The sample size is tiny; the respondents self-selecting. This is, therefore, Co-Op’s second “very non-scientific survey” this week. It’s a trend! The data is not meant to suggest any definite conclusions, but rather help researchers with hypothesis formation. But I’ll offer some grand thoughts at the end of this post anyway.

I emailed ten law professors at Temple who submitted an article to a mainstream law review during the spring season, and have received five responses to date. To avoid certain problems, I’ve stripped identifying information – like where the articles placed – from the responses and collated them.

  • All authors used ExpressO.
  • All five authors sent out their articles before March 3. The earliest mailing was on February 22. This contrasts with anecdotal accounts from past years. One colleague related submitting in 2001 in “the end of April”, and receiving an acceptance on “July 10.” In 2005, a colleague related submitting an article on April 5, and receiving an offer on April 18.
  • Average paper length was 24,599 words (min: 17,668; max: 29,846).
  • Three of five authors used a staged process of submission (i.e., sending out articles in waves). The average number of journals submitted to was 71 (min: 65; max: 199; median: 90).
  • The average number of acceptances per article was 8.8, one acceptance per 8 submissions. (Min: 2; Max 14; Median: 8 ).
  • The average time to the first offer was 13.6 days (min: 2; max: 33; median: 6).
  • The ultimate journal making an offer provided an (average) of 8.2 days to respond. (min: 48 hrs; max: 19 days; median: 9).
  • The total days, from start to finish of the process, averaged 33 days (min: 6; max: 52; median: 32).

Grand Thoughts: The median time to first offer (6); total days in process (32/3); and starting date (by early March) support the hypothesis that article selection is abbreviated. If we had more data, including historical reference points, we’d be able to say more. And if we had enough data, we’d be able, as I’d like to, call for a moratorium on sending out new articles before March 1 of each year. Among other pro-social effects, this would allow authors more time during the school year to workshop their pieces to outside audiences.

You may also like...

5 Responses

  1. Al Bophy says:


    Very interesting data. Sounds like Expresso is gaining in popularity. I’d like to hear more of your thoughts on that. Is this something we should be happy about? But a moratorium on submissions before March 1. Why?

  2. Christine says:

    Dave — interesting stuff. One thing — the number of offers seems to relate more to the actions of the submitter than the reviews. If you are an avid expediter and start with a journal #78 and expedite with every journal starting at #77, then you’ll end up with more offers. If you send out to 60 journals, get one offer and stop, you won’t. You can just gin up offers.

  3. Dave Hoffman says:

    Al, well, my feeling is that there is no reason to expect that current backward creep in submissions to stop – submissions in 07 might start in early to mid feb.; in 08 late january, etc. This would be (in my opinion) a bad effect, because it would make it more difficult to workshop during the school year. On the other side, journals are under increasing pressure to take pieces quickly, without a significant review time. This means that the system is likely selecting for flash over substance more of the time.

    Christine: You are right, of course, that expedites produce offers (some of the time). But that isn’t totally in the submitter’s choice. ExpressO and electronic submission generally leads to too many article submitted to “top” journals, resulting in those journals relying on proxies like letterhead and prior offers when deciding whether to read articles in the first place.

  4. Ann Bartow says:

    Hey Dave,

    This is sort of ORTHONGONAL to your post (see how subtly I integrate jargon?) but I noticed something interesting. Leiter listed the “‘Top Ten’ Corporate & Securities Articles for 2005” here:

    Look how many were published by the authors’ home law reviews! I don’t know what if anything that means, but it surprised me.

  5. Seth R. says:

    Honestly, I’d like to see a lot more of move toward a peer review system.

    But really the biggest opponents of that kind of thing are the school faculty themselves who don’t want another time commitment.