Law Review’s Thin Filter and Law’s Low Eigenfactor

What’s your Eigenfactor?  Scholars can find out now by looking at their scholarship page on the Social Science Research Network. Since inception a decade ago, SSRN ranks scholars by downloads; in the past few years, it refined that coarse measure using a separate list of citations, but only to other papers in SSRN.  Now comes the eigenfactor, an integrated metric of scholarly influence.

This will be an interesting addition to the dashboard data used in  studies of scholarly influence .  All these figures, old and new, are endlessly contestable.   The new figure adds a new ranking column which, naturally, differs from the downloads or citations columns. 

Often, the difference is of limited significance: scholars with high downloads often have high citations and now have high Eigenfactors.  But sometimes the differences are wild: there are people who rank at the top of downloads but lack many citations at all.    A few of those still have an impressive Eigenfactor rank, but most tumble way down the ladder.   

More striking is how the rank differences among these columns are less pronounced among economists and finance professors, as a cohort, compared to law professors, as a group.  Based on an impressionistic skimming of the columns for the first few hundred, there’s greater stickiness among non-law profs than among law profs. 

Economsits with high downloads still tend to have high Eigenfactors and vice versa; for law profs, though, other than the download leader, Lucian Bebchuk, even those with the highest downloads (ranked in the single or double digits), bounce way down the Eigenfactor rankings into the 200s, 1000s, 5000s or deeper. 

Many theories appear.   Mine attributes this to the student-gatekeeping function at law reviews.  Nearly every piece of legal scholarship gets posted and published because the selection process is modest; the filter comes in citation practice.   The filter in other social sciences is extremely tough ahead of publishing and posting; the practice of citation isn’t much of a filter at all. 

There may be other reasons for this impressionistic difference too.  Perhaps legal scholarship just isn’t as hot out there in the networks that Eigenfactor captures, compared to economic and financial scholarship.  But, no, that doesn’t seem right, does it?

You may also like...

11 Responses

  1. Frank Pasquale says:

    I like Einstein’s observation that, “Not everything that can be counted counts, and not everything that counts can be counted.” Brian Tamanaha has an interesting practical example of that insight:

    As those who’ve critiqued Google’s linking methods have noted, “citation” can mean many different things: a paper may cite a source to praise it, mock it, use a random statistic within it, etc. Any of these uses may arise out of what Robert K. Merton calls “The Matthew Effect in Science:”

    I have not studied eigenfactor methodology, but let me float one hypothesis on the robustness of the “economists’ and finance professors'” works. If the methodology values mere citations, it’s good to be in a field where authors a) produce many articles and b) cite to each other. Perhaps one key to producing many articles is to concentrate on pure theory—recall Larry Summer’s famous dismissal of finance experts as “ketchup economists.” Most cynically, one could say that a field that used citations as a sort of “currency” might generate a very high number of them in a shorter time than one where they were less valued. Such a “currency” understanding could encourage gaming. A field dominated by people who pursued that sort of gaming (or using citations to curry favor from one another) would do much better than a field that failed to shape its scholarly discourse in order to maximize citations.

    A field that has high citation levels may just be a field whose members are more interested in talking to/citing each other, than in engaging with people outside the field. A brilliant historian could write the history of slavery in South Carolina in the 1820s to 1840s, and never be cited by a historian of labor organizing in New York in the 1940s. If we let citation counts drive academic practice, historians would be much better off all studying one period and ceaselessly citing each other.

    Finally, if the eigenfactor is being used to demonstrate the superior scholarly “impact” of a certain group of social scientists (or legal scholars), we might want to pause to consider if that impact is positive or negative. Consider this perspective from the filmmaker behind the documentary “Inside Job:”

    “Over the past 30 years, the economics profession—in economics departments, and in business, public policy, and law schools—has become so compromised by conflicts of interest that it now functions almost as a support group for financial services and other industries whose profits depend heavily on government policy. The route to the 2008 financial crisis, and the economic problems that still plague us, runs straight through the economics discipline. And it’s due not just to ideology; it’s also about straightforward, old-fashioned money.”

    If citations are sometimes used to curry favor with those who are gatekeepers to lucrative consulting gigs or other perquisites, perhaps they are less a sign of rigor than we ordinarily assume.

  2. Orin Kerr says:


    I’m a little lost: Where is this information on the SSRN page, and what does it measure?

  3. Lawrence Cunningham says:

    Orin: SSRN listed the eigenfactor all day yesterday next to “SSRN Author Rank: xxx by Downloads” in bold, right under each author’s name on their page. Apparently, yesterday’s roll-out was a preliminary test, taken down at day’s end, with the final soon to come.

    The eigenfactor is a trademarked term referring to statistical expressions of scientific knowledge or scholarly influence based on network algorithms. The one SSRN will use was specially developed for it by two pioneers in the field, Carl Bergstrom and Jevin West from the University of Washington.

    SSRN’s exact protocol hasn’t been published but in general the calculations attempt a better representation of the influence of an article, book, journal or author by steps like adjusting raw citation counts according to the influence of the articles, books, journals or authors citing them.

    For SSRN, this has been a project long in development–since at least late 2008, when the company announced it in its year-end letter.

  4. dave hoffman says:

    Isn’t the (serious) issue with this that SSRN ordinarily counts citations only when they are put into a separate bibliography so they are easy to catch, so that most citations in most law reviews are simply missed?

  5. A.J. Sutter says:

    Dave, if the result is that that authors pressure law reviews and academic publishers always to include lists of references, rather than leaving readers to slog seriatim through all the footnotes, Eigenfactors will have performed a valuable service. As Frank Pasquale will appreciate, this very problem of invisible citation was actually anticipated by the fictional law professor protagonist of the voluminous mid-20th Century novel Der Mann ohne Eigenfactor, which unfortunately had somewhat lower “impact” than its near-namesake.

  6. anon says:

    I wonder if the variable law prof. rankings correspond to particular specialties. I can imagine writings in sexy areas like free speech and privacy will get a lot of downloads from professors and other across many fields who are interested in the topics but have no intent of ever citing to it while an article on many unsexy topics, such as a tax or administrative law problem will only be downloaded by those actually working in the area.

    Furthermore, closed scholarship (that which solves a problem such as how to resolve a specific tax problem) is likely to be reviewed by those in the area, but will rarely be cited if it truly resolves the problem and ends the debate. On the other hand, open scholarship (that which offers a new idea to stir debate) may get cited more as the idea is digested. The more practical the area, the more likely professors in the field will write closed pieces that will be widely read and used, but are less frequently cited in future scholarship.

    Anyway, just some thoughts off the top of my head.

  7. Orin Kerr says:

    Thanks, Larry, got it. Although if Dave is right, then that does raise the question of how valuable the information is, I suppose.

  8. anonprof says:

    Larry, Thanks for bringing this to our attention, but to me this eigenfactor thing looks very thin. The thing that stuck out for me after a cursory search was that many good law reviews were simply missing. A few examples where is B. C. L. Rev., U. C. Davis L. Rev., Cardozo L. Rev., Conn. L. Rev., Brooklyn L. Rev., Washington and Lee L. Rev., SMU, BYU, Maryland, Constitutional Commentary? As far as I can tell, they weren’t factored into the equation.

    Further, SSRN download counting is a tremendously faulty of measuring article impact. I’ve spoken to several people, who will remain nameless, who have told me that they manipulate the citation count by asking all their friends to download their papers or telling a large, first year class of students to download materials that are actually irrelevant or only tangentially relevant to the course.

  9. Thank you for the blog post and comments.

    A few notes:

    1) As Larry noted, this was a preliminary test of the ranking system that we have been developing together with SSRN. While not available at present, these rankings will be released on the SSRN site shortly.

    2) SSRN is in the process of accounting for the citations in the footnotes of law articles. These will be incorporated in the SSRN Eigenfactor rankings, and will have a significant impact on the overall scores.

    3) Our methods will be presented in a detail in a paper to be released on SSRN at the same time as the rankings are released.

    4) The law journal rankings ( referenced in this post ) are derived not from the SSRN data but rather from the Thomson-Reuters Journal Citation Reports data. There, we rank only those journals included in the Thomson-Reuters Journal Citation Reports.

    Jevin West,
    Eigenfactor Project

  10. Lawrence Cunningham says:

    Jevin–Thanks for this. Looking forward to your article and the results.

  11. dave hoffman says:

    So until #2 happens in a transparent way (a problem that SSRN has been working on for at least two years), I’d say that the Eigenfactor “rankings” provide very, very little information to the legal academy.