The Numbers are REALLY In–Plus Two Modest Proposals

For those of you who had any doubts, our friends at Kaplan have just confirmed it:  Aspiring law students care more about law school rankings than anything else, including the prospects of getting a job, quality of program, or geography.

Sayeth Kaplan:

1,383 aspiring lawyers who took the October LSAT . . . [were] asked “What is most important to you when picking a law school to apply to?” According to the results, 30% say that a law school’s ranking is the most critical factor, followed by geographic location at 24%; academic programming at 19%; and affordability at 12%. Only 8% of respondents consider a law school’s job placement statistics to be the most important factor. In a related question asking, “How important a factor is a law school’s ranking in determining where you will apply?” 86% say ranking is “very important” or “somewhat important” in their application decision-making.

Mystal at ATL expresses shock–shock!–that potential law students could be so naive. Surely, he fairly observes, they should care most about job prospects.

Yes, that would be true if they were rational.  Yet, we all know from the behavioral literature that we apply a heavy discount rate to long-distance prospects.  How much can I or  should I care today about what may happen 3 (or 4) years from today?

If you think about it from the perspective of any law school applicant today, the one concrete thing they can lock onto that has present value is the school’s ranking:  It is simple, quantified, and–perhaps most important–tauntable.  No one’s face burns with shame because their enemy (or friend)  got into a law school with a better job placement rate.  Jealously and envy–the daily diet of anxious first-years–are driven by much simpler signals:  Is mine bigger (higher) than yours?

This is not to defend the students who place so much faith in numbers that have repeatedly been shown to be incredibly stupid.  It just means that Kaplan’s survey (and I have not seen the instrument or data) makes intuitive sense.

Which leads to me to offer two modest (and probably unoriginal) proposals:

1.  Perhaps the  LSAT itself should include some behavioral assessment portion that mirrors the Kaplan survey.  After you’re done solving logic puzzles such as whether Aunt Jean wore a blond wig to dinner based on Wisconsin’s BCS ranking, you should be asked what sorts of things matter to you in a law school.   As the Kaplan survey shows, the answers give insight not only into market preferences (for better or worse) but, imho, the judgment of test takers: Demerits for poor judgment if you answered as most of these students apparently did.  Even if you couldn’t quantify and rank the responses (and, really, how could you?), it would tell schools something about the kind of students they’re getting.

We know that behavioral testing of all sorts is important in a variety of fields.  Having taught at least one student who went postal (not my fault, I was assured), I think law schools would do well to have some insight into the psychological fitness of their students. To my knowledge, that’s not something the analytic grind of the LSAT measures.

2.  Why isn’t there competition among ratings?  We know that in a world of J.D. Powers, Kiplingers, Consumer Reports–to say nothing of at least three credit rating agencies–it is certainly possible to come up with multiple, credible ways to evaluate any given product. I know that valiant (and not-so-valiant) efforts have been made to compete with USNWR, but so far nothing seems to have stuck.  Why hasn’t some media powerhouse come up with its own system?  It’s clearly been lucrative for U.S. News–I suspect it is the only thing keeping the brand alive.

But perhaps I answered my own question, above:  No one really wants more than one ranking  precisely because of the clear signal sent by a single, if flawed, metric.

Still, if they’re smart enough to become lawyers, you’d think they’d be smart enough to assess more than one set of measures.  Isn’t that what judging is all about?

You may also like...

10 Responses

  1. Bruce Boyden says:

    “Only 8% of respondents consider a law school’s job placement statistics to be the most important factor.” *Statistics*, not prospects. Given Duke’s numbers this year, I’m not sure I find that unreasonable.

  2. Bryan Gividen says:

    I read ATL’s coverage on this and I don’t think its or Mr. Lipson’s analysis (or Kaplan for that matter) is really interpreting the results correctly. Applicants can only select one of these factors as their “most important” factor. In reality, I think most of the applicants view “School Ranking” as a proxy for academic programming, affordability, and job placement. In most applicant’s minds, a higher rank translates to a better faculty to lead courses; a better return on investment compared to other, lower-ranked private schools; and a better shot at finding employment in the area they wish to practice. The thought is, if the student goes to Yale, Harvard or Stanford, the student can learn from the best, find a job that pays six-figures easily, and lock down a job no matter what. Contrarily, if they go to a low-ranked school, they’ll have to deal with whatever faculty is there, struggle to pay off the debt they take on, and only have marginal prospects at getting a job in that region. I am not saying these perceptions are accurate, I am saying it’s what I think most applicants believe and consequently, why rank is the most important factor. So I don’t think this has as much to do with naive applicants or even hyperbolic discounting – though certainly in some cases it does – as it does to do with an interpretation of the question by applicants.

    As for Mr. Lipson’s two suggestions-
    1. I don’t think this would have any actual effect on placement and if it did, it would become useless quickly. Law schools have shown they are primarily concerned about academic prestige, which translates in trying to get the best ranking in the US News as possible. (They have press releases when they move up a spot for heaven’s sakes.) Unless US News started to capture this behavioral assessment in their rankings, it would be as useless as the writing portion of the LSAT, which I think most people forget exists. (Including admissions panels.) If US News did start capturing it and schools started worrying about it, LSAT prep courses would begin teaching students how to answer the behavioral questions “correctly” so that they would receive a favorable rating.
    2. I think your conclusion on 2 is correct. There have been attempts to compete with US News and they generally fail. Another problem is that any rankings system that departs from a few norms would be considered inferior to US News. For instance, if the rankings don’t have Yale, Harvard, and Stanford in the top three, then there will be suspicions it isn’t accurately capturing information. Then again, if it does have them in the top three, it begins to look like the US News. That could continue on down the line with groupings of schools. NYU, Columbia, and Chicago are considered in the next three sets of schools. The big public institutions (UCBerk, UVa, UMich, UCLA, UT) and the other top private schools (Penn, Northwestern, Duke, Cornell, Georgetown) would all have to come before any other institutions. Once again, it starts to look like the US News with a few nominal changes. In short, US News has established themselves as being right so anything that differs from their assessment will be treated with skepticism. That’s not to say a new rankings system couldn’t change that perception over time, but is simply to say that there’s a giant hurdle to clear for anyone to gain notoriety and be different.

  3. anon says:

    Few prospective law students can imagine that that they might not get a job at all. They care about ranking because they are thinking about whether they will get the job that they want.

  4. Elie Mystal says:

    I’ve thought a lot about proposal 2. To me, there seem to be two main problems to effectively attacking U.S. New’s dominance.

    1) Where’s the info coming from? Most of U.S. News info is self-reported from schools. The ABA does a terrible job at forcing schools to release accurate public info. If you are starting a new rankings and you ask the school something USNEWS *doesn’t* already ask, they just won’t tell you.

    So a lot of new rankings are really just recalibrating the US News info in a different way.

    2) The Smell Test. If you do a new ranking, and Yale, Harvard, Standford, Columbia/NYU, Chicago and maybe Berkeley aren’t very near the top, then your ranking is “wrong.” I mean, reasonable people can disagree. But I saw one which said BC was a better law school for your career prospects than Yale, and that’s just not true and we all know it.

    Good science says you can’t design your methodology with a particular outcome in mind, but good marketing tells us that if you’ve got (ahem) Cooley ranked at #12… well there was really no point to putting together a ranking.

    My solution is that we need some kind of student outcome oriented ranking, instead of a student input oriented ranking. Instead of going to law schools, go to employers and find out where they hire. Find out how much they get paid (and/or other measures of career “success”) There’s a lot of stuff you could look at that would give you a sense of how graduates do, and we could base a law school ranking on that post-grad info.

    There’s too much stuff, actually. And people wildly disagree on which stuff is important. And even the objective answers change state-to-state or across socioeconomic classes.

    Trust me, if there was a simple way to do this I would have done it already.

  5. Elie Mystal's Pimpdaddy says:

    There is only one way to make the whole thing work.
    During the LSAT examination, prospective law students will have flashlight-sized electrodes inserted into their rectums. At random intervals during each section of the test, electrical current will be administered in voltages ranging from siezure-inducing to borderline fatal. This will weed out the slackers looking for mommy and daddy to pay for 17th through 19th grade, and those who pass will have demonstrated the ability to perform under conditions typical to the 21st century law firm.

  6. Lawlspeak says:

    “My solution is that we need some kind of student outcome oriented ranking, instead of a student input oriented ranking.”

    I’ve thought about this for quite some time, and I’m not yet convinced that an output- rather than input-focused rankings model would be more meaningful for job placement. Perhaps employers might have rational preferences based upon which school seems to be producing “very expensive clay” of a variety that is best-suited to being molded into new biglaw attorneys. However, my hunch is that there are three reasons why they prefer high-ranked schools:

    1) Like currency, T14 grads are (were) a scarce resource. They become valuable just like the ultra-rare neon pink variant elf-boots in World of Warcraft: a subset which is both limited in size and deemed valuable by consensus. You want more T14 grads because it’ll prove you’re better than the firms with fewer.
    2) Related to but distinct from (1), you can impress clients. “People from (T14 schools) are rare, and your entire team is made up of them!”
    3) Part of the pitch to clients, and also part of why they might seem especially desirable clay: students with top-notch admissions credentials are perhaps more likely to be some combination of smart and hard-working that is useful for BigLaw attorneys.

    Output-based rankings might be of questionable value inasmuch as there’s a pretty well-entrenched belief that law grads “don’t know anything about the real practice of law”, no matter how many practica or clinics they’ve done. Further, the three factors above are probably pretty strong motivators.

    I would be happy to see all this insane rankings obsession disappear, even though I personally benefit from it in a placement sense. Let’s just get good employment stats and (hopefully) stop people from flocking to schools nobody hires from (for good or bad reasons).

  7. Lawlspeak says:

    …sorry, I was tired and completely misread our dear Elie’s post. Indeed. Outcomes-based. I was talking instead about the possibility of ranking schools based on the caliber of attorneys they produce. (I’m a little dismayed that this is considered an absurd idea, but as noted above, I think it would indeed be bizarre given the way the legal world works.)

  8. PubliusFL says:

    This post seems to suggest that it shows poor judgment for prospective law students to place more weight on law school rankings than on “the prospects of getting a job,” which is represented by a law school’s job placement statistics. This is naive, apparently, because law school rankings are nothing but “numbers that have repeatedly been shown to be incredibly stupid.” But the rankings are “stupid” because they are based in significant part on factors that are highly manipulable by the law schools, and job placement statistics are among the most manipulable components of law school rankings. So why is it so irrational to place more weight on the latter than the former?

  9. Joseph Franklin says:

    Professor Lipson: When I attended law school a couple of decades ago, I was disappointed to find that, generally, members of the faculty regarded students with an attitude of cynicism. I’m again disappointed to find that the same attitude might still prevail.

  10. When I went to law school cynicism and faculty went hand in hand. I do remember thinking that the actual rank of the school was based on the preparedness of the students.