The Numbers are REALLY In–Plus Two Modest Proposals
For those of you who had any doubts, our friends at Kaplan have just confirmed it: Aspiring law students care more about law school rankings than anything else, including the prospects of getting a job, quality of program, or geography.
1,383 aspiring lawyers who took the October LSAT . . . [were] asked “What is most important to you when picking a law school to apply to?” According to the results, 30% say that a law school’s ranking is the most critical factor, followed by geographic location at 24%; academic programming at 19%; and affordability at 12%. Only 8% of respondents consider a law school’s job placement statistics to be the most important factor. In a related question asking, “How important a factor is a law school’s ranking in determining where you will apply?” 86% say ranking is “very important” or “somewhat important” in their application decision-making.
Mystal at ATL expresses shock–shock!–that potential law students could be so naive. Surely, he fairly observes, they should care most about job prospects.
Yes, that would be true if they were rational. Yet, we all know from the behavioral literature that we apply a heavy discount rate to long-distance prospects. How much can I or should I care today about what may happen 3 (or 4) years from today?
If you think about it from the perspective of any law school applicant today, the one concrete thing they can lock onto that has present value is the school’s ranking: It is simple, quantified, and–perhaps most important–tauntable. No one’s face burns with shame because their enemy (or friend) got into a law school with a better job placement rate. Jealously and envy–the daily diet of anxious first-years–are driven by much simpler signals: Is mine bigger (higher) than yours?
This is not to defend the students who place so much faith in numbers that have repeatedly been shown to be incredibly stupid. It just means that Kaplan’s survey (and I have not seen the instrument or data) makes intuitive sense.
Which leads to me to offer two modest (and probably unoriginal) proposals:
1. Perhaps the LSAT itself should include some behavioral assessment portion that mirrors the Kaplan survey. After you’re done solving logic puzzles such as whether Aunt Jean wore a blond wig to dinner based on Wisconsin’s BCS ranking, you should be asked what sorts of things matter to you in a law school. As the Kaplan survey shows, the answers give insight not only into market preferences (for better or worse) but, imho, the judgment of test takers: Demerits for poor judgment if you answered as most of these students apparently did. Even if you couldn’t quantify and rank the responses (and, really, how could you?), it would tell schools something about the kind of students they’re getting.
We know that behavioral testing of all sorts is important in a variety of fields. Having taught at least one student who went postal (not my fault, I was assured), I think law schools would do well to have some insight into the psychological fitness of their students. To my knowledge, that’s not something the analytic grind of the LSAT measures.
2. Why isn’t there competition among ratings? We know that in a world of J.D. Powers, Kiplingers, Consumer Reports–to say nothing of at least three credit rating agencies–it is certainly possible to come up with multiple, credible ways to evaluate any given product. I know that valiant (and not-so-valiant) efforts have been made to compete with USNWR, but so far nothing seems to have stuck. Why hasn’t some media powerhouse come up with its own system? It’s clearly been lucrative for U.S. News–I suspect it is the only thing keeping the brand alive.
But perhaps I answered my own question, above: No one really wants more than one ranking precisely because of the clear signal sent by a single, if flawed, metric.
Still, if they’re smart enough to become lawyers, you’d think they’d be smart enough to assess more than one set of measures. Isn’t that what judging is all about?