The Fundamental Problem with the US News School Rankings

Last week, all the law schools in America were holding their collective breaths for the latest pronouncement by US News about how their school ranked. For law schools, as well as other graduate schools as well as universities, the US News rankings play an enormously influential role. The rankings affect the number and quality of applicants. Employers use the rankings too, and the rankings thus affect job opportunities. The careers of law school deans can rise and fall on the rankings too. Key decisions about legal education are made based on the potential affect on ranking, as are admissions decisions and financial aid decisions.

In the law school world, grumbling about the US News rankings never ceases. The rankings use a formula that takes into account a host of factors that are often not very relevant, that can easily be misreported, skewed, or gamed, and that ultimately say little of value about the quality or reputation of a school. Each year, I read fervent outcries to US News to improve their formula. These cries are deftly answered with a response that is typically a variant of the following: “We’ll look into this. We are always looking to improve our ranking formula.” Not much changes, though. The formula is tweaked a little bit, but the changes are never dramatic.

And yet each year, we keep grumbling, keep hoping that someday Godot will arrive and US News will create a truly rigorous ranking.

We should stop hoping.

It isn’t going to happen. This is because there is a fundamental problem at the heart of the US News rankings — doing a rigorous and more accurate ranking is at odds with the economic interest of US News, which is to make money by selling its rankings to eager buyers each year and getting people to visit their site.

US News certainly has an incentive to create rankings that are considered to be highly plausible. Indeed, that is why many alternative rankings fail, because they often have some very weird results at odds with what most people think. When the rankings fail to list Harvard and Yale at the top, they are immediately suspect. US News rankings can be off for some schools by a fair degree, but overall, they are plausible enough that people generally accept them.

Why doesn’t US News strive to be more precise? Because it isn’t worth it to them. Their formula is easy to administer. It is costlier and more difficult to be better. For the schools, their ranking matters tremendously, but for US News, getting it more accurate isn’t the top priority. Selling the rankings is the priority.

In reality, rankings don’t change very much year to year. The reputation and quality of law schools vis-a-vis each other moves at a glacial pace. But glacial movement doesn’t sell magazines! Imagine if each year, US News were to publish the rankings and say: “They are exactly the same as last year” or “Out of all the schools, one school went up one spot and another school went down one spot.” Who would buy the rankings?

People want movement and action. So US News must shift things around each year so that there’s excitement and interest. They need to make the glacier appear more dynamic. And they do a great job of creating this illusion, keeping the rankings quite plausible yet throwing in some big shifts in rankings of some schools to generate some headlines and make things interesting.

But rankings really don’t change quickly despite the constant illusion of shuffling that US News creates.

I don’t blame US News. They have to do this. So we who work at law schools can keep hoping that US News will somehow create a better system, but in the end, the incentives of US News are not aligned with our incentives.

This problem is also manifested on the part of the US News ranking formula that most deans and professors find to be the most useful — the academic peer ranking and the lawyer/judges ranking. These rankings are done by surveys where surveyed participants are asked to rank each law school from 1 to 5, with 5 being the highest score. The academic peer rankings are done primarily by deans, as well as chairs of faculty appointments committees and most recently tenured faculty members. About 66% respond — not a terrible response ratio. The lawyer/judges ranking, however, has only a 32% response.

These surveys are far from rigorous. One key problem is that the 1 to 5 scale is woefully insufficient. This year, US News ranked 194 law schools. A few years ago, I wrote a post that poked fun at this problem. I wrote from the stream-of-consciousness of a fictitious dean filling out a US News ranking survey. Here’s an excerpt:

Okay, I think Yale is the top law school, so I’ll give it a 5.

What about Michigan? Great school, but not quite as high as Yale. I’ll give it a 4.

Cornell is an excellent school too, one of the best. But it’s not Yale or Harvard, so I can’t give it a 5. It’s not as good as Michigan in my view, so I can’t give it a 4. I gave Penn and Berkeley 4′s too, and I think Cornell isn’t quite at the same level. So it’s a 3.

What about USC? Another excellent school, but it’s not as high as Cornell. So it’s a 2.

Ruh-roh! I’m not even out of the top 20, and I have 160+ law schools to assign scores to, and I only have one number left. But I must go on!

US News will likely retort that the average scores across all the surveys will yield a ranking, but is that ranking really valid? Let’s look at the academic reputation scores. Yale gets a 4.8 and Harvard gets a 4.7. That means that some people are ranking them a 4 out of 5. Really? Seriously? And great schools like Stanford, Columbia, Chicago, NYU, and Berkeley have scores that are 4.7 to 4.4. So who is giving out all these 4s or perhaps even 3s to these schools? Essentially, it is those trying to game the system or a few deans or professors who have ideas at great variance to the norm. Suppose one were to fill out the form by applying a curve, with just the top 10% receiving the highest score of 5. That means that on a scorecard, out of roughly 200 schools, 20 schools should receive a 5. Should the scorecards of anyone who doesn’t rank Harvard, Yale, Stanford, Columbia, Chicago, NYU, or Berkeley at this level be taken seriously at all?

And what are these rankings really based on? How many of us really know much about other law schools except that we might know a few folks on the faculty, or have read something in the news about them?

This survey could be done a lot more thoughtfully and rigorously, but US News’s incentives are to get a plausible ranking without spending a fortune. US News needs to keep costs down because they rank so many different types of schools.

So at the end of the day, each year we wait with bated breath to see how we rank. We cry out to the heavens, hoping that the rankings will be better, but ultimately, not much will change. We are slaves to a magazine.

There are calls to ignore the US News rankings, to turn our backs on them. But we can’t because we haven’t embraced a plausible alternative, and people crave rankings. I have come to the conclusion that school rankings are an essential human need, right up there with food, water, shelter, love, chocolate, and smart phones. Unless we can come up with something better to satiate the hunger, it is pointless to hope that US News will suddenly act against its incentives or that people will just go hungry.

And so this little saga plays on each year, and will likely continue on throughout the ages . . .

Cross-posted at LinkedIn

You may also like...