What Would Happen if USNews Didn’t Weigh Money?
Recently the ABA announced that it will no longer collect expenditures data from law schools: Leiter and Merritt offer thoughts on how that decision will influence the USWR rankings. Both posts are interesting, though somewhat impressionistic. Leiter thinks that state schools will benefit and Yale will lose it’s #1 spot; Merritt believes that USWR should reconfigure its method. [Update: Bodie adds his two cents.]
It’s well known that the influence of particular categories of data on the ranking can’t be determined simply by reading the charts that the magazine provides. Paul Caron notes that the rankings depend on on inputs that aren’t displayed (like expenditures). But it gets worse: (1) the point accumulation of each school influences that of every other school; (2) USWR changes the raw data through manipulations that are not well explained (placement discounts for law school funded positions) or are simply obscure (CoL adjustments for expenditures); (3) many schools don’t report information and USWR doesn’t advertise their missing-data imputation method; etc. etc. Bottom line: the rankings are very, very fragile. (Many would say they are meaningless except at 10,000 feet.) Luckily, Ted Seto’s work enables everyone to give their best shot to approximating each year’s ranking. Seto argues that variance within a category turns out to influence the final scores as much as the purported weight that USWR assigns to it.
As thought experiment, I decided to estimate what would happen if each school’s expenditure data was set to average school’s expenditure. I then used Seto’s method on 2011-2012 historic data to estimate the rankings in the absence of expenditure variance. This basically eliminates the influence of expenditure as a category. (A perhaps better, but more time consuming, approach would be to eliminate the expenditure categories altogether and re-jigger the equation accordingly). My back-of-the-napkin approach produces some wacky results, particular at the lower end of the ranking scale. To keep it simple, after the jump I’ll focus on the top ten winners and losers from the elimination of expenditure variance in the 2013 t100 and then offer some thoughts.
Current T100 Schools with the largest estimated gains from the absence of expenditure variance:
1. University of Kentucky
2. Brigham Young University
3. Washington and Lee
4. University of Alabama
5. University of Louisville
7. University of Arizona
8. University of Utah
9. College of William and Mary
10. University of Arkansas
Current T100 Schools with the largest estimated losses from the absence of expenditure variance:
4. Seton Hall
5. Santa Clara
7. Penn State
And...Leiter is basically right. Some state schools with relatively good employment outcomes and relatively low expenditures per student would do better in the absence of any expenditure data; private & stand-alone schools (who were benefiting from the absence of indirect expenditures) will lose out. Notably, none of the effect sizes I saw was huge – the largest would produce a 5 – 10 point swing in the ultimate rank of the school.