I’m glad so many of you have enjoyed viewing the new 2011 US News Law School Rankings that were posted on Tuesday, some 30 hours before they showed up on the US News site. Yet again, someone in NYC managed to purchase a hard copy of the magazine well before its official release date, although the time between leak and confirmation was a bit less than last year.
It’s clear to see how much these rankings are like crack when I look at the number of page views this blog has received in the last few days. We’re a nation that lives on numbers, whether they come in the form of rankings, public opinion polls, restaurant reviews, or fantasy sports statistics. We find soccer boring because games are low scoring and the most cited stat is time of possession. We need to have winners and losers, and we need to know how good something is compared to its counterparts. To many, these law school rankings mean everything. It’s so easy to get caught up in the rankings game, because they come out each year and matter so much to so many. Schools whose ranking has gone up are quick to make this fact known on their websites. Schools whose ranking has gone down discredit the system as being flawed. No school can ignore the fact that its reputation depends significantly on the yearly ranking it receives, as many students make their selections almost solely on this number.
Fair or unfair, it’s the way it is. Many have attempted to challenge US News by coming up with other ways of ranking the schools, and while these methods may, in fact, be better, the US News methodology maintains its monopoly, and will be the big dog for years to come. I hadn’t really examined the main arguments against the US News system before, so I took a little time to educate myself. I started over at the TaxProf Blog, where Paul Caron, as he does every year, ranked the law schools solely on the basis of their academic peer reputation scores. These scores make up the largest component of the total, and are good because they aren’t manipulable by the individual schools. However, these scores surely depend to a significant degree on the overall US News rankings themselves, and therefore they don’t fluctuate very much, even when there are improvements in a school’s faculty and/or student quality. I was then curious about the factors that are manipulable by the individual schools, and was surprised to read this piece by Brian Leiter.
Even putting aside the fact that this formula, with its various weightings, is impossible to rationalize in any principled way, the really striking fact about the U.S. News methodology is surely the following: More than half the criteria-over 54%–that go in to the final score can be manipulated by the schools themselves, either through outright (and undetectable) deceit, or other devices (giving fee waivers to hopeless applicants, employing graduates in temp jobs to boost employment stats, etc.).
This year, for example, everyone seems to be talking about Duke’s 100% employment figure at graduation. That’s right…every single one of Duke’s ’08 grads had jobs when they graduated (and in an economic downturn, no less!). I guess we’ll have to take them at their word, because US News doesn’t check these self-reported figures. It’s also interesting that Chapman University entered the Top 100 this year (for I think, the first time ever), and this could be why: “Chapman University reported 91.1% of its graduates employed at graduation, more than any school ranked between 47 and 100 in U.S. News.” Even the seemingly non-manipulable figures, like academic peer reputation, can have serious issues:
Some readers may recall that Loyola LA took a plunge last year, when their academic reputation score dropped from 2.6 to 2.3, something which almost never happens. It turned out the explanation was simple: U.S. News stopped listing the school by the name everyone in the academy knows it by–Loyola Law School, Los Angeles–and simply listed Loyola Marymount University. After last year’s fiasco came to light, U.S. News agreed to list the school for purposes of this year’s survey as Loyola Law School again and, lo and behold, its reputation score was 2.6 this year. If such apparently trivial alterations can affect results so significantly, how much confidence should one have in the reputational results?
For more fascinating tidbits on the US News rankings, I’d highly recommend all of the other posts over at Leiter’s Law School Reports. While I now have a better understanding of the numerous problems associated with the rankings, I still won’t complain if and when my school continues to move up! A little data manipulation, and it’s sure to happen!