If you've been following the scamblog movement, you've likely figured out that one of the biggest issues is that there's just too many damn issues, all balled up into one huge mass of complaints. There are people who went to law school years ago, graduated, found 'good' jobs, and who are unhappy because it turns out that lawyering sucks, and their education and experience haven't opened up any better options to them. Then, there are people who went to really good schools, got good grades, had offers from big firms at the end of their second summer, and ended up laid off or with offers revoked as part of the 'lost generation.' There are the kids who were duped by overly optimistic employment figures and ended up taking out over $100,000 in loans to attend a third tier toilet school. And, there's the newest batch of kids who, with any minimal amount of research, should be able to learn that law school is an increasingly losing game, but go to law school anyways.
When trying to figure out if law schools are doing their job, or if law school is 'worth it,' the first question has to be what metric is being used.
One possibility is to look at the average outcome. For instance, you can add up all the starting salaries, and just see what the average is. This approach used to have a lot of appeal, but recently people have become more aware of the bimodal distribution of lawyer salaries. It's not a good method for the industry overall, but still is useful at individual schools, especially at the very top or very bottom of the rankings. Columbia doesn't have a bimodal distribution, and neither does Cooley. In fact, it is probably going to be very rare for any individual school to have a bimodal distribution. You would have to be good enough to send a large percentage in to Big Law, but also crappy enough that most of your students end up scrubbing toilets.
So, what's wrong with just measuring the performance of a school by the average, or maybe median, outcome? Maybe nothing. It may truly be a good gauge of the overall performance of the school. Afterall, it measures the total utility the school produces, and it's hard to argue that isn't important. Where it fails is its usefulness to prospective students. It is a single data point, and students are asked to make a large investment based on it. During the recession, the focus has turned to what happens if law school doesn't work for you. Knowing the average may be good, but you also want to know what happens if you lose.
A second possibility, and one that will likely have a lot of traction with people who think law schools are falling down on the job, is the Rawlsian maximin approach, focusing on the outcome for the worst off graduate. Naturally, every school will have some bad outcomes, and for any number of reasons. Some will be the fault of slacker students, you could have a class graduate into a bum economy, or there's simply the law of large numbers. Rather than focusing on the one person at the bottom of the heap, it makes more sense to consider the group of students at the bottom, perhaps the bottom 10% or bottom 25%.
At SUNY Buffalo, the 25th percentile for people working full time in the private sector was $47,500. 74% of the class went in to the private sector (59% into firms, 15% elsewhere). 21% went into government, clerkships, or public interest, with a median salary of $53,500. 1.6% of graduates were unemployed and seeking work.
There are some winners at SUNY, which is to be expected from a school in New York. The 75th private sector percentile is earning $100,000. But, using the maximin approach, we don't really care about the winners. Instead, we look at the fact that at least a third of the school is earning less than $50,000. With tuition of almost $30,000 a year, that's a whole lot of pretty meager outcomes.
Then, there is the rather callous approach of maximax, maximizing the outcomes for the students at the top of the curve, while not paying much attention to the people at the bottom. Anyone who has ever used the phrase 'social justice' with a straight face should find this approach appalling. And, even many free-market dog-eat-dog types won't be too comfortable with it. But, it is the point of view that schools often do look at. Just take a look at their recruiting materials and news they publish about graduates. Who do they focus on? Not the average student, but the ones at the very top. The student who just landed a SCOTUS clerkship, or the alumn who is now a federal judge or a member of Congress.
As douchy as it may seem, there is an argument to be made in favor of schools using a maximax approach to judge how well they're doing. Not all schools, mind you, but the ones at the very top. If you're Harvard, you don't have to be too concerned with the outcomes your worst students get. Odds are those outcomes are still going to be pretty darn good, and if they're not, it's going to be for reasons beyond your control, such as an asshole student or a real estate bubble bursting.
At a top school, looking at the average or median outcome isn't particularly useful. All of the top 10 boast a median of $160,000 starting salary. What matters at the most elite schools are outcomes that are better than Big Law, such as prestigious government jobs, clerkships with Supreme Court feeder judges, and tenure track professorships. If you're a Harvard student, you don't care if the worst outcome is a six-figure Big Law job, you care about having a shot at the even better jobs. In fact, you're probably willing to have the bottom 5% of the class do significantly worse if it greatly improves the outcomes for the top 5%.
So, what do we make from all this? Diff'rent strokes for diff'rent folks?
Yeah, pretty much. Evaluating law schools is a complex issue with numerous sub-issues. There are 200 schools, and it's stupid to think that Columbia and Cooley should be measured with the same ruler. It's like comparing apples and rotten apples.