Sometimes I feel like it's Bill Henderson's world, and I'm just living in it and trying to help connect the dots. So when John McCain talks about "spreading the wealth," I start thinking about distributive justice, specifically who gets what in the law schools that employ some of us, collect tuition from others, and ask still others for money every once in a while.
Our current system of no competition on educational quality among law schools, which I'm trying to help address with the Race to the Top project (download the Voter's Guide to the U.S. News survey here), has serious consequences for distributive justice in law schools.
The first is how we allocate the scarce resources of admission slots and financial aid. I talked a bit about this yesterday, but the basic answer is LSAT scores, as Henderson recently demonstrated. Financial aid and fundraising priorities goes to buying LSAT scores to move up in the rankings, when it could be going to expanding loan repayment programs for public interest or government jobs, or any number of other priorities.
Now, competing for the best students through merit-based aid doesn't sound so bad -- a bit of a waste of money from a public-good perspective -- but not terrible. Until you think about how merit is defined: test-taking speed, wihch is what the LSAT is about in large part.
And, as Bill Henderson has demonstrated in what has to be one of the most important law review article of the past twenty years, the only reason why the LSAT is a good predictor of law school grades is because most law school grades are determined by these time-pressured exams, having little to do with analytic ability or other skills relevant to quality lawyering, and everything to do with speed.
Which brings us to distributive justice problem #2: the next scarce resource we allocate is access to top jobs -- at most law schools, they're only accessible to the top of the class.
And what Henderson and others have demonstrated is that who falls where on the curve is different depending on the assessment method professors choose. That is, if you use a few short memo assignments instead of a time-pressured final exam to determine the grades, different people will get "As" and access to the top jobs. Doing memo assignments -- or final exams with word limits and no heavy time pressure (take-home, 6-8 hr, etc.)-- does a much better job of sorting people by analytic ability and work ethic than the time-pressured final exam. Mike Madison (Pittsburgh) and others have ably discussed the virtue of memo assignments -- one benefit for professors is that doing most of the grading during the semester frees up valuable time for uninterrupted writing at the end. I've found this to be a huge benefit, and on take-home exams or memo assignments, I've never had a problem doing a curve.
So the question for those law professors (until this year, myself included) who continue to use time-pressured exams to determine most of the grades is: why are you choosing speedocracy over meritocracy?
In the status quo, the speedy high-scoring LSAT folks get the best grades on the first-year time-pressured exams that determine grades, giving them the access to the top jobs that generally pay the most, making them the least in need of significant financial aid, while continuing to receive the most aid.
I'm not the first to criticize legal education on such grounds, I realize, but anyone else think it's time for a change? Law professors can act locally, of course, but competition on quality, through the U.S. News rankings, is the easiest way to do this globally and bring about the change we need. In the Voters' Guide we published earlier this week, we highlighted a set of schools that use "best practices" in legal education such as multiple assessments, feedback during the semester, and less reliance on time-pressured exams -- if U.S. News voters would award these schools high marks, we could have a race to the top that would help students learn more and better, and make law schools more meritocratic.
You can contact us at rtttlaw@gmail.com if you want to help; we could use it. In the meantime, I'll shut up for a while until after this other election. Thanks for listening.
Showing posts with label Race to the Top. Show all posts
Showing posts with label Race to the Top. Show all posts
Law Schools Competing On Quality
Why should anyone care about the stupid U.S. News survey anyway? According to a commonly held view, the rankings are silly, and the thing to do is ignore them. But I think this view is quite misguided.
It turns out – and this is the basic premise of the Race to the Top project that I helped start recently -- that a major obstacle to the improvement of legal education generally is the lack of competition on quality among peer institutions, and that this lack of competition also leads to other bad consequences for law schools like spending lots of money on buying LSAT scores and shifting full-time students into "part-time" programs. And the easiest way to address both sets of problems is by taking the U.S. News rankings more seriously, not less, and focusing on this survey.
What would such competition look like? In the Voter's Guide we sent out earlier this week to U.S. News voters, we said: "For example, take Penn and Northwestern, two national schools that compete for students and are close in the overall rankings. Both have very high student satisfaction and bar passage rates. But consider the curricular differences in areas particularly important in preparing students for practice: Northwestern has top-10 (or close) legal writing, clinical, dispute resolution and trial advocacy programs in last year's U.S. News surveys of faculty in these fields. Penn is not ranked in any of these areas, and is one of the few remaining law schools that uses third-year law students to teach 1Ls legal research and writing. Northwestern is also moving towards an increasingly innovative, practice-oriented curriculum, all of which suggests that Northwestern has a higher-quality J.D. program than Penn."
This kind of head-to-head comparison is completely lacking -- there's been no information out there on the relative quality of the education provided at different schools -- and as a result, U.S. News voters simply replicate the previous year's overall US News rankings when filling out the surveys. Glossy brochures notwithstanding, these quality assessment ratings rarely change from year to year, and when they do change over time, it is in response to a shift in a school's overall ranking (driven by higher LSAT scores, for example), not any underlying shift -- of reality or perception -- on the quality of the JD program. By the way, if you don't like the criteria used above to compare schools, would love to hear what existing data you would look to instead in assessing the relative quality of a school's JD program.
To understand why the lack of competition on quality has other bad consequences, recall there are four basic components of the U.S. News formula:
40%: Quality Assessment, from surveys of law professors (25%)
and lawyers/judges (15%)
25%: Student Selectivity, from LSAT Scores (12.5%), UGPAs (10%),
and Acceptance Rate (2.5%)
20%: Placement Success, from Emp rates at graduation (4%), 9 months
out (14%), and Bar Passage (2%)
15%: Faculty Resources, from Expenditures per student (11.25%),
Student-Faculty Ratio (3%), and Volumes in Library (.75%)
So since schools can't move up on the quality factor (40%) in the rankings, what do they do? They start competing on the next biggest category in the U.S. News formula -- LSAT scores and undergraduate GPAs -- by emphasizing these things more in admissions, and throwing money at (buying) higher credentials. Bill Henderson provides evidence of this trend here. How much money is your school spending on "merit-based" financial aid, and how is merit determined? I'm guessing it's not based on valuable graduate training in another discipline, interesting work experience that indicates potential excellence as a lawyer, or being the first in the family to go to a professional school.
Are we really any better than Baylor, which literally paid people to retake the SAT? I'm not so sure. Here's our deal: take that Kaplan course if you can afford it, work really hard studying for the LSAT, and if you're speedy enough, we'll give you a full ride. Sounds like paying for LSAT scores to me; we're only a tad more subtle.
The good news is we can fix this if we want to. It's actually not this pesky magazine controlling our priorities -- we (law professors and lawyers) control the U.S. News rankings, 40% of it, the largest category by far. If we have real competition on quality, there will be less need for schools to compete on other things. We just need to get enough information flowing to make competition on quality possible, and then start filling out the survey accordingly. I hope those voting this month and next will start now.
It turns out – and this is the basic premise of the Race to the Top project that I helped start recently -- that a major obstacle to the improvement of legal education generally is the lack of competition on quality among peer institutions, and that this lack of competition also leads to other bad consequences for law schools like spending lots of money on buying LSAT scores and shifting full-time students into "part-time" programs. And the easiest way to address both sets of problems is by taking the U.S. News rankings more seriously, not less, and focusing on this survey.
What would such competition look like? In the Voter's Guide we sent out earlier this week to U.S. News voters, we said: "For example, take Penn and Northwestern, two national schools that compete for students and are close in the overall rankings. Both have very high student satisfaction and bar passage rates. But consider the curricular differences in areas particularly important in preparing students for practice: Northwestern has top-10 (or close) legal writing, clinical, dispute resolution and trial advocacy programs in last year's U.S. News surveys of faculty in these fields. Penn is not ranked in any of these areas, and is one of the few remaining law schools that uses third-year law students to teach 1Ls legal research and writing. Northwestern is also moving towards an increasingly innovative, practice-oriented curriculum, all of which suggests that Northwestern has a higher-quality J.D. program than Penn."
This kind of head-to-head comparison is completely lacking -- there's been no information out there on the relative quality of the education provided at different schools -- and as a result, U.S. News voters simply replicate the previous year's overall US News rankings when filling out the surveys. Glossy brochures notwithstanding, these quality assessment ratings rarely change from year to year, and when they do change over time, it is in response to a shift in a school's overall ranking (driven by higher LSAT scores, for example), not any underlying shift -- of reality or perception -- on the quality of the JD program. By the way, if you don't like the criteria used above to compare schools, would love to hear what existing data you would look to instead in assessing the relative quality of a school's JD program.
To understand why the lack of competition on quality has other bad consequences, recall there are four basic components of the U.S. News formula:
40%: Quality Assessment, from surveys of law professors (25%)
and lawyers/judges (15%)
25%: Student Selectivity, from LSAT Scores (12.5%), UGPAs (10%),
and Acceptance Rate (2.5%)
20%: Placement Success, from Emp rates at graduation (4%), 9 months
out (14%), and Bar Passage (2%)
15%: Faculty Resources, from Expenditures per student (11.25%),
Student-Faculty Ratio (3%), and Volumes in Library (.75%)
So since schools can't move up on the quality factor (40%) in the rankings, what do they do? They start competing on the next biggest category in the U.S. News formula -- LSAT scores and undergraduate GPAs -- by emphasizing these things more in admissions, and throwing money at (buying) higher credentials. Bill Henderson provides evidence of this trend here. How much money is your school spending on "merit-based" financial aid, and how is merit determined? I'm guessing it's not based on valuable graduate training in another discipline, interesting work experience that indicates potential excellence as a lawyer, or being the first in the family to go to a professional school.
Are we really any better than Baylor, which literally paid people to retake the SAT? I'm not so sure. Here's our deal: take that Kaplan course if you can afford it, work really hard studying for the LSAT, and if you're speedy enough, we'll give you a full ride. Sounds like paying for LSAT scores to me; we're only a tad more subtle.
The good news is we can fix this if we want to. It's actually not this pesky magazine controlling our priorities -- we (law professors and lawyers) control the U.S. News rankings, 40% of it, the largest category by far. If we have real competition on quality, there will be less need for schools to compete on other things. We just need to get enough information flowing to make competition on quality possible, and then start filling out the survey accordingly. I hope those voting this month and next will start now.
Labels:
Race to the Top
Penn's Rankings Problem
As U.S. News voters figure out what rating to give each school, and start focusing more on educational quality, Penn Law seems to be quite well-positioned -- sky-high student satisfaction ("academic experience" rating of 96 in Princeton Review), great bar passage rates, on curriculum, we'll have to see what they submit for their "Best Practices" survey today (thanks to all who have submitted so far!).
But Penn faces a real ceiling on these "quality assessment" surveys: its legal writing program. Like Yale, it's taught by 3Ls. This ceiling prevents Penn from having an "outstanding" JD program ("5"), instead, I'm inclined to think they should get a "4" ("strong"). Particularly where one of its chief competitors, Northwestern, has top-10 legal writing, clinical, dispute-resolution and trial advocacy programs (Penn's nowhere on any of these lists, from last year's U.S. News surveys) and an increasingly innovative, practice-oriented curriculum, all of which point to a "5" in the survey, Penn needs to fix this soon.
Here's what the recently released Princeton Review "Best 174 Law Schools" says: "The only gripe that many Penn students express is with the first-year legal writing program. While some report positive experiences, many complain that the program is of poor quality and 'instructed by third-year law students that often don't have a lot of real-world experience outside of the summer clerking opportunities.'"
Remember, the question in the U.S. News survey is to rate 1-5 the quality of the school's J.D. program, and so some relevant indicators include: bar passage rates relative to entering credentials; levels of student engagement and satisfaction from the recently released Princeton Review law school rankings <http://www.princetonreview.com/law-school-rankings.aspx?uidbadge=%07> and implementation of findings from the Law School Survey of Student Engagement; and the strength of the curriculum, particularly in critical areas like legal writing and clinical offerings.
Why give so much weight to legal writing? Three reasons: (1) one of the most important skills for lawyering; (2) arguably the most important class in law school (I think so); and (3) frequent complaints from lawyers about new graduates' ability to communicate effectively in various forms.
So Penn, it's time to spend some money on real legal writing professors. The people who head Penn and Yale's progams may be terrific, but there's only so much one person can do. The law student instructors may be doing a good job given what they know, but... they're law students. Georgetown has moved away from this model in last few years -- are there any more schools out there that still do this? My colleague Hillel Levin's excellent and ongoing series of posts on legal research and writing didn't even mention this 3L model -- I assume, because it's so rare these days.
Last year, Penn's quality assessment # was 4.4 from lawyers/judges, and 4.3 from law professors. I would have been inclined to recommend giving Penn a "5", and still want to see what they submit on Best Practices of course -- but until Penn beefs up legal writing, I'm inclined to give Penn a "4" and hope you do the same.
Criticisms of this approach to the survey are always welcome, but you need an alternative. Right now, hundreds of professors and next month, lawyers, are doing the survey, mostly based on no information at all, and in the process, profoundly shaping the institutional incentives facing law schools.
But Penn faces a real ceiling on these "quality assessment" surveys: its legal writing program. Like Yale, it's taught by 3Ls. This ceiling prevents Penn from having an "outstanding" JD program ("5"), instead, I'm inclined to think they should get a "4" ("strong"). Particularly where one of its chief competitors, Northwestern, has top-10 legal writing, clinical, dispute-resolution and trial advocacy programs (Penn's nowhere on any of these lists, from last year's U.S. News surveys) and an increasingly innovative, practice-oriented curriculum, all of which point to a "5" in the survey, Penn needs to fix this soon.
Here's what the recently released Princeton Review "Best 174 Law Schools" says: "The only gripe that many Penn students express is with the first-year legal writing program. While some report positive experiences, many complain that the program is of poor quality and 'instructed by third-year law students that often don't have a lot of real-world experience outside of the summer clerking opportunities.'"
Remember, the question in the U.S. News survey is to rate 1-5 the quality of the school's J.D. program, and so some relevant indicators include: bar passage rates relative to entering credentials; levels of student engagement and satisfaction from the recently released Princeton Review law school rankings <http://www.princetonreview.com/law-school-rankings.aspx?uidbadge=%07> and implementation of findings from the Law School Survey of Student Engagement; and the strength of the curriculum, particularly in critical areas like legal writing and clinical offerings.
Why give so much weight to legal writing? Three reasons: (1) one of the most important skills for lawyering; (2) arguably the most important class in law school (I think so); and (3) frequent complaints from lawyers about new graduates' ability to communicate effectively in various forms.
So Penn, it's time to spend some money on real legal writing professors. The people who head Penn and Yale's progams may be terrific, but there's only so much one person can do. The law student instructors may be doing a good job given what they know, but... they're law students. Georgetown has moved away from this model in last few years -- are there any more schools out there that still do this? My colleague Hillel Levin's excellent and ongoing series of posts on legal research and writing didn't even mention this 3L model -- I assume, because it's so rare these days.
Last year, Penn's quality assessment # was 4.4 from lawyers/judges, and 4.3 from law professors. I would have been inclined to recommend giving Penn a "5", and still want to see what they submit on Best Practices of course -- but until Penn beefs up legal writing, I'm inclined to give Penn a "4" and hope you do the same.
Criticisms of this approach to the survey are always welcome, but you need an alternative. Right now, hundreds of professors and next month, lawyers, are doing the survey, mostly based on no information at all, and in the process, profoundly shaping the institutional incentives facing law schools.
Labels:
Race to the Top
Bar Passage: A Key Factor to Look To in USN Voting
For those filling out the U.S. News survey rating the academic quality of JD programs across the country, one logical question is what kind of information one ought to look at to make such determinations. Here's one key piece of data: bar passage rates relative to entering credentials.
So if we look at schools that have students with not-great entering credentials, but high bar passage rates in recent years -- that's a good signal that the quality of the JD program is relatively strong.
Two possible objections (and others welcome) on this as a metric: first, this encourages and rewards "teaching to the bar." My response is: well, the school that has pulled off one of the biggest bar-passage miracles of recent times, New York.Law School, raised bar passage rates -- 57% to 90% -- primarily by teaching struggling students analytic skills. See Dean Matasar's description of how they did it here (p. 3 of pdf). Intensive training in analytic skills for struggling students? Sounds good to me.
Second objection is: bar passage is already included in the U.S. News formula -- why double count it? The response is: bar passage counts for next to nothing (2%) in the US news formula, and it's considered on an absolute, not relative, basis. So Yale gets essentially the same credit for achieving a 90% bar passage rate in New York as New York Law School does, working with students with far lower entering credentials.
Below is the list we have so far, and thanks to Bill Henderson for pointing us in the direction of some of these schools. I'm quite sure we're missing some, and we're working on finalizing the list for that Voters' Guide out next week -- so please let us know other schools that might be considered to be in this category.
Schools that Achieve High Bar Passage Rates Relative to Entering Credentials:
Campbell (NC)
Cardozo
Duquesne (PA)
Florida Coastal
Florida International
Mercer (GA)
New York Law School
North Carolina Central
Northeastern
Texas Tech
University of Memphis
University of San Francisco
So if we look at schools that have students with not-great entering credentials, but high bar passage rates in recent years -- that's a good signal that the quality of the JD program is relatively strong.
Two possible objections (and others welcome) on this as a metric: first, this encourages and rewards "teaching to the bar." My response is: well, the school that has pulled off one of the biggest bar-passage miracles of recent times, New York.Law School, raised bar passage rates -- 57% to 90% -- primarily by teaching struggling students analytic skills. See Dean Matasar's description of how they did it here (p. 3 of pdf). Intensive training in analytic skills for struggling students? Sounds good to me.
Second objection is: bar passage is already included in the U.S. News formula -- why double count it? The response is: bar passage counts for next to nothing (2%) in the US news formula, and it's considered on an absolute, not relative, basis. So Yale gets essentially the same credit for achieving a 90% bar passage rate in New York as New York Law School does, working with students with far lower entering credentials.
Below is the list we have so far, and thanks to Bill Henderson for pointing us in the direction of some of these schools. I'm quite sure we're missing some, and we're working on finalizing the list for that Voters' Guide out next week -- so please let us know other schools that might be considered to be in this category.
Schools that Achieve High Bar Passage Rates Relative to Entering Credentials:
Campbell (NC)
Cardozo
Duquesne (PA)
Florida Coastal
Florida International
Mercer (GA)
New York Law School
North Carolina Central
Northeastern
Texas Tech
University of Memphis
University of San Francisco
Labels:
Race to the Top
And the Winner of the "Best Law Porn" Award is...
UCLA! We can judge "law porn" -- the glossy brochures that arrive in the mailboxes of of law professors, lawyers and judges this time of year -- on any number of dimensions: aesthetics, weight, ability to convey excitement, number of articles in top journals per square inch, etc.
My key metric is relevance, and that' s where UCLA's submission this cycle stands out. After all, these mailings are not just designed to create warm and fuzzy feelings towards the school, though they are that. They are designed to get the recipients to answer a particular question asked by U.S. News -- rate the "academic quality of their J.D. program" on a scale of 1-5 -- higher than the person would otherwise.
And to answer that question, the cover story of UCLA's law-porn magazine, "How UCLA Law Trains Lawyers", available here (see p. 34 of the pdf), provides highly relevant information on things like curriculum and the use of pedagogic techniques backed by research on learning theory. I read about how UCLA offers skills-oriented courses for transactional practice, which more law schools need and students want, and I'm turning my internal U.S. News dial upward.
In contrast, the glossy lists of articles provided by most schools -- and I actually like the glance at who's writing what during the 10-second stroll from my mailbox to my office -- provide next to no information to help answer the question U.S. News asks, and what consumers want to know: the quality of one law school's J.D. program versus its competitors.
After all, there's no point in comparing Harvard to Baylor: In filling out the survey, you want to speak directly to the U.S. News consumers -- prospective students who have particular LSATs and GPAs and a limited set of choices, and prospective employers who have a particular place in the market, and will choose from a certain set of schools on where to hire. The role of law professors and lawyers in the U.S. News formula is to assess the quality of one's school's legal education versus another, the "value added" to a particular student who enters law school with certain analytic and other skills, and will emerge with some additional training of relevance to being a lawyer. UCLA or USC? NYU or Columbia? Baylor or Texas Tech?
Of course, we can't just listen to the schools' own propaganda: we can look to other indicators like bar passage rates relative to entering credentials; levels of student engagement and satisfaction from the recently released Princeton Review law school rankings and implementation of findings from the Law School Survey of Student Engagement; and highly-rated programs in critical areas like legal writing and clinics. To get more information relevant to the quality of schools' JD programs, the project I helped start a few months ago, Race to the Top, has a survey out to all law schools, available here, on the degree to which they use "best practices" in legal education -- it's due this Friday, October 17.
Early next week, we'll deliver some information directly to U.S. News voters in law schools (you can sign-up for the "Voters' Guide" here) about how certain schools do in these categories, and again next month when lawyers and judges receive their survey -- as far as we know, this is the only information they'll get from a source other than the schools themselves to help fill out the survey. This will just be a first cut, and will simply highlight schools that are strong in at least one of these categories.
There will be other opportunities to be highlighted in the months ahead, and we don't quite have enough data yet to do a more definitive list of top "value added" law schools that score highly on a range of these indicators. But this time of year -- US News voting time -- is a critical time in the life of law schools and those that inhabit them. Let's take advantage, and create a race to the top in legal education.
For now, I'm thinking UCLA could be in the "outstanding" ("5") category, but would welcome thoughts.
My key metric is relevance, and that' s where UCLA's submission this cycle stands out. After all, these mailings are not just designed to create warm and fuzzy feelings towards the school, though they are that. They are designed to get the recipients to answer a particular question asked by U.S. News -- rate the "academic quality of their J.D. program" on a scale of 1-5 -- higher than the person would otherwise.
And to answer that question, the cover story of UCLA's law-porn magazine, "How UCLA Law Trains Lawyers", available here (see p. 34 of the pdf), provides highly relevant information on things like curriculum and the use of pedagogic techniques backed by research on learning theory. I read about how UCLA offers skills-oriented courses for transactional practice, which more law schools need and students want, and I'm turning my internal U.S. News dial upward.
In contrast, the glossy lists of articles provided by most schools -- and I actually like the glance at who's writing what during the 10-second stroll from my mailbox to my office -- provide next to no information to help answer the question U.S. News asks, and what consumers want to know: the quality of one law school's J.D. program versus its competitors.
After all, there's no point in comparing Harvard to Baylor: In filling out the survey, you want to speak directly to the U.S. News consumers -- prospective students who have particular LSATs and GPAs and a limited set of choices, and prospective employers who have a particular place in the market, and will choose from a certain set of schools on where to hire. The role of law professors and lawyers in the U.S. News formula is to assess the quality of one's school's legal education versus another, the "value added" to a particular student who enters law school with certain analytic and other skills, and will emerge with some additional training of relevance to being a lawyer. UCLA or USC? NYU or Columbia? Baylor or Texas Tech?
Of course, we can't just listen to the schools' own propaganda: we can look to other indicators like bar passage rates relative to entering credentials; levels of student engagement and satisfaction from the recently released Princeton Review law school rankings and implementation of findings from the Law School Survey of Student Engagement; and highly-rated programs in critical areas like legal writing and clinics. To get more information relevant to the quality of schools' JD programs, the project I helped start a few months ago, Race to the Top, has a survey out to all law schools, available here, on the degree to which they use "best practices" in legal education -- it's due this Friday, October 17.
Early next week, we'll deliver some information directly to U.S. News voters in law schools (you can sign-up for the "Voters' Guide" here) about how certain schools do in these categories, and again next month when lawyers and judges receive their survey -- as far as we know, this is the only information they'll get from a source other than the schools themselves to help fill out the survey. This will just be a first cut, and will simply highlight schools that are strong in at least one of these categories.
There will be other opportunities to be highlighted in the months ahead, and we don't quite have enough data yet to do a more definitive list of top "value added" law schools that score highly on a range of these indicators. But this time of year -- US News voting time -- is a critical time in the life of law schools and those that inhabit them. Let's take advantage, and create a race to the top in legal education.
For now, I'm thinking UCLA could be in the "outstanding" ("5") category, but would welcome thoughts.
Labels:
Race to the Top
U.S. News Survey: Vote Quality, Not Reputation
The U.S. News surveys -- the primary determinant of the overall rankings -- are now in the boxes of hundreds of law professors around the country, due in a few weeks. Next month, it's the lawyers' turn. Discussion in the blogosphere and elsewhere have referred to these as "reputation" surveys, which is misleading -- so let's stop doing so. Respondents are supposed to be actually assessing the quality of each school's JD program. U.S. News used to call them "reputation" surveys, but has not since 2002. The label, nonetheless, persists. This may seem like a small point, but I think it's quite important.
Here's what U.S. News asks law professors: "Identify the law schools you are familiar with, and then rate the academic quality of their J.D. program at each of these schools. Consider all factors that contribute to or give evidence of the excellence of the school's J.D. program, for example, curriculum, record of scholarship, quality of faculty and graduates."
U.S. News calls this its "quality assessment" surveys. They're asking law professors as experts on legal education, not as experts on public opinion ("reputation"). And lawyers and judges are asked the same thing as experts in lawyering (to assess "academic quality") except they are asked to particularly consider the degree to which schools prepare students for practice.
So if you were considering Yale, for example, and thought the question was what is the school's reputation on a scale of 1-5, of course the answer is "outstanding" ("5") or at least "strong" ("4") -- after all, it's the #1 law school in the country, according to the dominant rankings system! But if you were actually assessing the quality of their J.D. program, you might take into account the low student satisfaction ratings relative to their peers; the bar passage rate in New York, where most of its graduates take the bar, that were lower than Cornell and Cardozo last year, among others, despite its students having the highest entering credentials; and the fact that first-year students get most of their feedback in Legal Writing from upper-level law students, leading to legions of complaints from lawyers and judges about the work product of Yale summer associates and entry-level lawyers. And so you might say that the academic quality of the school's J.D. program was more like "adequate" ("2") or "good" ("3"), and rate the school as such.
By sticking with "reputation," we're not answering the question (as we often scold our students for doing), and also saying it's OK to answer the survey year after year according to last year's U.S. News rankings (which after all determine reputation) and in the absence of any real information on relative educational quality. The result is no competition on the quality of the service provided (legal education), and instead various attempts to "game" the rankings by buying LSAT scores (how much does your school spend on this practice?), shifting students into a part-time program, reducing the size of the first-year class, and other devices.
Time for a change. For a different approach that actually focuses on assessing quality, see here, and I'll have some more thoughts on available indicators to look to in the days ahead. And no, faculty scholarship, which has little to do with the quality of a school's JD program and is a poor proxy, won't be one of them.
Here's what U.S. News asks law professors: "Identify the law schools you are familiar with, and then rate the academic quality of their J.D. program at each of these schools. Consider all factors that contribute to or give evidence of the excellence of the school's J.D. program, for example, curriculum, record of scholarship, quality of faculty and graduates."
U.S. News calls this its "quality assessment" surveys. They're asking law professors as experts on legal education, not as experts on public opinion ("reputation"). And lawyers and judges are asked the same thing as experts in lawyering (to assess "academic quality") except they are asked to particularly consider the degree to which schools prepare students for practice.
So if you were considering Yale, for example, and thought the question was what is the school's reputation on a scale of 1-5, of course the answer is "outstanding" ("5") or at least "strong" ("4") -- after all, it's the #1 law school in the country, according to the dominant rankings system! But if you were actually assessing the quality of their J.D. program, you might take into account the low student satisfaction ratings relative to their peers; the bar passage rate in New York, where most of its graduates take the bar, that were lower than Cornell and Cardozo last year, among others, despite its students having the highest entering credentials; and the fact that first-year students get most of their feedback in Legal Writing from upper-level law students, leading to legions of complaints from lawyers and judges about the work product of Yale summer associates and entry-level lawyers. And so you might say that the academic quality of the school's J.D. program was more like "adequate" ("2") or "good" ("3"), and rate the school as such.
By sticking with "reputation," we're not answering the question (as we often scold our students for doing), and also saying it's OK to answer the survey year after year according to last year's U.S. News rankings (which after all determine reputation) and in the absence of any real information on relative educational quality. The result is no competition on the quality of the service provided (legal education), and instead various attempts to "game" the rankings by buying LSAT scores (how much does your school spend on this practice?), shifting students into a part-time program, reducing the size of the first-year class, and other devices.
Time for a change. For a different approach that actually focuses on assessing quality, see here, and I'll have some more thoughts on available indicators to look to in the days ahead. And no, faculty scholarship, which has little to do with the quality of a school's JD program and is a poor proxy, won't be one of them.
Labels:
Race to the Top
Want Your School to Rise in the Rankings? A Best Practices Survey For Law Schools
You might not want your school to rise, I don't know. But if you do, you might encourage your dean or associate dean to fill out this survey on your school's use of best practices in legal education, which along with information on bar passage rates relative to entering credentials, and student satisfaction, will be used to compile a list of law schools that provide exceptional "value added" for students.
Who's doing this? A couple of Moneyball-oriented law professors -- myself and Dave Fagundes of Southwestern Law -- with a dream: of law schools competing on educational quality, and a Race To The Top that improves legal education across the board. We're joined by a terrific Advisory Board, still in formation, that includes fomer deans like Daniel Rodriguez, former San Diego dean now at Texas, and fellow MoneyLaw blogger Nancy Rapoport, former dean at Houston and Nebraska/now at UNLV, as well as leading scholars on legal education and other topics like Susan Sturm (Columbia) and Bill Henderson (Indiana).
After compiling the list of "value-added" schools, we're going to deliver the information to U.S. News survey respondents, and encourage them to use it in filling out the survey in November. One possibility is that the value-added data will show that certain schools that have not historically received high ratings ought to receive a "4" or "5" from both law professors and lawyers. Given the current lack of information about the relative quality of law schools, and the weight given to the survey responses in U.S. News's methodology (40%) we believe that this additional information will have a significant — and positive — effect on the U.S. News rankings for those schools that we highlight.
Are you a dean, associate dean for academic affairs, chair of the hiring committee, most recently tenured professor, law firm hiring partner, state AG, or federal or state judge? That's who gets the U.S. News survey, and you can sign up for our Voter's Guide in a second at our new website, designed by UGA 2L Jerad Davis. We'll email it to you in November when the USN survey comes out, and if you have other ideas as to how we can help you do the survey, please let us know. For example, we may put on our website a spreadsheet where you can sort the schools by region to compare within the relevant markets.
Even if you're not one of the USN voters listed above, maybe you know one, and suspect they may not be a regular MoneyLaw reader — please, forward them this link to our site, and encourage them to sign up! Future law students will thank you, and so will we.
MoneyLaw readers might recall my blogging over the summer on this idea of value-added assessment, and now we're trying to make it a reality. It's a distinctively Moneyball concept — using performance, not pedigree, in assessing law schools — and we owe some serious thanks to Jim Chen for launching this blog in the first place, and then for hosting us here. Other godparents of the project include, of course, Paul Caron and Rafael Gely, whose classic article applying Moneyball principles to legal education got many of us thinking in this direction.
We hope you'll join the continued discussion about how best to assess relative educational quality, and specifically which schools ought to be rated particularly high or low — and encourage law professors and lawyers to use this kind of approach in doing the survey. Welcome your ideas.
Cross-posted at Prawfsblawg and Race to the Top
Who's doing this? A couple of Moneyball-oriented law professors -- myself and Dave Fagundes of Southwestern Law -- with a dream: of law schools competing on educational quality, and a Race To The Top that improves legal education across the board. We're joined by a terrific Advisory Board, still in formation, that includes fomer deans like Daniel Rodriguez, former San Diego dean now at Texas, and fellow MoneyLaw blogger Nancy Rapoport, former dean at Houston and Nebraska/now at UNLV, as well as leading scholars on legal education and other topics like Susan Sturm (Columbia) and Bill Henderson (Indiana).
After compiling the list of "value-added" schools, we're going to deliver the information to U.S. News survey respondents, and encourage them to use it in filling out the survey in November. One possibility is that the value-added data will show that certain schools that have not historically received high ratings ought to receive a "4" or "5" from both law professors and lawyers. Given the current lack of information about the relative quality of law schools, and the weight given to the survey responses in U.S. News's methodology (40%) we believe that this additional information will have a significant — and positive — effect on the U.S. News rankings for those schools that we highlight.
Are you a dean, associate dean for academic affairs, chair of the hiring committee, most recently tenured professor, law firm hiring partner, state AG, or federal or state judge? That's who gets the U.S. News survey, and you can sign up for our Voter's Guide in a second at our new website, designed by UGA 2L Jerad Davis. We'll email it to you in November when the USN survey comes out, and if you have other ideas as to how we can help you do the survey, please let us know. For example, we may put on our website a spreadsheet where you can sort the schools by region to compare within the relevant markets.
Even if you're not one of the USN voters listed above, maybe you know one, and suspect they may not be a regular MoneyLaw reader — please, forward them this link to our site, and encourage them to sign up! Future law students will thank you, and so will we.
MoneyLaw readers might recall my blogging over the summer on this idea of value-added assessment, and now we're trying to make it a reality. It's a distinctively Moneyball concept — using performance, not pedigree, in assessing law schools — and we owe some serious thanks to Jim Chen for launching this blog in the first place, and then for hosting us here. Other godparents of the project include, of course, Paul Caron and Rafael Gely, whose classic article applying Moneyball principles to legal education got many of us thinking in this direction.
We hope you'll join the continued discussion about how best to assess relative educational quality, and specifically which schools ought to be rated particularly high or low — and encourage law professors and lawyers to use this kind of approach in doing the survey. Welcome your ideas.
Cross-posted at Prawfsblawg and Race to the Top
Labels:
legal education,
Race to the Top,
U.S. News
Subscribe to:
Posts (Atom)
Blog Archive
- February (72)
- January (143)
- December (136)
- November (176)
- October (99)
- September (32)
- August (31)
- July (27)
- June (27)
- May (27)
- April (33)
- March (31)
- February (28)
- January (33)
- December (28)
- November (30)
- October (36)
- September (35)
- August (32)
- July (33)
- June (9)
- May (7)
- April (4)
- March (2)
- February (2)
- January (9)
- December (7)
- November (15)
- October (19)
- September (10)
- August (14)
- July (86)
- June (9)
- May (11)
- April (18)
- March (16)
- February (41)
- January (17)
- December (25)
- November (19)
- October (32)
- September (29)
- August (33)
- July (48)
- June (35)
- May (28)
- April (48)
- March (55)
- February (50)
- January (62)
- December (41)
- November (84)
- October (88)
- September (79)
- August (63)
- July (72)
- June (64)
- May (39)
- April (55)
- March (81)
- February (54)
- January (56)
- December (49)
- November (57)
- October (50)
- September (38)
- August (24)