Dissension in the Rankings

Dissension in the Rankings

Dissension in the Rankings

Sept. 7 1999 9:30 PM

Dissension in the Rankings

U.S. News responds to Slate's "best colleges" story.

An allegation has been made, and it must perforce be answered. The charge? Fiddling. No, not the fiddling of Nero or Nashville. The matter is more serious.


For those not in the academic racket, or with kids long out of college or not long out of diapers, it might seem a trifling matter. But to anyone with an abiding interest in higher education, the stakes don't get much higher. Because the fiddling charge arises in the context of college rankings. In the hushed groves of academia, few things cause more consternation than an outsider using numeral measurements to gauge academic performance--even though colleges and universities rely on similar measurements to rate their applicants.

Here's the deal. Many educators say it's absurd to think that the intangibles of a college education can be reduced to mere numbers, and they're right. But for more than a decade now, U.S. News & World Report has been providing kids and their parents a way to assess the most important factor in choosing a college: academic excellence. Obviously, that's not the only thing to think about when selecting a school. But millions of people find the magazine's assessments useful. And it's a measure of the seriousness with which they're taken that deans and admissions officers compete fiercely to better their schools' rankings from year to year.

Comes now the fiddling business. Writing in the pages of Slate, Bruce Gottlieb is admirably forthright in his condemnation. "[T]he editors of U.S. News" he writes, "fiddled with the rules" in preparing this year's college rankings. The provocation for the charge? This year the magazine ranked the California Institute of Technology first among national universities, up from the No. 9 position just a year ago. "This was dramatic," Mr. Gottlieb writes, "since Caltech, while highly regarded, is not normally thought of as No. 1."

Fair enough. We welcome challenges to our methodology and use them to refine and improve our rankings. To Mr. Gottlieb's gimlet eye, however, there is mischief afoot. In awarding the No. 1 slot to Caltech, he writes, the magazine's editors generated a sense of "surprise" by toppling last year's "uninteresting three-way tie among Harvard, Yale, and Princeton" for first place. "Nobody's going to pay much attention" to the magazine's rankings, Mr. Gottlieb writes, "if it's Harvard, Yale, and Princeton again and again, year after year." Ergo, the magazine "fiddled" the thing to generate a bit of buzz.

The charge bears examination. Never mind that Mr. Gottlieb, a former Slate staff writer, is currently enrolled at Harvard Law School. (One's attorney and one's mother abjure questions of motive.) But Mr. Gottlieb is a self-described student of econometrics, which our Webster's defines as "the use of mathematical and statistical methods in the field of economics to verify and develop economic theories." Put aside for a moment that the U.S. News rankings have virtually nothing to do with economic theory. One may posit that a mind used to grappling with the kudzu of econometrics is more than up to the task of dissecting something as relatively straightforward as college rankings.

How is it, then, that Mr. Gottlieb falls so short of the mark? The magazine's methodology for determining the rankings is based on a weighted sum of 16 numerical factors. Mr. Gottlieb the econometrician somehow manages to misapprehend even the most basic of these. The magazine, he says, rates schools on "average class size." Wrong. It's the percentage of classes with fewer than 20 students and the percentage of classes with 50 students or more. U.S. News, says Mr. Gottlieb, also rates schools on the "amount of alumni giving." Sadly, the econometrician gets it wrong once again. The magazine ranks schools on the rate of alumni giving--the percentage of alumni who donate money to their school.

But that is to cavil. It is not until he is well launched on his wrongheaded bill of particulars that Mr. Gottlieb makes an interesting concession. "I can't prove that U.S. News keeps changing the rules simply in order to change the results," he writes. No matter. The charge is leveled, and like a parched man finally led to water, Mr. Gottlieb keeps drinking and drinking.

Summing up, at long last, Mr. Gottlieb concludes that the success of the magazine's rankings "actually depends on confounding most peoples' intuition" about which colleges and universities are the best. Had he bothered conducting even the most rudimentary research, Mr. Gottlieb would have seen that the charge is without merit. Over the past 10 years (1991-2000), the top 15 national universities in the U.S. News rankings have remained remarkably consistent. Eleven schools have been in the top 15 every single year for a decade. Every year the top 15 have varied from the previous year's top 15 by one or fewer schools. In the past five years, the top 15 have been exactly the same top 15. Yes, the "uninteresting" triumvirate of Harvard, Yale, and Princeton has been there all along. So have schools like the Massachusetts Institute of Technology and others that virtually any expert would number among the nation's best providers of higher education. And, yes, Mr. Gottlieb, so has Caltech.

These are data even an econometrician should be able to understand.