Chatterbox

These Aren’t America’s Best High Schools

Editor’s note: Jay Mathews, the author of the Newsweek article described below, has written a response in “The Fray.” Click here to read it, and here to read a follow-up by James Fallows.

Since today’s Guest Chatterbox (G.C.) had to defend the U.S. News & World Report college rankings during the two years he was the editor there, he might not seem the ideal person to cast stones at Newsweek for its latest 1-through-472 ranked list of the “best” high schools in America. (Newsweek offers the list as an “exclusive” for its Web site.) To make things worse, several months ago in Slate, G.C. produced a pseudo-scientific ranking of “best value” beers, which highly resembled the Newsweek survey in its Professor Irwin Corey spurious precision. But it is in fact his experience as a rankings veteran that has put G.C. on the warpath about Newsweek. What the magazine has served up is an embarrassment even for the hard-to-shame industry of journalistic rankings. It is an incidental embarrassment for G.C. that Newsweek’s list appears under the byline of Jay Mathews, a friend and admirable journalist, who has a sensible article on high-school curricula in the same issue. G.C. finds it all too easy to imagine the internal newsmagazine chaos that might have led to the same person’s name appearing on a good article and a ridiculous chart.

How did Newsweek figure out the best schools? By counting how many Advanced Placement (AP) or International Baccalaureate (IB) tests the students from each school took, and then dividing them by the numbers of seniors in the graduating class. In the United States, the IB is a comparatively rare and recent program, which you can read more about at www.ibo.org but which we’ll leave out of the discussion from this point on. Only about a tenth of the schools on Newsweek’s list are IB schools, including the alleged top three. (These are: No. 1, Stanton College Prep, Jacksonville, Fla.; No. 2, George Mason, Falls Church, Va.; and No. 3, Eastside, Gainesville, Fla.) The rest rise or fall on the basis of AP tests.

So what’s wrong with a formula that calls a school best depending on how many AP (plus IB) tests the average graduate takes? Since this is a formula, let’s examine the numerator, the denominator, and the result.

The problem with the numerator–the “good” number in the ranking formula, like the number of hits in calculating a batting average–is that counting AP tests taken is a weirdly oblique way of getting at academic quality. AP courses are naturally meant to be more rigorous and substantive than ordinary classes. So counting a school’s total AP course offering, or its proportional enrollment in such courses, would make some limited sense. Or counting the scores a school’s students received on the AP test might be meaningful, as a sign of how much they had actually learned.

But counting how many tests the average student takes makes no sense. Once a student has finished an AP course, going on to take the test is hardly an automatic decision. The tests cost $76 a crack, straight to the beloved Educational Testing Service. As Newsweek points out, the fees can be reduced or subsidized for some poor students. Still, paying the money, and spending those long hours on the test, is worth doing for students only if … it’s worth doing. It does nothing for the students’ grade averages: They’ve already been graded in their courses. The AP tests the students take in senior year can’t help in college admissions, since the students have already been accepted or turned down by the time they take the tests in May. Two different students from the same high school might both have taken five or more AP courses in their career. One might decide to take a single AP test, to pass out of a college math requirement, and the other might decide to take five, to go for sophomore standing. The difference would say nothing about the quality of the high school but would make an enormous difference to Newsweek. Indeed, the difference in these two students’ behaviors is greater than the total spread from top to bottom on the Newsweek survey. The highest-ranked non-IB school in its survey averaged less than 3.5 tests per student; the schools at the very bottom averaged 1.

As for the denominator in Newsweek’s ranking–the counterpart to “at bats” in a batting average–it’s not any factor that might indicate the effort that the students or the school system put into their AP achievement. That is, the test-taking rate is not adjusted for the students’ family income (which is very important but hard to measure exactly), or the district’s per capita spending on the school (also important, and easy to measure), or rural-urban differences, or racial makeup, or anything else. It’s just a head count: AP tests per graduating senior. And this means that the more homogeneously prosperous the student body, the better it’s likely to do.

At U.S. News, large public universities from Arizona State to Virginia complained that almost any per capita calculation penalized them relative to private schools. Since part of their mission was taking in lots of students, the better they did that job the more they hurt themselves in any per capita equation, by producing a bigger denominator. Therefore we set up separate ranking tables for public institutions. The Newsweek rankings make a stab at correcting for this, by excluding schools that choose most of their students on the basis of qualifying exams or other academic tests. This can be the only explanation for the otherwise bizarre total absence of the Bronx High School of Science and the Thomas Jefferson science and technology school, in Fairfax County, Va., from the list of “best” public schools. But the formula does nothing to correct its natural bias in favor of homogeneous, wealthy suburban schools–that is, districts that “select” their students on the basis of whose parents can afford to live in the district rather than with Bronx High School-type admissions tests.

And then we have the results. Any publicized ranking system produces more of the behavior it rewards. Newspapers crank out big, “worthy” investigative series in hopes of winning the Pulitzer and other prizes. Universities have clearly responded to the U.S. News and other rankings, often in a perverse way. The worst example is clearly the “early decision” racket. U.S. News and other rankers reward colleges for high “yield” rates–the percentage of students admitted to the college who actually decide to enroll. Colleges figured out one way to play this system: Offering students an earlier decision about whether they’d be accepted, and in many cases slightly lowering the standards for admission, if the student would promise to enroll if let in. These “early decision” plans were good for the college, because the yield rate on those admitted was 100 per cent. They had another advantage for colleges, which was to make it harder for students to bargain and shop for the best financial-aid deal. You couldn’t say, “Well, I’m going to Northwestern, because they’re offering a better scholarship,” when early decision had already committed you somewhere else. They also forced students to decide by November of their senior year in high school, not May, where they really wanted to go to college–they could apply to only one place early-decision, and if let in, they had to go. When Stanford, Princeton, and Yale joined the early-decision pack in the 1990s, the most visible holdouts among competitive universities became Harvard, which has a high enough yield rate not to worry about it; plus Brown, Georgetown, and several others that admirably resist the trend on principle.

If this Newsweek system caught on, the explicit result might be schools’ encouraging more students to take the tests–even when there is nothing in it for the student. Not to belabor the point, but: The real payoff in these courses is the course itself, plus the limited practical usefulness of certain AP scores in getting advanced placement. Mathews points out in his article that Fairfax County, Va., has actually required students in AP courses to take the tests. This may make Fairfax parents feel better about the school’s position in Newsweek rankings, while giving students an extra reason to feel oppressed.

You can imagine cases where the implicit result of test counting would be positive–schools in poor districts accepting a Stand and Deliver-type challenge to move all their students through the AP route. G.C. knows Jay Mathews to be a democratic idealist and is sure that this is the behavior he hopes to evoke. But the behavior G.C. actually expects to see is schools that are already homogeneous and wealthy viewing these rankings as yet another reason to keep themselves that way.

Click here to read Jay Mathews’ response in “The Fray,” and here to read a follow-up by James Fallows.