Frustrated Slate sters wanted to know: Where to go for weather on the Web? Web forecasts seem "wildly divergent," huffed a New York-based editor, making it "tough to pack for trips." A Seattle scribe claimed MSNBC's predictions were more "upbeat" than Yahoo!'s. (Try this forecast, Yahoo!, High 205, Low 13.5.) Bitterness mounted as we sought to define "Partly Sunny." An investigation was in order.
I chose two cities—Boston (where I live) and Seattle (Slate's home base)—and set out to test a few online forecasters: Weather.com, AccuWeather.com, MSNBC, CNN, Yahoo!, the National Weather Service, and, as a local entry, Boston's WCVB-TV home page. I logged on each night to check next-day predictions for highs, lows, and conditions, then compared these guesses to the actual weather.
Where Do Forecasts Come From?
After just two days, I reached my first conclusion: AccuWeather, MSNBC, and CNN were posting identical forecasts. Close reading revealed this fine print on the MSNBC page: "MSNBC Weather is provided by AccuWeather." Aha! And CNN AccuWeather Senior Vice President Michael Steinberg explains that his massive meteorological hive, located in Pennsylvania, indeed provides forecasts for CNN.com, hundreds of other sites, and about 750 newspapers (including the Washington Post). Sometimes AccuWeather gets credited on the sites, other times it doesn't. Steinberg was quick to point out that while AccuWeather posts its forecasts on other sites, Weather.com (from the Weather Channel—AccuWeather's arch-nemesis) doesn't license out its forecasts—you must visit its site. This explains, says Steinberg, why Weather.com is No. 1 in weather site traffic, while Accu is only No. 2.
Steinberg said any forecast discrepancies between AccuWeather.com and the sites it services arise only from the fact that AccuWeather updates more frequently. But a deeper issue had been raised: Where do forecasts come from? Do they all spring forth from a tiny weather cabal? Sort of.
According to both Steinberg and Jody Fennell, a VP at Weather.com, basically all raw weather data comes from the federal National Weather Service. A few local TV stations own their own radars (Doppler 5000!), but to put together a radar mosaic, or certainly for satellite data, everyone counts on the NWS. It's how you interpret the raw data (from radars, satellites, weather station observers, and so on) that determines your forecast. In fact, most NWS data is available to the public for free. (With a bit of research, a green screen, and a mustache, you could be your own weatherman.) But the major forecasting groups use their own proprietary software, with specially chosen algorithms, to form predictions. That's the system for Weather.com, AccuWeather, your local station, and pretty much everybody else: NWS data is fed through various computer models. Fennell told me Weather.com's local temperature forecasts go straight from the software to the site, with no human intervention at all. The NWS makes forecasts from its data using software, too. Yahoo! says its forecasts come from an international company called Weathernews Inc., which gets its data from, yes, the NWS.
Who's Got the Best Forecasts?
OK, the same data gets fed through different software models. But whose forecasts are best? According to AccuWeather's Steinberg, there is no industry watchdog. Occasionally, an analysis will show up in the press or in scientific literature, but most outlets review their own forecasts internally. For more than 13 years, AccuWeather has compared its forecasts for the D.C. area to the forecasts from the NWS and claims it's beaten the NWS in 156 out of 158 months. What does that mean? AccuWeather looked at the average temperature for each day, then determined whose forecast was further off and by how many degrees.
I tried a similar test, but used high temperatures instead of average. Over eight nights, I looked at forecasts for the next day's high temp in Boston and Seattle, then compared these forecasts to the actual temperatures recorded by the government.
First I measured absolute values: the number of degrees by which each guess was wrong. If Yahoo! guessed 2 degrees too high on one day, and 4 degrees too low on the next, the absolute value of degrees-off was 6. I added this up for all eight days, then divided to find a per-day average, heretofore known as the "Total Degrees off Average."
Then I looked at trends, to see if sites guessed consistently too high or too low. In the above example, the "Directional Degrees Off" would be the sum of the two figures (2 + -4), or negative two. I just added these up for all eight days to see if it indicated any tendencies. (The fault with this stat: If you guess nine degrees too high one day, then nine degrees too low the next, you look perfect.) Got all that? Here are the stats for high temp predictions:
For Boston, the best predictions came from Yahoo!, which averaged 2.5 degrees off per day. AccuWeather was worst, averaging 3.63 degrees off per day. Every site tended to guess too low. Meanwhile, in Seattle, AccuWeather averaged an impressive 1.75 degrees off per day, while Yahoo! came in last. Every site tended to guess too high.
For night owls, I ran the same calculations on low-temp predictions, which hint at evening weather. Yahoo! won handily in Boston again, and this time it won in Seattle, too.
The differences in these per-day averages seemed a bit small—so I cooked up another stat. This one measures a site's propensity to totally screw you over, guessing wrong by a startling margin. We'll call these "Boff the Pooch" forecasts. A site was credited one BTP for each prediction, high temp or low, that missed by 6 degrees or more.
Conclusion so far: Compared to Seattle, Boston weathuh's wicked hahd to predict.
Next came qualitative analysis. Along with highs and lows, sites predict the conditions. Rain, sleet, partly cloudy, mostly sunny, etc. There exist actual definitions for some of these, sort of. Everyone seems to disagree, but according to the NWS (at least, the NWS outpost in Louisville, Ky., which posts a semiofficial looking chart), they are:
Cloudy: 9/10 cloud cover to total cloud cover
Mostly Cloudy or Considerable Cloudiness: 7/10 to 8/10
Partly Cloudy or Partly Sunny: 3/10 to 6/10
Mostly Clear or Mostly Sunny: 1/10 to 3/10
Clear or Sunny: 1/10 or less
Fair (used mostly for nighttime periods): Less than 4/10 opaque clouds, no precipitation, no extremes of visibility, temperature or winds.
And now you know. Most forecasts I looked at referred only to levels of cloudiness, which we could argue over all day. Can you really distinguish between 6/10ths and 7/10ths cloud cover? (My great aunt would simply say, "Enough blue for a Dutchman's britches.") A few days, however, featured genuine forecast disagreements. For instance, on March 1, Yahoo! and Weather.com called for rain in Seattle, while the other sites did not. My weatherspy out West reported that rain did indeed fall on her. Still, these obvious discrepancies were rare, and not much could be gleaned from them.
Where does this leave us? I'd say it's clear (or is it mostly sunny?) that Bostonians should take their weather tips from Yahoo!: It had the best average forecasts, and zero Boff the Pooch forecasts. Also interesting (to me, anyway) was that the local TV guys couldn't beat the national sites.
Out in Seattle, looks like you should go with AccuWeather, though it also seems you can't go too far wrong. Seattle-ites might ignore smallish statistical differences and choose a site based on ease of navigation, or coolest looking radar graphics, or whatever it is Seattle-ites go for these days. AccuWeather stressed its site's use of "Real Feel" temperatures (patent pending), which use various factors (not wind chill, which is inaccurate!) to tell you what the weather "really" feels like. Maybe this alone is enough to sway you. I say you should still carry an umbrella.