On the morning of Feb. 26, the investment firm Bear Stearns sent out an alert (PDF) about some unwelcome news for Google. According to comScore, a leading Web-analytics company, the company's domestic paid clicks—that is, the number of times people in the United States clicked on a Google ad—were down 0.3 percent compared to last year and down 12 percent since October. By 7:16 a.m., former tech-securities analyst (and Slate contributor) Henry Blodget reported the news on Silicon Alley Insider under the headline "Google Disaster." As news of the comScore report circulated, Google got killed on Wall Street: The stock opened the day down $25 a share and continued to fall, sinking to an 11-month low of $464.19 before staging a modest comeback.
Wall Street's anti-Google stampede came despite some good news. The company's advertising numbers from the previous quarter were strong, particularly outside the U.S., and Bear Sterns also reported that Google has "healthy growth prospects that should lead to market share gains [and] a strong balance sheet." Nevertheless, investors were spooked by the idea that Web surfers had stopped clicking on text ads—perhaps a sign that even mighty Google wasn't immune from an economic slowdown. Wall Street, however, shouldn't have made such a leap.ComScore's click numbers, like so many stats about user behavior on the Web, are unreliable and opaque. Instead of using comScore reports to predict a tech company's future performance, an investor would be better off ignoring them.
ComScore is one of several firms in the United States that peddles statistics on Web traffic. It seems like it should be easy to get an exact count of how many people visit a Web site, click on an ad, and so forth. But as Slate's Paul Boutin has pointed out, these stats are a moving target. Analytics firms like Nielsen and comScore don't count every time a Web page gets accessed; rather, they extrapolate the numbers based on data from panelists who install the companies' tracking software. ComScore claims its panel includes more than 2 million people who are recruited either directly or through third-party software packages that offer services like virus protection and performance optimization. (The company terms this "researchware." Less charitable types call it "spyware.") The company takes the data it gets from these users and weights it according to demographics to draw a statistical portrait of traffic to individual sites. ComScore is, essentially, making an educated guess. Nobody except Google is keeping a tally of each individual click on the company's text ads.
Even though comScore's numbers are an estimate, they've been repeated as gospel with little discussion of margins of error—this despite the large psychological difference between a 0.3 percent decline and a small gain (or a bigger loss). Why did Wall Street respond so emphatically to comScore's numbers, ignoring the big-picture reassurances in Bear Sterns' report? One can certainly blame a jittery market on the watch for bad news as economic indicators everywhere are looking ugly. It's also probably fair to guess that crafty investors—guessing that less savvy investors will panic—would sell early in an attempt to make money off this skittishness. But it's impossible to avoid the conclusion that Wall Street types put way too much stock in the reliability of Web traffic stats, numbers that should not be used for day-to-day management of a portfolio.
After the public hubbub over its Google numbers, comScore released an analysis of the data on the site's blog. The post lists many caveats, including the possibility that the recent decline in clicks might have been the result of Google getting better at reducing "bad clicks"—accidental clicks by people who have no interest in the product being advertised. Many in the tech-blog community saw this response as comScore getting spooked by the fallout from its report or bending to pressure from Google. (A comScore spokesman told me there was no contact between Google and comScore executives between the time of the initial report and comScore's elaborations.) More likely, comScore was simply being realistic about the reliability—or maybe the unreliability—of its own data.
ComScore's numbers are particularly prone to error when making long-term comparisons, like the year-over-year comparison of Google's paid clicks. For one thing, the group of panelists that provided comScore's data in January 2007 isn't the same as the group from January 2008. We don't know how different the groups were because comScore doesn't release that data.
Like most companies that deal in Web statistics, comScore gives few specifics about its methodology. In order for investors and tech buffs to get a better sense of the accuracy of this data, firms like Nielsen and comScore have to become more transparent—something the Interactive Advertising Bureau, an umbrella organization for 300 companies involved in online advertising, has called on them to do.(For a great side-by-side comparison of how different Web analytics companies work—so far as we know—see this primer from the Web marketing firm Antezeta.)
Until Nielsen, comScore, and other analytics companies become committed to sharing their data and methodologies, personal fortunes and the fates of tech companies will depend on data that might not be anywhere close to accurate. Wall Street, at least, shouldn't be so willing to act on this kind of report.
Before public demand for better methodology is likely to mount, however, those whose personal fortunes rest on this data will have to understand that it is a methodology in the first place, not some universal registry of Web use data with a margin of error of zero. Next time you see a press release that says clicks are going up or down, take it for what it is: a guess—as far as we know.