Google researchers try search ranking based on factual accuracy instead of links.

Google Is Thinking About Ranking Search Results Based on Facts Instead of Links

Google Is Thinking About Ranking Search Results Based on Facts Instead of Links

Future Tense
The Citizen's Guide to the Future
March 2 2015 12:46 PM

Google Is Thinking About Ranking Search Results Based on Facts Instead of Links

searchresult
Using algorithms to evaluate accuracy is a tough problem.

Screencap of Google

New hoaxes and misinformation crop up on the Internet every day, and it can be hard to tell fact from fiction. This is especially true on the most popular platforms, but help may be on the way. Facebook, for one, has been working on reducing the spread of phony news stories in its newsfeed. And now Google is researching methods to improve how it ranks its search results.

Lily Hay Newman Lily Hay Newman

Lily Hay Newman is a staff writer and the lead blogger for Future Tense.

In a February arXiv paper, Google researchers outlined progress on a new approach that would allow factual validity to contribute more heavily to a page’s search ranking. Currently the biggest factor is how many other pages link to the page in question, but this isn’t always a good metric for determining quality content. Often viral hoaxes are linked to tons of times simply because they're being talked about, not because they’re correct.

Advertisement

The Google research team wants to revise the current system to look for inaccuracies instead of links. The strategy isn’t being implemented yet, but the paper presented a method for adapting algorithms such that they would generate a “Knowledge-Based Trust” score for every page. To do this, the algorithm would pick out statements and compare them with Google’s Knowledge Vault, a database of facts. It would also attempt to assess the trustworthiness of the source—for example, a reputable news site versus a newly created Wordpress blog. Another component of the strategy involves looking at “topic relevance.” The algorithm scans the name of the site and its “about” section for information on its goals.

The aim is to create “web-source quality–knowledge-based trust.”

As New Scientist points out, there are already some services available that try to highlight misinformation, like the LazyTruth browser extension, which claims to surface “quality information when you receive an email forward full of political myths, urban legends, or security threats.” But adding even the most basic fact-checking capabilities to Google has the potential to produce broad-reaching effects, since so many people refer to the search engine multiple times per day.

On the other hand, algorithm tweaks that affect information surfacing can have unintended consequences, so implementing them is always a process. The Google method is still in development, but the researchers say it shows “promise” and “improvement.”

Future Tense is a partnership of SlateNew America, and Arizona State University.