MARKETING

PUBLIC RELATIONS

DIGITAL MEDIA

May 05, 2015

If Google Ranks Websites Based On Facts, Not Links, Would You Be Prepared?

What if all the information online had to be accurate? What if anything that wasn’t factually correct or accurate was exiled dozens of pages deep? Well, now it could be entirely possible. Google the phrase [...]

What if all the information online had to be accurate? What if anything that wasn’t factually correct or accurate was exiled dozens of pages deep? Well, now it could be entirely possible.

Google the phrase “Cure For Cancer,” and websites claiming “Coconut Oil” would be pushed 30 pages deep. Search for “vaccinations and autism” and you would have to scroll down for what seems like an eternity before finding a page. Returned search results would now be ranked by how well those site’s facts matched the real world.

Today, page ranks are based in part on the popularity of an indexed website as a result of the number and type of external links pointing to that page. Recently, however, researchers at Google published a controversial paper proving how rankings in a web search can be determined by something entirely different — the accuracy of the facts websites contain.

Online marketers are now in conflict over what is considered “facts” as they scramble to realize the consequences of a truth-based Internet.

“In this paper, we address the fundamental question of estimating how trustworthy a given web source is. Informally, we define the trustworthiness or accuracy of a web source as the probability that it contains the correct value for a fact (such as Barack Obama’s nationality), assuming that it mentions any value for that fact….,” said the authors of Google’s recent paper.

To achieve their goal, the researchers devised a “knowledge-based trust” evaluation algorithm to define any site’s accuracy. The researchers said:

“We extract a plurality of facts from many pages using information extraction techniques. We then jointly estimate the correctness of these facts and the accuracy of the sources using inference in a probabilistic model. Furthermore, we show how to initialize our estimate of the accuracy of sources based on authoritative information, in order to ensure that this iterative process converges to a good solution.”

It is a unique approach, based on math and science, and it appears to work. According to the paper, Google applied the methodology to a database of 2.8 billion facts extracted from the web, and thereby estimated the trustworthiness of nearly 120 million webpages. “Manual evaluation of a subset of the results confirmed the effectiveness of the method,” said the authors.

According to a recent article in The New Scientist: “There are already lots of apps that try to help internet users unearth the truth. ‘Lazy Truth’ is a browser extension that skims inboxes to weed out the fake or hoax emails that do the rounds. ‘Emergent,’ a project from the Tow Center for Digital Journalism at Columbia University, New York, pulls in rumors from trashy sites, then verifies or rebuts them by cross-referencing to other sources.”

If such truth-based web searches ever become a reality, it is clear that the Internet would be completely changed forever. And with that possibility, we can see how online content marketing will be completely transformed forever. There is a lot riding on this possibility and for companies looking to preserve or improve their rankings, seeking professional expertise in online content marketing will be an absolutely necessary element of driving traffic.

Share This Story, Choose Your Platform!

Tips On Preparing For An Online Crisis

As dollars spent on ‘native’ advertising increase, so will the role of creative public relations agencies

Marx Layne is your competitive advantage.

Your reputation and success are our only concerns.

LETS GET STARTED