Cillian Bracken Conway
14th Dec, 2015

Updates on Google algorithms have always been something that marketers and website owners dread and anticipate. Try as they might to prepare for an update, they usually end up reeling from the shock and finding it hard to bounce back from a great fall from search engine rankings.

But Google had been less brutal when they released Panda 4.2. The roll out was slow and done over the course of a month, which, according to Google, would be how they will be releasing updates from then on. The search giant is looking into a more continuous change, instead of serving a one-time painful blow. They also provided information as to the percentage of queries that will be affected by the release of Panda 4.2.

But for all the noise it costs, it turned out to be not much of an update at all. There was slow gains in traffic, which all vanished in August. There were speculations that the Panda 4.2 update was reversed.

Whichever is the case, questions now arise about updates on Google algorithms by 2016. Will the search giant stick with slow roll outs? Will there be updates on Penguin and Pigeon in 2016?

Rather than leave everyone guessing, Google is sparing everyone from shocking surprises next year.

Penguin update will be released before 2015 ends.


Webmasters who were badly affected by the last update in December 2014 can now recover. Panda update has not been refreshed automatically, offering an opportunity to bounce back. The update is also in real-time, which means the impact would be served immediately. The opposite is also true. That is, recovery is also immediate.

One of the things that the update will detect is spammy links. Such links still exist and many webmasters still commit these mistakes, regardless of how Google work to keep everything above-the-board as much as possible.

In preparation for the upcoming Pending update, Search Engine Land recommends an audit on your website’s link profile. Last-minute checks can spare you from possible penalties. There are plenty of tools that will automate the process, instead of you doing it manually. These include Check My Links, Link Research Tools, and Screaming Frog. Among the things to look into are anchor text distribution, sudden spikes on line graphs, and too many referring pages.

What about disavowing links? Although it pays to tell Google that you wash your hands off some of the spammy links pointing to your website, it might not increase whatever benefits you gain from cleaning up links. Panda is not a manual penalty, after all.

Do not ignore internal links. They do have an impact on the way search engines crawl your website, even if they have less impact on rankings. Since you have control over them, you can definitely keep your internal link profile clean.

In 2016, facts would also take the leading role.

Google has always valued truth, which is why Wikipedia often ranks high in Google’s search results. In a research done by Intelligent Positioning in 2012, it showed that Wikipedia appeared on the first page of search results in 99% of searches, position 1 to 5 in 96% of searches, and No.1 in 56% searches. This is because the website is a fact-based content, offering an accurate source for online users, and not because of Google’s preferential treatment as what some pundits claimed.

google search results

Next year, the company is taking steps to ensure everyone else share their views and objectives. Earlier this year, the search giant released a system that evaluates a content’s trustworthiness. Knowledge-Based Trust (KBT) allows the company’s research team to compare facts presented on a website with those found in Google’s Knowledge Vault. This refers to a repository of facts that the company has collected from across the web.

KBT will be looking into how many incorrect data is found on a website, and then assign a trustworthiness score or Knowledge-Based Trust score. This will show online users if a website is reliable enough or not. Scores, however, still depend on the kind of website being checked. Poetry websites, for example, would not receive a low score, as they are not supposed to present facts and details. The only facts to be checked are related to the author, publication, poem structure and the like.

Google ensures that sites with little to no facts will not be penalised. But with KBT scores to supplement PageRank scores, websites could still take a hit from online users.

KBT will also bring back transparency online because value would be given on real knowledge shared. Website popularity based on backlinks would no longer fly. That will be the day when search results provide value rather than fluff. People would no longer complain about top-ranking websites that are rarely helpful.

It is unclear whether KBT will be integrated to any of Google’s algorithm. But it is clear that it would still affect the future of content and how websites will be optimised. With the company literally fact-checking content, it might be time to start checking on how you deliver content to your readers.

High-quality content has always been vital to search engine rankings, but KBT scores would be a major blow to websites engaging in dodgy SEO tactics when it is actually implemented. The fact-checking system would also reward helpful content that is given for free. Google is demanding that accurate information should be freely given and gated content should be minimised if not eliminated.

What webmasters and marketers should do?

Put out high-quality and factual content, instead of focusing on how to beat or stay safe from Google’s algorithm updates. The company has rolled out guidelines on how they wish content to be served. As long as you stick with it, you would have less to worry about. With KBT scores to take the spotlight in 2016, companies should also publish content that are of top quality and rich in facts.