advanced machine learning

Penguin 4.0 – Google is now Sentient….

Penguin 4.0 - Google is now Sentient….

The Frosty Touch of the Penguin 4.0: How will it Affect You?

What you currently understand about Googles policing of the search results has now changed.

All of us know Google has always ranked URLs based on their own credibility and that of the site they belong to, applying manual penalties at URL level, folder level (every page that proceeds that folder) and at the domain level (site wide)

What you knew about the above and what the SEO community understands about the methodology of search result ranking has changed. Google is aware…Google has become Sentient.

The Snowbird Cometh…

After a few years in development, Google have just released the latest offering in state of the art anti-SEO defence: A constantly updating, scrutinising, data-ravenous beast known as Penguin 4.0, designed to be self-learning. This latest instalment has already altered the SEO playing field forever more. This Harbinger looms formidably, casting the shadow of portents for all following future updates.

How it’s Going Down – The Simple Version

Detection of the footprints that are left behind by the SEO operatives (at the URL level and not just the domain or folder level as follows:-

How it’s Going Down – Detailed Version

Whilst Google have always assessed individual URLs on a site, they are now adopting a much more granular approach in an attempt to pick up metrics that wouldn’t have previously flagged as footprints.

They are assessing URLs in unison as well as testing parts of the domain as a whole. For small businesses who have a handful of pages, this is of little consequence. Those enterprises who have hundreds, thousands, maybe even millions of URLs under the umbrella domain however, will have their work cut out for them.

Here’s Why!

Backlink data and inbound links carry risks, but these risks are rarely detected proactively at the most granular level. Even to date, URLs are often part of a light analysis.

But now the tables have turned and they will be thrust in to the spotlight for all to see!

This level of public display will force everyone, especially the more concerned amongst you, to re-evaluate the depth of detail applied when assessing a domains entire backlink profile, as well as internal link patterns.

This is also going to be the time when the disavow file will see some overtime, as people will start to shift URLS that could attract that dirty, black mark from Google.

Prevention is always better than cure and we know URLs are coming under close scrutiny. Now is the time to see what the search engines can see and change what they see.

You must appear to be operating within the walls of the great Google think-tank, abiding by their guidelines. Google will make note of this and, if fortune is smiling, give you a wide berth.

How to React!

All businesses have their objectives, be that traffic or sales, so the first step is to use a rank volatility index (a measure of the movement in the search results pages) across the search terms that prove most profitable, then the analysis can commence to safe guard that revenue.

Adopting a Mott and Bailey approach, you safeguard the important revenue-generating sections of your portfolio and then work outwards from there. This secures your business financially and gives you the strategic start position for planning a counter attack.

With this new algorithm, Google are now able to, and will frequently, look at every URL with the equivalent of a high-powered electron microscope. Everything will be laid bear with definition and clarity. They are able to identify the commonalities of natural and unnatural backlink equity. Commonalities that many modern day SEO operatives fail to consider would pose a risk in the future.

The way to deal with this is through detailed and high level management of the disavow file, no longer just disavowing at the domain level, but strategically disavowing at the URL level. Remember, the URLs that link to you are also being scrutinised in the same way as the URLs they link to; so the chain is only as strong as its’ weakest link, and your objective is to find your’ weak links.

Once these links have been identified, they can be assessed and either disavowed or repurposed, then reinforced through strategic social interaction or a more natural approach to link accrual. Effectively by diluting the signals that initially made them weak.

But where do you start?

Once your fortifications are in place for your priority URLs (the traffic and or revenue drivers), then you can start to look at the URLs with the greatest number of risk-centric footprints. These are the fundamentals for your defensive strategy, planning ahead and preparing BEFORE Google reaches you.

Once Google commences their scrutiny of your URLs, the rank volatility will start to show, which will help you organise, identify and react to the problem areas before they become potentially lethal to the domains’ natural exposure.

“Part of this process is identifying those links that pose the greatest risk at the URL level this will go towards clearing the way for the domain to manage the scrutiny in the best possible way for your business.”

Taking all of these variables into consideration will put you on the front foot for facing Google’s newest leviathan in its arsenal – to defunct the SEO community.

Considering the level that Google has developed its search result algorithm to, the SEO community must be of the mindset to ensure they stay one-step ahead of a system that is now entering a full-on sprint.

This race has just begun and yet the SEO community are soon to find themselves perspiring, while in contrast, Penguin 4.0 hasn’t even broken a sweat yet!

The worst thing anyone can do now is to ignore the facts this warning is based upon. If you doubt this article, we implore you to research this across many other well-known and respected SEO information sources. You need to be more prepared than ever before.

+44 20 8638 8000