Google is often criticized for how it handles spammy links, but columnist Ian Bowden believes this criticism may not be fair. Here, he takes a look at the challenges Google might confront in handling the continuing issue of paid links.
Prior to the recent entrance of Penguin 4.0, it had been nearly two years since Penguin was last updated. By the summer, some in the industry had given up on Google ever releasing Penguin 4.0. But why did it take so long?
I’d claim that criticism directed at Google is in many instances unjustified, as individuals often choose overly simplistic a view of the task at hand for the search engine.
Dealing with paid links and detecting is a lot more difficult than many people believe, and there are likely great reasons why Google chose longer than hoped to release the next iteration of Penguin.
Here are some of the challenges Google may have faced in pushing out the latest Penguin update:
1. It has to be good at finding links that are paid
To run and set up a productive Penguin update, Google has to have the skill to (algorithmically and at scale) ascertain which links break guidelines. It’s not clear the extent to which Google is capable of this; there are lots of case studies which show that links breaking the guidelines continue to operate.
Nevertheless, not all paid links are created equal.
Some paid links are clearly paid for.
On the other hand, some links may not have any telltale signs on the page that they are paid for, so determining whether or not they are paid links comes through discovering patterns.
The reality is that advanced will be challenging for Google penalize or to devalue.
Penguin has historically targeted very low-quality web junk, as qualify and it is simpler to distinguish, but a level above this is an opportunity. Google must have confidence in its capability before implementing a filter, as a result of severity of the results.
2. Google is dependent on links for the finest quality search results
Maybe, just maybe, Google is actually capable of detecting paid links but chooses not to devalue all of them.
Most people will be familiar with third party applications that perform link evaluations to evaluate which links are “toxic” and will potentially be harming search performance. Users understand that sometimes these tools get it wrong, but normally they’re quite good.
Google has experimented with removing links from their index with negative effects for the quality of search results. It would be fascinating to see the quality of search results when the spammy link brink of Penguin vary.
It’s not impossible that even though particular links aren’t compliant with webmaster guidelines, they assist Google in their own number one aim of returning users the best quality search results. For the time being, they might be useful to Google.
3. Negative Search Engine Optimization stays a reality
If Google is convinced that a link has been orchestrated, it is very difficult for the search engine to also be sure whether it was done by the webmaster or by someone else carrying out a negative Search Engine Optimization effort.
If a visibility or penalty fall were as simple to incur from some of paid links in theory, it’d be pretty straightforward to perform negative SEO on adversaries. The barriers to doing this are quite low, and also, the footprint is minimal.
Google has attempted to negate this problem with the launch of the disavow tool, but it isn’t realistic to believe all webmasters will know of this, let alone use the tool accurately. This is a challenge in tackling paid links for Google.
4. It provides attention that is unwanted and a PR backlash
When rolling out large algorithm upgrades, it’s inevitable that there’ll be false positives or intense punishments for small violations. After any rollout, there will be a number of corrections” that is “ as Google quantifies the impact of the update and efforts to tweak it.
Despite that, a high number of companies will suffer as a consequence of these upgrades. Those who frequently join Google Webmaster Hangouts will be used to company owners, almost in tears, plead for more information and discussing the overwhelming impact of a recent upgrade.
While the vast majority of Google users will most probably unaware of or care about the fallout of algorithm upgrades, Google is provided by these scenarios with some amount of negative PR. Any noise that points toward Google affording too large an amount of power is unwanted focus.
On a related note, occasionally fees are simply not viable for Google. They expect to see specific retailers, when someone walks down Main Street. It’s precisely the same with search results. Users going to Google expect to see the top brands.
The user doesn’t really care if a brand is not appearing because of a penalty. Users will hold it as a reflection on the quality of Google rather than the brand’s non compliance with guidelines.
To be clear, that’s not to say that Google never penalizes huge brands — Sprint, JCPenney, the BBC and tons of other large brands all have received high-profile manual punishments in the past.
But Google does have to contemplate the impact on the user experience when selecting how to weight various sorts of links. If users don’t see the websites they anticipate in search results, the result could be switching to another search engine.
This is how the problem is dealt with by Google
The preceding four points highlight some of the challenges Google faces. Fewer things are more important than meeting its objective of returning its users the results that are most useful, so it’s a massive interest in dealing with paid links.
Here are some ways Google could address the challenges it faces:
1. Favor to devalue links and issue fewer penalties
Penalties act as a hindrance for breaking guidelines, and they serve to enhance the quality of search results by demoting results that were artificially fostered. A lot of the danger of “getting it incorrect” can just be mitigated through devaluing links rather than imposing penalties that were manual.
In the instance of a negative SEO attack, the spammy links, rather than causing a punishment for a website, could just not be counted. In theory, this is the goal of a disavow file. Penalties could be saved for only the most egregious offenders.
2. Do a slow rollout joined with other updates
Slowly rolling out the Penguin 4.0 upgrade supplies Google two edges. It softens the shock of the upgrade. There is not one week when suddenly some big profile brands lose visibility, drawing attention to the upgrade.
It enables Google to examine the impact of the upgrade and adjust over time. If the update is overly brutal, they’re able to fix the parameters.
To add to the confusion and allow it to be more difficult to comprehend the impact of Penguin 4.0, it is likely Google will roll out some other upgrades at the same time.
If you cast your memory back two years to the introduction of Panda 4.1 and Penguin 3.0, they were rolled out almost in conjunction. This made it more difficult to understand what their impacts were.
There was lots of SERP fluctuation this September. It is potential part of this change can be attributed to Penguin 4.0 testing, but there is no guarantee because of the number of other updates happening (such as the local update dubbed “Possum“).
3. Encourage a culture of fear
If the threat of receiving a punishment is the same as it was five years ago, panic and the anxiety of receiving one is considerably greater among brands. High profile punishments have not only served their function of punishing the offending brand, but in addition they have provided a great deterrent to anyone else considering this kind of strategy.
The transition to content marketing and SEO becoming less of a black box helped in this, but this culture of fear has been a large driver in the decrease in link activity that is paid.
Google is frequently criticized for not doing more to handle paid links, but I believe that criticism is unfair. One can be more forgiving when one considers the challenges when handling paid links search engines face.
However, the fact that Penguin now runs in real time will ensure it is more difficult for webmasters to know when a loss in ranks is due to spammy links or something else — so webmasters will have to be vigilant about monitoring the well-being of their backlink profiles.
I suppose that Google will continue to make tweaks and adjustments to Penguin after the rollout is complete, and I expect to see a continued shift from penalties to devaluing links as time passes.