For the last four-plus years now, we’ve discovered a lot about Penguin. Initially announced in April 2012, we were told that this algorithm upgrade, designed to combat web spam, would impact three percent of queries.
After Google expressly noted one was coming “soon” in October of 2015, more recently, we’ve seen frustration on the part of website owners that are penalized at having to wait over a year for an update.
In all the years of discussion around Penguin, however, I don’t consider any update has been more filled with confusing statements and misinformation than Penguin 4.0, the most recent update. The largest offender here is Google itself, which hasn’t been consistent in its messaging.
And this is merely misunderstood facets of this upgrade or the matter of this post: the peeling away of some of the recent misstated, and more to the point, what it means for web site owners and their SEOs.
So, let ’s begin.
What is Penguin?
Note: You can also browse Search Engine Land’s Penguin Update section for all the posts written here on this issue.
The Penguin algorithm update was first declared on April 24, 2012, and the official explanation was that the algorithm targeted web spam in general. However, since the largest losses were incurred by those engaged in manipulative link schemes, the algorithm itself was viewed as being designed to punish sites with bad link profiles.
I’ll with the premise that I shouldn’t bore you with added details on what the algorithm was designed to do leave it at that. Let’s move now to the confusion.
Where’s the confusion?
Until Penguin 4.0 rolled out on September 23, 2016, there actually wasn’t a lot of confusion around the algorithm. The whole Search Engine Optimization community — and even many outside it.
Understood that the Penguin update demoted sites with bad links, until it was next updated that an affected website could anticipate some semblance of retrieval and it wasn’t.
However, things got more complex with this recent upgrade — not because the algorithm itself got any more challenging to understand, but rather because the people at Google did.
In essence, there were just a couple of major changes with this update:
Penguin now runs in real time. Webmasters impacted by Penguin will have to await the next update to see the consequences of their improvement attempts — changes will be noticeable considerably more rapidly, generally not long after a page reindexed and is recrawled.
Penguin 4.0 is “more granular,” meaning that it can now impact individual pages or sections of a site in addition to entire domain names; previously, it’d act as a site-wide punishment, impacting positions for an entire website.
It would appear that there isn’t lots of room for confusion here on first glimpse. However, when the folks at Google started giving advice and adding details, that ended up causing a bit of confusion. Therefore let’s look at those to get a better comprehension of what we’re anticipated to do.
Disavow files
This is a result of a change in how Penguin 4.0 deals with bad links: they now devalue the links themselves rather than demoting the website they’re linking to.
Now, that looks pretty clear. If you read Illyes’ statements in the article there are a few takeaways:
Spam is devalued, rather than sites being demoted.
There’s less need to use a disavow file for Penguin-connected ranking penalties.
Using the disavow file for Penguin-related issues can help Google help you, but it is more especially useful for websites under guide review.
The takeaway here is that the more things change, the more they remain the same. There is no change. If you’ve used unethical link-building strategies in the past and are contemplating submitting a disavow file — good, you should do that.
If you’ven’t used such strategies, then you shouldn’t need to; if Google finds bad links to your site, they’ll just devalue them.
Of course, it was also claimed that negative SEO doesn’t work, meaning a disavow wasn’t necessary for bad links you didn’t build. This was obviously not the case, and negative SEO did work (and may well still).
So you should be continuing to monitor your links for bad ones and adding them to your disavow file occasionally. After all, if poor links couldn’t negatively affect your website, there would not be any demand for a disavow in the slightest.
And so, the more things change, the more they stay the same.
The source site?
In a recent podcast over on Marketing Land, Gary Illyes clarifies that under Penguin, it’s not the objective site it’s the source. Other signs a page sends to indicate that it’s likely spam, although this doesn’t just contain links themselves.
Thus, what we only were advised is that the worth of a link comes from the site/page it’s not and on where it’s pointing. To put it differently, when you’re judging your inbound links, make sure you take a look at the source page and domain of those links.
The more things change, the more they stay the same.
Your links are labeled
In the same podcast on Penguin, it came to light that Google places links on a page into groups, including things like:
footer;
Penguin-influenced; and
disavowed.
They weren’t named, although it was suggested that there are other groups. What actually does this mean?
It means what we understood was going on for about a decade.
Also, we understood that links that were not avow were flagged as such.
There is one side that is new
Essentially, it seems that where formerly, content as a whole may have been categorized and links contained in that categorization, now a link is given one or possibly multiple labels.
The link tagging takeway
Understanding whether the link is being labeled or simply decided by its position on the page — and whether it’s not or been disavowed — isn’t especially actionable. But from a SEO’s standpoint, we have to ask ourselves, ”What actually changed?”
Nothing. You’ll be working to develop highly visible links, placed contextually where potential and on related websites. You probably weren’t doing your link building accurately to begin with, if it strays far from what you were doing. I repeat: the more things change, the more they stay the same.
But not Penguin penalties, right? Or… ?
It turns out that Penguin punishments are treated very differently in 4.0 from the way they were formerly. In a conversation with Google’s Gary Illyes, he disclosed that there is no sandbox for a website penalized by Penguin.
So basically, if you get hit with a Penguin penalty, there is no trust delay in restoration — after you fix the issue and your site is recrawled, you’d bounce back.
That said, there’s something threatening about Illyes’ final tweet above. So Penguin will not require or impose a sandbox or trust -established delay… but that’s not to say there aren’t other functions in Google’s algorithm that do.
Prevent punishments — and while not Penguin-related, there may or may not be delays in recovering from one. Sound familiar?
The more things change, the more they remain the same
While this was a major upgrade with a few critical changes, what it ultimately means is that our SEO process hasn’t really changed at all. Our links will get picked up quicker (both the good and the bad), and punishments will likely be doled out and rolled back considerably more faithfully; however, the links we need to assemble and how they’re being weighted stay pretty much the same (if not identical).
I have no doubt this is now true.
But when that time comes, the machines will be looking for link quality signs and maximized and relevancy user experience.