There is quite a buzz in SEO community as Google may have ended the long wait since Penguin 3.0. That was way back in October 2014. Is it Penguin, or just the newest expression of changes that we have seen so often in the last couple months?
According to Mozcast and a bunch of other tracking tools something definitely happened on Friday.
One of My Websites Impacted
One of my websites was impacted by it, mostly positively. It’s an affiliate website, but a high quality one for that.
- Keywords positively impacted: these were keywords that have some links. I am seeing the biggest jump for 2 keywords, each with only a couple of links, like less than 5. These jumped e.g. from position 15 to position 5, so it’s quite significant. On the other hand, I have seen such a jump for another keyword a couple weeks ago. That makes me lean towards a (gradual) Panda update. Smaller improvements were probably the result of websites above mine were devalued.
- Keywords negatively impacted: only 2 of them had negative change, but not big movements. It’s probably because other websites have been promoted above mine. Still a little painful to fall back to number 10 from number 6.
What is Penguin about?
It is a non-manual “penalty” that Google won’t call a penalty. It’s basically another algorithm on top of all the others. It analyzes linking data. What about that data? Nobody knows exactly, but we have pretty good guesses.
Anchor Text Concentration
Before Penguin it was very easy to spam links all over the internet with the exact anchor text you wanted to rank for. There is a problem with that. If you check big websites, their anchor text distribution is very diversified. There are also a lot of “useless” anchors occurring naturally, like “click here”, “this”, etc.
Based on my personal experience Penguin definitely looked at this. Whether there is a more sophisticated part of it is hard to tell.
They can analyze:
- the link velocity, that’s how many links have you acquired in what time period and what’s the trend,
- the relevance of the linking page,
- the onsite quality of the linking page (e.g. design, content length, grammar)
- its link profile (contagious bad links),
- and many more along the lines of the above mentioned.
What is Panda About?
Panda is all about the onsite quality of websites. Think design, content, user experience and usage metrics.
Again, nobody knows exactly except a small group of Googlers.
What’s weird about the gradual rollout (and I have seen nobody mentioning it) is that Panda is a sitewide filter. How can they apply a sitewide filter gradually, if it’s based on the whole site’s evaluation? My only guess is they still evaluate the whole webiste. They then artificially slow down the rollout, so you have a hard time reverse engineering what’s going on.
So, Is This Penguin?
So was this a Penguin? I would love to say yes (because I don’t like the the looming danger), but I have to say it probably wasn’t Penguin. I have seen this kind of changes for the last couple of weeks (even months), although on a smaller scale.
My guess is it’s either Panda or a more likely core algo update.
Update 07.09: Google’s John Mueller said yesterday on Youtube live that the weekend update was not Penguin.
A lot of people are saying that this update is helping in-depth articles and negatively impacts shallow ones. It can also be a Rankbrain update, interpreting the user intent (whether the user is looking for an in-depth article).