The latest update to Google’s Panda algorithm that affected roughly 1.6% searches according to Google’s tweet, claimed a substantial chunk of traffic from its blogging service Blogspot.com. The image below shows the traffic profile for Blogspot.com vs. its counterpart WordPress.com. Both are open blogging platforms used by thousands of publishers all over the world.
The image suggests that a drop in traffic experienced by Blogspot right around the time when the latest iteration of Panda was performed (NOTE: Alexa traffic stats lag by about one week so keep this in mind when replicating this image over a shorter period), was due to lost organic rankings experienced by some pages hosted on Blogspot. From what we know about Panda, we can assume that those pages lacked quality.
While the focus of this article is not on how Google Panda “calculates” quality metrics for a page and how it operates domain wide, there are some general observations that one can make from this case about Panda.
First of all, Blogspot is an open publishing platform. Just like on article directory sites, most of which have been hit hard by Panda, anyone can publish their content on Blogspot. Thus, we can expect to find some great posts on Blogspot and some that are not so great. The Google webmaster blog is hosted on Blogspot and if you need an example of a great quality blog, check it out now!
Panda uses a machine learning algorithm to sort poor quality pages from good quality ones. We know that it takes a close look at on-page factors first and off-page factors second. These on-page factors include many little details that you may not notice right away but they are there and they count when Panda comes around.
An interesting question is why WordPress.com was not affected. It seems unlikely that it has fewer low quality blogs hosted on it than Blogspot. What do you think?
Both platforms use the same subdomain structure – take a note. Hubpages.com switched to subdomains after it was severely punished by Panda and actually experienced a nice enough comeback, while famous Ezinearticles.com did not and is still in the doldrums. Why it matters? I think that a domain wide Panda penalty works a bit differently depending on whether a UGC site uses subdomains or not. At the same time, it doesn’t really make sense to penalize the entire domain and all its content just because it has a few poor quality posts. This is actually a central dilemma for Google that also defines its corrective measures against spam. At this point, it seems that once a site reaches a threshold set for the percentage of bad quality pages, the entire domain gets a slap (trust score adjustment). There are indications, however, that this is changing and Google becomes more sensitive towards individual pages.
Let’s dig a little deeper to see if Blogspot actually experienced a downward domain wide trust score adjustment, or a Panda penalty, if you will. It is not difficult to check in Alexa where visitors hang out most on Blogspot. A natural candidate is, of course, Google’s own blog Googleblog.Blogspot.com that attracts about 0.4% of Blogspot visitors. The test we are about to run is quite simple.
There is no doubt that Google’s own blog is considered to be of high quality by everybody and, above all, by Googlers themselves. Therefore, the blog should not have lost rankings due to latest Panda iterations unless there was a domain wide penalty applied to Blogspot. In order to run this test, we will need to look at before and after rankings for Googleblog.
I am going to use the iSpionage keyword tracking tool – a must have for every serious SEO. It provides an average organic position in Google for a particular keyword for any domain or subdomain. Since we are lucky to catch this traffic change for Blogspot before the positions have been updated, we will simply check them against the current positions in Google. And how can I be sure that the positions have not yet been updated? I simply checked the same for another blog hosted on Blogspot that was “pandalized”. The results are plotted below.
As you can see, the “pandalized” blog (name undisclosed), was pushed down several notches for most keywords included in my random sample (I did check to make sure I had a few highly competitive keywords in both samples though). For the Google’s official blog, this was not the case. There were only three keywords for which the blog’s rankings shifted more than 3 positions down. We can totally attribute these changes to stronger performance of other sites competing for the keywords in question. You can’t make the same case for the “pandalized” blog though.
From this analysis, which is certainly not free of bias, we can conclude that Blogspot was not penalized on a domain level in the sense many webmasters, who experienced Panda’s penalties first hand, conceive. It should not come as a surprise. Googlers who created Panda are the same people who created Blogspot’s structure and who host their own blogs on Blogspot. It would run against the logic to apply a harsh domain wide penalty to the site which hosts their high quality blogs – thus, the subdomain structure.
Furthermore, Panda algorithm seems to become more sensitive to individual pages at a query (keyword) level. This means that, say, your article posted on Ezinearticles.com or eHow.com – the domains that Panda doesn’t like – can still rank well on its own if it meets certain quality metrics used by Panda.
Finally, if you do not yet have a blog on Blogspot, it is time to fire one up…