Google’s Over-Optimization Penalty, How to Avoid It, and How to Fix It

Comments: 16

  • Interesting article. I’ve been working on building my website for a while and now I will go back and see if I’ve done any of the items listed above.

  • Very well written. I’m glad to see Google voicing official steps of what it likes and what it doesnt, so people stop obsessing about SEO checklists, the ridiculous but persistent keyword density issue, and the role of website usability in SEO.
    Personally, I see things converging around usability and content quality metrics. For those who learned how to write a good research paper, content marketing looks like a bright future. For the rest, I guess it means brushing up on their research and writing skills and rules.
    The interesting thing is that Google addresses the “ugly site design” issue. Granted, aesthetics is difficult to measure especially since Google sees HTML code, not how the site actually looks. So I’m itching to find out how they will quantify aesthetics. My guess is they may perhaps rely on CSS color palette use, clean and functional Navigation, and probably they’ll weave in the recent “avoid too many Ads above the fold” advice I’ve read somewhere.
    Interesting times, to say the least. Things look quite promising for people who have work ethics in their company ethos.

  • @Alex Havian – negative SEO does not exist. For one, the gig you mentioned is gone. And two, Google’s method for penalization is to de-index sites where you get links from, NOT penalize. I saw this first hand as my site dropped in rankings since i had so many links from BuildMyRank. My site has since bounced back (evidence of no penalty), by diversifying my link building strategy.

    @Alex Wall – i see you’ve re-published this article and neither your source (or you) have mentioned where the list of bullets that make up your “what qualifies for over-optimization” list comes from. Has Cutts or anyone else from Google published such a list?

    Other than suspicion, what cause is there to provide such a dramatic list without backup?

    • Mike, I syndicate articles from my blog, Searchcore, to B2C, so the “source” to which you are referring is me. I hope that clears one thing up.

      Furthermore, it’s established knowledge that Google uses over 200 signals in its algorithm to filter between good and bad links, and the signals listed are assembled based on my experience, expertise, and general knowledge.

      For reasons apparent, Google is tight-lipped about the qualifiers it uses to parse good from bad, and conclusions drawn from testing and experience are often all we have to go on. I’m sure you can find other lists with similar items, perhaps some different, because all SEOs are not built the same.

      That said, if you don’t want to believe it — don’t. I stand by my findings.

  • How is Google going to find the abusers and differentiate between the good and the bad is a question we all are waiting to get an answer for. It’s not only going to be difficult but I believe it will probably bring focus on SEO basics of On-page optimization, and there is room to define inbound links relevancy as well. What Google will also need to look for, is websites which have good content but zero SEO, surprisingly high PR and high SERP rankings.

    While this is good news for all those who believe in websites with highly relevant content, it could mean another drop in rankings for those who have not understood the search engines move to increase relevancy since the Panda update.

    • You’re absolutely right. Things could get ugly from here on out, but I don’t believe that the competing forces of Google and the SEO/SEM community are really going to let that happen. I think that in the long run, these are positive changes, even if they represent short-term complications.

  • So this leaves it open for black hat people to spam competition sites and drop them from rankings? This could be a big mess and make allot of legit business suffer.

  • Very informative, Alex. I’m an SEOer but I still appreciate that quality content is moving up the list of things that make a site rank.

    “…is a peninsula of a Pangaea-like continent.” Sorry Alex, you lost me on what I am sure is a very clever metaphor.

  • To the guy who said negative SEO doesn’t exist, I would challenge him to put a URL he owns and a target keyword where his mouth is. It is far easier to drop a site into a black hole than it it is to move it higher.

    • Quite so, Mark, and I’m inclined to agree with you. Particularly with these new penalties, which have been masked as a “webspam algorithm update” (also now known as Google Panda), even associating with questionable sites can tank you in the rankings.

  • These updates by Google really penalized so many websites. I would like to thanks Google that my websites are not penalized because I always follow the rules.

Add a New Comment

Thank you for adding to the conversation!

Our comments are moderated. Your comment may not appear immediately.