Recently, Search Metrics released a white paper on their findings for Google’s SEO Ranking Factors in 2013. It displays the correlation between particular metrics such as ‘Facebook Likes’ and ‘Backlinks’ and the rank ability of a page within Google. Also available in the 70 page document are comparisons with 2012’s data, where that data is available.
Not to many marketers surprise, a lot has changed. According to Search Metrics, it’s time to get social! In this article I graze upon some of the most important areas as well as my thoughts on what Google is playing it – for better and worse.
*The data from search metrics is based on research and is not 100% factual. Nor are my opinions factual, but rather opinions and statements based on what I believe to be happening.
On-Site: Coding & Meta Data
Length of URL
In this graph you can see that the Length of URL is the highest on-page ranking factor. Why would that be? I mean, surely it’s not important how “long” the URL is. Unless of course it’s a ranking factor because Google aren’t always trying to rank the best websites – shock horror!
One might just say that this is proof of Google’s underlying attempt to educate and clean up the web. I have a strong suspicion that a lot of Google’s updates aren’t necessarily just for themselves, but instead for a much greater good.
With great power, comes great responsibility – something Google have in abundance. This also ties in with my thoughts regarding the Disavow tool which I believe to be Google’s way of making us do a lot of their groundwork. Why would Google clean up all these links and make 6 months of changes to their algorithm when they could have every webmaster and their grandfather running scared with mops and brooms cleaning up the linking mess they made – then to simply submit a list to Google with the remaining faulty links.
Touché Google, touché.
Existence of Description
I found this one particularly interesting. It was certainly a shock to see anyway. Meta Descriptions were once upon a time the holy grail (combined with Meta keywords) before they were stuffed and damaged by spammers (like everything else, ever).
Because of this, they were discarded as a ranking factor altogether and served no search purpose, ranking wise. Of course, having a well-written Meta description was still of value to the reader and would help to improve click through rates. Now, once again it seems that Google has brought in an interesting factor – the existence. Why is that? And why does it outrank H tags, site speed, and keywords?
I’m thinking my theory of a cleaner, more beautiful web is Google’s aim here. If they make it a ranking factor webmasters must put more effort into ensuring that part of their website is as good as possible. A great Meta description = beautiful search results pages = happier users = more love for Google = more money for Google.
I’ll let you make your own decisions on the others. Moving on…
On-Site: Content
Number of Internal Links
Surprised to see this one at the top? I’m not. The more internal links a website has, the easier it is for Google to understand and attribute the necessary rankings. Internal linking has always been a strong indicator of what the most important areas of a website are – but never before has it been more important than the actual textual content itself.
Then again, what use is the content if Google can’t find it?
HTML Length
The length of HTML has been an ever increasing ranking factor. Goolgebot is no fan of clunky HTML, it has a limited amount of content and coding it can absorb to make its decisions. The more unnecessary HTML is has to trawl through, the less content and useful information it’s absorbing – making it less efficient.
This once more points in the direction of a much cleaner web environment.
Number of External Links
This is another new high riser – never before has it been so important to link out. In the industry as a whole, a link is extremely valuable and webmasters seem to think they should never link to other websites – this makes absolutely NO sense.
If you link to others, you are helping Google understand the web – be it for good reason or not. They will always favour a website which links out to many others. Over time Google will begin to perceive your website as a credible source of information for that given subject as well as a source for finding other credible websites within that subject field.
Equally, linking out to other websites is usually practiced because a webmaster is looking to give a user more information. Whether it be a link to understand more about the subject, a sign up page or helpful video – usually it involves pointing someone in the direction of ‘more’. If you are giving more, Google see’s your website as a great destination for users.
Off-Site: Backlinks
% Backlinks rel=nofollow
For those of you that don’t know, this was initially introduced as an agreement by all search engines to help abolish spam; mainly from areas like comments and forums.
The general rule was that a website may still be crawled through a nofollow link, but no value would be passed to the receiving website. It was a way of telling search engines – “sure, visit this website. But I don’t particularly vote for them at all”.
Of course, now though this has been abused too with webmasters looking to be somewhat stingy and nofollow-ing all outbound links; Google has had to take a step back and look at this realistically. The nofollow attribute no longer means that Google shouldn’t rank them; it could now mean that any particular webmaster has used it incorrectly.
Lastly, it’s used so much that if a website does not have any nofollow links pointing to them, Google can tell their websites’ backlinks have been heavily modified which is unnatural. No nofollow links = unnatural link profile = poor rankings. Some nofollow links = natural link profile = good rankings.
SEO Visibility of backlinking URL
I don’t actually have much to say about this except that it’s probably about right and about time, too! If a link is to be passing value to another website, it stands to reason that the website’s visibility in rankings should be taken into account itself.
% Backlinks with Stopword
Not sure how I feel about this one; it’s not exactly ‘correct’ is it? So if my website was called ‘The Website’ instead of ‘Website’ it would rank better for “website” as it would have a lot more brand and stopword backlinks. I think that’s pretty poor from Google, not cool.
I do agree with the idea that Google should take into account additional words within a backlink.
I do not agree with the idea that Google will only accept it dependent on stopwords.
What if a link has four words in it such as – “get awesome car insurance” will Google only take note of the “car insurance” as keywords and the “get” as a stopword?
From what it sounds like, they’re treating any word which isn’t; the, is, get, and, etc… as a traffic driving monetised word – which isn’t necessarily correct.
Off-Site: Social
Google+1’s
This comes as no shock as Google keeps piling on the pressure for users and especially marketers to use its Google+ service. Making this one of the core ranking factors will and has been majorly encouraging users to hype up the service – with this taking place in the industry, it’s only a matter of time before the public begin to use it too.
Facebook: Total
It’s interesting to see just how much trust their putting in Facebook to deliver them accurate results; based on a community of people who “like” a picture of a cat doing something stupid, rather than re-sharing something important like a message of gun crime.
A place where avoiding your parents is the correct thing to do, where you can share your thoughts, pictures and ideas – but not your lust for porn or gambling because it’s taboo.
True, I’m generalising – but I don’t feel it’s necessarily the best way to establish where a website deserves to rank above another. These social signals should be taken into account for sure, but they should not become the judge and jury of ranking factors. Maybe Google is going for a follow the masses approach? I guess it wouldn’t be a surprise if SEO ends up like the mainstream music industry.
Anyway, I digress…
Final Thoughts
Google is playing a dangerous game and has been for a while. With the understanding of their strength and knowledge that they can say jump and we’ll all ask how high. For better or worse they are using the sheer size and dominance they cast over the web to pull strings and manipulate us puppets.
It’ll be interesting to see what happens over the course of 2013 and into 2014 – drastic changes have been affecting the SEO industry for a while.
And in true Viva La Bam style we ask… “What will Google do next? Whatever the f*** they want!”
Read more:
Comments on this article are closed.