In this article, we’re looking at how website domains can affect rankings in Google’s search results. We’re exploring some of the factors Google accounts for when cataloguing new domains, such as ownership history, IP address and domain neighbourhood. We’ll then suggest some ways website owners can clean-up their domain and, therefore, make the most of Google’s search results pages.

Photo c/o Jenn Durfey

One of the primary considerations Google uses in determining the ranking of a website is its domain. For businesses, domain quality is very important. Websites with keywords in their domain or subdomain usually rank better, although exact-match domains may not necessarily result in higher rankings. Country code top level domains help websites to rank higher in a particular country (for example, “”, “.de” or “”), but may also affect the site’s ability to rank well globally. And domain age is also taken into consideration. “The difference between a domain that’s six months old verses one year old is really not that big”, says Matt Cutts, however, older domains tend to compete for rankings slightly better.

How else does Google measure domain quality? Firstly, it can identify a particular mischievous spammer and punishes other domains owned by them. And if a website uses “private whois” protection then Google might consider it to have something to hide. “Having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so”, said Matt Cutts back in 2006.

Domain registration length is another important factor. “Valuable (legitimate) domains are often paid for several years in advance, while doorway (illegitimate) domains rarely are used for more than a year. Therefore, the date when a domain expires in the future can be used as a factor in predicting the legitimacy of a domain”, reads Google’s “Information Retrieval on Historical Data” patent. In other words, the registration data of a domain should not damage its reputation, but it could potentially help.

Finally, Google downgrades websites with a dodgy past, “drops” or a volatile ownership history. As a consequence, it can take a long time – and a great deal of effort – for new domain owners to wipe the slate clean.

It happens all the time: a website owner purchases a new domain, only to discover that it has a dodgy history and a bad relationship with Google. It may, for instance, have been temporarily banned or restricted from Google because a past owner used unethical black hat SEO tactics. The waters are muddied further if someone buys the website content and links too.

In a recent episode of his webmaster blog, head of Google’s webspam team, Matt Cutts, addressed the controversial issue of domain history and how it affects a website’s rank in Google’s search results.

How can we check to see if a domain (bought from a registrar) was previously in trouble with Google? I recently bought, and unbeknownst to me the domain isn’t being indexed and I’ve had to do a reconsideration request. How could I have prevented?

That courtesy of “Wally from Reno” – a good question because it gets to the crux of the matter.

Purchasing a new domain can be a stressful time. It’s like moving your shop into new high street premises: can you guarantee customers will walk through the door? Were the previous owners run out of town? Did it have a rodent infestation!? Of course, website owners always have a choice: you can buy and register a fresh domain or purchase one of someone else or from a registrar. But, Cutts asks, how can you guarantee that you’ll get what you paid for? And how can you make sure a previous owner didn’t do some damage and “do a runner”?

Matt Cutts outlines a few important strategies that website owners can adopt to better understand their domain history. “First off, do a search for the domain and do it in a couple of ways”, he says. “Do a “site:search”, so “” – whatever it is you want to buy. If there is no results at all from that domain, even if there’s content on that domain then that’s a pretty bad sign.”

Google tries to withdraw parked domains from SERPs, so there might not always be results – but if there are no results for a “site:search” then the signs are bad.  “Also, just search for the domain name, or the name of the domain minus the “.com”, because you can often find out a little bit of the reputation”, he continues. “Were people spamming with that domain name? Were they talking about it? Were they talking about it in a bad way like “This guy was sending me unsolicited email and leaving spam comments on my blog?”

It’s also a good idea to search for the domain name in other search engines. A website might appear in Bing’s results but not in Google’s, which is obviously bad news (users can also plug in typical spam keywords).

Internet Archive is another useful tool because it visibly shows how previous versions of the website looked. “If the site looked like it was spamming, then that’s definitely a reason to be a lot more cautious, and maybe steer clear of buying that domain name because that probably means the previous owner has dug the domain into a hole and you have to do a lot of work even to get back to level ground”, Matt comments. “If it’s auto-generated spammy content that’s a really bad sign and may be a reason to avoid that domain.” Occasionally, old website content is blocked from, which might mean that a previous owner or disreputable seller has disabled content access by using “robots.txt” – which might be another reason avoid buying particular domain.

Photo c/o Wikimedia Commons

Potential domain buyers can also ask the current owner for analytics statistics and webmaster tools. Analytical data enables the buyer to see traffic flow over time and, if it has significantly dropped, that might be symptomatic of dubious activity. “This type of request is definitely becoming more common, especially because of all the headaches it can take to have a successful reconsideration request, especially when you’re in the dark about what exactly has been done to it in the past”, writes Jennifer Slegg from Search Engine Watch.

It’s worth the hassle of research before attempting to get a website back into Google’s search engine. Before doing a reconsideration request, warns Matt Cutts, “I would ask yourself whether you are buying the domain just because you like the domain name, or are you buying it because of all the previous content, or the links that were coming to it.” Links do not always carry over. And Cutts also recommends disavowing all the links if the previous owner was a notorious spammer. When a website owner purchases a previously used domain then they are running the risk because it requires a lot of hoop-jumping to clean up a bad domain.


How long do expired domains with penalties last?

Matt Cutts has also spoken about the differences between a “manual action” on an expired domain and an algorithmic action. “If domain hasn’t really been on the web since 2001, I would expect any manual webspam actions to have expired long ago”, he said. “It’s possible that the domain did some things in 2001 that would lead to algorithmic ranking issues, but the web typically changes enough in 12 years that I’d be surprised if you ran into issues. Typically when you buy a site and run into problems, it’s because someone was spamming more recently with the domain.”

In response, Barry Schwartz at Search Engine Roundtable wrote, “So clearly, if the domain expired years ago, you probably don’t need to worry about a manual action. But, to be safe, login into Webmaster Tools and see if it still has a manual actions. If so, then submit a reconsideration request. I wouldn’t be surprised if it did have a manual action, that some algorithm is also impacting it.”

In his forum post, Matt Cutts suggests that Google’s ranking algorithm probably will not affect a decade-old expired domain. It is unclear, however. If it is still damaging it then it’s time to find a completely new domain.


How do you clean up a bad domain and ask for a reconsideration request?

Websites have to comply with Google’s quality guidelines. Google imposes manual penalty notices on nonconforming websites and cleaning up a bad domain can be hard work. Website owners have to file for a reconsideration request to have penalties removed and for it to return to SERPs.

The first task is to clean up the site. For example, if Google has penalised a website because of an unnatural link profile, those links need to be identified and either removed or disavowed (making clear to Google that you’ve used “disavow.txt”). Similarly, if a website has been penalised for an on-site issue then that needs to be addressed. It’s then important to provide proof to Google that you’ve made efforts to clean up the domain; documentation should be thorough and official otherwise it may be ignored by Google’s webmasters.

New website owners need to demonstrate that they will be more committed to following Google’s rules than the previous owners.


What are “bad neighbourhoods”?  

Web hosting services often host several different websites on the same server. Search engines identify websites by their IP address (domain names are only an add-on) and they can either be shared or dedicated. Association with a “bad neighbourhood” can have an adverse effect on the search ranking of an honest website – just as a bad domain history can. However, unlike in real life, a website might not know it has bad neighbours until it has faced an unexpected SERP demotion.

Below our SMG technical team talks you through how to do a “neighbourhood” check. Is yours good or bad?

“Firstly, find the website you want to check. We’ll use

Press the Windows button on your desktop to get the start menu, type “CMD” and press enter. 

A black command prompt should appear.

Type “ping” followed (after a space) by the website you want to test. So, in this case type “ping” followed by “” (you need to omit “http://” at the front, it can only be the root domain and cannot have a “/” on the end).

Press enter. The number after your website address is the IP address – remember it.

Now go to

In the search-box, type “ip:” followed by the IP address (without any spaces).

So, in our case: “ip:”.

Done. Meet your new neighbours.”

In March 2013, a study by the University of Twente’s Centre for Telematics and Information Technology founded that out of 42,000 surveyed Internet Service Providers (ISPs) just twenty were found to be responsible for half of all spammy internet addresses.

The study proposes that a small number of “bad neighbourhoods” are accountable for the majority of online spam, phishing and other dodgy activity. “Just like in the real world, the internet has also “bad neighbourhoods” whose streets are not safe and where crime rates are higher than in other district”, explains Science Daily.

The chief researcher, Giovane Moura, monitored and analysed network data in order to investigate malicious hosts, and concluded that malicious activity on the internet is concentrated in small areas – particularly where IP addresses show similarities. “For instance, [Moura] found that 62% of the addresses at one ISP were related to spam [and] this knowledge can be used to link security measures to specific ISPs.”

Moura highlights the importance of distinguishing between individual IP addresses – that might launch one-off attacks – and larger “bad” neighbourhoods which constantly launch attacks. Moreover, the study also identifies that certain types of undesirable activity are associated with certain part of the world. Phishing, for example, is common in the USA and other more developed countries, because they are home to the majority of data centres and cloud providers. Spam, on the other hand, mainly originates from countries in southern Asia. Alarmingly, BRIC countries are a ticking spam bomb; if India had the same internet penetration rate as the US and maintained the current proportion of malicious IP addresses, there would be 200% more spammy IP addresses on the web.

It’s not difficult to avoid linking your website to a bad neighbourhood. Website owners should check outbound links regularly; even though Google will unlikely punish you for a few minor mishaps. Websites can also link to a bad neighbour by proxy, through linking to a site which has questionable links itself – so website owners need to be aware of that also. Of course, you can always use “nofollow” if you do not trust the site you are linking to or “Google Safe Browsing” to find out whether a site has been found to be unsafe or has a penalty imposed upon it.



In short, there are numerous ways to check the health and reputation of a website domain. Bad neighbours can drive down the property values of an entire neighbourhood; nobody wants that. Therefore, it is certainly worth investing in a dedicated host server and regular link audits.

The web is built on links and when they are used properly everyone can benefit. Google is very quick to punish sites associated with spammy on-page SEO techniques, as well as dodgy backlink and interlink profiles. Website owners who remain vigilant, however, should have little to worry about.

Read more: