A really great website is one that delivers a specially personalized user experience. Google knows this and is working to personalize its search engine results. The more your content caters to the user searching for a solution you provide, the more likely Google is going to drive traffic to your site.
In this week’s Ask An SEO Expert, Jesse Laffen outlines four points to keep in mind when personalizing your site to ensure no harm is done to your SEO efforts.
Related Resources from B2C
» Free Webcast: Using Data and Design to Create a Knockout Email Nurture Program
Here’s a question from Michelle: “As a site is developed to deliver a personalized experience, what steps should be taken to ensure there is not harm done from a SEO perspective?”
This is a really kind of nuanced question. It’s a really, really good one. There are a lot of reasons to want to personalize your pages, right? Google does it all the time. They’re showing me plus results. They’re showing me things that they think my friends like. They know that I’m searching from a certain area in the country, and they want to make sure that those results are relevant to me.
Over time Google has pushed more and more to get personalized results. It’s because it’s a better experience. People want things and want information that is germane to them and what they’re looking for right now. So certainly, a great step to take.
The first point here is the risk/reward. The risk that you run doing this is that Google and other search engines have always been a little hesitant of trusting sites that are trying to show different content between a user and a search engine. The reason they do that is because they want to understand that whatever their search engine spider sees in indexes is what they’re actually showing to their end user. It makes a lot of sense. If I show up and try to sell you a Corvette and you get there and it’s a Pinto, you’re going to be like, “What’s going on, Jesse? Why did you do that?”
They’re kind of the same way. They’ve always been a little distrustful of that. It’s a practice called cloaking. There’s kind of another side to this too, though, where certain innovations in web technology are actually much, much better for the user. What happens is Google and Bing and other search engines try to index and understand more and more content so that they can catch up to that really, really good user experience.
I have to believe that this, given Google’s preference to personalize their own results, definitely measures or hits that threshold of this is something that is going on, on the Web, that we really, really want to understand. I think, overall, there’s probably not a ton of risk here, but there is a little bit, because you are essentially possibly showing two different things to two different people and maybe another thing entirely to a search engine.
In order to minimize that risk, which is point number two here, try not to show radically different pages on the same URLs to two different viewers. If somebody comes and they say, “Oh, I’m really, really interested in giraffes,” don’t serve up a page about horses. That just doesn’t make sense. Search engines are really, really good at understanding patterns. That’s kind of what they do. Make sure that you compartmentalize the difference in the content that you are actually showing, if at all possible.
If I’m a search engine and I see a million pages, and they all look kind of the same except for this part over here that’s being changed based on preference, that’s one thing. If all of the content on the entire page is always different, then that’s something entirely different. If I’m a search engine, I probably don’t really trust the latter page, whereas the former, I can start to kind of understand and pick apart. That’s probably a good thing.
Number three here, if you are actually using some sort of tracking ID or dynamic URL or just changing URLs with alike content, make sure that you can canonicalize it back to one source. That’s just a real easy way to show the search engine, yeah, we understand that there’s duplicate content here, but it’s okay, because here’s where the home of it is, and we had really good reason to do that. Search engines like it because they can show up and say, “All right, I don’t have to worry about storing these vast amounts of similar content, because you’ve already shown me exactly what I need to serve up in my search results.”
Step number four, specify static pages in a sitemap. If you are going to change a lot of the content on the pages or if you’re going to serve up different URLs with different content on them, make sure that you’re serving up a really nice, easy to understand XML sitemap for a search engine to kind of parse and understand, okay, these are the places where I can go to find these original sources of content.
On the whole, I think that as long as you kind of minimize exactly how much information you’re showing and really, really focus on your user, it’s hard to go wrong by a search engine, because they want to return great pages all the time. Personalize pages, usually really good for users as long as you’re doing it well. I can’t imagine that a search engine would have a really big problem with it. Just make sure that you’re careful to kind of let them know what’s going on as much as you can.