Proper site indexation is crucial for any website, but if your online presence is your source of income and a place in the ecommerce atmosphere the indexation issues become not only an obstacle but a huge gap between your business and your target audience. Because if your pages are not indexed/crawled by search engines or bots you really have no audience whatsoever. Since bots can’t find the page they can’t create a copy or index it. In such a case the page is not displayed in search and do not drive traffic to the website. What’s the root of the problem?
Usual reasons for the indexation issues:
- A website has too many pages which means that search engines or bots must spend too much time on indexing all of them to crawl the entire site;
- High frequency of a website updating. For example, when products are constantly added and removed, the website structure dynamically changes so bots are not able to index them in the short period of time;
- There are no links to the page. For example, a page was created but not linked to a website neither from the navigation menu nor from the other pages and it wasn’t submitted to search engines (there is a special service, provided by search engines, which allows to submit a page or a website for indexation) either. In such a case the page isn’t visible for bots, so they can’t find it without a link to it and as a result the page will not be indexed;
- Disallowed content. That happens when a webmaster disallows content from indexation by an error (usually it’s done in robot.txt file, which can be seen by typing domain/robots.txt, for every website on the Internet, e.g. http://sysiq.com/robots.txt). Robots.txt can close some parts of website from indexing by adding a category of pages, e.g. Disallow: */books/ or like it shown on the picture:
So how to fix it?
- Create a sitemap. If a website is fairly big it should have a proper sitemap. More details about how to generate a sitemap can be found here: http://www.sitemaps.org/protocol.html;
- Proper interlinking ensures that a new product page has at least one link from, let’s say, a category they belong to;
- Recheck that a page that you want to be indexed is not disallowed in robots.txt;
- Build external links to your categories and subcategories or even product pages to ensure proper indexation and make it work faster;
- Make a revision of all of your pages to find broken (with broken html code), old and out of date ones to keep a website structure clear, so bots are not spending time on indexing the irrelevant pages and can concentrate on the pages that you want to get indexed.
Discovering and taking care of the indexing issues is not an easy task but very worthy. First and foremost evaluate the problem and outline the steps to resolve it. Whether it’s a broken link or a dissolved content be patient in identifying an exact issue and carefully work it through. After all proper indexation is a road to your customers and to your ecommerce business success.
Comments on this article are closed.