When you have a technical SEO checklist. Regardless of the size of website. You should consider it before you deploy it to the web. Technical SEO can be easily overlooked because of cost.
But the cost is on the owner of the website. When a technical SEO checklist is not used, your site will not rank as well. In this post, I want to talk about how to incorporate a Technical SEO Checklist into your website’s development.
Google’s algorithm(s) advanced a lot leading up to Penguin. Take a look at this timeline:
- 2003 — Florida (anti-keyword stuffing spam)
- 2005 — Jagger (anti-link spam)
- 2009 — Caffeine (near real-time indexation)
- 2010 — MayDay (anti-thin content)
- 2011 — Panda (“quality”), Google starts using SSL in search, & Freshness (prioritizing fresh content)
- 2012 — Penguin (anti-link “spam”), Knowledge Graph, & EMD (downgrade exact match domain)
- 2013 — Phantom (quality update) & Hummingbird (core algorithm overhaul)
- 2015 — Rankbrain (contextual search)
What is Technical SEO
Technical SEO is a term that is gaining momentum as the web becomes more crowded with SEO companies. Simply defined it’s the process of modifying your website for the best crawling and indexing on the search engines.
Due to the rapid progression of Google’s search engine (see timeline above) and backlink spam. Technical SEO has become one of the main jobs in SEO trends.
Big technical SEO topics
- Crawling: log file analysis, XML / HTML sitemaps, mobile bot crawl behavior
- Mobile: AMP, Progressive Web Apps, Responsive design
- SSL: HTTPS, HTTP/2
- Structured data: Schema markup, Microdata & JSON-LD, Rich Snippets
- Migrations: domain migrations, relaunches, CMS changes, HTTP to HTTPS
- Page speed: rich media and script compression, CSS sprites, CDNs, server speed optimization, parallel downloads and minify, caching
- Rendering: critical rendering path/lazy loading, DOM rendering, Javascript frameworks rendering
- Content optimization: entity optimization, duplicate content, thin content
- Status codes: 3xx, 4xx, 5xx
- Indexation: canonicalization, robots.txt, meta-tags
- Site structure: internal linking, URL structure, taxonomy
Check your indexing (crawling)
The first thing you should do is to check your site for indexing. If not indexed, then your are not in the search results pages. If you are on WordPress, go to Settings > Reading and uncheck the box that says “Search Engine Visibility.”
Next in your search bar in Google or any search engine type: site:domain.com to see exactly what pages are indexed. The count will be in the upper left of the pages (and posts) you have on the index.
SSL Encryption (https)
Google is now flagging sites that do not have an SSL. In the fight for a more secure Internet, Google considers sites with SSL’s better for users. Thus translating this into an important SEO signal. Most SSL’s are free if you are hosting with a managed hosting provider. But the encryption is basic. While this isn’t bad, it depends on how sensitive your site is containing data.
Delete duplicate pages
Duplicate content is not a good thing on your site as it is misleading to users. Duplicate title tags and meta descriptions are also bad. The search engine crawls everything on your site. This leads to a frequency of crawls or a crawl budget for your site. In order to stop wasting your crawl budget – delete these pages.
Delete zombie pages
These are dead pages on your site. They hold no SEO value. Therefore they are not good to keep on your site. Zombie pages are easy to spot. If you do an SEO Audit you can find them. Another way to spot them is to use the Google Console. If these pages are not at least the top 20 in the results, then you can remove them. Pages with thin content are also to be considered for removal.
Audit internal links
If you have multiple pages on your site you should check your click depth. If there are more than three clicks to get to your content, you should revise to reduce this. Next, check for broken links, redirects (301’s and 410’s), and then fix those broken links (404’s).
While nothing I’ve found has shown that 404’s are bad for SEO, having too many is. This leads to users having a bad experience on your website. Which is what is believed that Google RankBrain does.
Orphaned Content
Orphaned content, as defined by YOAST (WordPress SEO), is content that has no other pages on the same website linking to it. If you have a WordPress website add Yoast SEO and you’ll get an audit of those pages that are orphaned. Using the plugin you can begin to fix, and link those pages (or posts) to other relevant content raising your site SEO.
Update your sitemap
Your website’s sitemap is critical to keep updated as you create more content to publish. The sitemap tells Google crawlers what page your want indexed. Without a sitemap on your sites has less of a chance to improve its SEO.
Page Speed
Site speed has become a huge SEO ranking factor. Faster sites get crawled more often and users love them because they are waiting for content to load. You can do a speed test on Google Page Speed or GTmetrixs.
A quick story. The agency I work for had a client that was too obsessed with their site’s speed but didn’t understand that hosting costs play a huge part in this. Another issue was the images they were uploading were not only terrible. But they were also extremely large. large files on a website will slow a site’s performance. Therefore examine your site’s images and files and take steps to reduce them.
Mobile Friendliness
Now Google has updated its algorithm to include mobile indexing. This is not a separate algorithm for mobile responsive websites. This is for all sites. Your site must be mobile-ready or you could see your site’s results drop like a rock or disappear altogether.
Typically when doing responsive web design, designers will use CSS to hide elements that are difficult to render on mobile devices. This is not recommended any longer. If you are not able to make it work on Mobile then don’t use it. The search engine doesn’t like class {display:none;} if used too liberally.