Post

Technical SEO Checklist

Technical SEO refers to the process of optimizing the technical aspects of your website to ensure that it can be effectively crawled, indexed, and understood by search engines. It focuses on improving the infrastructure and backend elements of a website rather than content or link-building, and aims to enhance both user experience and search engine rankings. Here are the main components of technical SEO:

  1. Crawling and Indexing Crawlability: Ensuring that search engines can find and access all the important pages on your website. This includes having a clear site structure, an XML sitemap, and ensuring no critical pages are blocked by robots.txt. Indexability: Making sure that pages intended for search engines can be indexed. This involves checking for “noindex” tags and using canonical tags to avoid duplicate content issues.

  2. Website Speed

Page Load Time: Faster websites provide better user experiences, which is a ranking factor for Google. Technical SEO involves optimizing elements like image sizes, server response times, browser caching, and minifying CSS, JavaScript, and HTML to improve speed.

  1. Mobile-Friendliness

Responsive Design: Ensuring the website is responsive, meaning it adjusts and displays well on different devices and screen sizes. Mobile usability is a key part of technical SEO, especially since Google primarily uses mobile-first indexing.

  1. Secure Website (HTTPS)

HTTPS: Using SSL certificates to ensure that your site is secure is important for user trust and is also a ranking factor. Technical SEO involves ensuring all pages are accessible via HTTPS and redirecting HTTP pages to HTTPS.

  1. URL Structure

Clean URLs: Technical SEO ensures that URLs are short, descriptive, and free of unnecessary parameters. Proper URL structuring helps search engines and users understand the content of each page more easily.

  1. Fixing Crawl Errors

Crawl Errors: Using tools like Google Search Console to identify and fix crawl errors, such as 404 errors (page not found) or broken links, is part of technical SEO. Ensuring that there are no dead ends or orphaned pages is crucial.

  1. Structured Data (Schema Markup)

Schema Markup: Adding structured data (e.g., JSON-LD or microdata) helps search engines understand the content on your site better, which can lead to rich snippets in search results (e.g., star ratings, recipes, product details).

  1. Canonical Tags

Canonicalization: Preventing duplicate content issues by using canonical tags to specify the preferred version of a page. This ensures that search engines understand which page to index, especially if similar content exists on multiple URLs.

  1. XML Sitemaps and Robots.txt

XML Sitemap: Creating an XML sitemap helps search engines understand the structure of your website and discover all important pages. Robots.txt: Configuring the robots.txt file allows you to control which pages search engines are allowed to crawl, keeping irrelevant or duplicate pages out of search engine indexes.

  1. Website Architecture and Internal Linking

Internal Linking: A well-planned internal linking structure helps distribute link equity throughout the website and makes it easier for search engines to crawl. Flat Site Architecture: Technical SEO focuses on ensuring important pages are only a few clicks away from the homepage, which makes it easier for search engines to crawl the site.

  1. Error Pages and Redirects

404 Pages: Customizing 404 error pages ensures users get helpful navigation if they end up on a broken link, reducing bounce rates. Redirects: Setting up appropriate 301 redirects when URLs change helps retain link equity and prevents broken links.

  1. JavaScript and CSS Optimization

JavaScript Crawling: Technical SEO also involves ensuring that content generated by JavaScript is crawlable by search engines. Improper use of JavaScript can lead to important content being missed. CSS Optimization: Minify CSS files and ensure that CSS doesn’t block page rendering, which can impact page speed and SEO performance.

  1. Core Web Vitals

Core Web Vitals: Technical SEO focuses on optimizing metrics related to user experience, including Largest Contentful Paint (LCP), First Input Delay (FID), and Cumulative Layout Shift (CLS). These are important factors that Google uses to assess page experience.

Why Technical SEO Is Important

Technical SEO ensures that search engines can efficiently crawl and index your site. If search engines have trouble understanding your site’s content or structure, it can negatively impact your visibility and ranking. Addressing technical SEO issues improves not only search engine accessibility but also user experience, leading to higher rankings and increased traffic.

Crawl Index

Indexing is the process of adding crawled web pages to Google’s search index. Once a page is crawled, Google evaluates the content and determines whether it should be included in the index and shown in search results. Check search console to see what pages have been indexed in Google Search Console.

Crawlability

Crawling is the process where Googlebot (Google’s web crawler) visits your website to discover new and updated pages. Googlebot follows links from known pages to find new URLs and gathers information about your website.

Check page crawlability

  • Check search console to see what pages have been crawled in Google Search Console.

Check pages that have

  • A noindex value. Pages with this value are not being crawled.
  • A status code other then 200. Pages with a non 200 value are not being crawled.

Ensure this is the correct directive for these pages. Screaming frog is a great tool for this.

indexability

Configuration of Sitemap

example.com/sitemap.xml

  • Ensure weak content is not being crawled
  • Ensure potent content is crawled

Configuration Canonicals

  • Any thin or dull content that has a better version somewhere else should be canonicalized
  • Self canonicalize pages if no canonical is needed

Config Robots File

example.com/robots.txt

  • Ensure potent content is crawlled

Exclude search queries

Add a disallow in the robots.txt:

User-agent: *
Disallow: /umbraco
Disallow: /search
Disallow: *utm_source*
Disallow: /?post_type=product
Disallow: /?p=

Fix

  • 404s,
  • 503s

Crawl Depth

Crawl/Page depth is a measure of how many clicks away a page is from the homepage of a website. It is an important factor in SEO because it affects how easily search engine crawlers can access and index a page. The deeper a page is, the harder it is for search engines to find and rank it.

Use screaming frog to get the crawl depth of pages - there is a dedicated column.

Mobile Menu

As google crawls the mobile version of the site its important to check the mobile menu to ensure it contains the relevant links.

  • Ensure mobile menu matches the desktop menu this is important because google crawls the mobile site only
  • Ensure mobile menu works with javascript disabled
  • check google search console for any errors when crawling mobile. GSC has a section just for mobile.

Strategy

  • Determine the focus keywords for pages

Schema

  • Schema is correct for each page. Each page should be read and assigned a specific schema
  • Generate specific schemas with this tool(https://technicalseo.com/tools/schema-markup-generator/) ie: rather then Generate a Generic ‘local business’ schema, be as specific as possible. ie: @type: Florist,.

Content

Optimize

  • Title tags
  • H1
  • Meta description
  • Internal linking
  • Keyword optimization of pages

Thin content

  • Identify pages with content below 1000 words

Optimize titles for target keywords

Resources

more info more info more info

This post is licensed under CC BY 4.0 by the author.