Technical SEO Checklist 2023

65 Views
10 min

0.0

Are you ready to make sure your website is taking advantage of the latest in technical SEO? Whether you’re a marketer, analyst, or developer – ensuring that your website architecture and coding are optimized for search engine success should be on your list of priorities. That’s why this blog post explores the essential Technical SEO Checklist 2023 – outlining key elements and tests to address the Technical SEO of a website to secure long-term organic search visibility. From metadata to canonical tags, read on as we break down everything you need to know about achieving technical SEO success in 2023!

Make Sure Your Site is on HTPPS

HTTPS encrypts your connection to the website you’re visiting, keeping your sensitive data safe from sneaky hackers. However, if the lock is open or absent, it’s a sign that the connection is not secure and could be harmful. Hackers can easily intercept any data you send to that site, including your password and email address. To avoid this threat, HTTPS is an essential part of technical SEO – keeping your website safe and secure for all users.

Website URL Version

There can be many versions of your website’s URLs. To explain that, let’s assume domain.com is your website. There can be many variations to this website URL, out of which a few are given below:

  • www.domain.com
  • domain.com
  • www.domain.com/index
  • www.domain.com/index.php
  • www.domain.com/index.html
  • domain.com/index
  • domain.com/index.php
  • domain.com/index.html

The bad part about having more than one URL is that Google considers them different entities and treats their SEO separately. Therefore, the more URL variations you have, the more SEO-related issues you’ll come across. You might be wondering how to counter this setback. To do that, you can make one of the domains as your primary URL. For example, make domain.com your primary. But we suggest choosing either www.domain.com or domain.com as your primary URLs.

Website Responsive Test

By utilizing responsive web design, your website becomes more user-friendly and ensures easier navigation, encouraging visitors to stick around. And creating an exceptional user experience can result in repeat visitors to your site. Google provides a test to assess Mobile-Friendly responsiveness to see if your website looks mobile-friendly in the Google index. While testing non-canonical pages might seem like a good idea, it doesn’t truly reflect the mobile-friendliness of your page in the Google index. During the indexing process, Google only checks the mobile-friendliness of the canonical URL, mainly if there are duplicate pages. Therefore, even if the non-canonical versions are mobile-friendly if the canonical version isn’t, then the page will still have issues in the eyes of Google.

Website Speed and UX Insight

Have you ever visited a website that took forever to load? Website speed is measured by something called Core Web Vitals. It’s like a report card for a website’s performance. Core Web Vitals includes three metrics: First Input Delay (FID), Largest Contentful Paint (LCP), and Cumulative Layout Shift (CLS). To pass the Core Web Vitals assessment, the 75th percentile of all three metrics must be good. If there’s insufficient data for FID, the website can still pass if LCP and CLS are good. But if either LCP or CLS doesn’t have enough data, the website can’t be assessed. It’s like getting graded in school – your website will pass with flying colors!

First Input Delay (FID)

FID tracks the time between a user’s first interaction with your page and when the browser can actually start processing the event handlers in response. This helps you understand how quickly your site responds to user input and can inform your optimization efforts. First Input Delay of 100ms is considered good, whereas more than 300ms is considered poor. 

Largest Contentful Paint (LCP)

The largest Contentful Paint (LCP) metric is like the stopwatch for the biggest image or text block on your screen when loading a website. It measures how quickly that big guy shows up compared to when you hit “enter” on the URL. To keep users happy and engaged, websites should aim for an LCP of 2.5 seconds or less. If it takes longer than 4 seconds, it’s considered a pretty bad experience.

Cumulative Layout Shift (CLS)

CLS, or Cumulative Layout Shift, measures how much a webpage jumps around as you scroll through it. Whenever an element suddenly moves on the page, it counts as a layout shift. If multiple shifts happen in quick succession (within five seconds of each other), they’re considered a session window. The largest session window with the highest total score of layout shifts is what’s measured. So, in short, CLS tells you how stable a web page is while you’re using it. A CLS score of 0.1 or less is considered good. 

First Contentful Paint (FCP)

The First Contentful Paint (FCP) is a metric that calculates the time it takes for any part of a website’s content – whether it’s text, images, or other design elements – to appear on the screen after the page begins to load. The ideal FCP time is 1.8 seconds or less to ensure a positive user experience. Essentially, FCP gives website owners insight into how quickly their content loads and appears on the user’s screen, letting them optimize and improve their site’s performance.

Interaction to Next Paint (INP)

Interaction to Next Paint (INP) is a tool that measures a page’s overall responsiveness by tracking every interaction you make – from clicks to taps to keyboard inputs. It looks at the longest interaction you have with the page and uses that as the INP value. But what exactly counts as an “interaction”? Well, INP measures the duration of a group of event handlers that leads up to the next time the page shows visual feedback after you start the interaction if your INP is below or at 200 milliseconds, congratulations! Your page is super responsive. But if it’s above 200 milliseconds and below 500 milliseconds, it could use a little work. And if your INP is above 500 milliseconds, your page is, unfortunately, pretty unresponsive.

Time to First Byte (TTFB)

Time to First Byte (TTFB) measures the time between requesting a resource and receiving the very first byte of the response. It’s a sum of various phases, including redirect time, DNS lookup, connection and TLS negotiation, and the request itself. By improving your connection setup time and backend, you can significantly cut down on TTFB.

Schemas Markeup

Google suggests beginning with the Rich Results Test to determine what types of Google-rich results your webpage may generate. But if you want to test all types of schema.org markup, not just those specific to Google, turn to the Schema Markup Validator. It’s a simple way to ensure your webpage is up-to-date and optimized for search engines.

Rich Results Snippet Test

Rich results might include images, carousels, or other visual or non-textual elements. To discover which rich search can be shown for your publicly accessible page, use the Rich Results Snippet Test.

Did you know that broken links on your website can harm your SEO? Not only can they prevent Google from correctly crawling and indexing your site, but they can also make it harder for visitors to navigate your pages. When Google crawls a website, it follows all of the links on the site. So if there are broken links, Google may not be able to properly crawl the site, which could result in fewer indexed pages. And the fewer indexed pages, the less content your users will see. Plus, broken links could negatively impact the overall quality of your website in the eyes of Google. This is why it’s crucial to regularly check your site for any broken links and fix them promptly.

HTML Validation Check

By validating your HTML code, you can improve more than just the appearance of your website. It can also help boost your SEO ranking by making your site easier for search engine crawlers to understand. These bots scour your site to index your pages, and they heavily rely on your HTML code to determine your page’s value and relevance. So, taking the time to ensure your HTML is error-free could significantly improve your website’s visibility and credibility online.

Sitemap.xml

If you want Google to easily find and crawl your website, having proper links between your pages is key. This means providing navigation tools like menus and links that lead to all of your important pages. But for larger or more complex sites, a sitemap can also be helpful. Sitemaps provide a map of all the URLs on your site, making it easier for search engines to discover them. Note that having a sitemap isn’t a guarantee that all the items on it will be crawled and indexed, but in most cases, it’s still beneficial to have one.

Do you need a sitemap.xml file?

If your website is large, it can be a challenge to ensure that every page is properly linked. To help Google discover all your new pages, creating a sitemap might be necessary. Similarly, if your site is new and doesn’t have many external links, it’s essential to consider using a sitemap to ensure that Googlebot can find your pages. Additionally, if your site features rich media content or appears in Google News, having a sitemap can provide extra helpful information for searchers. Lastly, Not all websites need a sitemap! If you have a small site with 500 pages or fewer, and everything necessary is linked internally, you might not need one. Sitemaps can also help Google find media files or news articles, but if those aren’t crucial to your search results, skip the sitemap.

Robots.txt

If you don’t want Google’s crawler to crawl unimportant pages on your site, a robots.txt file can help you manage traffic. While your web page may still appear in search results if it’s blocked with a robots.txt file, the search result won’t have a description, and non-HTML files like images, videos, and PDFs won’t be crawled. If you need to reverse this decision, remove the robots.txt entry blocking the page. However, to keep the page hidden permanently, you must use a different method.

Note: 

Using robots.txt to keep your web pages hidden from Google search results may not be as effective as you think. Even if other pages link to your site with meaningful content descriptions, Google may still recognize and index the URL without visiting the page. To block your page from search results, consider other methods like setting a password or using the “noindex” tag.

Find & Fix Doorway Pages

Doorways are sneaky pages designed to rank high in search results for specific queries. They serve as middlemen, leading users to less valuable pages before ultimately getting them where they want to go. Some common examples of doorways include having multiple domain names or URLs that target specific areas, generating pages that funnel visitors to relevant parts of your site, and creating pages that look almost identical to search results. It’s a shady tactic that may get you better rankings, but it’s not worth sacrificing the user experience.

Image Optimization

Before uploading your image, make sure to follow these image optimization tips. It’s crucial to select the right file format and reduce file size for faster page load speeds. Make the image and alternative text relevant to the page, and pair your on-page SEO elements with your image. Lastly, consider creating an image sitemap or ensuring your images are featured in your sitemap for crawlability.

URL Inspection Tool

The URL Inspection tool is a tool that lets you take a sneak peek at what Google thinks of a particular webpage. You can also use the tool to see whether a URL is indexable. Best of all, it gives you all sorts of juicy information about a page, like structured data, videos, linked AMP, and indexing status.

What does URL Inspection do?

Get the scoop on how Google views your website with these tools. Check if a page can be indexed, request a crawl from Google, and view a screenshot of the page as Googlebot sees it. Plus, find more detailed information like resources and page code with just a few clicks. If you’re experiencing issues with indexing, this can help troubleshoot the problem. Explore the indexed version of your page and investigate the Google-selected canonical field to determine which version is considered the true source. Don’t miss out on valuable insights that can help boost your SEO.

Conclusion

To sum it up – HTTPS is an essential layer of security for any website, and you should make sure your site is properly encrypted. You should also simplify your website’s URLs; multiple versions of the same URL can confuse Google and erode SEO performance. It also pays to review how your website looks on mobile devices. Be sure to check how your website fares regarding Core Web Vitals, as passing this assessment can lead to a higher ranking in searches. To pass the Core Web Vitals assessment, all metrics must be in good standing. Furthermore, with enough data available, FID metrics are considered, while sites without enough data can still pass if LCP and CLS are good. Also, sitemap.xml and robots.txt should be taken care of as per Google’s policies. Adhering to these suggestions will help ensure your website is secure and optimized for users everywhere.

Similar Articles

Found these helpful?

Get our newsletter with inspiration on the latest trends, projects and much more.