What Is a Technical SEO Audit?

A technical SEO audit is like giving your website a checkup to make sure it works well with search engines. It’s all about looking at the technical parts of your site to ensure that search engines like Google can easily explore, understand, and rank your pages.

By doing these checks regularly, you can find and fix any technical problems on your website. This helps your site do better in search engine results over time.

To do a technical check on your website, you’ll need two main tools:

  1. Google Search Console
  2. A tool that crawls your site, like Semrush’s Site Audit

If you’re new to Search Console, check out our guide to set it up. We’ll discuss its reports below. If you’re new to Site Audit, you can sign up for free and start quickly.

The Site Audit tool scans your website and gives data about all the pages it can check. The report it makes helps find various technical SEO issues.

Here’s a quick overview:

  1. Set up a Project: Create a project first.
  2. Use the Site Audit Tool: Go to the Site Audit tool and choose your domain.
  3. Settings: A window will pop up. Configure the basics for your first crawl.
  4. Start Audit: Click “Start Site Audit” at the bottom of the window.

After the tool scans your site, it shows an overview of your site’s health with the Site Health score, from 0 to 100. This score tells you how your site compares with others in your field.

You’ll also see issues sorted by severity (“Errors,” “Warnings,” and “Notices”) or explore specific technical SEO areas with “Thematic Reports.”

Now, go to the “Issues” tab. There, you’ll find a list of all issues and how many pages they affect.

Each issue has a “Why and how to fix it” link. Click it, and you’ll get a short description, tips on fixing it, and useful links to tools or resources.

Issues fall into two categories:

  1. Issues you can fix on your own
  2. Issues needing help from a developer or system administrator

If you’re new to auditing, it might seem overwhelming. That’s why we made this guide to help beginners and ensure they don’t miss anything important.

Find and Fix:

We suggest doing a technical SEO check on any new site. After that, do it at least once every quarter (preferably monthly), or when you notice a drop in rankings.

1. Identify and Resolve Issues with Crawlability and Indexability

For search engines like Google to rank your webpages, they first need to be able to explore and index them. This makes crawlability and indexability crucial for SEO.

To check if your site has any problems with crawlability or indexability, follow these steps in the Site Audit:

  1. Go to the “Issues” tab.
  2. Click on “Category” and choose “Crawlability.”
  3. Do the same for the “Indexability” category.

Issues related to crawlability and indexability often appear at the top of the results, specifically in the “Errors” section. These problems are usually serious and impact how well your site performs on search engines. In this guide, we’ll tackle various issues related to crawlability and indexability since many technical SEO issues are linked to these aspects.

Now, let’s focus on two crucial website files—robots.txt and sitemap.xml. These files play a significant role in how search engines discover and navigate your site.

Check and Resolve Robots.txt Concerns

The robots.txt file is a text document on your website that guides search engines on which pages to crawl or avoid. You can typically find it in the root folder of your site: https://domain.com/robots.txt.

The robots.txt file helps you:

  1. Direct search engine bots away from private folders.
  2. Prevent bots from putting too much load on your server resources.
  3. Specify where your sitemap is located.

Just one line in robots.txt can stop search engines from crawling your entire site. So, it’s crucial to ensure that your robots.txt file doesn’t block any folder or page you want to show up in search results.

To check your robots.txt file in Site Audit:

  1. Scroll down to the “Robots.txt Updates” box at the bottom.
  2. Check if the crawler has detected the robots.txt file on your website.
  3. If the file status is “Available,” review it by clicking the link icon.
  4. Alternatively, focus on the changes since the last crawl by clicking “View changes.”

For more in-depth understanding, reviewing and fixing the robots.txt file requires technical knowledge. Always follow Google’s robots.txt guidelines. You can learn about its syntax and best practices in our robots.txt guide.

To uncover more issues, go to the “Issues” tab and search for “robots.txt.” Possible problems include format errors, not indicating the sitemap.xml in robots.txt, or blocking internal resources. Click the link with the number of found issues to inspect and learn how to fix them.

Additional Guidance for Search Engines

In addition to the robots.txt file, there are two other methods to instruct search engine crawlers: the robots meta tag and x-robots tag. Site Audit will notify you about any issues related to these tags. To learn how to use them effectively, refer to our guide on robots meta tags.

Identify and Resolve XML Sitemap Problems

An XML sitemap is a file that lists all the pages you want search engines to index and, ideally, rank.

Make it a routine to review your XML sitemap during every technical SEO audit to ensure it includes all the pages you want to rank.

Conversely, it’s crucial to verify that the sitemap doesn’t include pages you don’t want to appear in the search engine results pages (SERPs), such as login pages, customer account pages, or gated content.

Note: If your site lacks a sitemap.xml file, check our guide on creating one.

Afterward, confirm that your sitemap is functioning correctly.

The Site Audit tool can identify common sitemap-related issues, including:

  1. Incorrect pages in your sitemap
  2. Format errors in your sitemap

Simply go to the “Issues” tab and type “sitemap” in the search field to locate and address these issues.

You can also use Google Search Console to spot sitemap problems.

Visit the “Sitemaps” report to submit your sitemap to Google, check your submission history, and review any errors. Find it by clicking “Sitemaps” under the “Indexing” section to the left.

If “Success” is listed next to your sitemap, there are no errors. However, if it says “Has errors” or “Couldn’t fetch,” there’s an issue.

In case of problems, the report will flag them individually. Follow Google’s troubleshooting guide to fix them.

For additional reading: Explore more about XML sitemaps.

2. Evaluate Your Site’s Structure

Site architecture involves the arrangement of your webpages and how they are linked. It’s crucial to organize your website logically for users and easy maintenance as your site grows.

Good site architecture matters for two reasons:

  1. It aids search engines in crawling and understanding the connections between your pages.
  2. It helps users navigate your site effectively.

Let’s explore three key aspects of site architecture.

Site Hierarchy: Site hierarchy, or site structure, is the organization of your pages into subfolders. To assess your site’s hierarchy:

  1. Go to the “Crawled Pages” tab in Site Audit.
  2. Switch the view to “Site Structure.”

Review subdomains and subfolders for an organized hierarchy. Aim for a flat site architecture, allowing users to find pages within three clicks from the homepage. If it takes more, your hierarchy might be too deep, and search engines may see deeper pages as less important. To fix this, go to “Crawled Pages,” switch to “Pages” view, click “More filters,” and select “Crawl Depth” as “4+ clicks.”

Navigation: Your site’s navigation, like menus and breadcrumbs, should simplify user navigation. Ensure it is:

  1. Simple, avoiding complex menus or unusual menu names.
  2. Logical, reflecting the page hierarchy, often achieved with breadcrumbs.

Messy architecture makes navigation challenging. A clear navigation system enhances user and bot understanding. Manual review, following UX best practices, is key for user-friendly menus.

URL Structure: Similar to hierarchy, a site’s URL structure should be consistent and straightforward. For example:

If a user navigates: Homepage > Children > Girls > Footwear

The URL should reflect the structure: domain.com/children/girls/footwear.

Consider using a URL structure indicating relevance to a specific country, like “domain.com/ca” for Canada.

Ensure user-friendly URL slugs and use Site Audit to spot common URL issues, such as underscores, excessive parameters, or overly long URLs.

3. Resolve Internal Linking Problems

Internal links are connections from one page to another within your website. They play a crucial role in building a solid website architecture, distributing link equity (also known as “link juice” or “authority”) across your pages to help search engines identify important content.

As you enhance your site’s structure, ensuring both search engines and users can easily find content, it’s essential to check and maintain the health of your site’s internal links.

Refer to the Site Audit report and click “View details” under your “Internal Linking” score. This report provides a breakdown of internal link issues.

Tip: Explore Semrush’s study on common internal linking mistakes and how to fix them.

One common issue is broken internal links, which point to pages that no longer exist. Click the number of issues in the “Broken internal links” error, and manually update the broken links listed.

Another fixable issue is orphaned pages—pages without any links pointing to them. Check the “Internal Links” graph for pages with zero links and add at least one internal link to each.

Lastly, use the “Internal Link Distribution” graph to assess Internal LinkRank (ILR) distribution. A higher ILR indicates stronger internal linking. Identify pages that could benefit from more internal links to distribute link equity effectively.

To avoid such issues in the future, follow internal linking best practices:

  1. Integrate internal linking into your content creation strategy.
  2. When creating a new page, link to it from existing pages.
  3. Avoid linking to URLs with redirects; link to the redirect destination instead.
  4. Link to relevant pages with appropriate anchor text.
  5. Use internal links to highlight important pages for search engines.
  6. Exercise caution with the number of internal links—common sense applies.
  7. Understand and correctly use nofollow attributes.

By adhering to these best practices, you can maintain a healthy and effective internal linking structure on your website.

4. Identify and Resolve Duplicate Content Problems

Duplicate content occurs when multiple webpages have identical or very similar content, leading to various issues, such as:

  1. Incorrect page versions displaying in search engine results pages (SERPs).
  2. Poor SERP performance or indexing problems for affected pages.

Site Audit flags pages as duplicate content if their content is at least 85% identical. This typically happens for two reasons:

1. Multiple Versions of URLs: Common instances include HTTP and HTTPS versions or www and non-www versions. Google treats these as different sites. To resolve:

  • Choose a preferred version.
  • Set up a sitewide 301 redirect to ensure only one version is accessible.

2. URL Parameters: These are additional elements in a URL used for filtering or sorting content. Recognizable by a question mark and equal sign, they’re often used for product pages with slight variations.

  • Google usually groups these pages and selects the best one for search results.
  • Reduce unnecessary parameters and use canonical tags pointing to URLs without parameters to minimize potential issues.

During your SEO audit, customize the “Remove URL parameters” section to exclude pages with parameters from crawling. This ensures the Site Audit tool analyzes only the pages you want.

Access these settings later by clicking the gear icon in the top-right corner, then “Crawl sources: Website” under Site Audit settings. Following these steps helps maintain content integrity and improves your site’s performance in search engine rankings.

5. Evaluate Your Site’s Performance

Site speed is a crucial aspect of the overall page experience and a key factor in Google rankings. It involves two main measurements:

  1. Page Speed: The time it takes for a single webpage to load.
  2. Site Speed: The average page speed for a sample set of page views on a site.

Improving individual page speed contributes to enhanced overall site speed. Google’s PageSpeed Insights is a specialized tool for this purpose, focusing on three critical metrics called Core Web Vitals:

  1. Largest Contentful Paint (LCP): Measures how quickly the main content of your page loads.
  2. First Input Delay (FID): Measures how swiftly your page becomes interactive.
  3. Cumulative Layout Shift (CLS): Measures how visually stable your page is.

To get a comprehensive view of your site’s performance, you can utilize Google Search Console or a website audit tool like Semrush’s Site Audit.

In Site Audit, go to the “Issues” tab and choose the “Site Performance” category. This section reveals all pages affected by specific issues, such as slow load speed.

Two detailed reports—“Site Performance” and “Core Web Vitals”—provide further insights. The “Site Performance” report includes a “Site Performance Score” and a breakdown of pages by load speed. The “Core Web Vitals” report breaks down Core Web Vitals metrics based on 10 URLs, allowing you to track performance over time.

For a varied analysis covering different page types on your site (e.g., blog posts, landing pages, and product pages), click “Edit list” in the “Analyzed pages” section.

Further Reading: Site performance is a critical aspect of technical SEO. For more in-depth knowledge, explore our page speed guide and detailed guide to Core Web Vitals.

6. Uncover Mobile-Friendliness Concerns

With more than half of web traffic occurring on mobile devices, ensuring your website functions seamlessly on mobile is crucial. Google primarily indexes the mobile version of websites through mobile-first indexing.

To assess mobile-friendliness, Google Search Console offers a valuable “Mobile Usability” report. It categorizes your pages into “Not Usable” and “Usable,” and a section titled “Why pages aren’t usable on mobile” details detected issues.

For a quick check of mobile usability for a specific URL, you can use Google’s Mobile-Friendly Test. In Semrush’s Site Audit tool, explore the “Mobile SEO” category in the “Issues” tab to assess two vital aspects: the viewport meta tag and Accelerated Mobile Pages (AMPs).

1. Viewport Meta Tag:

  • This HTML tag scales your page to different screen sizes, automatically adjusting based on the user’s device with a responsive design.

2. Accelerated Mobile Pages (AMPs):

  • These are streamlined versions of your pages designed for quick loading on mobile devices. Google runs AMPs from its cache, enhancing mobile performance.

For AMP pages, Site Audit tests for issues in three categories: AMP HTML issues, AMP style and layout issues, and AMP templating issues. Regularly audit your AMP pages to ensure correct implementation and optimize mobile visibility.

Further Reading: Explore more about Accelerated Mobile Pages (AMPs) for additional insights into enhancing mobile performance.

7. Identify and Resolve Code Issues

Despite how a webpage appears to human eyes, search engines interpret it solely as code. Proper syntax, relevant tags, and attributes are crucial for search engines to understand your site. During your technical SEO audit, pay attention to HTML, JavaScript, and structured data in your website code.

Meta Tag Issues:

  • Meta tags offer additional data to search engine bots about a page’s content. Key meta tags include the title tag (forming the clickable link in search results) and meta description (providing a brief page snippet). In Site Audit, explore the “Meta tags” category in the “Issues” tab to identify and address issues related to these tags.

Canonical Tag Issues:

  • Canonical tags indicate the main version of a page to search engines, crucial for handling duplicate or similar content. Site Audit detects issues such as missing or broken canonical tags. Navigate to “Issues” and select the “Canonicalization” category to view and resolve these concerns.

Hreflang Attribute Issues:

  • Hreflang attributes specify a page’s target region and language, aiding search engines in serving the correct variations based on user location and language preferences. In Site Audit, check the “International SEO” thematic report for an overview of hreflang issues.

JavaScript Issues:

  • JavaScript, used for interactive page elements, is crucial for proper page indexing. Site Audit identifies broken JavaScript files, helping you ensure Google renders pages correctly. In Google Search Console, utilize the “URL Inspection Tool” to check how Google renders JavaScript-enabled pages.

Structured Data Issues:

  • Structured data, organized in a specific code format like Schema.org, provides additional information to search engines. It enhances indexing and enables SERP features. Use Google’s Rich Results Test tool to check eligibility for rich results and the “Markup” thematic report in Site Audit to identify structured data issues.

Note: Each issue highlighted in the audit report includes links to relevant documentation, aiding in issue resolution. Regularly addressing these code-related issues enhances your site’s visibility and performance in search engine rankings.

8. Ensure Secure HTTPS Implementation

Your website should operate on the HTTPS protocol, denoting a secure server with an SSL certificate. This not only establishes the site’s legitimacy but also builds trust with users, as indicated by a padlock icon in the browser’s URL.

HTTPS is a confirmed Google ranking signal, enhancing your site’s visibility. During your technical SEO audit, address HTTPS issues using the following steps:

  1. Access the HTTPS Report:
    • Open the “HTTPS” report in the Site Audit overview. This report provides a comprehensive list of HTTPS-related issues, including affected URLs and guidance on resolution.
  2. Common HTTPS Issues:
    • The report highlights common issues, such as:
    • Expired Certificate: Indicates if your security certificate needs renewal.Old Security Protocol Version: Informs you if your site uses outdated SSL or TLS protocols.No Server Name Indication (SNI): Checks if your server supports SNI for enhanced security.Mixed Content: Identifies unsecured content, triggering a “not secure” warning in browsers.

Implementing HTTPS is crucial not only for security but also for SEO, aligning with Google’s ranking preferences. Regularly monitor and resolve HTTPS issues to maintain a secure and trusted online presence.

Identify and Resolve HTTP Status Code Issues

HTTP status codes communicate a website server’s response to a browser’s request to load a page. While 1XX and 2XX statuses indicate informational and successful requests, respectively, the focus is on reviewing and addressing issues related to 3XX, 4XX, and 5XX statuses.

  1. Access the HTTP Status Report:
    • Open the “Issues” tab in Site Audit and select the “HTTP Status” category in the top filter.
  2. Review HTTP Status Issues:
    • Explore the listed issues and warnings related to HTTP statuses. Click on a specific issue to view affected pages.
  3. 3XX Status Codes: Redirects
    • 3XX codes signify redirects, guiding users and search engine crawlers to a new page. Ensure correct usage to prevent problems.Common redirect issues include:
      • Redirect chains: Multiple redirects between the original and final URL.Redirect loops: Circular redirection between two URLs.
      Audit redirects using Site Audit and address any flagged issues.
  4. 4XX Status Codes: Page Not Found
    • 4XX errors, notably 404 (Page not found), indicate inaccessible pages. If Site Audit detects 4XX pages:
      • Open the issue to see affected URLs.Click “View broken links” to identify internal links pointing to the 4XX pages.Remove or replace internal links to address the 4XX errors.
  5. 5XX Status Codes: Server-Side Errors
    • 5XX errors occur on the server side, indicating server unavailability or configuration issues. Common causes include temporary downtime, incorrect configuration, or server overload.
    • Investigate the reasons behind 5XX errors and implement fixes where possible.

Understanding and resolving HTTP status code issues is essential for maintaining a smooth user experience and ensuring search engine compatibility. Regularly monitor and address these issues to uphold the integrity of your website.

10. Performing Log File Analysis for Comprehensive SEO Insights

Log file analysis provides valuable insights into your website’s interactions with users and search engine bots. Utilizing a tool like Semrush’s Log File Analyzer streamlines the process, offering a detailed report on Googlebot activity. Follow these steps to conduct log file analysis:

  1. Access Log File:
    • Retrieve a copy of your website’s access log file. You can find it in your server’s file manager or use an FTP client.
  2. Upload to Log File Analyzer:
    • Use Semrush’s Log File Analyzer tool to upload your log file for analysis.
  3. Analysis Report:
    • The tool will generate a comprehensive report, providing insights into various aspects of your website’s performance.
    • Key questions addressed by the analysis include:
      • Are errors hindering complete website crawling?
      • Which pages receive the highest crawl frequency?
      • Are there pages that Googlebot is not crawling?
      • Do structural issues impact page accessibility?
      • How efficiently is your crawl budget allocated?
  4. Actionable Insights:
    • Leverage the insights gained from log file analysis to refine your SEO strategy and address specific issues affecting crawling, indexing, and user experience.
  5. Example Use Case:
    • If errors preventing Googlebot from crawling your site are identified, take necessary steps to resolve these errors. This proactive approach ensures optimal website performance.

Conclusion: A thorough technical SEO audit, incorporating log file analysis, significantly impacts search engine performance. Utilize tools like Semrush’s Site Audit and Log File Analyzer to identify, address, and monitor issues, ultimately enhancing your website’s SEO health. Get started today and observe the positive changes in your site’s search engine visibility over time.