How to Fix Crawl Errors in Google Search Console

Crawl errors are one of those technical SEO issues that can quietly suppress your rankings for months without you realising it. They happen when Google tries to visit a page on your website and gets back an unexpected or unhelpful response. The result is that pages you want ranking either don’t get indexed, or get indexed inconsistently — and pages with errors dilute your overall site quality signals.

What Is a Crawl Error?

A crawl error occurs when Googlebot attempts to access a URL on your website and encounters a problem — a page that no longer exists, a server that’s temporarily unavailable, a URL that redirects incorrectly, or a page blocked by robots.txt. Not all crawl errors are equally serious. A handful of 404s on old pages nobody links to is completely normal. But systematic errors affecting important pages, or a high volume of errors across the site, needs prompt attention.

Where to Find Crawl Errors

The primary place to find and monitor crawl errors is Google Search Console. Navigate to the Indexing section, then look at the Pages report. This shows which pages are indexed and groups non-indexed pages into categories explaining why. Also use a crawl tool like Screaming Frog or Sitebulb for a more granular view — broken internal links, redirect chains, unexpected status codes. These tools can also help you find and fix broken links across your entire site.

The Most Common Crawl Errors and How to Fix Them

404 Errors (Page Not Found)

A 404 error means the server couldn’t find the page at that URL. 404s become a problem when they appear on URLs that other sites link to, that users have bookmarked, or that internal pages still link to — inbound links to 404 pages are wasted link equity. Fix: if the page has moved, set up a 301 redirect from the old URL to the new one. A 301 tells Google the page has permanently moved and passes the link equity along. Don’t redirect every 404 to your homepage — this creates a “soft 404.”

Server Errors (5xx Status Codes)

5xx errors indicate a server-side problem. Consistent 5xx errors can cause Google to reduce its crawl rate for your site, slowing how quickly new content gets indexed. Check your server logs to identify what’s triggering the errors. The culprit is usually hosting resources being maxed out, database connection limits, or a plugin causing server crashes. This typically requires your developer or hosting provider.

Redirect Errors

A redirect chain is when URL A redirects to URL B, which redirects to URL C. Each hop dilutes the link equity being passed — best practice is to redirect directly to the final destination. A redirect loop is when URL A redirects back to URL A, catching the crawler in an infinite loop. Fix both by identifying them in Screaming Frog, then updating your .htaccess or CMS redirect settings.

Blocked by Robots.txt

If your robots.txt has a Disallow rule that matches a URL Googlebot is trying to crawl, it stops and records it as blocked. Robots.txt blocks are often intentional — but the problem is when they accidentally block pages you want indexed. This is most common after site migrations where old robots.txt rules are copied across without review. A technical SEO audit always includes a full robots.txt review.

Soft 404s

A soft 404 is when a page returns a 200 status code but the content signals the page doesn’t exist — for example, a search results page returning “no results found.” Google detects these and excludes them from the index. Having lots of them wastes crawl budget and signals poor site quality. Fix: redirect to a relevant alternative, or return a proper 404 status.

How to Prioritise Crawl Error Fixes

Focus first on errors affecting important business pages. Fix errors on pages that have inbound links — a 404 with backlinks is wasted authority that a redirect can recover. Investigate patterns of 5xx errors as these indicate server problems that worsen over time. Then clean up redirect chains and loops systematically.

Keeping on Top of Crawl Errors Ongoing

Check Google Search Console’s Pages report at least once a month and run a fresh crawl whenever you make significant changes. Set up email alerts in Search Console so you’re notified of new crawl issues. That way, you catch problems early rather than discovering months later that a key page has been returning a 404 since your last site update.

More from the Technical SEO Series

Crawl errors are one piece of the technical SEO puzzle. For a complete picture, read The Complete Guide to Technical SEO and explore the rest of the series covering Core Web Vitals, sitemaps, canonicalisation, page speed, mobile-first indexing, and structured data.