Skip to content
Technical SEO

Robots.txt Tester

Paste a robots.txt and a URL to verify which user-agent rule applies.

Advertisement

The Robots.txt Tester accepts any robots.txt file and any URL, and tells you exactly which rule applies and whether the URL is allowed or blocked.

What is the Robots.txt Tester?

Manually reading robots.txt is error-prone — order, wildcards, and longest-match rules trip up even experienced developers. This tester applies the same matching logic Google uses, so you can validate changes before they break crawl coverage.

How to use the Robots.txt Tester

Steps

  1. Paste the contents of your robots.txt.
  2. Paste the URL you want to test.
  3. Optionally choose a user-agent.
  4. Click Test for an Allowed / Blocked verdict and the matching rule.

Benefits

  • Validate robots.txt before deploying.
  • Debug crawl errors in Search Console.
  • Avoid costly site-wide blocks.

Use cases

  • Pre-deploy verification.
  • Investigating dropped pages in GSC.
  • Migrations.

Pro tips

  • Test the most-restrictive rule first.
  • Remember: the longest path-match wins.
  • Wildcards (*, $) follow Google’s extended syntax.

Frequently asked questions

Does the order of rules matter?

For Google, the longest matching path wins regardless of order.

How do wildcards work?

* matches any sequence of characters; $ anchors the end of the URL.

Will Googlebot always obey?

Yes for crawl directives — but not for index removal.

Related SEO tools