Technical SEO
Robots.txt Tester
Paste a robots.txt and a URL to verify which user-agent rule applies.
Advertisement
The Robots.txt Tester accepts any robots.txt file and any URL, and tells you exactly which rule applies and whether the URL is allowed or blocked.
What is the Robots.txt Tester?
Manually reading robots.txt is error-prone — order, wildcards, and longest-match rules trip up even experienced developers. This tester applies the same matching logic Google uses, so you can validate changes before they break crawl coverage.
How to use the Robots.txt Tester
Steps
- Paste the contents of your robots.txt.
- Paste the URL you want to test.
- Optionally choose a user-agent.
- Click Test for an Allowed / Blocked verdict and the matching rule.
Benefits
- Validate robots.txt before deploying.
- Debug crawl errors in Search Console.
- Avoid costly site-wide blocks.
Use cases
- Pre-deploy verification.
- Investigating dropped pages in GSC.
- Migrations.
Pro tips
- Test the most-restrictive rule first.
- Remember: the longest path-match wins.
- Wildcards (*, $) follow Google’s extended syntax.
Frequently asked questions
Does the order of rules matter?
For Google, the longest matching path wins regardless of order.
How do wildcards work?
* matches any sequence of characters; $ anchors the end of the URL.
Will Googlebot always obey?
Yes for crawl directives — but not for index removal.
Related SEO tools
- Robots.txt Generator — Build a clean robots.txt for any site.
- XML Sitemap Generator — Build an XML sitemap from your URL list.
- Canonical Tag Checker — Inspect the canonical tag of any URL.
- Schema Markup Generator — Generate JSON-LD for Article, Product, FAQ, and more.