Skip to content
Technical SEO

Robots.txt Generator

Generate a valid robots.txt file with allow/disallow rules and a sitemap reference.

Advertisement

The Robots.txt Generator builds a clean, valid robots.txt file with allow/disallow rules and a sitemap reference — ready to paste into your site root.

What is the Robots.txt Generator?

robots.txt tells search engines which parts of your site they can and cannot crawl. A misconfigured robots.txt can quietly de-index your entire site. This tool produces a syntactically clean file that follows the original Robots Exclusion Protocol and the 2022 RFC.

How to use the Robots.txt Generator

Steps

  1. Choose whether to allow or block all crawlers by default.
  2. Add disallow paths (one per line).
  3. Paste your sitemap URL.
  4. Click Generate and copy the result to /robots.txt.

Benefits

  • Avoid accidental site-wide blocks.
  • Direct crawlers to your sitemap.
  • Block low-value paths from being crawled.

Use cases

  • New site launch.
  • WooCommerce sites blocking /cart/ and /checkout/.
  • Staging cleanup.

Pro tips

  • Always include a Sitemap directive.
  • Never block /wp-content/uploads/ — that hides your images.
  • Test changes with the Robots.txt Tester before deploying.

Frequently asked questions

Where does robots.txt live?

It must be at the site root: https://example.com/robots.txt.

Can robots.txt deindex pages?

No — it blocks crawling, not indexing. Use noindex tags for that.

Should I block AI crawlers?

That is a policy choice. The generator supports custom user-agent rules.

Related SEO tools