Technical SEO
Robots.txt Generator
Generate a valid robots.txt file with allow/disallow rules and a sitemap reference.
Advertisement
The Robots.txt Generator builds a clean, valid robots.txt file with allow/disallow rules and a sitemap reference — ready to paste into your site root.
What is the Robots.txt Generator?
robots.txt tells search engines which parts of your site they can and cannot crawl. A misconfigured robots.txt can quietly de-index your entire site. This tool produces a syntactically clean file that follows the original Robots Exclusion Protocol and the 2022 RFC.
How to use the Robots.txt Generator
Steps
- Choose whether to allow or block all crawlers by default.
- Add disallow paths (one per line).
- Paste your sitemap URL.
- Click Generate and copy the result to /robots.txt.
Benefits
- Avoid accidental site-wide blocks.
- Direct crawlers to your sitemap.
- Block low-value paths from being crawled.
Use cases
- New site launch.
- WooCommerce sites blocking /cart/ and /checkout/.
- Staging cleanup.
Pro tips
- Always include a Sitemap directive.
- Never block /wp-content/uploads/ — that hides your images.
- Test changes with the Robots.txt Tester before deploying.
Frequently asked questions
Where does robots.txt live?
It must be at the site root: https://example.com/robots.txt.
Can robots.txt deindex pages?
No — it blocks crawling, not indexing. Use noindex tags for that.
Should I block AI crawlers?
That is a policy choice. The generator supports custom user-agent rules.
Related SEO tools
- Robots.txt Tester — Test if any URL is blocked by your robots.txt.
- XML Sitemap Generator — Build an XML sitemap from your URL list.
- Canonical Tag Checker — Inspect the canonical tag of any URL.
- Schema Markup Generator — Generate JSON-LD for Article, Product, FAQ, and more.