Robots.txt Generator

Build a robots.txt file visually by adding user-agent rules, Allow and Disallow paths, an optional crawl-delay, and a sitemap URL — then download the finished file. The output updates in real time so you can see the effect of each rule as you add it. Useful when setting up a new site or refining crawler access without memorising the robots.txt syntax.

All processing happens in your browser. No data is sent to any server.

Frequently Asked Questions

What is a robots.txt file?
robots.txt is a plain-text file at the root of your website that instructs web crawlers which pages or directories they may or may not access.
Does Disallow prevent a page from being indexed?
Disallow blocks crawlers from fetching the page, but if other sites link to it, search engines may still index the URL without content. Use a noindex meta tag to fully prevent indexing.
Where should the robots.txt file be placed?
It must be at the root of your domain — for example, https://example.com/robots.txt.

Related Tools