Robots.txt Generator

Create robots.txt files for your website.

Robots.txt generator for SEO and web crawl control

The robots.txt file tells web crawlers which parts of your website they can or cannot crawl and index. A well-configured robots.txt is essential for technical SEO, preventing search engines from wasting crawl budget on irrelevant pages.

Main directives are User-agent, Disallow, Allow, and Sitemap. You can create specific rules for different bots, including blocking AI crawlers like GPTBot while allowing Googlebot.

Our generator lets you create the file visually without memorizing the syntax. It includes presets for common bots and frequently blocked paths.

Frequently asked questions

Where should I place the robots.txt file?

The robots.txt file should be in the root of your domain, accessible at the URL example.com/robots.txt. Bots look for the file at this exact location. If you use subdomains, each one needs its own robots.txt.

Does robots.txt prevent my pages from appearing on Google?

Not completely. Robots.txt blocks crawling, but Google can index a URL if it finds it in links from other sites, although it won't be able to see its content. To prevent indexing, use the noindex meta tag on the page. Robots.txt and noindex have different and complementary purposes.

Can I block AI bots like ChatGPT?

Yes. You can add specific rules for User-agent: GPTBot and User-agent: ChatGPT-User with Disallow: / to block OpenAI's crawlers. Similarly, ClaudeBot is Anthropic's user-agent. However, not all AI bots respect robots.txt.

Want to learn more? Read our complete guide