Robots.txt Generator
Generate a robots.txt file to control search engine crawling.
User-agent
Crawl delay (seconds)
Disallow paths (one per line)
Allow paths (one per line)
Sitemap URL
robots.txt
User-agent: * Allow: / Disallow: /admin/ Disallow: /private/ Disallow: /api/ Sitemap: https://example.com/sitemap.xml
Formula / How it works
robots.txt tells search engines which pages to crawl and which to skip. User-agent: * means all bots. Disallow: /admin/ blocks that path. Allow: / allows access. Sitemap: points to your sitemap URL.