Free SEO Tool

Robots.txt Generator

Generate a valid robots.txt file for your website with custom allow/disallow rules, crawl delay, and sitemap URL support.

Custom disallow paths

Sitemap URL support

Copy or download instantly

Robots.txt Settings

Configure crawl directives and generate a clean robots.txt file.

Generated Robots.txt

Copy the file or download it directly.

User-agent: * Allow: / Disallow: /admin/ Disallow: /private/ Disallow: /api/ Crawl-delay: 1 Sitemap: https://example.com/sitemap.xml

How to Use This Tool

1

Add Your Site URL

Enter your domain so the sitemap line is generated correctly.

2

Set Crawl Rules

Choose allow/disallow behavior and list restricted paths.

3

Copy or Download

Copy the final output or download it directly as robots.txt.

Robots.txt SEO Tips

Keep robots.txt simple and intentional. Block only low-value or private sections, while allowing pages you want indexed.

Always include a sitemap reference when possible. This improves crawl discovery and helps search engines find important URLs faster.

Recheck directives after site migrations or URL structure changes to avoid accidental crawl blocking.

Related Tools