Robots.txt Generator

Generate robots.txt file for your website to control search engine crawlers

User-agent: *

How to Use

Robots.txt Generator Guide

A robots.txt file tells search engine crawlers which URLs they can access on your site. This tool helps you generate a robots.txt file with common configurations.

Basic Usage

  1. Choose User-agent (search engine bot)
  2. Add Allow/Disallow rules
  3. Set crawl delay and sitemap URL
  4. Generate and download robots.txt

Common Rules

  • User-agent: * applies to all bots
  • Disallow: /admin/ blocks access to admin pages
  • Allow: /blog/ allows access to blog pages
  • Crawl-delay: 10 sets delay between requests
  • Sitemap: https://example.com/sitemap.xml declares sitemap location