Free Robots.txt Generator

Create, test and validate your robots.txt file instantly. Configure crawl rules for Googlebot, Bingbot, Yandex and 50+ bots — then fetch any live website's robots.txt to analyse it in real time.

✓ 50+ Bot Presets ✓ Live Robots.txt Tester ✓ Fetch Any Website ✓ Crawl-Delay Support ✓ Download Instantly ✓ No Sign-up
Fetch & Analyse Live Robots.txt
Enter any website URL to retrieve and parse its real-world robots.txt file
💡 This fetches the actual robots.txt file from the target website in real time using a CORS proxy — no simulated data.
Select User-Agents (Bots)
Choose which crawlers to configure — each gets its own rule block
Crawl Rules (Disallow / Allow)
Add paths to block or allow — applied to all selected user-agents unless overridden
⚠️ Disallow: / blocks the entire site. Disallow: (empty) means allow everything. Rules are matched by path prefix — most specific rule wins.
# Type Path Applies To Note (optional)
Global Options
Crawl-delay, sitemap URL, host directive and advanced settings
Block all bad / scraper bots
Adds Disallow: / for AhrefsBot, SemrushBot, DotBot and other aggressive scrapers
Block AI training bots
Adds Disallow: / for GPTBot, Google-Extended, CCBot, anthropic-ai, cohere-ai
Noindex admin & login paths
Disallow /admin/, /login/, /wp-admin/, /dashboard/ for all bots
Block search/filter URLs
Disallow /*?* to prevent crawling of faceted search and parameter pages
Add comment header
Prepend a comment block with generator info and timestamp
Generated robots.txt
Ready to download
How to Use This Robots.txt Generator
  1. Fetch a live site — Enter any website URL in the Fetch panel above to retrieve and analyse its real robots.txt file, then optionally import it into the editor.
  2. Select bots — Choose which crawlers to configure. Click a preset (e.g. Googlebot, Bingbot) or select All / Bad Bots only.
  3. Add rules — Add Disallow and Allow rules in the Rules Builder. Each rule maps to a path prefix. Leave path empty to allow everything.
  4. Set global options — Add your sitemap.xml URL, set crawl-delay for Yandex/Bing, block AI training bots, and configure advanced toggles.
  5. Generate & test — Click Generate to preview the robots.txt output. Use the URL Tester to verify any path is correctly allowed or blocked.
  6. Download & deploy — Download robots.txt and upload it to the root of your website at https://yoursite.com/robots.txt
Frequently Asked Questions
What is a robots.txt file and why do I need one?
A robots.txt file sits at the root of your domain and tells web crawlers which parts of your site to crawl or skip. It helps prevent search engines from indexing duplicate content, admin pages, or private areas — improving your crawl budget and SEO efficiency.
How do I block Googlebot from specific pages?
Select "Googlebot" in the bot presets, then add a Disallow rule with the path you want blocked (e.g. /private/ or /admin/). The generator will produce the correct User-agent: Googlebot / Disallow: /private/ block automatically.
Does robots.txt affect SEO?
Yes — a well-configured robots.txt directs search engine crawlers to focus on your most important pages, preserving crawl budget. Blocking duplicate or low-value pages can improve rankings by keeping Google focused on your best content.
What is crawl-delay in robots.txt?
Crawl-delay tells a bot how many seconds to wait between requests. Useful for servers with limited resources. Note: Google ignores Crawl-delay in robots.txt — use Google Search Console's crawl rate settings for Googlebot. Yandex and Bing do honour it.
How do I block AI training bots from scraping my site?
Enable "Block AI training bots" in Global Options. This adds Disallow: / for GPTBot (OpenAI), Google-Extended, CCBot (Common Crawl), anthropic-ai, cohere-ai, and other AI training crawlers that follow robots.txt.

About ToollLive Free Robots.txt Generator

ToollLive's free robots.txt generator is the most complete online tool for creating, testing and validating your robots.txt file — no sign-up required. Generate robots.txt for Googlebot, Bingbot, Yandex, DuckDuckBot and 50+ other crawlers. Use our live robots.txt tester to fetch any website's existing file in real time, analyse its rules, and import it into the editor. Add Disallow and Allow rules, configure crawl-delay, set your sitemap URL, block AI training bots like GPTBot and Google-Extended, and block aggressive scrapers. Test any URL path instantly to verify it's correctly allowed or blocked. Download your completed robots.txt and deploy it to your site root in seconds. Explore our full suite of free SEO tools including our XML Sitemap Generator.