Robots.txt Generator
Robots.txt Generator
Create a perfect robots.txt file for your website in seconds. Control which search engines can crawl your pages — 100% free.
Website Details
Default Crawl Rule
Search Engine Bots
Block These Pages/Folders
Common blocked paths: /admin/, /login/, /cart/, /checkout/, /private/, /wp-admin/ etc.
Force Allow These Pages
robots.txt Output
LIVEHow to Use robots.txt
Step 1: Generate your robots.txt using this tool.
Step 2: Download the file or copy the content.
Step 3: Upload the file to your website’s root folder.
Step 4: Access it at: https://yoursite.com/robots.txt
Step 5: Test it in Google Search Console.
What is robots.txt?
A robots.txt file tells search engine crawlers which pages or files they can or cannot request from your website.
It helps protect private pages from being indexed on Google and improves your crawl budget.
⚠️ Note: robots.txt is a suggestion, not a rule. Malicious bots may still ignore it.