Robots.txt Generator
Create robots.txt files to control how search engines crawl your website. Block sensitive pages, set crawl delays, and add your sitemap.
Build Your Robots.txt
Note: Google ignores crawl-delay. Use Google Search Console instead.
Generated robots.txt
# Robots.txt generated by BetterUtils # https://betterutils.com/tools/robots-txt-generator User-agent: *
📋 How to Use
- Copy the generated robots.txt content
- Create a file named
robots.txt - Paste the content into the file
- Upload to your website root directory
- Verify at
yoursite.com/robots.txt
Common Paths to Block
/admin/Admin area
/wp-admin/WordPress admin
/api/API endpoints
/private/Private pages
/tmp/Temporary files
/cgi-bin/CGI scripts
/*.pdf$PDF files
/searchSearch results
FAQs
What is robots.txt?â–¼
Robots.txt is a text file that tells search engine crawlers which pages or files they can or cannot request from your site. It's placed in your website's root directory.
Does robots.txt block pages from Google?â–¼
Robots.txt blocks crawling, but not indexing. Google might still index a URL if other pages link to it. Use noindex meta tags to prevent indexing.
Where should I put robots.txt?â–¼
The robots.txt file must be in your website's root directory, accessible at yoursite.com/robots.txt. It won't work in subdirectories.
What does User-agent: * mean?â–¼
The asterisk (*) is a wildcard that applies the rules to all web crawlers. You can specify individual bots like Googlebot for bot-specific rules.