Free Tool Guide
•
4 min read

Check your crawler rules and ensure search engines can access your content.
Try the Tool Now
Free, no signup required
The robots.txt file tells search engine crawlers which parts of your site they can and can't access. A misconfigured robots.txt can accidentally block important pages from being indexed—or worse, block your entire site. The Robots.txt Validator checks for common errors.
robots.txt is a plain text file at your domain's root (example.com/robots.txt) that contains directives for web crawlers. Common directives include:
# Example robots.txt
User-agent: *
Disallow: /admin/
Disallow: /private/
Allow: /
Sitemap: https://example.com/sitemap.xml
Step 1: Enter Your Domain
Type your domain name (e.g., "example.com"). The tool fetches your robots.txt file automatically.
Step 2: Review the Score
See an overall health score based on syntax correctness and best practices.
Step 3: Check for Errors
The tool highlights syntax errors, unknown directives, and problematic rules like blocking all crawlers.
Step 4: Verify Sitemaps
See all sitemap URLs declared in your robots.txt. Click to verify they're accessible.
Ready to validate your robots.txt?
Check any site's crawler rules—free, no signup required.
Open Robots.txt Validator