Robots.txt & Sitemap Inspector
Check your robots.txt and sitemap directives instantly. Catch wildcard blocks and missing sitemaps before they hurt SEO.
Run your first inspection to visualize robots.txt rules and sitemap coverage here.
What are robots.txt and sitemaps?
robots.txt is a text file that tells search engine crawlers which pages or sections of your website they can or cannot access. It uses directives like User-agent, Disallow, Allow, and Sitemap. A sitemap is an XML file that lists all important pages on your website, helping search engines discover and index your content faster. Both files work together to control how search engines crawl and index your site.
Why use this robots.txt and sitemap checker?
This free inspector analyzes both your robots.txt file and sitemap in one check. It detects dangerous wildcard blocks like Disallow: /, verifies that your sitemap is referenced in robots.txt, checks if the sitemap URL is accessible, and ensures search engines can access all important pages. Fix crawl and indexing issues instantly to maintain visibility in search results.
Robots.txt Directives
- User-agent — Specifies which crawler
- Disallow — Blocks specific paths
- Allow — Overrides Disallow rules
- Sitemap — Points to XML sitemap location
Sitemap & Common Issues
- Sitemap.xml — Lists all important URLs
- Wildcard blocks — Disallow: / blocks everything
- Missing sitemap — No sitemap reference or inaccessible
- Accidental blocks — Important pages blocked
Check everything at onceand much more
Launch a free analysis in under 30 seconds. Get performance, SEO, and trust scores with actionable fixes—all in one dashboard.
Everything you need tounderstand
Get answers to the most common questions about robots.txt, sitemaps, and crawl control