Robots.txt tester
Paste your robots.txt and instantly check if a specific URL is crawlable by Googlebot, Bingbot, or any user agent. Catches wildcards and Allow/Disallow precedence.
What is Robots.txt Tester?
A robots.txt tester is a critical SEO debugging tool that lets you verify whether search engine crawlers like Googlebot, Bingbot, and others are allowed or blocked from accessing specific URLs on your website. The robots.txt file sits at your domain root and controls which pages search engines can crawl and index. A single misconfigured rule can accidentally block your entire site from Google, or expose private admin pages to crawlers. This tool parses your robots.txt content, evaluates Allow and Disallow directives with proper precedence rules, handles wildcard patterns (* and $ end-of-string anchors), and tests against multiple user agents. It follows the same matching logic that real search engine crawlers use, so you can catch crawl blocking issues before they tank your rankings. Whether you are launching a new site, migrating URLs, or debugging indexing problems, testing your robots.txt is one of the first steps in any technical SEO audit.
How to Use Robots.txt Tester
- 1
Paste your robots.txt content
Copy your robots.txt file content and paste it into the editor on the left. You can paste the entire file including multiple User-agent blocks, Allow, Disallow, and Sitemap directives.
- 2
Enter a URL path and select a user agent
Type the URL path you want to test (e.g., /admin/settings) and choose a crawler from the dropdown -- Googlebot, Bingbot, DuckDuckBot, Yandex, Baiduspider, or the wildcard * agent.
- 3
Check the result instantly
The tool immediately shows whether the path is Allowed or Blocked for that user agent, which specific rule matched, and a full breakdown of all parsed rules color-coded by type.
Features
- Tests Allow vs Disallow precedence using longest-match-wins logic, matching real crawler behavior
- Supports wildcard patterns including * (match anything) and $ (end-of-URL anchor)
- Pre-configured user agents for Googlebot, Bingbot, DuckDuckBot, Yandex, and Baiduspider
- Parses and displays all rules organized by User-agent block with color-coded Allow/Disallow labels
- Shows the exact matched rule so you can trace why a URL is blocked or allowed
- Runs entirely in your browser -- your robots.txt content never leaves your device
Related Tools
Meta Tag Generator
Create perfect SEO meta tags with OG + Twitter cards. Live character counters.
Meta Tag Analyzer
Analyze any URL's meta tags, OG tags, Twitter cards and SEO elements with scoring.
Open Graph Tag Generator
Generate Open Graph meta tags with live Facebook and LinkedIn preview.
Open Graph Checker
Check any URL's Open Graph tags and preview how links appear on social media.
Twitter Card Generator
Generate Twitter Card meta tags with live preview for summary and large image cards.
Robots.txt Generator
Build a robots.txt file with a visual editor. Add rules for any user agent.