Paste your robots.txt and instantly check if a specific URL is crawlable by Googlebot, Bingbot, or any user agent. Catches wildcards and Allow/Disallow precedence.
A robots.txt tester is a critical SEO debugging tool that lets you verify whether search engine crawlers like Googlebot, Bingbot, and others are allowed or blocked from accessing specific URLs on your website. The robots.txt file sits at your domain root and controls which pages search engines can crawl and index. A single misconfigured rule can accidentally block your entire site from Google, or expose private admin pages to crawlers. This tool parses your robots.txt content, evaluates Allow and Disallow directives with proper precedence rules, handles wildcard patterns (* and $ end-of-string anchors), and tests against multiple user agents. It follows the same matching logic that real search engine crawlers use, so you can catch crawl blocking issues before they tank your rankings. Whether you are launching a new site, migrating URLs, or debugging indexing problems, testing your robots.txt is one of the first steps in any technical SEO audit.
Copy your robots.txt file content and paste it into the editor on the left. You can paste the entire file including multiple User-agent blocks, Allow, Disallow, and Sitemap directives.
Type the URL path you want to test (e.g., /admin/settings) and choose a crawler from the dropdown -- Googlebot, Bingbot, DuckDuckBot, Yandex, Baiduspider, or the wildcard * agent.
The tool immediately shows whether the path is Allowed or Blocked for that user agent, which specific rule matched, and a full breakdown of all parsed rules color-coded by type.
Create perfect SEO meta tags with OG + Twitter cards. Live character counters.
Analyze any URL's meta tags, OG tags, Twitter cards and SEO elements with scoring.
Generate Open Graph meta tags with live Facebook and LinkedIn preview.
Check any URL's Open Graph tags and preview how links appear on social media.
Generate Twitter Card meta tags with live preview for summary and large image cards.
Build a robots.txt file with a visual editor. Add rules for any user agent.