Web Robots Parse
Parses robots.txt into structured user-agent allow, disallow, sitemap, and crawl-delay rules.
Run tool
Form input
Direct JSON or Markdown
Submit JSON directly or Markdown that can be converted into JSON.
Work steps
Inputs
| Name | Type | Required | Description |
|---|---|---|---|
text | text | Yes | Text |
Outputs
| Name | Type | Description |
|---|---|---|
groups | array | Groups |
sitemaps | array | Sitemaps |
work | array | Work |
Sample request
{
"text": "User-agent: *\nDisallow: /private\nSitemap: https://example.com/sitemap.xml"
}