Guide
robots.txt vs llms.txt vs BotConsent policy
Compare traditional crawler instructions, AI-oriented context files, and richer consent policies.
robots.txt
robots.txt is widely understood by traditional web crawlers and is useful for path-level allow or disallow rules. It is intentionally simple, which makes it durable but limited for AI-specific use cases.
llms.txt
llms.txt is an emerging convention for giving language models curated context and guidance. It can be helpful for documentation and content discovery, but it is not a complete consent ledger.
BotConsent policy
A BotConsent policy is designed to express use-case permissions, policy history, contact routes, and developer-readable metadata. It complements existing files instead of replacing them.