What Is llms.txt?
llms.txt is an emerging web standard that provides structured information about a website specifically for large language models. Similar to how robots.txt tells search crawlers what they can access, llms.txt tells AI systems what your site is about, what content is most important, and how to interpret your pages.
The file sits at your domain root (yourdomain.com/llms.txt) and contains structured metadata including your site’s purpose, primary content areas, key pages, and preferred citation format.
Plain text file with a title, brief description, and categorized list of important pages. Under 100 lines.
Who Supports llms.txt?
Anthropic (Claude)
SupportedClaude’s web browsing features can read and use llms.txt files.
Perplexity
PartialPerplexity reads llms.txt when available.
OpenAI (ChatGPT)
No official supportGPTBot may read it during crawling, but no confirmed use.
Google AI
No confirmed supportNo confirmed support for llms.txt in AI Overviews source selection.
The standard is still evolving. Early adoption carries minimal risk with potential upside as more AI systems adopt it.
Should You Implement It?
Short answer: Yes, if you have 30 minutes to spare. The implementation cost is trivial, the maintenance burden is minimal, and the potential upside grows as adoption increases.
When to prioritize
- You have a content-heavy site with many pages
- You want to guide AI toward your most important content
- You’re already optimized on the fundamentals
When to deprioritize
- You haven’t addressed basic AI visibility issues yet
- Your site has fewer than 10 pages
Implementation: Create a plain text file at yourdomain.com/llms.txt with your site title, a brief description, and a categorized list of your most important pages. Keep it under 100 lines. Update it when you add major new content sections.
Frequently Asked Questions
llms.txt is an emerging web standard that provides structured information about a website specifically for large language models. The file sits at your domain root and contains metadata about your site's purpose, primary content areas, and key pages — helping AI systems understand your site without crawling every page.
No. llms.txt is not required and is not yet universally supported. The fundamentals — robots.txt access, schema markup, content quality, and E-E-A-T signals — have far more impact on AI visibility. Consider llms.txt as a low-cost addition after you've addressed the basics.
Audit Your AI Search Visibility
See exactly how AI systems view your content and what to fix. Join the waitlist to get early access.
Audit Your AI Search Visibility
See exactly how AI systems view your content and what to fix. Join the waitlist to get early access.