Part of the AI Readiness audit
Check if your site has an llms.txt file
The llms.txt file tells AI systems what your site is about and where to find key content. SiteCurl checks whether your site has one and if it is formatted correctly.
No signup required. Results in under 60 seconds.
What this check does
SiteCurl checks for a file at /llms.txt on your site. This file is a plain-text summary of your site designed for large language models. It typically includes a brief description of the site, a list of key pages, and links to the most important content sections.
If the file exists, SiteCurl verifies it returns a 200 status code and is accessible. If it is missing, SiteCurl flags it as an opportunity to improve your AI discoverability.
The llms.txt standard is emerging and not yet widely adopted. Having one puts your site ahead of most competitors in AI discovery.
How this shows up in the real world
The llms.txt file is inspired by robots.txt but serves a different purpose. While robots.txt tells crawlers what they cannot access, llms.txt tells AI systems what they should focus on. It is a guide for machines, written in plain text, that summarizes your site and points to the most important pages.
A typical llms.txt file contains a one-paragraph description of the site, a list of key pages with brief descriptions, and links organized by category. Think of it as a table of contents for AI systems. When an AI crawler visits your site, llms.txt tells it: 'Here is what this site is about, and here are the pages that matter most.'
The standard was proposed in 2024 and is gaining adoption among forward-thinking publishers and SaaS companies. It is not required by any search engine, but early adopters may benefit from better AI understanding of their content as AI search tools evolve to use it.
Because llms.txt is plain text, it is trivial to create and maintain. A 20-line file that describes your site and links to your 10 most important pages is a reasonable starting point. Update it when you add or remove significant content.
Why it matters
AI systems discover content through crawling, but they benefit from explicit guidance about what a site offers. An llms.txt file provides that guidance in a format designed for machine consumption. It is like giving an AI a map of your site instead of making it wander and guess.
As AI search tools evolve, files like llms.txt may become a standard signal for content discovery. Early adopters position themselves to benefit first as the standard gains wider support.
Creating an llms.txt file also forces you to think about your site from an AI's perspective. Which pages are the most important? What is the best one-sentence summary of your site? Answering these questions improves your content strategy regardless of whether AI tools use the file directly.
Who this impacts most
SaaS companies with documentation, feature pages, and pricing information benefit from an llms.txt file that directs AI systems to the right pages. When a user asks an AI 'What does SiteCurl cost?', the llms.txt file can point the AI directly to the pricing page.
Content publishers with large archives benefit the most. An llms.txt file can highlight the most authoritative articles rather than letting AI systems index everything equally. This guides AI toward your best content for citations.
Any site that wants to be cited in AI-generated answers benefits from making its structure explicit. The easier you make it for AI to understand your site, the more likely your content is to be selected as a source.
How to fix it
Step 1: Create the file. Create a plain text file named llms.txt in your site's root directory. Start with a brief description of your site (1 to 2 sentences), followed by a list of your most important pages with short descriptions.
Step 2: Organize by section. Group your pages under headings like 'Products,' 'Documentation,' 'Blog,' and 'Company.' For each page, include the full URL and a one-sentence description of what the page covers.
Step 3: Include your most important pages. Focus on pages you want AI systems to cite: your home page, pricing page, key product pages, and your most authoritative blog posts. You do not need to list every page, just the ones that represent your site's core value.
Step 4: Keep it updated. When you add a major new page or remove an old one, update llms.txt. Treat it like a curated sitemap for AI systems.
Common mistakes when fixing this
Listing every page on the site. The llms.txt file should be a curated guide, not a complete sitemap. Include your 10 to 30 most important pages. If you list everything, AI systems have no signal about what matters most.
Using HTML or markdown instead of plain text. The file should be plain text. No HTML tags, no markdown formatting. Keep it simple: headings as text lines, pages as URL plus description.
Creating the file and never updating it. An llms.txt file that references pages that no longer exist or misses your newest product sends the wrong signals. Review it quarterly or when you make significant site changes.
How to verify the fix
After creating the file, visit https://yoursite.com/llms.txt in your browser. You should see your plain text summary. Run another SiteCurl scan to confirm the check passes.
Check that the file returns a 200 status code with curl -sI https://yoursite.com/llms.txt. Verify the content type is text/plain, not text/html.
The bottom line
The llms.txt file is a simple way to help AI systems understand your site. It takes 15 minutes to create and positions your site for better AI discovery as the standard gains adoption. List your most important pages with brief descriptions and keep it updated.
Example findings from a scan
llms.txt file found at /llms.txt
No llms.txt file detected
llms.txt file returns 404
Related checks
Frequently asked questions
What is llms.txt?
It is a plain text file at your site's root that tells AI systems what your site is about and where to find key content. Think of it as a curated table of contents designed for machines rather than humans.
Is llms.txt required for AI readiness?
No. It is an optional signal that can improve AI discovery of your content. Most sites do not have one yet, so adding one puts you ahead of competitors.
Can I check for llms.txt without signing up?
Yes. The free audit checks for an llms.txt file as part of a full seven-category scan. No signup needed.
How long should my llms.txt file be?
Keep it concise. A 1-2 sentence site description followed by 10 to 30 key pages with brief descriptions is a good starting point. The file should be easy for an AI to parse in one pass.
Check your AI discoverability now