Skip to content

Generated Files

After building, aeo.js generates these files in your output directory:

AI-crawler-aware robots directives. Includes rules for all known AI crawlers (GPTBot, ClaudeBot, PerplexityBot, etc.) with sensible defaults.

User-agent: *
Allow: /
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /

A concise, LLM-readable summary of your site. This is the first file AI crawlers look for — it tells them what your site is about and where to find content.

# My Site
A site optimized for AI discovery
## Pages
- [Home](https://mysite.com/)
- [About](https://mysite.com/about)
- [Blog](https://mysite.com/blog)

Full concatenated content of all pages in a single file. Useful for LLMs that want to ingest your entire site at once.

Standard XML sitemap for search engines and AI crawlers.

A structured documentation manifest:

{
"name": "My Site",
"description": "A site optimized for AI discovery",
"baseUrl": "https://mysite.com",
"totalDocs": 5,
"docs": [
{
"title": "Home",
"path": "/",
"markdownUrl": "https://mysite.com/index.md",
"htmlUrl": "https://mysite.com/",
"content": "..."
}
]
}

An AI-optimized content index designed for RAG (Retrieval-Augmented Generation) pipelines:

[
{
"id": "index",
"url": "https://mysite.com/",
"title": "Home",
"content": "Full page content in markdown...",
"description": "Page description",
"keywords": ["keyword1", "keyword2"]
}
]

One .md file per page, extracted from your rendered HTML:

public/
index.md # Markdown for /
about.md # Markdown for /about
blog.md # Markdown for /blog

You can enable or disable individual generators:

{
generators: {
robotsTxt: true,
llmsTxt: true,
llmsFullTxt: true,
rawMarkdown: true,
manifest: true,
sitemap: true,
aiIndex: true,
schema: true,
}
}