From the GEO Audit Report (discussion #30401): the docs site (github.github.com/gh-aw/) scored 0/18 for both robots.txt and llms.txt, together representing 36 potential points.
Approach
1. Add robots.txt to docs site
Create docs/public/robots.txt that explicitly allows major AI crawlers:
User-agent: *
Allow: /
User-agent: GPTBot
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Google-Extended
Allow: /
Sitemap: https://github.github.com/gh-aw/sitemap-index.xml
2. Add llms.txt to docs site
Create docs/public/llms.txt summarizing the project for AI consumption. Mirror the structure from the README's llms.txt but tailored to the docs site, including:
- Project overview
- Key concepts (workflows, engines, MCP servers)
- Links to main docs sections
- Installation and quick-start
Files to Create
docs/public/robots.txt
docs/public/llms.txt
Acceptance Criteria
Generated by Plan Command for issue #discussion #30401 · ● 2.8M · ◷
From the GEO Audit Report (discussion #30401): the docs site (
github.github.com/gh-aw/) scored 0/18 for both robots.txt and llms.txt, together representing 36 potential points.Approach
1. Add
robots.txtto docs siteCreate
docs/public/robots.txtthat explicitly allows major AI crawlers:2. Add
llms.txtto docs siteCreate
docs/public/llms.txtsummarizing the project for AI consumption. Mirror the structure from the README's llms.txt but tailored to the docs site, including:Files to Create
docs/public/robots.txtdocs/public/llms.txtAcceptance Criteria
https://github.github.com/gh-aw/robots.txtreturns 200 with AI bot Allow ruleshttps://github.github.com/gh-aw/llms.txtreturns 200 with project summaryllms.txthas at minimum 500 words and covers core concepts