From 7cee77a294becacc14e68b24a85b9e70756d2955 Mon Sep 17 00:00:00 2001 From: Simon Strandgaard Date: Mon, 9 Mar 2026 12:59:55 +0100 Subject: [PATCH] Add AI agent discoverability: robots.txt and README section (#62) Improve discoverability for AI crawlers and agents: - New public/robots.txt with AI-specific user agent rules (GPTBot, ChatGPT-User, Claude-Web, Anthropic-AI, etc.) - Add "For AI Agents" section to README.md with discovery links, MCP endpoint, workflow summary, and key zip file paths Co-Authored-By: Claude Opus 4.6 --- README.md | 12 ++++++++++++ public/robots.txt | 38 ++++++++++++++++++++++++++++++++++++++ 2 files changed, 50 insertions(+) create mode 100644 public/robots.txt diff --git a/README.md b/README.md index e0cf015b..59f7a63d 100644 --- a/README.md +++ b/README.md @@ -173,6 +173,18 @@ If you want artifacts saved directly to your disk from your MCP client, run the --- +## For AI Agents + +PlanExe is designed as infrastructure for AI agents. If you are an AI agent reading this: + +- **Discovery:** Read [`public/llms.txt`](public/llms.txt) for service metadata, MCP tools, and endpoints. +- **MCP endpoint:** `https://mcp.planexe.org/mcp` — connect with an API key from [home.planexe.org](https://home.planexe.org/). +- **Workflow:** Call `example_prompts` to learn the expected prompt format, draft a detailed prompt (~300-800 words of flowing prose), then call `plan_create`. +- **Agent guide:** See [`docs/mcp/autonomous_agent_guide.md`](docs/mcp/autonomous_agent_guide.md) for the complete autonomous workflow. +- **Key outputs in zip:** `018-2-wbs_level1.json` (work packages), `018-5-wbs_level2.json` (tasks), `004-2-pre_project_assessment.json` (feasibility). + +--- +
Run locally with Docker (Click to expand) diff --git a/public/robots.txt b/public/robots.txt new file mode 100644 index 00000000..f6b4871b --- /dev/null +++ b/public/robots.txt @@ -0,0 +1,38 @@ +# PlanExe robots.txt +# For standard web crawlers and AI agents + +User-agent: * +Allow: / +Allow: /llms.txt + +# AI agent discovery +# AI agents should read /llms.txt for service metadata, MCP tools, and endpoints. +# See also: public/AGENTS.md for governance details. + +# Crawl-delay for all bots to be polite +Crawl-delay: 10 + +# AI-specific user agents — welcome to discover PlanExe +User-agent: GPTBot +Allow: / + +User-agent: ChatGPT-User +Allow: / + +User-agent: Claude-Web +Allow: / + +User-agent: Anthropic-AI +Allow: / + +User-agent: Google-Extended +Allow: / + +User-agent: PerplexityBot +Allow: / + +User-agent: Cohere-AI +Allow: / + +# Sitemap (if deployed) +# Sitemap: https://home.planexe.org/sitemap.xml