Skip to content

Fix default Ollama overlay missing api field for llmspy routing#181

Merged
OisinKyne merged 2 commits intomainfrom
fix/llmspy-default-ollama-api
Feb 18, 2026
Merged

Fix default Ollama overlay missing api field for llmspy routing#181
OisinKyne merged 2 commits intomainfrom
fix/llmspy-default-ollama-api

Conversation

@bussyjd
Copy link
Collaborator

@bussyjd bussyjd commented Feb 18, 2026

Summary

  • The default Ollama overlay routes through llmspy (http://llmspy.llm.svc.cluster.local:8000/v1), which only serves POST /v1/chat/completions
  • Without an explicit api field, OpenClaw falls back to openai-responses which sends to POST /v1/responses — a route llmspy doesn't handle → 404
  • The cloud provider paths (buildLLMSpyRoutedOverlay) already set api: openai-completions correctly; only the hardcoded default Ollama block was missing it
  • Adds api: openai-completions to the default overlay — one line fix

Test plan

  • Fresh obol agent init with Ollama running → verify inference works through llmspy
  • obol openclaw setup with Anthropic via model gateway → verify inference works
  • obol openclaw setup with direct Anthropic → verify inference works (regression check)

The default Ollama overlay routes through llmspy, which only serves
POST /v1/chat/completions. Without an explicit api field, OpenClaw
falls back to openai-responses (POST /v1/responses) — a route llmspy
doesn't handle, resulting in a 404.

The llmspy cloud provider paths (buildLLMSpyRoutedOverlay) already set
api: openai-completions correctly. This adds the same field to the
hardcoded default Ollama block.
Sonnet is more appropriate as the default for agent tasks —
better cost/performance ratio for typical OpenClaw workloads.
@OisinKyne OisinKyne merged commit 4fddc23 into main Feb 18, 2026
5 checks passed
@OisinKyne OisinKyne deleted the fix/llmspy-default-ollama-api branch February 18, 2026 14:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants