Skip to content

docs: add Foundry Local as a BYOK provider#393

Open
thegovind wants to merge 5 commits intogithub:mainfrom
thegovind:docs/add-foundry-local-byok-provider
Open

docs: add Foundry Local as a BYOK provider#393
thegovind wants to merge 5 commits intogithub:mainfrom
thegovind:docs/add-foundry-local-byok-provider

Conversation

@thegovind
Copy link

@thegovind thegovind commented Feb 6, 2026

Adds Foundry Local (Microsoft on-device inference) as a documented BYOK provider in docs/auth/byok.md.

Foundry Local exposes an OpenAI-compatible API locally — the same pattern as the existing Ollama entry. Four insertions, zero deletions:

  • Supported Providers table — new row for Foundry Local (type: "openai")
  • Example configuration section### Foundry Local (On-Device) with SDK bootstrap snippet and endpoint config
  • Provider-Specific Limitations table — new row noting local-only, install requirement, and preview status
  • Troubleshooting section### Connection Refused (Foundry Local) with install and diagnostic commands

Details

Foundry Local facilitates running AI models on-device without an Azure subscription. It serves an OpenAI-compatible REST API at http://localhost:5272/v1, making it a natural fit alongside the existing Ollama documentation.

The new sections cover:

  1. Provider table entry — identifies Foundry Local as type: "openai" with a brief description
  2. Configuration example — shows how to bootstrap the service using foundry-local-sdk and configure the Copilot SDK provider block, including the default endpoint and SDK-provided API key
  3. Limitations — local-only, requires separate install, model catalog scoped to Foundry Local models, REST API in preview
  4. Troubleshootingfoundry --version, foundry model ls, foundry model run, curl health check, and install commands for Windows (winget) and macOS (brew)

Changes

  • docs/auth/byok.md — 45 lines added, 0 lines removed, no existing content modified
  • No other files changed

Add Foundry Local (Microsoft on-device inference) to the BYOK documentation:
- Supported Providers table entry
- Example configuration section with SDK bootstrap snippet
- Provider-Specific Limitations table entry
- Troubleshooting section for connection issues
@thegovind thegovind requested a review from a team as a code owner February 6, 2026 13:50
Copilot AI review requested due to automatic review settings February 6, 2026 13:50
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds Foundry Local as an additional documented BYOK provider option in the authentication docs, aligning it with existing OpenAI-compatible local provider guidance (e.g., Ollama).

Changes:

  • Added “Foundry Local” to the Supported Providers table as an "openai"-type provider.
  • Added a Foundry Local configuration example and a provider-specific limitations entry.
  • Added troubleshooting steps for “Connection Refused (Foundry Local)”.

The TypeScript snippet contains bootstrap comments and a bare object
literal that isn't standalone-valid TS, same pattern used elsewhere
in the file (e.g. Azure endpoint type confusion examples).
Use manager.endpoint instead of hardcoded URL, add wireApi: completions,
and link to the full integration guide and working sample in the
Foundry-Local repo (microsoft/Foundry-Local#417).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant