diff --git a/README.md b/README.md index d0f0033a10..10dcce40ed 100644 --- a/README.md +++ b/README.md @@ -1,10 +1,10 @@ # Claude Code Best V5 (CCB) -[![GitHub Stars](https://img.shields.io/github/stars/claude-code-best/claude-code?style=flat-square&logo=github&color=yellow)](https://github.com/claude-code-best/claude-code/stargazers) -[![GitHub Contributors](https://img.shields.io/github/contributors/claude-code-best/claude-code?style=flat-square&color=green)](https://github.com/claude-code-best/claude-code/graphs/contributors) -[![GitHub Issues](https://img.shields.io/github/issues/claude-code-best/claude-code?style=flat-square&color=orange)](https://github.com/claude-code-best/claude-code/issues) -[![GitHub License](https://img.shields.io/github/license/claude-code-best/claude-code?style=flat-square)](https://github.com/claude-code-best/claude-code/blob/main/LICENSE) -[![Last Commit](https://img.shields.io/github/last-commit/claude-code-best/claude-code?style=flat-square&color=blue)](https://github.com/claude-code-best/claude-code/commits/main) +[![GitHub Stars](https://img.shields.io/github/stars/claude-code-best/claude-code-mix?style=flat-square&logo=github&color=yellow)](https://github.com/claude-code-best/claude-code-mix/stargazers) +[![GitHub Contributors](https://img.shields.io/github/contributors/claude-code-best/claude-code-mix?style=flat-square&color=green)](https://github.com/claude-code-best/claude-code-mix/graphs/contributors) +[![GitHub Issues](https://img.shields.io/github/issues/claude-code-best/claude-code-mix?style=flat-square&color=orange)](https://github.com/claude-code-best/claude-code-mix/issues) +[![GitHub License](https://img.shields.io/github/license/claude-code-best/claude-code-mix?style=flat-square)](https://github.com/claude-code-best/claude-code-mix/blob/main/LICENSE) +[![Last Commit](https://img.shields.io/github/last-commit/claude-code-best/claude-code-mix?style=flat-square&color=blue)](https://github.com/claude-code-best/claude-code-mix/commits/main) [![Bun](https://img.shields.io/badge/runtime-Bun-black?style=flat-square&logo=bun)](https://bun.sh/) [![Discord](https://img.shields.io/badge/Discord-Join-5865F2?style=flat-square&logo=discord)](https://discord.gg/uApuzJWGKX) @@ -27,6 +27,7 @@ | **Poor Mode** | 穷鬼模式,关闭记忆提取和键入建议,大幅度减少并发请求 | /poor 可以开关 | | **Channels 频道通知** | MCP 服务器推送外部消息到会话(飞书/Slack/Discord/微信等),`--channels plugin:name@marketplace` 启用 | [文档](https://ccb.agent-aura.top/docs/features/channels) | | **自定义模型供应商** | OpenAI/Anthropic/Gemini/Grok 兼容 (`/login`) | [文档](https://ccb.agent-aura.top/docs/features/all-features-guide) | +| **/mix 模型混合模式** | `/mix true` 启用独立 `ccbsettings.json`,Opus/Sonnet/Haiku 可分别配置 provider、Base URL、API Key 和模型名 | [使用说明](#mix-模型混合模式) | | Voice Mode | 语音输入,支持豆包语言输入(`/voice doubao`) | [文档](https://ccb.agent-aura.top/docs/features/voice-mode) | | Computer Use | 屏幕截图、键鼠控制 | [文档](https://ccb.agent-aura.top/docs/features/computer-use) | | Chrome Use | 浏览器自动化、表单填写、数据抓取 | [自托管](https://ccb.agent-aura.top/docs/features/chrome-use-mcp) [原生版](https://ccb.agent-aura.top/docs/features/claude-in-chrome-mcp) | @@ -123,7 +124,7 @@ powershell -c "irm bun.sh/install.ps1 | iex" ### 📥 安装 ```bash -cd /path/to/claude-code +cd /path/to/claude-code-mix bun install ``` @@ -161,6 +162,31 @@ bun run build - ⌨️ **Tab / Shift+Tab** 切换字段,**Enter** 确认并跳到下一个,最后一个字段按 Enter 保存 +### `/mix` 模型混合模式 + +默认情况下,`/login` 配置会保存到原版共享配置文件 `settings.json`,Opus、Sonnet、Haiku 三种模型共用同一组 provider、API URL 和 API Key。 + +如果你希望三种模型分别使用不同的 API 地址、密钥或协议,先在 REPL 中开启混合模式: + +```text +/mix true +``` + +开启后: + +- 配置会保存到独立的 `ccbsettings.json`,不再写入共享的 `settings.json` +- 再运行 `/login` 时,会先选择要配置的模型族:`Opus`、`Sonnet` 或 `Haiku` +- 选择模型族后,再进入原来的 provider 类型选择菜单,例如 Anthropic Compatible、OpenAI Compatible、Gemini API +- 每个模型族都可以单独保存自己的 provider、Base URL、API Key 和模型名 + +常用命令: + +```text +/mix true # 开启模型混合模式,使用独立 ccbsettings.json +/mix status # 查看当前是否开启 mix 模式以及正在使用的配置文件 +/mix false # 关闭模型混合模式,恢复使用原版共享 settings.json +``` + > ℹ️ 支持所有 Anthropic API 兼容服务(如 OpenRouter、AWS Bedrock 代理等),只要接口兼容 Messages API 即可。 ## Feature Flags @@ -217,21 +243,21 @@ TUI (REPL) 模式需要真实终端,无法直接通过 VS Code launch 启动 ## 相关文档及网站 - **在线文档(Mintlify)**: [ccb.agent-aura.top](https://ccb.agent-aura.top/) — 文档源码位于 [`docs/`](docs/) 目录,欢迎投稿 PR -- **DeepWiki**: [https://deepwiki.com/claude-code-best/claude-code](https://deepwiki.com/claude-code-best/claude-code) +- **DeepWiki**: [https://deepwiki.com/claude-code-best/claude-code-mix](https://deepwiki.com/claude-code-best/claude-code-mix) ## Contributors - + Contributors ## Star History - + - - - Star History Chart + + + Star History Chart diff --git a/README_EN.md b/README_EN.md index 6769ff2a9a..9bdd15a7ad 100644 --- a/README_EN.md +++ b/README_EN.md @@ -1,10 +1,10 @@ # Claude Code Best V5 (CCB) -[![GitHub Stars](https://img.shields.io/github/stars/claude-code-best/claude-code?style=flat-square&logo=github&color=yellow)](https://github.com/claude-code-best/claude-code/stargazers) -[![GitHub Contributors](https://img.shields.io/github/contributors/claude-code-best/claude-code?style=flat-square&color=green)](https://github.com/claude-code-best/claude-code/graphs/contributors) -[![GitHub Issues](https://img.shields.io/github/issues/claude-code-best/claude-code?style=flat-square&color=orange)](https://github.com/claude-code-best/claude-code/issues) -[![GitHub License](https://img.shields.io/github/license/claude-code-best/claude-code?style=flat-square)](https://github.com/claude-code-best/claude-code/blob/main/LICENSE) -[![Last Commit](https://img.shields.io/github/last-commit/claude-code-best/claude-code?style=flat-square&color=blue)](https://github.com/claude-code-best/claude-code/commits/main) +[![GitHub Stars](https://img.shields.io/github/stars/claude-code-best/claude-code-mix?style=flat-square&logo=github&color=yellow)](https://github.com/claude-code-best/claude-code-mix/stargazers) +[![GitHub Contributors](https://img.shields.io/github/contributors/claude-code-best/claude-code-mix?style=flat-square&color=green)](https://github.com/claude-code-best/claude-code-mix/graphs/contributors) +[![GitHub Issues](https://img.shields.io/github/issues/claude-code-best/claude-code-mix?style=flat-square&color=orange)](https://github.com/claude-code-best/claude-code-mix/issues) +[![GitHub License](https://img.shields.io/github/license/claude-code-best/claude-code-mix?style=flat-square)](https://github.com/claude-code-best/claude-code-mix/blob/main/LICENSE) +[![Last Commit](https://img.shields.io/github/last-commit/claude-code-best/claude-code-mix?style=flat-square&color=blue)](https://github.com/claude-code-best/claude-code-mix/commits/main) [![Bun](https://img.shields.io/badge/runtime-Bun-black?style=flat-square&logo=bun)](https://bun.sh/) > Which Claude do you like? The open source one is the best. @@ -32,6 +32,7 @@ Sponsor placeholder. - [x] Custom Sentry error reporting support [Docs](https://ccb.agent-aura.top/docs/internals/sentry-setup) - [x] Custom GrowthBook support (GB is open source — configure your own feature flag platform) [Docs](https://ccb.agent-aura.top/docs/internals/growthbook-adapter) - [x] Custom login mode — configure Claude models your way + - [x] `/mix` mixed model mode — use an independent `ccbsettings.json` so Opus, Sonnet, and Haiku can each have their own provider, Base URL, API key, and model name - [ ] V6: Large-scale refactoring, full modular packaging - [ ] V6 will be a new branch; main branch will be archived as a historical version @@ -105,7 +106,7 @@ powershell -c "irm bun.sh/install.ps1 | iex" ### Install ```bash -cd /path/to/claude-code +cd /path/to/claude-code-mix bun install ``` @@ -143,6 +144,31 @@ Fields to fill in: - Model fields auto-fill from current environment variables - Configuration saves to `~/.claude/settings.json` under the `env` key, effective immediately +### `/mix` Mixed Model Mode + +By default, `/login` saves provider settings to the shared `settings.json` file, so Opus, Sonnet, and Haiku use the same provider, API URL, and API key. + +If you want each model family to use a different API endpoint, key, or protocol, enable mixed model mode in the REPL first: + +```text +/mix true +``` + +After enabling it: + +- Configuration is saved to the independent `ccbsettings.json` file instead of the shared `settings.json` +- Running `/login` first asks which model family to configure: `Opus`, `Sonnet`, or `Haiku` +- After selecting the model family, the existing provider menu appears, such as Anthropic Compatible, OpenAI Compatible, or Gemini API +- Each model family can store its own provider, Base URL, API key, and model name + +Common commands: + +```text +/mix true # Enable mixed model mode with independent ccbsettings.json +/mix status # Show whether mix mode is enabled and which config file is active +/mix false # Disable mixed model mode and return to shared settings.json +``` + You can also edit `~/.claude/settings.json` directly: ```json @@ -188,21 +214,21 @@ The TUI (REPL) mode requires a real terminal and cannot be launched directly via ## Documentation & Links - **Online docs (Mintlify)**: [ccb.agent-aura.top](https://ccb.agent-aura.top/) — source in [`docs/`](docs/), PR contributions welcome -- **DeepWiki**: https://deepwiki.com/claude-code-best/claude-code +- **DeepWiki**: https://deepwiki.com/claude-code-best/claude-code-mix ## Contributors - - + + ## Star History - + - - - Star History Chart + + + Star History Chart diff --git a/package.json b/package.json index cd32559f00..4689204eb8 100644 --- a/package.json +++ b/package.json @@ -6,11 +6,11 @@ "author": "claude-code-best ", "repository": { "type": "git", - "url": "git+https://github.com/claude-code-best/claude-code.git" + "url": "git+https://github.com/claude-code-best/claude-code-mix.git" }, - "homepage": "https://github.com/claude-code-best/claude-code#readme", + "homepage": "https://github.com/claude-code-best/claude-code-mix#readme", "bugs": { - "url": "https://github.com/claude-code-best/claude-code/issues" + "url": "https://github.com/claude-code-best/claude-code-mix/issues" }, "keywords": [ "claude", diff --git a/src/commands.ts b/src/commands.ts index 33c1c75f0f..b37759caf7 100644 --- a/src/commands.ts +++ b/src/commands.ts @@ -193,6 +193,7 @@ import stickers from './commands/stickers/index.js' import advisor from './commands/advisor.js' import autonomy from './commands/autonomy.js' import provider from './commands/provider.js' +import mix from './commands/mix.js' import { logError } from './utils/log.js' import { toError } from './utils/errors.js' import { logForDebugging } from './utils/debug.js' @@ -212,6 +213,7 @@ import { import memoize from 'lodash-es/memoize.js' import { isUsing3PServices, isClaudeAISubscriber } from './utils/auth.js' import { isFirstPartyAnthropicBaseUrl } from './utils/model/providers.js' +import { isMixModeEnabled } from './utils/model/mix.js' import env from './commands/env/index.js' import exit from './commands/exit/index.js' import exportCommand from './commands/export/index.js' @@ -300,6 +302,7 @@ const COMMANDS = memoize((): Command[] => [ advisor, autonomy, provider, + mix, agents, branch, btw, @@ -380,7 +383,7 @@ const COMMANDS = memoize((): Command[] => [ hooks, exportCommand, sandboxToggle, - ...(!isUsing3PServices() ? [logout, login()] : []), + ...(!isUsing3PServices() || isMixModeEnabled() ? [logout, login()] : []), passes, ...(peersCmd ? [peersCmd] : []), ...(attachCmd ? [attachCmd] : []), diff --git a/src/commands/mix.ts b/src/commands/mix.ts new file mode 100644 index 0000000000..f58e417b2e --- /dev/null +++ b/src/commands/mix.ts @@ -0,0 +1,99 @@ +import type { Command } from '../commands.js' +import type { LocalCommandCall } from '../types/command.js' +import { MIX_MODE_ENV, isMixModeEnabled } from '../utils/model/mix.js' +import { + getSettings_DEPRECATED, + getSettingsFilePathForSource, + updateSettingsForSource, +} from '../utils/settings/settings.js' + +const TRUE_VALUES = new Set(['true', 'on', 'enable', 'enabled', '1', 'yes']) +const FALSE_VALUES = new Set(['false', 'off', 'disable', 'disabled', '0', 'no']) + +function getMixUsageText( + enabled: boolean, + settingsPath: string | undefined, +): string { + return [ + `Mix mode is ${enabled ? 'enabled' : 'disabled'}.`, + `Settings file: ${settingsPath ?? 'unknown'}`, + '', + 'Usage:', + ' /mix true Enable mixed model mode', + ' /mix false Disable mixed model mode', + ' /mix status Show current mixed model mode status', + '', + 'Mixed model mode uses the independent ccbsettings.json config file instead of the shared settings.json.', + 'When mixed model mode is enabled, Opus, Sonnet, and Haiku can each be configured separately.', + 'After running /mix true, run /login and choose which model family you want to configure first.', + 'Each model family stores its own provider, API URL, API key, and model name in ccbsettings.json.', + ].join('\n') +} + +const call: LocalCommandCall = async args => { + const arg = args.trim().toLowerCase() + const settingsPath = getSettingsFilePathForSource('userSettings') + + if (!arg || arg === 'status') { + const settings = getSettings_DEPRECATED() || {} + const enabled = isMixModeEnabled(settings) + return { + type: 'text', + value: getMixUsageText(enabled, settingsPath), + } + } + + if (!TRUE_VALUES.has(arg) && !FALSE_VALUES.has(arg)) { + return { + type: 'text', + value: getMixUsageText( + isMixModeEnabled(getSettings_DEPRECATED() || {}), + settingsPath, + ), + } + } + + const enabled = TRUE_VALUES.has(arg) + const previousMixEnv = process.env[MIX_MODE_ENV] + if (enabled) { + process.env[MIX_MODE_ENV] = '1' + } + const { error } = updateSettingsForSource('userSettings', { mix: enabled }) + if (error) { + if (previousMixEnv === undefined) { + delete process.env[MIX_MODE_ENV] + } else { + process.env[MIX_MODE_ENV] = previousMixEnv + } + return { + type: 'text', + value: `Failed to update mix mode: ${error.message}`, + } + } + + process.env[MIX_MODE_ENV] = enabled ? '1' : '0' + return { + type: 'text', + value: enabled + ? [ + 'Mix mode enabled.', + '', + 'Mixed model mode uses the independent ccbsettings.json config file.', + 'Next step: run /login, then select Opus, Sonnet, or Haiku to configure that model family.', + 'Each family can use its own provider, API URL, API key, and model name.', + ].join('\n') + : 'Mix mode disabled. /login will use the shared API configuration flow.', + } +} + +const mix = { + type: 'local', + name: 'mix', + description: + 'Enable or disable mixed model mode using an independent config file; Opus, Sonnet, and Haiku can be configured separately', + argumentHint: '[true|false|status]', + supportsNonInteractive: true, + load: () => Promise.resolve({ call }), +} satisfies Command + +export default mix diff --git a/src/components/ConsoleOAuthFlow.tsx b/src/components/ConsoleOAuthFlow.tsx index 9ca4641b3c..3dc00cd70c 100644 --- a/src/components/ConsoleOAuthFlow.tsx +++ b/src/components/ConsoleOAuthFlow.tsx @@ -13,7 +13,19 @@ import { OAuthService } from '../services/oauth/index.js'; import { getOauthAccountInfo, validateForceLoginOrg } from '../utils/auth.js'; import { logError } from '../utils/log.js'; +import { + MIX_MODE_ENV, + MODEL_FAMILIES, + createMixedModelSettingsPatch, + getMixedModelConfig, + getModelFamilyLabel, + isMixModeEnabled, + normalizeModelFamily, + type MixedModelProvider, + type ModelFamily, +} from '../utils/model/mix.js'; import { getSettings_DEPRECATED, updateSettingsForSource } from '../utils/settings/settings.js'; +import type { SettingsJson } from '../utils/settings/types.js'; import { Select } from './CustomSelect/select.js'; import { Spinner } from './Spinner.js'; import TextInput from './TextInput.js'; @@ -26,6 +38,7 @@ type Props = { }; type OAuthStatus = + | { state: 'selecting_mix_model' } | { state: 'idle' } // Initial state, waiting to select login method | { state: 'platform_setup' } // Show platform setup info (Bedrock/Vertex/Foundry) | { @@ -67,6 +80,40 @@ type OAuthStatus = }; const PASTE_HERE_MSG = 'Paste code here if prompted > '; + +type LoginConfigField = 'base_url' | 'api_key' | 'haiku_model' | 'sonnet_model' | 'opus_model'; + +function getModelFieldForFamily(family: ModelFamily): Extract { + return `${family}_model` as Extract; +} + +function getLoginConfigFields(mixEnabled: boolean, family: ModelFamily | null): LoginConfigField[] { + if (mixEnabled && family) { + return ['base_url', 'api_key', getModelFieldForFamily(family)]; + } + return ['base_url', 'api_key', 'haiku_model', 'sonnet_model', 'opus_model']; +} + +function getFamilyModelEnvKey(prefix: 'ANTHROPIC' | 'OPENAI' | 'GEMINI', family: ModelFamily): string { + return `${prefix}_DEFAULT_${family.toUpperCase()}_MODEL`; +} + +function buildProviderSettingsPatch( + mixEnabled: boolean, + family: ModelFamily | null, + provider: MixedModelProvider, + env: Record, +): SettingsJson { + if (mixEnabled && family) { + process.env[MIX_MODE_ENV] = '1'; + return createMixedModelSettingsPatch(family, provider, env); + } + return { + modelType: provider, + env, + }; +} + export function ConsoleOAuthFlow({ onDone, startingMessage, @@ -74,6 +121,7 @@ export function ConsoleOAuthFlow({ forceLoginMethod: forceLoginMethodProp, }: Props): React.ReactNode { const settings = getSettings_DEPRECATED() || {}; + const mixEnabled = mode === 'login' && isMixModeEnabled(settings); const forceLoginMethod = forceLoginMethodProp ?? settings.forceLoginMethod; const orgUUID = settings.forceLoginOrgUUID; const forcedMethodMessage = @@ -92,8 +140,12 @@ export function ConsoleOAuthFlow({ if (forceLoginMethod === 'claudeai' || forceLoginMethod === 'console') { return { state: 'ready_to_start' }; } + if (mixEnabled) { + return { state: 'selecting_mix_model' }; + } return { state: 'idle' }; }); + const [mixModelFamily, setMixModelFamily] = useState(null); const [pastedCode, setPastedCode] = useState(''); const [cursorOffset, setCursorOffset] = useState(0); @@ -262,7 +314,10 @@ export function ConsoleOAuthFlow({ throw new Error((orgResult as { valid: false; message: string }).message); } // Reset modelType to anthropic when using OAuth login - updateSettingsForSource('userSettings', { modelType: 'anthropic' } as any); + updateSettingsForSource( + 'userSettings', + buildProviderSettingsPatch(mixEnabled, mixModelFamily, 'anthropic', {}), + ); setOAuthStatus({ state: 'success' }); void sendNotification( @@ -288,7 +343,7 @@ export function ConsoleOAuthFlow({ ssl_error: sslHint !== null, }); } - }, [oauthService, setShowPastePrompt, loginWithClaudeAi, mode, orgUUID]); + }, [oauthService, setShowPastePrompt, loginWithClaudeAi, mode, orgUUID, mixEnabled, mixModelFamily]); const pendingOAuthStartRef = useRef(false); @@ -372,6 +427,10 @@ export function ConsoleOAuthFlow({ handleSubmitCode={handleSubmitCode} setOAuthStatus={setOAuthStatus} setLoginWithClaudeAi={setLoginWithClaudeAi} + settings={settings} + mixEnabled={mixEnabled} + mixModelFamily={mixModelFamily} + setMixModelFamily={setMixModelFamily} onDone={onDone} /> @@ -394,6 +453,10 @@ type OAuthStatusMessageProps = { handleSubmitCode: (value: string, url: string) => void; setOAuthStatus: (status: OAuthStatus) => void; setLoginWithClaudeAi: (value: boolean) => void; + settings: SettingsJson; + mixEnabled: boolean; + mixModelFamily: ModelFamily | null; + setMixModelFamily: (family: ModelFamily | null) => void; }; function OAuthStatusMessage({ @@ -410,9 +473,47 @@ function OAuthStatusMessage({ handleSubmitCode, setOAuthStatus, setLoginWithClaudeAi, + settings, + mixEnabled, + mixModelFamily, + setMixModelFamily, onDone, }: OAuthStatusMessageProps): React.ReactNode { + const mixModelLabel = mixModelFamily ? getModelFamilyLabel(mixModelFamily) : null; + const getLoginEnvValue = (key: string): string => { + if (mixEnabled && mixModelFamily) { + return getMixedModelConfig(mixModelFamily, settings)?.env?.[key] ?? process.env[key] ?? ''; + } + return process.env[key] ?? ''; + }; + switch (oauthStatus.state) { + case 'selecting_mix_model': + return ( + + Select model to configure: + + { const idx = FIELDS.indexOf(activeField); @@ -739,13 +846,15 @@ function OAuthStatusMessage({ return ( - Anthropic Compatible Setup + + {mixModelLabel ? `${mixModelLabel} Anthropic Compatible Setup` : 'Anthropic Compatible Setup'} + {renderRow('base_url', 'Base URL ')} {renderRow('api_key', 'API Key ', { mask: true })} - {renderRow('haiku_model', 'Haiku ')} - {renderRow('sonnet_model', 'Sonnet ')} - {renderRow('opus_model', 'Opus ')} + {FIELDS.includes('haiku_model') && renderRow('haiku_model', 'Haiku ')} + {FIELDS.includes('sonnet_model') && renderRow('sonnet_model', 'Sonnet ')} + {FIELDS.includes('opus_model') && renderRow('opus_model', 'Opus ')} ↑↓/Tab to switch · Enter on last field to save · Esc to go back @@ -754,7 +863,7 @@ function OAuthStatusMessage({ case 'openai_chat_api': { type OpenAIField = 'base_url' | 'api_key' | 'haiku_model' | 'sonnet_model' | 'opus_model'; - const OPENAI_FIELDS: OpenAIField[] = ['base_url', 'api_key', 'haiku_model', 'sonnet_model', 'opus_model']; + const OPENAI_FIELDS = getLoginConfigFields(mixEnabled, mixModelFamily) as OpenAIField[]; const op = oauthStatus as { state: 'openai_chat_api'; activeField: OpenAIField; @@ -833,13 +942,19 @@ function OAuthStatusMessage({ } if (finalVals.api_key) env.OPENAI_API_KEY = finalVals.api_key; - if (finalVals.haiku_model) env.OPENAI_DEFAULT_HAIKU_MODEL = finalVals.haiku_model; - if (finalVals.sonnet_model) env.OPENAI_DEFAULT_SONNET_MODEL = finalVals.sonnet_model; - if (finalVals.opus_model) env.OPENAI_DEFAULT_OPUS_MODEL = finalVals.opus_model; - const { error } = updateSettingsForSource('userSettings', { - modelType: 'openai' as any, - env, - } as any); + if (mixEnabled && mixModelFamily) { + const modelField = getModelFieldForFamily(mixModelFamily); + const modelValue = finalVals[modelField]; + if (modelValue) env[getFamilyModelEnvKey('OPENAI', mixModelFamily)] = modelValue; + } else { + if (finalVals.haiku_model) env.OPENAI_DEFAULT_HAIKU_MODEL = finalVals.haiku_model; + if (finalVals.sonnet_model) env.OPENAI_DEFAULT_SONNET_MODEL = finalVals.sonnet_model; + if (finalVals.opus_model) env.OPENAI_DEFAULT_OPUS_MODEL = finalVals.opus_model; + } + const { error } = updateSettingsForSource( + 'userSettings', + buildProviderSettingsPatch(mixEnabled, mixModelFamily, 'openai', env), + ); if (error) { setOAuthStatus({ state: 'error', @@ -859,7 +974,7 @@ function OAuthStatusMessage({ setOAuthStatus({ state: 'success' }); void onDone(); } - }, [activeField, openaiInputValue, openaiDisplayValues, setOAuthStatus, onDone]); + }, [activeField, openaiInputValue, openaiDisplayValues, setOAuthStatus, onDone, mixEnabled, mixModelFamily]); const handleOpenAIEnter = useCallback(() => { const idx = OPENAI_FIELDS.indexOf(activeField); @@ -939,14 +1054,16 @@ function OAuthStatusMessage({ return ( - OpenAI Compatible API Setup + + {mixModelLabel ? `${mixModelLabel} OpenAI Compatible API Setup` : 'OpenAI Compatible API Setup'} + Configure an OpenAI Chat Completions compatible endpoint (e.g. Ollama, DeepSeek, vLLM). {renderOpenAIRow('base_url', 'Base URL ')} {renderOpenAIRow('api_key', 'API Key ', { mask: true })} - {renderOpenAIRow('haiku_model', 'Haiku ')} - {renderOpenAIRow('sonnet_model', 'Sonnet ')} - {renderOpenAIRow('opus_model', 'Opus ')} + {OPENAI_FIELDS.includes('haiku_model') && renderOpenAIRow('haiku_model', 'Haiku ')} + {OPENAI_FIELDS.includes('sonnet_model') && renderOpenAIRow('sonnet_model', 'Sonnet ')} + {OPENAI_FIELDS.includes('opus_model') && renderOpenAIRow('opus_model', 'Opus ')} ↑↓/Tab to switch · Enter on last field to save · Esc to go back @@ -955,7 +1072,7 @@ function OAuthStatusMessage({ case 'gemini_api': { type GeminiField = 'base_url' | 'api_key' | 'haiku_model' | 'sonnet_model' | 'opus_model'; - const GEMINI_FIELDS: GeminiField[] = ['base_url', 'api_key', 'haiku_model', 'sonnet_model', 'opus_model']; + const GEMINI_FIELDS = getLoginConfigFields(mixEnabled, mixModelFamily) as GeminiField[]; const gp = oauthStatus as { state: 'gemini_api'; activeField: GeminiField; @@ -1008,7 +1125,25 @@ function OAuthStatusMessage({ const doGeminiSave = useCallback(() => { const finalVals = { ...geminiDisplayValues, [activeField]: geminiInputValue }; - if (!finalVals.haiku_model || !finalVals.sonnet_model || !finalVals.opus_model) { + if (mixEnabled && mixModelFamily) { + const modelField = getModelFieldForFamily(mixModelFamily); + if (!finalVals[modelField]) { + setOAuthStatus({ + state: 'error', + message: `Gemini setup requires a ${getModelFamilyLabel(mixModelFamily)} model name.`, + toRetry: { + state: 'gemini_api', + baseUrl: finalVals.base_url, + apiKey: finalVals.api_key, + haikuModel: finalVals.haiku_model, + sonnetModel: finalVals.sonnet_model, + opusModel: finalVals.opus_model, + activeField, + }, + }); + return; + } + } else if (!finalVals.haiku_model || !finalVals.sonnet_model || !finalVals.opus_model) { setOAuthStatus({ state: 'error', message: 'Gemini setup requires Haiku, Sonnet, and Opus model names.', @@ -1028,13 +1163,19 @@ function OAuthStatusMessage({ const env: Record = {}; if (finalVals.base_url) env.GEMINI_BASE_URL = finalVals.base_url; if (finalVals.api_key) env.GEMINI_API_KEY = finalVals.api_key; - if (finalVals.haiku_model) env.GEMINI_DEFAULT_HAIKU_MODEL = finalVals.haiku_model; - if (finalVals.sonnet_model) env.GEMINI_DEFAULT_SONNET_MODEL = finalVals.sonnet_model; - if (finalVals.opus_model) env.GEMINI_DEFAULT_OPUS_MODEL = finalVals.opus_model; - const { error } = updateSettingsForSource('userSettings', { - modelType: 'gemini' as any, - env, - } as any); + if (mixEnabled && mixModelFamily) { + const modelField = getModelFieldForFamily(mixModelFamily); + const modelValue = finalVals[modelField]; + if (modelValue) env[getFamilyModelEnvKey('GEMINI', mixModelFamily)] = modelValue; + } else { + if (finalVals.haiku_model) env.GEMINI_DEFAULT_HAIKU_MODEL = finalVals.haiku_model; + if (finalVals.sonnet_model) env.GEMINI_DEFAULT_SONNET_MODEL = finalVals.sonnet_model; + if (finalVals.opus_model) env.GEMINI_DEFAULT_OPUS_MODEL = finalVals.opus_model; + } + const { error } = updateSettingsForSource( + 'userSettings', + buildProviderSettingsPatch(mixEnabled, mixModelFamily, 'gemini', env), + ); if (error) { setOAuthStatus({ state: 'error', @@ -1054,7 +1195,7 @@ function OAuthStatusMessage({ setOAuthStatus({ state: 'success' }); void onDone(); } - }, [activeField, geminiInputValue, geminiDisplayValues, onDone, setOAuthStatus]); + }, [activeField, geminiInputValue, geminiDisplayValues, onDone, setOAuthStatus, mixEnabled, mixModelFamily]); const handleGeminiEnter = useCallback(() => { const idx = GEMINI_FIELDS.indexOf(activeField); @@ -1134,7 +1275,7 @@ function OAuthStatusMessage({ return ( - Gemini API Setup + {mixModelLabel ? `${mixModelLabel} Gemini API Setup` : 'Gemini API Setup'} Configure a Gemini Generate Content compatible endpoint. Base URL is optional and defaults to Google's v1beta API. @@ -1142,9 +1283,9 @@ function OAuthStatusMessage({ {renderGeminiRow('base_url', 'Base URL ')} {renderGeminiRow('api_key', 'API Key ', { mask: true })} - {renderGeminiRow('haiku_model', 'Haiku ')} - {renderGeminiRow('sonnet_model', 'Sonnet ')} - {renderGeminiRow('opus_model', 'Opus ')} + {GEMINI_FIELDS.includes('haiku_model') && renderGeminiRow('haiku_model', 'Haiku ')} + {GEMINI_FIELDS.includes('sonnet_model') && renderGeminiRow('sonnet_model', 'Sonnet ')} + {GEMINI_FIELDS.includes('opus_model') && renderGeminiRow('opus_model', 'Opus ')} ↑↓/Tab to switch · Enter on last field to save · Esc to go back diff --git a/src/services/api/claude.ts b/src/services/api/claude.ts index 528c60938a..31c75f9e58 100644 --- a/src/services/api/claude.ts +++ b/src/services/api/claude.ts @@ -89,6 +89,10 @@ import { getSmallFastModel, isNonCustomOpusModel, } from '../../utils/model/model.js' +import { + applyMixedModelConfigForModel, + getAPIProviderForModel, +} from '../../utils/model/mix.js' import { asSystemPrompt, type SystemPrompt, @@ -1094,8 +1098,12 @@ async function* queryModel( // Also naturally handles rollback/undo since removed messages won't be in the array. const previousRequestId = getPreviousRequestIdFromMessages(messages) + const apiProvider = + applyMixedModelConfigForModel(options.model) ?? + getAPIProviderForModel(options.model) + const resolvedModel = - getAPIProvider() === 'bedrock' && + apiProvider === 'bedrock' && options.model.includes('application-inference-profile') ? ((await getInferenceProfileBackingModel(options.model)) ?? options.model) @@ -1215,7 +1223,7 @@ async function* queryModel( // Header differs by provider: 1P/Foundry use advanced-tool-use, Vertex/Bedrock use tool-search-tool // For Bedrock, this header must go in extraBodyParams, not the betas array const toolSearchHeader = useToolSearch ? getToolSearchBetaHeader() : null - if (toolSearchHeader && getAPIProvider() !== 'bedrock') { + if (toolSearchHeader && apiProvider !== 'bedrock') { if (!betas.includes(toolSearchHeader)) { betas.push(toolSearchHeader) } @@ -1362,7 +1370,7 @@ async function* queryModel( // OpenAI-compatible provider: delegate to the OpenAI adapter layer // after shared preprocessing (message normalization, tool filtering, // media stripping) but before Anthropic-specific logic (betas, thinking, caching). - if (getAPIProvider() === 'openai') { + if (apiProvider === 'openai') { const { queryModelOpenAI } = await import('./openai/index.js') // OpenAI emulates Anthropic's dynamic tool loading client-side. It needs // the full tool pool so ToolSearchTool can search deferred MCP tools that @@ -1377,7 +1385,7 @@ async function* queryModel( return } - if (getAPIProvider() === 'gemini') { + if (apiProvider === 'gemini') { const { queryModelGemini } = await import('./gemini/index.js') yield* queryModelGemini( messagesForAPI, @@ -1390,7 +1398,7 @@ async function* queryModel( return } - if (getAPIProvider() === 'grok') { + if (apiProvider === 'grok') { const { queryModelGrok } = await import('./grok/index.js') yield* queryModelGrok( messagesForAPI, @@ -1521,7 +1529,7 @@ async function* queryModel( if ( !cacheEditingHeaderLatched && cachedMCEnabled && - getAPIProvider() === 'firstParty' && + apiProvider === 'firstParty' && options.querySource === 'repl_main_thread' ) { cacheEditingHeaderLatched = true @@ -1617,7 +1625,7 @@ async function* queryModel( enablePromptCaching, options.querySource, cachedMCEnabled && - getAPIProvider() === 'firstParty' && + apiProvider === 'firstParty' && options.querySource === 'repl_main_thread', consumedCacheEdits as any, consumedPinnedEdits as any, @@ -1655,7 +1663,7 @@ async function* queryModel( // For Bedrock, include both model-based betas and dynamically-added tool search header const bedrockBetas = - getAPIProvider() === 'bedrock' + apiProvider === 'bedrock' ? [ ...getBedrockExtraBodyParamsBetas(retryContext.model), ...(toolSearchHeader ? [toolSearchHeader] : []), @@ -1780,7 +1788,7 @@ async function* queryModel( if ( cacheEditingHeaderLatched && cacheEditingBetaHeader && - getAPIProvider() === 'firstParty' && + apiProvider === 'firstParty' && options.querySource === 'repl_main_thread' && !betasParams.includes(cacheEditingBetaHeader) ) { @@ -1919,7 +1927,7 @@ async function* queryModel( // server request ID) can still be correlated with server logs. // First-party only — 3P providers don't log it (inc-4029 class). clientRequestId = - getAPIProvider() === 'firstParty' && isFirstPartyAnthropicBaseUrl() + apiProvider === 'firstParty' && isFirstPartyAnthropicBaseUrl() ? randomUUID() : undefined @@ -2545,7 +2553,7 @@ async function* queryModel( // (Bedrock) expose their own throttle headers — let their adapter // overwrite the store with its bucket(s). Anthropic's adapter runs // inside extractQuotaStatusFromHeaders. - if (getAPIProvider() === 'bedrock') { + if (apiProvider === 'bedrock') { updateProviderBuckets( 'bedrock', bedrockAdapter.parseHeaders(resp.headers), diff --git a/src/services/api/grok/client.ts b/src/services/api/grok/client.ts index 060d126363..8b1835807f 100644 --- a/src/services/api/grok/client.ts +++ b/src/services/api/grok/client.ts @@ -11,16 +11,28 @@ import { getProxyFetchOptions } from 'src/utils/proxy.js' const DEFAULT_BASE_URL = 'https://api.x.ai/v1' let cachedClient: OpenAI | null = null +let cachedClientKey: string | null = null export function getGrokClient(options?: { maxRetries?: number fetchOverride?: typeof fetch source?: string }): OpenAI { - if (cachedClient) return cachedClient - const apiKey = process.env.GROK_API_KEY || process.env.XAI_API_KEY || '' const baseURL = process.env.GROK_BASE_URL || DEFAULT_BASE_URL + const clientKey = JSON.stringify({ + apiKey, + baseURL, + maxRetries: options?.maxRetries ?? 0, + timeout: process.env.API_TIMEOUT_MS || String(600 * 1000), + }) + if ( + !options?.fetchOverride && + cachedClient && + cachedClientKey === clientKey + ) { + return cachedClient + } const client = new OpenAI({ apiKey, @@ -34,6 +46,7 @@ export function getGrokClient(options?: { if (!options?.fetchOverride) { cachedClient = client + cachedClientKey = clientKey } return client @@ -41,4 +54,5 @@ export function getGrokClient(options?: { export function clearGrokClientCache(): void { cachedClient = null + cachedClientKey = null } diff --git a/src/services/api/openai/client.ts b/src/services/api/openai/client.ts index 5ee37cd414..651d3c6382 100644 --- a/src/services/api/openai/client.ts +++ b/src/services/api/openai/client.ts @@ -13,6 +13,7 @@ import { getProxyFetchOptions } from 'src/utils/proxy.js' */ let cachedClient: OpenAI | null = null +let cachedClientKey: string | null = null /** * Wrap a fetch so that every response's rate-limit headers are fed into the @@ -41,10 +42,23 @@ export function getOpenAIClient(options?: { fetchOverride?: typeof fetch source?: string }): OpenAI { - if (cachedClient) return cachedClient - const apiKey = process.env.OPENAI_API_KEY || '' const baseURL = process.env.OPENAI_BASE_URL + const clientKey = JSON.stringify({ + apiKey, + baseURL, + maxRetries: options?.maxRetries ?? 0, + timeout: process.env.API_TIMEOUT_MS || String(600 * 1000), + organization: process.env.OPENAI_ORG_ID, + project: process.env.OPENAI_PROJECT_ID, + }) + if ( + !options?.fetchOverride && + cachedClient && + cachedClientKey === clientKey + ) { + return cachedClient + } const baseFetch = options?.fetchOverride ?? (globalThis.fetch as typeof fetch) const wrappedFetch = wrapFetchForUsage(baseFetch) @@ -67,6 +81,7 @@ export function getOpenAIClient(options?: { if (!options?.fetchOverride) { cachedClient = client + cachedClientKey = clientKey } return client @@ -75,4 +90,5 @@ export function getOpenAIClient(options?: { /** Clear the cached client (useful when env vars change). */ export function clearOpenAIClientCache(): void { cachedClient = null + cachedClientKey = null } diff --git a/src/utils/model/mix.ts b/src/utils/model/mix.ts new file mode 100644 index 0000000000..edc4ba2d86 --- /dev/null +++ b/src/utils/model/mix.ts @@ -0,0 +1,279 @@ +import { isEnvTruthy } from '../envUtils.js' +import { getSettings_DEPRECATED } from '../settings/settings.js' +import type { SettingsJson } from '../settings/types.js' +import { getAPIProvider, type APIProvider } from './providers.js' + +export const MIX_MODE_ENV = 'CCB_MIX' + +export const MODEL_FAMILIES = ['opus', 'sonnet', 'haiku'] as const + +export type ModelFamily = (typeof MODEL_FAMILIES)[number] + +export type MixedModelProvider = 'anthropic' | 'openai' | 'gemini' | 'grok' + +type MixedModelConfig = NonNullable< + NonNullable[ModelFamily] +> + +const MODEL_FAMILY_LABELS: Record = { + opus: 'Opus', + sonnet: 'Sonnet', + haiku: 'Haiku', +} + +const mixedEnvOriginalValues = new Map() +let lastAppliedMixedEnvKeys = new Set() + +const MIXED_PROVIDER_ENV_KEYS = [ + 'ANTHROPIC_API_KEY', + 'ANTHROPIC_AUTH_TOKEN', + 'ANTHROPIC_BASE_URL', + 'ANTHROPIC_DEFAULT_HAIKU_MODEL', + 'ANTHROPIC_DEFAULT_OPUS_MODEL', + 'ANTHROPIC_DEFAULT_SONNET_MODEL', + 'ANTHROPIC_MODEL', + 'ANTHROPIC_SMALL_FAST_MODEL', + 'GEMINI_API_KEY', + 'GEMINI_BASE_URL', + 'GEMINI_DEFAULT_HAIKU_MODEL', + 'GEMINI_DEFAULT_OPUS_MODEL', + 'GEMINI_DEFAULT_SONNET_MODEL', + 'GEMINI_MODEL', + 'GEMINI_SMALL_FAST_MODEL', + 'GROK_API_KEY', + 'GROK_BASE_URL', + 'GROK_DEFAULT_HAIKU_MODEL', + 'GROK_DEFAULT_OPUS_MODEL', + 'GROK_DEFAULT_SONNET_MODEL', + 'GROK_MODEL', + 'OPENAI_API_KEY', + 'OPENAI_BASE_URL', + 'OPENAI_DEFAULT_HAIKU_MODEL', + 'OPENAI_DEFAULT_OPUS_MODEL', + 'OPENAI_DEFAULT_SONNET_MODEL', + 'OPENAI_MODEL', + 'OPENAI_ORG_ID', + 'OPENAI_PROJECT_ID', + 'OPENAI_SMALL_FAST_MODEL', + 'XAI_API_KEY', +] as const + +export function getModelFamilyLabel(family: ModelFamily): string { + return MODEL_FAMILY_LABELS[family] +} + +export function isModelFamily(value: string): value is ModelFamily { + return (MODEL_FAMILIES as readonly string[]).includes(value) +} + +export function normalizeModelFamily(value: string): ModelFamily | null { + const normalized = value.trim().toLowerCase() + return isModelFamily(normalized) ? normalized : null +} + +export function providerToAPIProvider( + provider: MixedModelProvider | undefined, +): APIProvider | undefined { + if (!provider) return undefined + if (provider === 'anthropic') return 'firstParty' + return provider +} + +export function isMixModeEnabled( + settings: Pick = getSettings_DEPRECATED() || {}, +): boolean { + return settings.mix === true || isEnvTruthy(process.env[MIX_MODE_ENV]) +} + +export function getMixedModelConfig( + family: ModelFamily, + settings: Pick< + SettingsJson, + 'mixedModelConfigs' + > = getSettings_DEPRECATED() || {}, +): MixedModelConfig | undefined { + return settings.mixedModelConfigs?.[family] +} + +export function getMixedModelEnv( + family: ModelFamily, + key: string, + settings: Pick< + SettingsJson, + 'mix' | 'mixedModelConfigs' + > = getSettings_DEPRECATED() || {}, +): string | undefined { + if (!isMixModeEnabled(settings)) return undefined + return getMixedModelConfig(family, settings)?.env?.[key] +} + +export function getAPIProviderForModelFamily( + family: ModelFamily, + settings: Pick< + SettingsJson, + 'mix' | 'mixedModelConfigs' | 'modelType' + > = getSettings_DEPRECATED() || {}, +): APIProvider { + if (isMixModeEnabled(settings)) { + const provider = providerToAPIProvider( + getMixedModelConfig(family, settings)?.provider, + ) + if (provider) return provider + } + return getAPIProvider(settings) +} + +function stripModelTags(model: string): string { + return model + .toLowerCase() + .replace(/\[1m\]$/i, '') + .trim() +} + +function getConfiguredModelEnvKeys(family: ModelFamily): string[] { + const upper = family.toUpperCase() + return [ + `ANTHROPIC_DEFAULT_${upper}_MODEL`, + `OPENAI_DEFAULT_${upper}_MODEL`, + `GEMINI_DEFAULT_${upper}_MODEL`, + `GROK_DEFAULT_${upper}_MODEL`, + ] +} + +function modelMatchesConfiguredFamily( + model: string, + family: ModelFamily, + settings: Pick, +): boolean { + const config = getMixedModelConfig(family, settings) + if (!config?.env) return false + const normalizedModel = stripModelTags(model) + for (const key of getConfiguredModelEnvKeys(family)) { + const configured = config.env[key] + if (configured && stripModelTags(configured) === normalizedModel) { + return true + } + } + return false +} + +export function getModelFamilyForModel( + model: string, + settings: Pick< + SettingsJson, + 'mixedModelConfigs' + > = getSettings_DEPRECATED() || {}, +): ModelFamily | null { + const normalizedModel = stripModelTags(model) + if (normalizedModel.includes('opus')) return 'opus' + if (normalizedModel.includes('sonnet')) return 'sonnet' + if (normalizedModel.includes('haiku')) return 'haiku' + + for (const family of MODEL_FAMILIES) { + if (modelMatchesConfiguredFamily(normalizedModel, family, settings)) { + return family + } + } + + return null +} + +export function getAPIProviderForModel( + model: string, + settings: Pick< + SettingsJson, + 'mix' | 'mixedModelConfigs' | 'modelType' + > = getSettings_DEPRECATED() || {}, +): APIProvider { + const family = getModelFamilyForModel(model, settings) + if (family) return getAPIProviderForModelFamily(family, settings) + return getAPIProvider(settings) +} + +function rememberOriginalEnvValue(key: string): void { + if (!mixedEnvOriginalValues.has(key)) { + mixedEnvOriginalValues.set(key, process.env[key]) + } +} + +function getKeysToManage(nextEnv: Record): Set { + return new Set([ + ...MIXED_PROVIDER_ENV_KEYS, + ...lastAppliedMixedEnvKeys, + ...Object.keys(nextEnv), + ]) +} + +function restorePreviousMixedEnv(): void { + for (const key of lastAppliedMixedEnvKeys) { + const originalValue = mixedEnvOriginalValues.get(key) + if (originalValue === undefined) { + delete process.env[key] + } else { + process.env[key] = originalValue + } + } + lastAppliedMixedEnvKeys = new Set() +} + +export function applyMixedModelConfigForFamily( + family: ModelFamily, + settings: Pick< + SettingsJson, + 'mix' | 'mixedModelConfigs' + > = getSettings_DEPRECATED() || {}, +): APIProvider | undefined { + if (!isMixModeEnabled(settings)) return undefined + const config = getMixedModelConfig(family, settings) + if (!config) return undefined + + const env = config.env || {} + const keysToManage = getKeysToManage(env) + for (const key of keysToManage) { + rememberOriginalEnvValue(key) + const value = env[key] + if (value === undefined) { + delete process.env[key] + } else { + process.env[key] = value + } + } + lastAppliedMixedEnvKeys = keysToManage + + return providerToAPIProvider(config.provider) +} + +export function applyMixedModelConfigForModel( + model: string, + settings: Pick< + SettingsJson, + 'mix' | 'mixedModelConfigs' + > = getSettings_DEPRECATED() || {}, +): APIProvider | undefined { + if (!isMixModeEnabled(settings)) { + restorePreviousMixedEnv() + return undefined + } + const family = getModelFamilyForModel(model, settings) + if (!family) { + restorePreviousMixedEnv() + return undefined + } + return applyMixedModelConfigForFamily(family, settings) +} + +export function createMixedModelSettingsPatch( + family: ModelFamily, + provider: MixedModelProvider, + env: Record, +): Pick { + return { + mix: true, + mixedModelConfigs: { + [family]: { + provider, + env, + }, + }, + } +} diff --git a/src/utils/model/model.ts b/src/utils/model/model.ts index a43d101bb4..3dd9121af1 100644 --- a/src/utils/model/model.ts +++ b/src/utils/model/model.ts @@ -25,6 +25,7 @@ import { formatModelPricing, getOpus46CostTier } from '../modelCost.js' import { getSettings_DEPRECATED } from '../settings/settings.js' import type { PermissionMode } from '../permissions/PermissionMode.js' import { getAPIProvider, isFirstPartyAnthropicBaseUrl } from './providers.js' +import { getAPIProviderForModelFamily, getMixedModelEnv } from './mix.js' import { LIGHTNING_BOLT } from '../../constants/figures.js' import { isModelAllowed } from './modelAllowlist.js' import { type ModelAlias, isModelAlias } from './aliases.js' @@ -35,16 +36,26 @@ export type ModelName = string export type ModelSetting = ModelName | ModelAlias | null export function getSmallFastModel(): ModelName { - const provider = getAPIProvider() + const provider = getAPIProviderForModelFamily('haiku') // Provider-specific small fast model - if (provider === 'openai' && process.env.OPENAI_SMALL_FAST_MODEL) { - return process.env.OPENAI_SMALL_FAST_MODEL - } - if (provider === 'gemini' && process.env.GEMINI_SMALL_FAST_MODEL) { - return process.env.GEMINI_SMALL_FAST_MODEL + const openaiSmallFastModel = + getMixedModelEnv('haiku', 'OPENAI_SMALL_FAST_MODEL') || + process.env.OPENAI_SMALL_FAST_MODEL + if (provider === 'openai' && openaiSmallFastModel) { + return openaiSmallFastModel + } + const geminiSmallFastModel = + getMixedModelEnv('haiku', 'GEMINI_SMALL_FAST_MODEL') || + process.env.GEMINI_SMALL_FAST_MODEL + if (provider === 'gemini' && geminiSmallFastModel) { + return geminiSmallFastModel } // Anthropic-specific or fallback - return process.env.ANTHROPIC_SMALL_FAST_MODEL || getDefaultHaikuModel() + return ( + getMixedModelEnv('haiku', 'ANTHROPIC_SMALL_FAST_MODEL') || + process.env.ANTHROPIC_SMALL_FAST_MODEL || + getDefaultHaikuModel() + ) } export function isNonCustomOpusModel(model: ModelName): boolean { @@ -114,18 +125,27 @@ export function getBestModel(): ModelName { // @[MODEL LAUNCH]: Update the default Opus model (3P providers may lag so keep defaults unchanged). export function getDefaultOpusModel(): ModelName { - const provider = getAPIProvider() + const provider = getAPIProviderForModelFamily('opus') // For OpenAI provider, check OPENAI_DEFAULT_OPUS_MODEL first - if (provider === 'openai' && process.env.OPENAI_DEFAULT_OPUS_MODEL) { - return process.env.OPENAI_DEFAULT_OPUS_MODEL + const openaiModel = + getMixedModelEnv('opus', 'OPENAI_DEFAULT_OPUS_MODEL') || + process.env.OPENAI_DEFAULT_OPUS_MODEL + if (provider === 'openai' && openaiModel) { + return openaiModel } // For Gemini provider, check GEMINI_DEFAULT_OPUS_MODEL - if (provider === 'gemini' && process.env.GEMINI_DEFAULT_OPUS_MODEL) { - return process.env.GEMINI_DEFAULT_OPUS_MODEL + const geminiModel = + getMixedModelEnv('opus', 'GEMINI_DEFAULT_OPUS_MODEL') || + process.env.GEMINI_DEFAULT_OPUS_MODEL + if (provider === 'gemini' && geminiModel) { + return geminiModel } // Anthropic-specific override (for first-party and other 3P providers) - if (process.env.ANTHROPIC_DEFAULT_OPUS_MODEL) { - return process.env.ANTHROPIC_DEFAULT_OPUS_MODEL + const anthropicModel = + getMixedModelEnv('opus', 'ANTHROPIC_DEFAULT_OPUS_MODEL') || + process.env.ANTHROPIC_DEFAULT_OPUS_MODEL + if (anthropicModel) { + return anthropicModel } // 3P providers (Bedrock, Vertex, Foundry) all publish Opus 4.7 in sync // with firstParty as of 2026-04-17 (AWS Bedrock, Google Vertex AI, and @@ -139,18 +159,27 @@ export function getDefaultOpusModel(): ModelName { // @[MODEL LAUNCH]: Update the default Sonnet model (3P providers may lag so keep defaults unchanged). export function getDefaultSonnetModel(): ModelName { - const provider = getAPIProvider() + const provider = getAPIProviderForModelFamily('sonnet') // For OpenAI provider, check OPENAI_DEFAULT_SONNET_MODEL first - if (provider === 'openai' && process.env.OPENAI_DEFAULT_SONNET_MODEL) { - return process.env.OPENAI_DEFAULT_SONNET_MODEL + const openaiModel = + getMixedModelEnv('sonnet', 'OPENAI_DEFAULT_SONNET_MODEL') || + process.env.OPENAI_DEFAULT_SONNET_MODEL + if (provider === 'openai' && openaiModel) { + return openaiModel } // For Gemini provider, check GEMINI_DEFAULT_SONNET_MODEL - if (provider === 'gemini' && process.env.GEMINI_DEFAULT_SONNET_MODEL) { - return process.env.GEMINI_DEFAULT_SONNET_MODEL + const geminiModel = + getMixedModelEnv('sonnet', 'GEMINI_DEFAULT_SONNET_MODEL') || + process.env.GEMINI_DEFAULT_SONNET_MODEL + if (provider === 'gemini' && geminiModel) { + return geminiModel } // Anthropic-specific override (for first-party and other 3P providers) - if (process.env.ANTHROPIC_DEFAULT_SONNET_MODEL) { - return process.env.ANTHROPIC_DEFAULT_SONNET_MODEL + const anthropicModel = + getMixedModelEnv('sonnet', 'ANTHROPIC_DEFAULT_SONNET_MODEL') || + process.env.ANTHROPIC_DEFAULT_SONNET_MODEL + if (anthropicModel) { + return anthropicModel } // Default to Sonnet 4.5 for 3P since they may not have 4.6 yet if (provider !== 'firstParty') { @@ -161,18 +190,27 @@ export function getDefaultSonnetModel(): ModelName { // @[MODEL LAUNCH]: Update the default Haiku model (3P providers may lag so keep defaults unchanged). export function getDefaultHaikuModel(): ModelName { - const provider = getAPIProvider() + const provider = getAPIProviderForModelFamily('haiku') // For OpenAI provider, check OPENAI_DEFAULT_HAIKU_MODEL first - if (provider === 'openai' && process.env.OPENAI_DEFAULT_HAIKU_MODEL) { - return process.env.OPENAI_DEFAULT_HAIKU_MODEL + const openaiModel = + getMixedModelEnv('haiku', 'OPENAI_DEFAULT_HAIKU_MODEL') || + process.env.OPENAI_DEFAULT_HAIKU_MODEL + if (provider === 'openai' && openaiModel) { + return openaiModel } // For Gemini provider, check GEMINI_DEFAULT_HAIKU_MODEL - if (provider === 'gemini' && process.env.GEMINI_DEFAULT_HAIKU_MODEL) { - return process.env.GEMINI_DEFAULT_HAIKU_MODEL + const geminiModel = + getMixedModelEnv('haiku', 'GEMINI_DEFAULT_HAIKU_MODEL') || + process.env.GEMINI_DEFAULT_HAIKU_MODEL + if (provider === 'gemini' && geminiModel) { + return geminiModel } // Anthropic-specific override (for first-party and other 3P providers) - if (process.env.ANTHROPIC_DEFAULT_HAIKU_MODEL) { - return process.env.ANTHROPIC_DEFAULT_HAIKU_MODEL + const anthropicModel = + getMixedModelEnv('haiku', 'ANTHROPIC_DEFAULT_HAIKU_MODEL') || + process.env.ANTHROPIC_DEFAULT_HAIKU_MODEL + if (anthropicModel) { + return anthropicModel } // Haiku 4.5 is available on all platforms (first-party, Foundry, Bedrock, Vertex) @@ -350,7 +388,7 @@ export function renderDefaultModelSetting( } export function getOpusPricingSuffix(fastMode: boolean): string { - if (getAPIProvider() !== 'firstParty') return '' + if (getAPIProviderForModelFamily('opus') !== 'firstParty') return '' const pricing = formatModelPricing(getOpus46CostTier(fastMode)) const fastModeIndicator = fastMode ? ` (${LIGHTNING_BOLT})` : '' return ` ·${fastModeIndicator} ${pricing}` @@ -360,7 +398,7 @@ export function isOpus1mMergeEnabled(): boolean { if ( is1mContextDisabled() || isProSubscriber() || - getAPIProvider() !== 'firstParty' || + getAPIProviderForModelFamily('opus') !== 'firstParty' || !isFirstPartyAnthropicBaseUrl() ) { return false diff --git a/src/utils/settings/settings.ts b/src/utils/settings/settings.ts index f656e6a6ac..25ca4c6c70 100644 --- a/src/utils/settings/settings.ts +++ b/src/utils/settings/settings.ts @@ -252,16 +252,10 @@ export function getSettingsRootPathForSource(source: SettingSource): string { } } -/** - * Get the user settings filename based on cowork mode. - * Returns 'cowork_settings.json' when in cowork mode, 'settings.json' otherwise. - * - * Priority: - * 1. Session state (set by CLI flag --cowork) - * 2. Environment variable CLAUDE_CODE_USE_COWORK_PLUGINS - * 3. Default: 'settings.json' - */ function getUserSettingsFilePath(): string { + if (isMixUserSettingsEnabled()) { + return 'ccbsettings.json' + } if ( getUseCoworkPlugins() || isEnvTruthy(process.env.CLAUDE_CODE_USE_COWORK_PLUGINS) @@ -271,6 +265,26 @@ function getUserSettingsFilePath(): string { return 'settings.json' } +function isMixUserSettingsEnabled(): boolean { + if (isEnvTruthy(process.env.CCB_MIX)) return true + + try { + const raw = safeParseJSON( + readFileSync( + join(getSettingsRootPathForSource('userSettings'), 'ccbsettings.json'), + ), + ) + return ( + raw !== null && + typeof raw === 'object' && + (raw as Record).mix === true + ) + } catch (e) { + if (isENOENT(e)) return false + return false + } +} + export function getSettingsFilePathForSource( source: SettingSource, ): string | undefined { diff --git a/src/utils/settings/types.ts b/src/utils/settings/types.ts index 430ed25b70..a255566ec3 100644 --- a/src/utils/settings/types.ts +++ b/src/utils/settings/types.ts @@ -33,6 +33,21 @@ export const EnvironmentVariablesSchema = lazySchema(() => z.record(z.string(), z.coerce.string()), ) +const MixedModelApiProviderSchema = lazySchema(() => + z.enum(['anthropic', 'openai', 'gemini', 'grok']), +) + +const MixedModelApiConfigSchema = lazySchema(() => + z.object({ + provider: MixedModelApiProviderSchema() + .optional() + .describe('Provider used by this model family'), + env: EnvironmentVariablesSchema() + .optional() + .describe('Environment variables used only by this model family'), + }), +) + /** * Schema for permissions section */ @@ -372,6 +387,22 @@ export const SettingsSchema = lazySchema(() => 'API provider type. "anthropic" uses the Anthropic API (default), "openai" uses the OpenAI Chat Completions API, "gemini" uses the Gemini API, and "grok" uses the xAI Grok API (OpenAI-compatible). ' + 'When set to "openai", configure OPENAI_API_KEY, OPENAI_BASE_URL, and OPENAI_MODEL. When set to "gemini", configure GEMINI_API_KEY and optional GEMINI_BASE_URL. When set to "grok", configure GROK_API_KEY (or XAI_API_KEY), optional GROK_BASE_URL, GROK_MODEL, and GROK_MODEL_MAP.', ), + mix: z + .boolean() + .optional() + .describe( + 'Enable per-model-family API configuration for Opus, Sonnet, and Haiku.', + ), + mixedModelConfigs: z + .object({ + opus: MixedModelApiConfigSchema().optional(), + sonnet: MixedModelApiConfigSchema().optional(), + haiku: MixedModelApiConfigSchema().optional(), + }) + .optional() + .describe( + 'Per-model-family provider, URL, key, and model environment settings used when mix mode is enabled.', + ), model: z .string() .optional()