Skip to content

feat: add MiniMax as LLM provider with Guardian threat protection#1

Open
octo-patch wants to merge 1 commit intoOraclesTech:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider with Guardian threat protection#1
octo-patch wants to merge 1 commit intoOraclesTech:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Add MiniMax as a first-class provider integration for Guardian SDK. MiniMax offers powerful LLM models (M2.7, M2.5) through an OpenAI-compatible API at https://api.minimax.io/v1. This provider wraps MiniMax-configured OpenAI clients with the same multi-layer threat detection pipeline used for OpenAI and Anthropic.

Changes

  • New provider: ethicore_guardian/providers/minimax_provider.pyMiniMaxProvider, ProtectedMiniMaxClient, ProtectedChat, ProtectedCompletions, and create_protected_minimax_client() convenience factory
  • Auto-detection: Updated get_provider_for_client() in base_provider.py to detect MiniMax clients by checking base_url for 'minimax'
  • Dependencies: Added minimax optional dependency group in pyproject.toml (uses openai>=1.0.0)
  • Documentation: Added MiniMax provider example and install instructions to README
  • Tests: 30 tests in tests/test_minimax.py — 22 unit tests + 5 integration tests + 3 constant tests

Supported Models

Model Context
MiniMax-M2.7 1M tokens
MiniMax-M2.7-highspeed 1M tokens (fast)
MiniMax-M2.5 204K tokens
MiniMax-M2.5-highspeed 204K tokens (fast)

Usage

import openai
from ethicore_guardian import Guardian, GuardianConfig
from ethicore_guardian.providers.minimax_provider import MiniMaxProvider

guardian = Guardian(config=GuardianConfig(api_key="my-app"))

minimax_client = openai.OpenAI(
    api_key="your-minimax-api-key",
    base_url="https://api.minimax.io/v1",
)

provider = MiniMaxProvider(guardian)
client = provider.wrap_client(minimax_client)

response = client.chat.completions.create(
    model="MiniMax-M2.7",
    messages=[{"role": "user", "content": user_input}]
)

Test plan

  • All 30 unit + integration tests pass with mocked Guardian
  • Verify no regressions in existing OpenAI/Anthropic providers
  • Manual smoke test with real MiniMax API key (optional)

Add MiniMax (https://www.minimax.io) as a first-class provider integration
for Guardian SDK. MiniMax offers powerful LLM models (M2.7, M2.5) through
an OpenAI-compatible API, and this provider wraps MiniMax-configured OpenAI
clients with the same threat detection pipeline used for OpenAI and Anthropic.

Changes:
- Add minimax_provider.py with MiniMaxProvider, ProtectedMiniMaxClient,
  and create_protected_minimax_client() convenience factory
- Add MiniMax auto-detection in get_provider_for_client() via base_url
- Add minimax optional dependency group in pyproject.toml
- Add MiniMax provider example and install instructions to README
- Add 30 tests (22 unit + 5 integration + 3 constant tests)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant