integrate anthropic as LLM provider#501
integrate anthropic as LLM provider#501DanielDCM212 wants to merge 3 commits intoAsyncFuncAI:mainfrom
Conversation
There was a problem hiding this comment.
Code Review
This pull request integrates Anthropic Claude into the AdalFlow library, introducing the AnthropicClient, updating provider configurations, and extending the WebSocket chat handler. It also includes minor updates to the FastAPI route definitions and Ollama model configurations. Feedback highlights that the acall method should lazily initialize the async client for better performance, the WebSocket handler is missing the intended streaming support for Anthropic, and the regex used for prompt parsing is fragile.
| response = await model.acall(api_kwargs=api_kwargs, model_type=ModelType.LLM) | ||
| text = response.content[0].text if response.content else "" | ||
| if text: | ||
| await websocket.send_text(text) | ||
| await websocket.close() |
There was a problem hiding this comment.
The Anthropic implementation in this WebSocket handler is currently non-streaming. It awaits the entire response from model.acall and sends it as a single block. This contradicts the PR description which mentions 'async response streaming' and results in a poor user experience for long wiki generations compared to the OpenAI or Google providers which are implemented with streaming in this same file.
To fix this, you should check if stream is enabled in api_kwargs and iterate over the response stream if it is. Note that Anthropic's stream events require specific handling (e.g., checking for content_block_delta or using the text_stream helper).
| pattern = ( | ||
| rf"{system_tag_start}\s*(.*?)\s*{system_tag_end}\s*" | ||
| rf"{user_tag_start}\s*(.*?)\s*{user_tag_end}" | ||
| ) | ||
| match = re.search(pattern, input, re.DOTALL) | ||
| if match: | ||
| system_prompt = match.group(1).strip() | ||
| user_content = match.group(2).strip() |
There was a problem hiding this comment.
The regex for splitting system and user prompts is quite fragile. It expects a very specific sequence of tags and whitespace. If the input string contains the system tag but doesn't match this exact pattern (e.g., if there is text before <START_OF_SYSTEM_PROMPT> or if the user tags are missing), the match will fail. Consequently, system_prompt will remain None and the entire input (including the tags) will be sent as the user message. Consider using a more flexible parsing approach to extract the system and user parts independently.
Add Anthropic Claude as an LLM Provider
Motivation
DeepWiki already supports multiple LLM providers (Google Gemini, OpenAI, OpenRouter, Ollama, AWS Bedrock, Azure OpenAI, DashScope). Anthropic's Claude models are among the most
capable and widely used LLMs available today, and a significant portion of users prefer them for wiki generation and code analysis tasks. This PR adds first-class Anthropic
support so users can select Claude models from the provider dropdown without any workarounds.
What Changed
New: api/anthropic_client.py
An AnthropicClient that implements the AdalFlow ModelClient interface:
Updated: api/config.py and api/config/generator.json
Updated: api/websocket_wiki.py
Dependency: api/pyproject.toml
New: tests/unit/test_anthropic_provider.py
20 unit tests covering:
How to Use
Set your API key and select the provider in the UI:
export ANTHROPIC_API_KEY=sk-ant-...
Then choose Anthropic from the provider dropdown and pick a Claude model (claude-sonnet-4-6 is the default).
Testing
python -m pytest tests/unit/test_anthropic_provider.py -v
All 20 tests pass without requiring a live API key (network calls are mocked).