Open
Conversation
IMPORTANT. The ADK docs state: "It is important to set the provider ollama_chat instead of ollama. Using ollama will result in unexpected behaviors such as infinite tool call loops and ignoring previous context." Also bump to the latest ADK version
…t aware The OpenAI faked mode to ollama allows for multimodal chats The normal ollama_chat allows direct access to ollama, but has issues with images or multi-modal The mode ollama does not keep any context and might loop. See the ADK docs More infos also at: google/adk-python#49 https://google.github.io/adk-docs/agents/models/#using-ollama_chat-provider
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR should provide some options for multi-modal abilities.
We cannot directly use the ollama_chat endpoint as it apparently causes problems with image upload, etc.
It works great for chat though.
One workaround was found @App0lyon (thank you very much for this). Directly calling ollma (without chat).
This works, but has the issues potentially missing previous context. Explicitly stated in the ADK Docs: https://google.github.io/adk-docs/agents/models/#using-ollama_chat-provider
Hence, there is the workaround from google/adk-python#49.
By faking the OpenAI API, we might get the best out of both still running with ollama