-
-
Notifications
You must be signed in to change notification settings - Fork 4.6k
[Bug Fix ] : KeyError: 'name' when using function calling with Ollama models - Gemma3 or Llama3.2 #9966
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
…move debug prints
…move debug prints and comment
|
Here is full code repo for above fix to work with external third party tools like with ADK - Function calling with Ollama/gemma3:27B and Ollama/llama3.2:latest and other models https://github.com/arjunprabhulal/adk-gemma3-function-calling |
|
Done
…On Tue, Apr 22, 2025 at 5:58 PM CLAassistant ***@***.***> wrote:
[image: CLA assistant check]
<https://cla-assistant.io/BerriAI/litellm?pullRequest=9966>
Thank you for your submission! We really appreciate it. Like many open
source projects, we ask that you sign our Contributor License Agreement
<https://cla-assistant.io/BerriAI/litellm?pullRequest=9966> before we can
accept your contribution.
You have signed the CLA already but the status is still pending? Let us
recheck <https://cla-assistant.io/check/BerriAI/litellm?pullRequest=9966>
it.
—
Reply to this email directly, view it on GitHub
<#9966 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACZXOJJPMJ5NYH2V34RVBLL2223Q5AVCNFSM6AAAAAB3BRG6ZKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDQMRSGU3TAOJRHA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
*CLAassistant* left a comment (BerriAI/litellm#9966)
<#9966 (comment)>
[image: CLA assistant check]
<https://cla-assistant.io/BerriAI/litellm?pullRequest=9966>
Thank you for your submission! We really appreciate it. Like many open
source projects, we ask that you sign our Contributor License Agreement
<https://cla-assistant.io/BerriAI/litellm?pullRequest=9966> before we can
accept your contribution.
You have signed the CLA already but the status is still pending? Let us
recheck <https://cla-assistant.io/check/BerriAI/litellm?pullRequest=9966>
it.
—
Reply to this email directly, view it on GitHub
<#9966 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ACZXOJJPMJ5NYH2V34RVBLL2223Q5AVCNFSM6AAAAAB3BRG6ZKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDQMRSGU3TAOJRHA>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
|
Hey, are there any updates on when this is going to be merged? |
|
Any update on this PR? I think it's related to the open issue ollama/ollama#9941 |
|
This pull request has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. |



Issue
fix(ollama): Handle non-tool-call JSON response when format=json
Issue occurs while using from google.adk.models.lite_llm import LiteLlm with llama3.2 , gemma3:27b
Error
Changes
This PR addresses a KeyError: 'name' that occurs in litellm/llms/ollama/completion/transformation.py when using certain Ollama models (e.g., ollama/llama3.2:latest, ollama/gemma3:27b) with tool calling enabled (format="json").
Problem:
When LiteLLM expects a tool call response, it requests format="json" from Ollama. However, some models return a valid JSON string in the response field, but the structure of this JSON does not match the expected tool call format ({"name": ..., "arguments": ...}). Instead, it might contain other structures (e.g., Schema.org JSON as seen with llama3.2). When transformation.py parses this JSON and attempts to access function_call["name"], it results in a KeyError.
