-
Notifications
You must be signed in to change notification settings - Fork 104
Adding new llms #266
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding new llms #266
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Review by Korbit AI
Korbit automatically attempts to detect when you fix issues in new commits.
| Category | Issue | Status |
|---|---|---|
| Incorrect None value for max_new_tokens ▹ view |
Files scanned
| File Path | Reviewed |
|---|---|
| src/agentlab/llm/llm_configs.py | ✅ |
| src/agentlab/llm/chat_api.py | ✅ |
Explore our documentation to understand the languages and file types we support and the files we ignore.
Check out our docs on how you can make Korbit work best for you and your team.
| "openai/o3-2025-04-16": OpenAIModelArgs( | ||
| model_name="o3-2025-04-16", | ||
| max_total_tokens=200_000, | ||
| max_input_tokens=200_000, | ||
| max_new_tokens=None, | ||
| temperature=1, | ||
| vision_support=True, | ||
| ), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Incorrect None value for max_new_tokens 
Tell me more
What is the issue?
Setting max_new_tokens=None could cause issues with token generation limits. The PR's intent mentions using NOT_GIVEN for None values, but this isn't implemented here.
Why this matters
Without proper token limits, the model might generate responses that are too long or fail to respect API constraints, potentially causing API errors or unexpected behavior.
Suggested change ∙ Feature Preview
Replace None with NOT_GIVEN to properly handle the absence of a token limit:
"openai/o3-2025-04-16": OpenAIModelArgs(
model_name="o3-2025-04-16",
max_total_tokens=200_000,
max_input_tokens=200_000,
max_new_tokens=NOT_GIVEN,
temperature=1,
vision_support=True,
),Provide feedback to improve future suggestions
💬 Looking for more details? Reply to this comment to chat with Korbit.
Description by Korbit AI
What change is being made?
Add new language model configurations to
llm_configs.pyand updatechat_api.pyto support these configurations, including handling of themax_tokensparameter usingNOT_GIVENfrom OpenAI.Why are these changes being made?
This update introduces new models such as OpenAI's "o3-2025-04-16" and Anthropic's "claude-3-7-sonnet-20250219" to enhance the agent capabilities with a diverse selection of large language models. Adjustments to
chat_api.pyprovide a more flexible API interface, particularly in managing token limits, thus supporting greater configurability and accommodating new model parameters effectively.