Skip to content

Conversation

@recursix
Copy link
Collaborator

@recursix recursix commented Jul 18, 2025

Description by Korbit AI

What change is being made?

Add new language model configurations to llm_configs.py and update chat_api.py to support these configurations, including handling of the max_tokens parameter using NOT_GIVEN from OpenAI.

Why are these changes being made?

This update introduces new models such as OpenAI's "o3-2025-04-16" and Anthropic's "claude-3-7-sonnet-20250219" to enhance the agent capabilities with a diverse selection of large language models. Adjustments to chat_api.py provide a more flexible API interface, particularly in managing token limits, thus supporting greater configurability and accommodating new model parameters effectively.

Is this description stale? Ask me to generate a new description by commenting /korbit-generate-pr-description

@recursix recursix changed the title New experiments Jul 18, 2025
Copy link

@korbit-ai korbit-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review by Korbit AI

Korbit automatically attempts to detect when you fix issues in new commits.
Category Issue Status
Functionality Incorrect None value for max_new_tokens ▹ view
Files scanned
File Path Reviewed
src/agentlab/llm/llm_configs.py
src/agentlab/llm/chat_api.py

Explore our documentation to understand the languages and file types we support and the files we ignore.

Check out our docs on how you can make Korbit work best for you and your team.

Loving Korbit!? Share us on LinkedIn Reddit and X

Comment on lines +43 to +50
"openai/o3-2025-04-16": OpenAIModelArgs(
model_name="o3-2025-04-16",
max_total_tokens=200_000,
max_input_tokens=200_000,
max_new_tokens=None,
temperature=1,
vision_support=True,
),
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incorrect None value for max_new_tokens category Functionality

Tell me more
What is the issue?

Setting max_new_tokens=None could cause issues with token generation limits. The PR's intent mentions using NOT_GIVEN for None values, but this isn't implemented here.

Why this matters

Without proper token limits, the model might generate responses that are too long or fail to respect API constraints, potentially causing API errors or unexpected behavior.

Suggested change ∙ Feature Preview

Replace None with NOT_GIVEN to properly handle the absence of a token limit:

"openai/o3-2025-04-16": OpenAIModelArgs(
    model_name="o3-2025-04-16",
    max_total_tokens=200_000,
    max_input_tokens=200_000,
    max_new_tokens=NOT_GIVEN,
    temperature=1,
    vision_support=True,
),
Provide feedback to improve future suggestions

Nice Catch Incorrect Not in Scope Not in coding standard Other

💬 Looking for more details? Reply to this comment to chat with Korbit.

@amanjaiswal73892 amanjaiswal73892 merged commit f090b5c into main Jul 18, 2025
7 checks passed
@amanjaiswal73892 amanjaiswal73892 deleted the new_experiments branch July 18, 2025 14:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants