Skip to content

Conversation

@majiayu000
Copy link

Summary

  • Add user-friendly ImportError messages when optional dependencies are not installed for Ollama, Anthropic, and OpenAI model clients
  • Follow the pattern already used by LlamaCppChatCompletionClient

Why are these changes needed?

When users try to use model clients without the required optional dependencies installed, they receive raw ImportError/ModuleNotFoundError exceptions that don't provide guidance on how to fix the issue.

This change wraps imports in try/except blocks to provide clear, actionable error messages like:

ImportError: Dependencies for Ollama not found. Please install the ollama package: pip install autogen-ext[ollama]

Related issue number

Closes #4605

Checks

🤖 Generated with Claude Code

Add user-friendly ImportError messages when optional dependencies are not
installed for Ollama, Anthropic, and OpenAI model clients.

This follows the pattern already used by LlamaCppChatCompletionClient,
providing clear guidance on how to install the required dependencies
(e.g., "pip install autogen-ext[ollama]").

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Signed-off-by: majiayu000 <1835304752@qq.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

1 participant