Skip to content

Conversation

@manascb1344
Copy link

feat: add extra_headers support for LLM and Embedding providers

Description

This PR implements support for extra_headers across multiple LLM and Embedding providers to support observability tools (like Helicone), custom proxies, and additional authentication requirements.

Key changes include:

  • Added extra_headers field to BaseLlmConfig and BaseEmbedderConfig.
  • Implemented header propagation in OpenAI, Anthropic, Groq, Together, DeepSeek, xAI, Azure OpenAI, LM Studio, and vLLM.
  • Updated factory.py to support extra_headers in configuration generation.
  • Added unit tests for OpenAI to ensure custom headers are correctly passed to the provider clients.
  • Updated documentation with extra_headers parameter and usage example for Helicone integration.

Fixes #3851

Type of change

  • New feature (non-breaking change which adds functionality)
  • Documentation update

How Has This Been Tested?

  • Unit Test: Verified OpenAI client initialization with extra_headers in tests/llms/test_openai.py.

Checklist:

  • My code follows the style guidelines of this project
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • Any dependent changes have been merged and published in downstream modules
  • I have checked my code and corrected any misspellings

Maintainer Checklist

- Add extra_headers parameter to BaseLlmConfig and OpenAIConfig
- Update OpenAILLM to pass extra_headers as default_headers to client
- Add extra_headers parameter to BaseEmbedderConfig
- Update OpenAIEmbedding to pass extra_headers to client
- Add unit tests for extra_headers functionality

Closes partially mem0ai#3851
- DeepSeek: pass extra_headers to OpenAI client
- Anthropic: pass extra_headers to Anthropic client
- Groq: pass extra_headers to Groq client
- Together: pass extra_headers to Together client
- xAI: pass extra_headers to OpenAI client

Closes mem0ai#3851
…nfigurations

- Add extra_headers parameter to AzureOpenAIConfig, LMStudioConfig, and VllmConfig
- Update LLM implementations to use extra_headers in OpenAI client initialization
- Update factory.py to include extra_headers when creating configs
- Fix duplicate elif statement in lmstudio.py
- Add extra_headers parameter to LLM config params table
- Add extra_headers parameter to Embedder config params table
- Add usage example showing Helicone observability integration
Copilot AI review requested due to automatic review settings December 24, 2025 07:46
@CLAassistant
Copy link

CLAassistant commented Dec 24, 2025

CLA assistant check
All committers have signed the CLA.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds support for extra_headers across multiple LLM and Embedding providers to enable integration with observability tools (like Helicone), custom proxies, and additional authentication requirements.

Key changes:

  • Added extra_headers field to base configuration classes (BaseLlmConfig and BaseEmbedderConfig)
  • Implemented header propagation in 9 LLM providers (OpenAI, Anthropic, Groq, Together, DeepSeek, xAI, Azure OpenAI, LM Studio, and vLLM)
  • Updated factory pattern to support extra_headers during configuration conversion
  • Added comprehensive unit tests for OpenAI provider
  • Updated documentation with usage examples

Reviewed changes

Copilot reviewed 20 out of 20 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
mem0/configs/llms/base.py Added extra_headers parameter to base LLM configuration class with proper type hints and documentation
mem0/configs/llms/openai.py Added extra_headers parameter to OpenAI config with type hints and docstring
mem0/configs/llms/lmstudio.py Added extra_headers parameter to LM Studio config with type hints
mem0/configs/llms/vllm.py Added extra_headers parameter to vLLM config with type hints
mem0/configs/llms/azure.py Added extra_headers parameter to Azure OpenAI config
mem0/configs/embeddings/base.py Added extra_headers parameter to base embedder configuration with documentation
mem0/llms/openai.py Implemented extra_headers propagation to OpenAI client via default_headers parameter
mem0/llms/anthropic.py Implemented extra_headers propagation to Anthropic client
mem0/llms/groq.py Implemented extra_headers propagation to Groq client
mem0/llms/together.py Implemented extra_headers propagation to Together client
mem0/llms/deepseek.py Implemented extra_headers propagation to DeepSeek client
mem0/llms/xai.py Implemented extra_headers propagation to xAI client
mem0/llms/azure_openai.py Implemented header merging logic for Azure OpenAI to combine extra_headers with azure_kwargs.default_headers
mem0/llms/lmstudio.py Implemented extra_headers propagation to LM Studio client
mem0/llms/vllm.py Implemented extra_headers propagation to vLLM client
mem0/embeddings/openai.py Implemented extra_headers propagation to OpenAI embedding client
mem0/utils/factory.py Updated factory to handle extra_headers during config conversion using getattr for backwards compatibility; improved code formatting for sentence_transformer reranker
tests/llms/test_openai.py Added two test cases to verify extra_headers are correctly passed to OpenAI client and handle None case
docs/components/llms/config.mdx Added documentation section with usage example for extra_headers with Helicone; added parameter to master list
docs/components/embedders/config.mdx Added extra_headers parameter to master list for OpenAI embeddings

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

api_key="test-key",
extra_headers=custom_headers
)
llm = OpenAILLM(config)
Copy link

Copilot AI Dec 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Variable llm is not used.

Suggested change
llm = OpenAILLM(config)
OpenAILLM(config)
Copilot uses AI. Check for mistakes.
model="gpt-4.1-nano-2025-04-14",
api_key="test-key"
)
llm = OpenAILLM(config)
Copy link

Copilot AI Dec 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Variable llm is not used.

Suggested change
llm = OpenAILLM(config)
OpenAILLM(config)
Copilot uses AI. Check for mistakes.
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

2 participants