-
Notifications
You must be signed in to change notification settings - Fork 4.9k
Add extra_headers support for observability and custom proxy headers #3852
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
- Add extra_headers parameter to BaseLlmConfig and OpenAIConfig - Update OpenAILLM to pass extra_headers as default_headers to client - Add extra_headers parameter to BaseEmbedderConfig - Update OpenAIEmbedding to pass extra_headers to client - Add unit tests for extra_headers functionality Closes partially mem0ai#3851
- DeepSeek: pass extra_headers to OpenAI client - Anthropic: pass extra_headers to Anthropic client - Groq: pass extra_headers to Groq client - Together: pass extra_headers to Together client - xAI: pass extra_headers to OpenAI client Closes mem0ai#3851
…nfigurations - Add extra_headers parameter to AzureOpenAIConfig, LMStudioConfig, and VllmConfig - Update LLM implementations to use extra_headers in OpenAI client initialization - Update factory.py to include extra_headers when creating configs - Fix duplicate elif statement in lmstudio.py
- Add extra_headers parameter to LLM config params table - Add extra_headers parameter to Embedder config params table - Add usage example showing Helicone observability integration
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds support for extra_headers across multiple LLM and Embedding providers to enable integration with observability tools (like Helicone), custom proxies, and additional authentication requirements.
Key changes:
- Added
extra_headersfield to base configuration classes (BaseLlmConfigandBaseEmbedderConfig) - Implemented header propagation in 9 LLM providers (OpenAI, Anthropic, Groq, Together, DeepSeek, xAI, Azure OpenAI, LM Studio, and vLLM)
- Updated factory pattern to support
extra_headersduring configuration conversion - Added comprehensive unit tests for OpenAI provider
- Updated documentation with usage examples
Reviewed changes
Copilot reviewed 20 out of 20 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
| mem0/configs/llms/base.py | Added extra_headers parameter to base LLM configuration class with proper type hints and documentation |
| mem0/configs/llms/openai.py | Added extra_headers parameter to OpenAI config with type hints and docstring |
| mem0/configs/llms/lmstudio.py | Added extra_headers parameter to LM Studio config with type hints |
| mem0/configs/llms/vllm.py | Added extra_headers parameter to vLLM config with type hints |
| mem0/configs/llms/azure.py | Added extra_headers parameter to Azure OpenAI config |
| mem0/configs/embeddings/base.py | Added extra_headers parameter to base embedder configuration with documentation |
| mem0/llms/openai.py | Implemented extra_headers propagation to OpenAI client via default_headers parameter |
| mem0/llms/anthropic.py | Implemented extra_headers propagation to Anthropic client |
| mem0/llms/groq.py | Implemented extra_headers propagation to Groq client |
| mem0/llms/together.py | Implemented extra_headers propagation to Together client |
| mem0/llms/deepseek.py | Implemented extra_headers propagation to DeepSeek client |
| mem0/llms/xai.py | Implemented extra_headers propagation to xAI client |
| mem0/llms/azure_openai.py | Implemented header merging logic for Azure OpenAI to combine extra_headers with azure_kwargs.default_headers |
| mem0/llms/lmstudio.py | Implemented extra_headers propagation to LM Studio client |
| mem0/llms/vllm.py | Implemented extra_headers propagation to vLLM client |
| mem0/embeddings/openai.py | Implemented extra_headers propagation to OpenAI embedding client |
| mem0/utils/factory.py | Updated factory to handle extra_headers during config conversion using getattr for backwards compatibility; improved code formatting for sentence_transformer reranker |
| tests/llms/test_openai.py | Added two test cases to verify extra_headers are correctly passed to OpenAI client and handle None case |
| docs/components/llms/config.mdx | Added documentation section with usage example for extra_headers with Helicone; added parameter to master list |
| docs/components/embedders/config.mdx | Added extra_headers parameter to master list for OpenAI embeddings |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| api_key="test-key", | ||
| extra_headers=custom_headers | ||
| ) | ||
| llm = OpenAILLM(config) |
Copilot
AI
Dec 24, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Variable llm is not used.
| llm = OpenAILLM(config) | |
| OpenAILLM(config) |
| model="gpt-4.1-nano-2025-04-14", | ||
| api_key="test-key" | ||
| ) | ||
| llm = OpenAILLM(config) |
Copilot
AI
Dec 24, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Variable llm is not used.
| llm = OpenAILLM(config) | |
| OpenAILLM(config) |
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
feat: add extra_headers support for LLM and Embedding providers
Description
This PR implements support for
extra_headersacross multiple LLM and Embedding providers to support observability tools (like Helicone), custom proxies, and additional authentication requirements.Key changes include:
extra_headersfield toBaseLlmConfigandBaseEmbedderConfig.OpenAI,Anthropic,Groq,Together,DeepSeek,xAI,Azure OpenAI,LM Studio, andvLLM.factory.pyto supportextra_headersin configuration generation.extra_headersparameter and usage example for Helicone integration.Fixes #3851
Type of change
How Has This Been Tested?
OpenAIclient initialization withextra_headersintests/llms/test_openai.py.Checklist:
Maintainer Checklist