-
Notifications
You must be signed in to change notification settings - Fork 3.6k
docs: Add Hugging Face Inference Providers documentation #4611
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
docs: Add Hugging Face Inference Providers documentation #4611
Conversation
Adds documentation for using Hugging Face Inference Providers with Aider. Key additions: - Setup instructions using OpenAI-compatible client - Model discovery guidance (Hub browsing, model cards, API docs) - Provider selection for users with existing credits/preferences - Links to Inference Providers documentation Benefits for Aider users: - Access to multiple providers (Cerebras, Groq, Together, etc.) via one API - Zero markup pricing - Automatic routing with optional provider selection - OpenAI-compatible integration
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks @davanstrien!
|
|
||
| ``` | ||
| # Mac/Linux: | ||
| export OPENAI_API_BASE=https://router.huggingface.co/v1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can use HUGGINGFACE_API_KEY, this is the api key used in the hugging face inference providers integration in litellm and if i'm not mistaken, aider uses litellm under the hood
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same for OPENAI_API_BASE -> HUGGINGFACE_API_BASE
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Testing this locally, both OPENAI_API_BASE and HUGGINGFACE_API_BASE seem to work. Probably `HUGGINGFACE_API_BASE is a bit more intuitive, though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
HF_TOKEN should work too!
| aider --model openai/<model-name> | ||
|
|
||
| # Using MiniMaxAI/MiniMax-M2: | ||
| aider --model openai/MiniMaxAI/MiniMax-M2 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if aider uses litellm, we should use the huggingface prefix
| aider --model openai/<model-name> | |
| # Using MiniMaxAI/MiniMax-M2: | |
| aider --model openai/MiniMaxAI/MiniMax-M2 | |
| aider --model huggingface/<model-name> | |
| # Using MiniMaxAI/MiniMax-M2: | |
| aider --model huggingface/MiniMaxAI/MiniMax-M2 |
Adds documentation for using Hugging Face Inference Providers with Aider.
Key additions: