Skip to content

Conversation

@davanstrien
Copy link

Adds documentation for using Hugging Face Inference Providers with Aider.

Key additions:

  • Setup instructions using OpenAI-compatible client
  • Model discovery guidance (Hub browsing, model cards, API docs)
  • Provider selection for users with existing credits/preferences
  • Links to Inference Providers documentation
Adds documentation for using Hugging Face Inference Providers with Aider.

Key additions:
- Setup instructions using OpenAI-compatible client
- Model discovery guidance (Hub browsing, model cards, API docs)
- Provider selection for users with existing credits/preferences
- Links to Inference Providers documentation

Benefits for Aider users:
- Access to multiple providers (Cerebras, Groq, Together, etc.) via one API
- Zero markup pricing
- Automatic routing with optional provider selection
- OpenAI-compatible integration
@CLAassistant
Copy link

CLAassistant commented Oct 29, 2025

CLA assistant check
All committers have signed the CLA.

Copy link

@hanouticelina hanouticelina left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @davanstrien!


```
# Mac/Linux:
export OPENAI_API_BASE=https://router.huggingface.co/v1

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can use HUGGINGFACE_API_KEY, this is the api key used in the hugging face inference providers integration in litellm and if i'm not mistaken, aider uses litellm under the hood

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same for OPENAI_API_BASE -> HUGGINGFACE_API_BASE

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Testing this locally, both OPENAI_API_BASE and HUGGINGFACE_API_BASE seem to work. Probably `HUGGINGFACE_API_BASE is a bit more intuitive, though.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

HF_TOKEN should work too!

Comment on lines +37 to +40
aider --model openai/<model-name>

# Using MiniMaxAI/MiniMax-M2:
aider --model openai/MiniMaxAI/MiniMax-M2

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if aider uses litellm, we should use the huggingface prefix

Suggested change
aider --model openai/<model-name>
# Using MiniMaxAI/MiniMax-M2:
aider --model openai/MiniMaxAI/MiniMax-M2
aider --model huggingface/<model-name>
# Using MiniMaxAI/MiniMax-M2:
aider --model huggingface/MiniMaxAI/MiniMax-M2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

3 participants