You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I built LOCAL-CLI because I wanted Claude Code's powerful coding agent experience — but without paying for API calls.
The Problem
Claude Code is amazing, but it requires a paid API. What if you could get the same experience with:
🏠 Your local models (Ollama, LM Studio)
⚡ Self-hosted inference (vLLM, TGI, LocalAI)
🏢 Your company's internal LLM server
☁️ Any OpenAI-compatible endpoint (Groq, Together AI, OpenRouter, etc.)
The Solution: LOCAL-CLI
One CLI tool that works with ANY OpenAI-compatible API.
If your LLM server speaks the OpenAI format (/v1/chat/completions), LOCAL-CLI just works. No vendor lock-in. No monthly bills. Your models, your rules.
Tested With
Provider
Status
Ollama
✅
vLLM
✅
LM Studio
✅
LocalAI
✅
Groq
✅
Together AI
✅
OpenRouter
✅
Azure OpenAI
✅
Any OpenAI-compatible server
✅
Features
🔒 Supervised Mode — Approve file changes before they happen
📋 Plan & Execute — Auto TODO lists with step-by-step execution
GeneralGeneral topics and discussions that don't fit into other categories, but are related to GitHubShow & TellDiscussions where community members share their projects, experiments, or accomplishments
1 participant
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Show & Tell
Body
Hey everyone! 👋
I built LOCAL-CLI because I wanted Claude Code's powerful coding agent experience — but without paying for API calls.
The Problem
Claude Code is amazing, but it requires a paid API. What if you could get the same experience with:
The Solution: LOCAL-CLI
One CLI tool that works with ANY OpenAI-compatible API.
If your LLM server speaks the OpenAI format (/v1/chat/completions), LOCAL-CLI just works. No vendor lock-in. No monthly bills. Your models, your rules.
Tested With
Features
Demo
demo.mp4
Quick Start
git clone https://github.com/A2G-Dev-Space/Local-CLI.git
cd Local-CLI
npm install && npm run build
node dist/cli.js
First run launches the endpoint setup wizard — just paste your LLM URL and you're ready.
GitHub
⭐ https://github.com/A2G-Dev-Space/Local-CLI
Free, open-source, MIT licensed.
Feedback and contributions welcome!
Beta Was this translation helpful? Give feedback.
All reactions