Alex Markevich’s Post

You can now run an AI coding agent locally for $0. No API bills. No token limits. No sending your code to the cloud. A lot of engineers still assume AI development tools require paid APIs. That is no longer true. You can run a powerful coding agent entirely on your machine in about 5 minutes. Here is the simple setup: #1) Install Ollama Download: https://ollama.com #2) Pull a coding model Example: ollama pull qwen2.5-coder This gives you a strong open-source model optimized for programming. #3) Install Claude Code CLI curl -fsSL https://lnkd.in/ghXwt2z9 | bash or npm install -g @anthropic-ai/claude-code #4) Point Claude Code to your local model export ANTHROPIC_AUTH_TOKEN=ollama export ANTHROPIC_BASE_URL=http://localhost:11434 #5) Run the agent claude --model qwen2.5-coder Now the agent writes code, edits files, runs commands, and reasons about your repo. Locally. No token cost. No cloud dependency. What this means for engineers: 1) AI development tools are becoming infrastructure, not SaaS 2) Private codebases can be explored safely 3) Autonomous coding agents are becoming accessible to everyone The barrier to experimenting with agentic engineering workflows just dropped dramatically. Engineers who start experimenting now will understand where development is going next. #AI #AgenticAI #SoftwareEngineering #Automation #LLM #DeveloperTools

  • text
James Parker

Sightsource, LLC47 followers

2w

What do you need in terms of compute? GPU specs? If its achingly slow, does it matter?

I am curious if you got the desired result directly. Because I have done this too and ran into numerous issues with CLI printing the tool call instead of generating the source I asked for. How did you set up the tool exection with Claude? Or did you use another feature flag to avoid the issue I got?

Like
Reply
Mohamed Elsawy

QESTIT Group453 followers

1w

This changes everything for privacy-conscious development teams. Testing it now.

Like
Reply

Alex Markevich 👍 И сам Claude мне подсказал еще такой вариант на основе Вашего (под мои задачи и параметры моего ноута): Давай проверю, как сейчас реально работает связка Claude Code + Ollama.Да, это работает, причём сейчас ещё проще, чем в том посте. Вот актуальная инструкция под твою модель: Быстрый путь (Ollama 0.15+) Ollama теперь нативно поддерживает Anthropic Messages API, и Claude Code подключается к любой модели Ollama напрямую. Самый простой способ: # 1. Убедись что Ollama обновлена (нужна v0.15+) ollama --version # 2. Скачай модель ollama pull qwen3-coder:30b # 3. Установи Claude Code (если ещё нет) npm install -g @anthropic-ai/claude-code # 4. Запускай одной командой ollama launch claude --model qwen3-coder:30b Всё. Ollama сама поднимет модель и подключит к ней Claude Code. Для NDA-проекта это идеальная связка — весь код остаётся локально, никакие данные не уходят в облако.

Alex Becz

Filevine13K followers

2w

Just saw this on YT shorts > that is a great option! ollama + qwen +claude == FREE 😊

John Bondinsky

QASolver1K followers

2w

Nicely done

See more comments

To view or add a comment, sign in

Explore content categories