The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
-
Updated
Aug 7, 2025 - TypeScript
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but 100% free.
Chat with your pdf using your local LLM, OLLAMA client.
A modern web interface for [Ollama](https://ollama.ai/), with DeepSeek in next version.
BeautifyOllama is an open-source web interface that enhances your local Ollama AI model interactions with a beautiful, modern design. Built with cutting-edge web technologies, it provides a seamless chat experience with stunning visual effects and enterprise-grade functionality.
Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy-first Ollama Chrome extension to chat with locally hosted Ollama lllm models like LLaMA 2, Mistral, and CodeLLaMA. Supports streaming, stop/regenerate and easy model switching — all without cloud APIs or data leaks.
Converse with Ollama models running on your local machine
A vs code extension where users can come and select their locally downloaded ollama models and make them their personal coding agents
BrainDrive Plugin for managing your Ollama servers and models
Add a description, image, and links to the ollama-chat topic page so that developers can more easily learn about it.
To associate your repository with the ollama-chat topic, visit your repo's landing page and select "manage topics."