Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
-
Updated
Dec 31, 2025 - TypeScript
Dive is an open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities. ✨
LLMX; Easiest 3rd party Local LLM UI for the web!
Fully-featured, beautiful web interface for vLLM - built with NextJS.
A minimal interface for AI Companion that runs entirely in your browser.
c4 GenAI Suite
🔬 Experiment is an experiment is an experiment is an experiment is an experiment is an e̴x̷p̶e̶r̶i̶m̸e̸n̸t̴ ̷i̵s̴ ̷a̵n̷ è̷̜x̴̝͝p̵̨̐e̴̯̐r̴͔̍ì̸̻m̴̛͎e̵̥̔n̶̠̎t̷̠͝ ̶̼̳̕ǐ̷̞͍͂s̷͍̈́ ̶̫̀a̵̠͌n̵̲͊ ̶̣̼̆ḛ̸̀x̵̰͋p̵͉̺̎e̶̛͈̮ř̸̜̜̅ì̵̜̠͗ṃ̴̼͆ė̴̮n̶̪̈́t̸̢͖͋͂
Intentflow is a YAML-based UX flow engine that lets you define, trigger, and optimize user journeys in your frontend. It supports dynamic flags, conditional components (modals, tooltips, banners), optional LLM logic for adaptive rendering.
A modern, feature-rich web interface built with Next.js and shadcn/ui for interacting with local Ollama large language models.
Run AI conversations locally with Ollama, LMStudio OR Amazon Bedrock
A modern, multi-tenant platform for building and managing AI Agents
AI generates a landing page with a waiting list in one click, SEO friendly.
Fully customizable React + MUI AI chat components: Message bubbles, typing indicator, code blocks, model selector, token counter, and more. Ready for your own LLM.
Add a description, image, and links to the llm-ui topic page so that developers can more easily learn about it.
To associate your repository with the llm-ui topic, visit your repo's landing page and select "manage topics."