A beautiful, multi-provider AI chat application built with Tauri, React, and TypeScript.
- π€ Multi-Provider Support: Connect to Ollama (local), OpenRouter, OpenAI, Anthropic, or Google Gemini
- π§ Muradian Auto: Intelligent AI routing that automatically selects the best model based on your question
- π¬ Split View: Chat with two AI models side-by-side
- π¨ Dark/Light Mode: Automatic theme switching based on system preferences
- β‘ Fast & Lightweight: Native desktop app powered by Tauri (~10MB)
- π Privacy First: All local Ollama chats stay on your machine
- π Markdown Support: Beautiful rendering of code blocks, lists, and formatting
- π Auto Model Download: Automatically download Ollama models with one click
Muradian Auto is our smart provider that automatically detects the type of question you're asking and routes it to the most appropriate AI model. No more switching models manually!
flowchart TD
Start[User sends message] --> Check{Muradian Auto<br/>provider?}
Check -->|No| Standard[Use selected provider]
Check -->|Yes| Analyze[Analyze conversation<br/>every 2-4 messages]
Analyze --> Intent{Detected<br/>Intent}
Intent -->|CODING| Code[qwen-coder<br/>Programming expert]
Intent -->|MATH| Math[deepseek-r1<br/>Mathematical reasoning]
Intent -->|SCIENCE| Science[gemini-pro<br/>Scientific analysis]
Intent -->|CREATIVE| Creative[llama-3.3<br/>Creative writing]
Intent -->|ANALYSIS| Analysis[gemini-pro<br/>Analytical thinking]
Intent -->|GENERAL| General[Local/gemini-flash<br/>Fast responses]
Code --> Response[Generate response]
Math --> Response
Science --> Response
Creative --> Response
Analysis --> Response
General --> Response
Standard --> Response
Response --> End[Display to user]
| Category | Triggers | AI Model | Best For |
|---|---|---|---|
| π₯οΈ CODING | Programming, algorithms, debugging | qwen-2.5-coder-32b |
Writing code, fixing bugs, API usage |
| π’ MATH | Equations, calculations, proofs | deepseek-r1 |
Mathematical reasoning, step-by-step solutions |
| π¬ SCIENCE | Physics, chemistry, biology | gemini-2.0-pro |
Scientific explanations, research questions |
| βοΈ CREATIVE | Writing, storytelling, poetry | llama-3.3-70b |
Creative content, narratives, art |
| π ANALYSIS | Research, comparisons, evaluation | gemini-2.0-pro |
Critical thinking, data analysis |
| π¬ GENERAL | Casual chat, everyday questions | Local + gemini-flash |
Quick responses, general conversation |
Simply select "Muradian Auto" as your provider and ask any question:
- β "How do I implement a binary search in Python?" β CODING mode
- β "What's the derivative of xΒ³?" β MATH mode
- β "Explain photosynthesis" β SCIENCE mode
- β "Write a short story about space" β CREATIVE mode
- β "Compare React vs Vue" β ANALYSIS mode
- β "Hi, how are you?" β GENERAL mode
The system analyzes your conversation every 2-4 messages and automatically switches to the optimal model!
-
Download the latest
.dmgfrom Releases -
Open the DMG and drag MOSP Chat to Applications
-
First Launch (Important): Since this app isn't notarized, you need to bypass Gatekeeper:
Option 1: Right-click method
- Right-click (or Control-click) on MOSP Chat.app
- Click "Open" from the menu
- Click "Open" in the dialog that appears
Option 2: Terminal command
xattr -cr /Applications/MOSP\ Chat.appTo bypass the security warning for the installer (DMG):
sudo xattr -cr 'MOSP Chat_0.1.0_aarch64.dmg'Then open the app normally.
Option 3: System Settings
- Try to open the app (it will be blocked)
- Go to System Settings β Privacy & Security
- Scroll down and click "Open Anyway" next to MOSP Chat
# Clone the repository
git clone https://github.com/Mosp-TM/MOSP-Chat.git
cd mosp-chat
# Install dependencies
bun install
# Run in development mode
bun tauri dev
# Build for production
bun tauri build- Install Ollama
- Run Ollama (it starts automatically on macOS)
- Select Ollama in MOSP Chat - models will be detected automatically
- If no models are installed, click "Download deepseek-r1:1.5b" to get started
- Get an API key from your provider:
- Enter your API key in the provider dialog
- Start chatting!
- Frontend: React 19, TypeScript, Tailwind CSS
- UI Components: Radix UI, shadcn/ui
- State Management: Zustand
- Desktop: Tauri 2.0 (Rust)
- Build Tool: Vite
mosp-chat/
βββ src/ # React frontend
β βββ components/ # UI components
β βββ lib/ # API clients (Ollama, OpenRouter, etc.)
β βββ store/ # Zustand state management
βββ src-tauri/ # Tauri/Rust backend
β βββ src/ # Rust source code
β βββ icons/ # App icons
βββ public/ # Static assets
βββ dist/ # Production build output
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is open source and available under the MIT License.