A lightweight, simplified version of Plandex that orchestrates multiple AI agents to handle complex development tasks. Each agent specializes in a specific role, working together to break down requests, write code, review outputs, and provide architectural guidance.
- Multi-Agent Architecture: Five specialized AI agents working in harmony
- Plug & Play Setup: Simple Ollama integration - just provide an endpoint
- Role-Based Task Distribution: Each agent focuses on what they do best
- TypeScript Implementation: Type-safe, modern JavaScript development
- PDF Export: Built-in markdown-to-PDF conversion capabilities
- Lightweight & Fast: Minimal overhead, maximum efficiency
- Local AI Support: Works with locally hosted models via Ollama
"You are a Planner. Break down user requests into clear, actionable steps."
Analyzes complex requests and creates structured, step-by-step execution plans.
"You are a Coder. Write clean, working code that solves the described tasks."
Implements solutions with clean, maintainable code following best practices.
"You are a Summarizer. Summarize long outputs into concise points."
Distills lengthy outputs into clear, actionable summaries and key insights.
"You are an Architect. Propose high-level design choices and trade-offs."
Provides system design recommendations, architectural patterns, and technical trade-offs.
"You are a Reviewer. Check outputs for errors, improvements, and clarity."
Quality assurance specialist that validates outputs for correctness and suggests improvements.
- Ollama installed and running
- A compatible language model (tested with Qwen2.5:7b)
- Node.js 16+ and npm/yarn
- TypeScript (for development)
# Visit https://ollama.ai/ for installation instructions
# Or use curl (Linux/macOS)
curl -fsSL https://ollama.ai/install.sh | sh# Pull Qwen2.5 (recommended)
ollama pull qwen2.5:7b
# Or try other models
ollama pull llama2
ollama pull codellamaollama serveSet your Ollama endpoint in the configuration:
# Default Ollama endpoint
export OLLAMA_ENDPOINT="http://localhost:11434"git clone https://github.com/Magnus0969/Plandex-lite.git
cd Plandex-lite
# Install dependencies
npm install
# Build TypeScript
npm run build
# Run the application
npm start
# or
node plandex.jsThe system is designed to be plug and play with any Ollama endpoint. Update your config_models.json:
{
"ollama_endpoint": "http://localhost:11434",
"model": "qwen2.5:7b",
"timeout": 300,
"max_tokens": 4096,
"temperature": 0.7
}While tested primarily with Qwen3:4b, Plandex-lite works with various Ollama-compatible models:
- β Qwen3:4b (Recommended)
- β Llama 2/3
- β Code Llama
- β Mistral
- β Gemma
- β Any Ollama-compatible model
Request: "Create a todo app with React and Node.js"
π― Planner: Breaks down into frontend/backend tasks
π» Coder: Implements React components and Express API
ποΈ Architect: Suggests project structure and data flow
π Reviewer: Validates code quality and suggests improvements
π Summarizer: Provides implementation summary and next steps
Request: "Refactor this legacy JavaScript code for better performance"
π― Planner: Identifies refactoring opportunities
ποΈ Architect: Proposes modern patterns and optimizations
π» Coder: Implements refactored code
π Reviewer: Compares performance and validates improvements
π Summarizer: Highlights key changes and benefits
plandex-lite/
βββ config_models.json # Model configuration settings
βββ core.ts # Core application logic
βββ index.ts # Main entry point
βββ markdown-to-pdf.ts # PDF generation utilities
βββ package-lock.json # Dependency lock file
βββ package.json # Project dependencies and scripts
βββ plandex.js # Main orchestrator
βββ roles.ts # Agent role definitions and prompts
βββ tsconfig.json # TypeScript configuration
βββ types.ts # TypeScript type definitions
- Define new role in
roles.tswith a specific prompt - Update type definitions in
types.ts - Implement agent logic in
core.ts - Update configuration in
config_models.json - Rebuild with
npm run build
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Commit changes:
git commit -am 'Add feature' - Push to branch:
git push origin feature-name - Submit a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- Inspired by the original Plandex project
- Built with Ollama for local AI model hosting
- Tested extensively with Qwen2.5 models
- π Report Issues
- π‘ Feature Requests
- π Documentation
Ready to orchestrate your development workflow? π
Just point Plandex-lite to your Ollama endpoint and watch the agents collaborate to tackle your most complex development challenges!