Skip to content
/ helix Public

♾️ Helix is a private GenAI stack for building AI agents with declarative pipelines, knowledge (RAG), API bindings, and first-class testing.

License

Notifications You must be signed in to change notification settings

helixml/helix

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

SaaS β€’ Private Deployment β€’ Docs β€’ Discord

HelixML - AI Agents on a Private GenAI Stack

πŸ‘₯ Discord

Deploy AI agents in your own data center or VPC and retain complete data security & control.

HelixML is an enterprise-grade platform for building and deploying AI agents with support for RAG (Retrieval-Augmented Generation), API calling, vision, and multi-provider LLM support. Build and deploy LLM applications by writing a simple helix.yaml configuration file.

Our intelligent GPU scheduler packs models efficiently into available GPU memory and dynamically loads and unloads models based on demand, optimizing resource utilization.

✨ Key Features

πŸ€– AI Agents

  • Easy-to-use Web UI for agent interaction and management
  • Session-based architecture with pause/resume capabilities
  • Multi-step reasoning with tool orchestration
  • Memory management for context-aware interactions
  • Support for multiple LLM providers (OpenAI, Anthropic, and local models)
AI Agents Interface

πŸ› οΈ Skills and Tools

  • REST API integration with OpenAPI schema support
  • MCP (Model Context Protocol) server compatibility
  • GPTScript integration for advanced scripting
  • OAuth token management for secure third-party access
  • Custom tool development with flexible SDK
Skills and Tools

πŸ“š Knowledge Management

  • Built-in document ingestion (PDFs, Word, text files)
  • Web scraper for automatic content extraction
  • Multiple RAG backends: Typesense, Haystack, PGVector, LlamaIndex
  • Vector embeddings with PGVector for semantic search
  • Vision RAG support for multimodal content
Knowledge Base

Main use cases:

  • Upload and analyze corporate documents
  • Add website documentation URLs to create instant customer support agents
  • Build knowledge bases from multiple sources

πŸ” Tracing and Observability

Context is everything. Agents can process tens of thousands of tokens per stepβ€”Helix provides complete visibility under the hood:

Tracing Interface

Tracing features:

  • View all agent execution steps
  • Inspect requests and responses to LLM providers, third-party APIs, and MCP servers
  • Real-time token usage tracking
  • Pricing and cost analysis
  • Performance metrics and debugging

πŸš€ Additional Features

  • Multi-tenancy with organization, team, and role-based access control
  • Scheduled tasks and cron jobs
  • Webhook triggers for event-driven workflows
  • Evaluation framework for testing and quality assurance
  • Payment integration with Stripe support
  • Notifications via Slack, Discord, and email
  • Keycloak authentication with OAuth and OIDC support

πŸ—οΈ Architecture

HelixML uses a microservices architecture with the following components:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                      Frontend (React)                    β”‚
β”‚                     vite + TypeScript                    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                     β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                  API / Control Plane (Go)                β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β” β”‚
β”‚  β”‚   Agents     β”‚  Knowledge   β”‚   Auth & Sessions    β”‚ β”‚
β”‚  β”‚   Skills     β”‚  RAG Pipelineβ”‚   Organizations      β”‚ β”‚
β”‚  β”‚   Tools      β”‚  Vector DB   β”‚   Usage Tracking     β”‚ β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”΄β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜ β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
          β”‚                                  β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”            β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   PostgreSQL       β”‚            β”‚   GPU Runners      β”‚
β”‚   + PGVector       β”‚            β”‚   Model Scheduler  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜            β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
          β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β–Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Supporting Services: Keycloak, Typesense, Haystack,   β”‚
β”‚  GPTScript Runner, Chrome/Rod, Tika, SearXNG           β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Three-layer agent hierarchy:

  1. Session: Manages agent lifecycle and state
  2. Agent: Coordinates skills and handles LLM interactions
  3. Skills: Group related tools for specific capabilities
  4. Tools: Individual actions (API calls, functions, scripts)

πŸ’» Tech Stack

Backend

  • Go 1.24.0 - Main backend language
  • PostgreSQL + PGVector - Data storage and vector embeddings
  • GORM - ORM for database operations
  • Gorilla Mux - HTTP routing
  • Keycloak - Identity and access management
  • NATS - Message queue
  • Zerolog - Structured logging

Frontend

  • React 18.3.1 - UI framework
  • TypeScript - Type-safe JavaScript
  • Material-UI (MUI) - Component library
  • MobX - State management
  • Vite - Build tool
  • Monaco Editor - Code editing

AI/ML

  • OpenAI SDK - GPT models integration
  • Anthropic SDK - Claude models integration
  • LangChain Go - LLM orchestration
  • GPTScript - Scripting capabilities
  • Typesense / Haystack / LlamaIndex - RAG backends

Infrastructure

  • Docker & Docker Compose - Containerization
  • Kubernetes + Helm - Orchestration
  • Flux - GitOps operator

πŸš€ Quick Start

Install on Docker

Use our quickstart installer:

curl -sL -O https://get.helixml.tech/install.sh
chmod +x install.sh
sudo ./install.sh

The installer will prompt you before making changes to your system. By default, the dashboard will be available on http://localhost:8080.

For setting up a deployment with a DNS name, see ./install.sh --help or read the detailed docs. We've documented easy TLS termination for you.

Next steps:

Install on Kubernetes

Use our Helm charts for production deployments:

πŸ”§ Configuration

All server configuration is done via environment variables. You can find the complete list of configuration options in api/pkg/config/config.go.

Key environment variables:

  • OPENAI_API_KEY - OpenAI API credentials
  • ANTHROPIC_API_KEY - Anthropic API credentials
  • POSTGRES_* - Database connection settings
  • KEYCLOAK_* - Authentication settings
  • SERVER_URL - Public URL for the deployment
  • RUNNER_* - GPU runner configuration

See the configuration documentation for detailed setup instructions.

πŸ‘¨β€πŸ’» Development

For local development, refer to the Helix local development guide.

Prerequisites:

  • Docker Desktop (or Docker + Docker Compose)
  • Go 1.24.0+
  • Node.js 18+
  • Make

Quick development setup:

# Clone the repository
git clone https://github.com/helixml/helix.git
cd helix

# Start supporting services
docker-compose up -d postgres keycloak

# Run the backend
cd api
go run . serve

# Run the frontend (in a new terminal)
cd frontend
npm install
npm run dev

See local-development.md for comprehensive setup instructions.

πŸ“– Documentation

🀝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

By contributing, you confirm that:

  • Your changes will fall under the same license
  • Your changes will be owned by HelixML, Inc.

πŸ“„ License

Helix is licensed under a similar license to Docker Desktop. You can run the source code (in this repo) for free for:

  • Personal Use: Individuals or people personally experimenting
  • Educational Use: Schools and universities
  • Small Business Use: Companies with under $10M annual revenue and less than 250 employees

If you fall outside of these terms, please use the Launchpad to purchase a license for large commercial use. Trial licenses are available for experimentation.

You are not allowed to use our code to build a product that competes with us.

Why these license clauses?

  • We generate revenue to support the development of Helix. We are an independent software company.
  • We don't want cloud providers to take our open source code and build a rebranded service on top of it.

If you would like to use some part of this code under a more permissive license, please get in touch.

πŸ†˜ Support

🌟 Star History

If you find Helix useful, please consider giving us a star on GitHub!


Built with ❀️ by HelixML, Inc.