Skip to content

engindearing-projects/omniclaw

Repository files navigation

Omniclaw

Build persistent AI agents that run 24/7. Local LLM for routine tasks, cloud API for complex ones. Monitor your projects, track deadlines, write code, deliver standup summaries — all from a Docker container.

What is this?

Omniclaw is an open-source framework for building persistent AI agents — ones that don't disappear when you close a tab. Your agent runs in Docker, remembers what happened yesterday, checks your Jira boards on a schedule, and messages you on Slack when something needs attention.

It's built on OpenClaw, an open-source AI agent daemon that handles the hard parts: sessions, channels, model routing, and tool execution.

Who it's for: Developers who want an AI assistant that works autonomously — checking boards, monitoring repos, tracking deadlines — without babysitting it.

Quick Start

# Clone the repo
git clone https://github.com/engindearing-projects/omniclaw.git
cd omniclaw

# Run the setup wizard (needs Node.js)
node scripts/wizard.mjs

# Start everything
docker compose up -d

The wizard walks you through configuring your agent name, API keys, and integrations. After setup, your agent is live at http://localhost:18789.

No Node.js? Run ./scripts/setup.sh instead — it copies example configs for you to edit manually.

Features

  • Local-first AI — Ollama (llama3.1/3.2) handles routine tasks for free. Claude Sonnet kicks in for complex work.
  • Jira tracking — Monitor boards, surface overdue tickets, track critical paths
  • GitHub monitoring — Watch PRs, CI status, merge conflicts across repos
  • Deadline alerts — Proactive warnings when timelines slip
  • Daily standups — Morning briefs delivered to Slack with overnight activity + priorities
  • Autonomous code work — Create branches, write code, run tests, open PRs (with guardrails)
  • Cron scheduling — Run checks on exact schedules (8am standup, 2pm follow-up)
  • Heartbeat system — Periodic background checks every 30 minutes
  • Multi-channel — Slack, Telegram, CLI
  • Memory system — Daily logs + long-term memory that persists across sessions
  • MCP bridge — Expose your agent as MCP tools for use in other AI workflows

Architecture

Omniclaw runs as two Docker containers:

┌─────────────────────────────────┐
│  omniclaw container             │
│  ┌───────────────────────────┐  │
│  │  OpenClaw daemon          │  │
│  │  ├─ Gateway (port 18789)  │  │
│  │  ├─ Agents & Sessions     │  │
│  │  ├─ Cron scheduler        │  │
│  │  └─ Channel connectors    │  │
│  └───────────────────────────┘  │
│  Volumes:                       │
│    config/ → system config      │
│    workspace/ → agent personality│
│    cron/ → scheduled jobs       │
│    memory/ → persistent memory  │
└─────────────────────────────────┘
         │
         ▼
┌─────────────────────────────────┐
│  ollama container               │
│  llama3.1 / llama3.2           │
│  (port 11434)                  │
└─────────────────────────────────┘

See ARCHITECTURE.md for the full system design.

Skills

Skills are modular capabilities defined in workspace/skills/. Each skill has a SKILL.md that documents its workflow, API usage, and guardrails.

Skill Description
jira-tracker Query Jira boards, report status, track blockers
github-manager Monitor PRs, CI status, merge conflicts
deadline-alerter Track critical paths, alert on timeline slippage
standup-summarizer Generate daily morning briefs
code-worker Autonomous coding with branch/test/PR workflow

Creating Your Own Skill

  1. Create workspace/skills/your-skill/SKILL.md
  2. Add YAML frontmatter with name, description, and requirements
  3. Document the workflow, commands, and response format
  4. Enable it in config/openclaw.json under skills.entries

See existing skills for the pattern.

Configuration

Omniclaw uses a layered personality system:

File Purpose
workspace/SOUL.md Universal principles — who the agent fundamentally is
workspace/AGENTS.md Workspace behavior — how to use memory, tools, channels
config/IDENTITY.md Agent-specific identity — name, orgs, boards, guardrails
workspace/IDENTITY.md Personality — vibe, avatar, self-description
workspace/USER.md Human context — your name, timezone, work style
workspace/MEMORY.md Learned knowledge — grows over time

The .example files are templates. The setup wizard generates real configs from them.

Integrations

Integration Setup Guide Required Env Vars
Jira Get token from Atlassian API tokens JIRA_BASE_URL, JIRA_EMAIL, JIRA_API_TOKEN
GitHub Create personal access token with repo scope GITHUB_TOKEN
Slack See workspace/setup/slack-setup.md SLACK_APP_TOKEN, SLACK_BOT_TOKEN
Telegram Create bot via @BotFather TELEGRAM_BOT_TOKEN

MCP Bridge

The MCP bridge lets you use your Omniclaw agent as a set of MCP tools from other AI workflows (like Claude Code). It connects via WebSocket to the OpenClaw gateway.

# Install bridge dependencies
cd mcp-bridge && npm install

# Add to your MCP config (.mcp.json)
{
  "mcpServers": {
    "omniclaw": {
      "command": "node",
      "args": ["./mcp-bridge/index.mjs"]
    }
  }
}

Deploying to a Server

# From your local machine
./scripts/deploy-to-ubuntu.sh user@your-server

# On the server
cd ~/omniclaw
./scripts/setup.sh

For GPU-accelerated Ollama on Ubuntu, uncomment the deploy section in docker-compose.yml.

Contributing

See CONTRIBUTING.md for development setup, skill creation guide, and PR process.

License

MIT — see LICENSE for details.

About

No description, website, or topics provided.

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors