Skip to content

mcpany/core

Repository files navigation

License Test GoDoc GoReportCard codecov

MCP Any Logo

MCP Any: Configuration-Driven MCP Server

One server, Infinite possibilities.

MCP Any revolutionizes how you interact with the Model Context Protocol (MCP). It is not just another MCP proxy or aggregator—it is a powerful Universal Adapter that turns any API into an MCP-compliant server through simple configuration.

Traditional MCP adoption requires running a separate server binary for every tool or service you want to expose. This leads to "binary fatigue," complex local setups, and maintenance nightmares.

MCP Any solves this with a Single Binary approach:

  1. Install once: Run a single mcpany server instance.
  2. Configure everything: Load lightweight YAML/JSON configurations to capability-enable different APIs (REST, gRPC, GraphQL, Command-line).
  3. Run anywhere: No need for npx, python, or language-specific runtimes for each tool.

❓ Philosophy: Configuration over Code

We believe you shouldn't have to write and maintain new code just to expose an existing API to your AI assistant.

  • Metamcp / Onemcp vs. MCP Any: While other tools might proxy existing MCP servers (aggregator pattern), MCP Any creates them from scratch using your existing upstream APIs.
  • No More "Sidecar hell": Instead of running 10 different containers for 10 different tools, run 1 mcpany container loaded with 10 config files.
  • Ops Friendly: Centralize authentication, rate limiting, and observability in one robust layer.

Comparison with Traditional MCP Servers

Unlike traditional "Wrapper" MCP servers (like mcp-server-postgres, mcp-server-github, etc.) which are compiled binaries dedicated to a single service, MCP Any is a generic runtime.

Feature Traditional MCP Server (e.g., mcp-server-postgres) MCP Any
Architecture Code-Driven Wrapper: Wraps internal API calls with MCP annotations. Config-Driven Adapter: Maps existing API endpoints to MCP tools via config.
Deployment 1 Binary per Service: Need 10 different binaries for 10 services. 1 Binary for All: One mcpany binary handles N services.
Updates Recompile & Redistribute: Internal API change = New Binary release. Update Config: API change = Edit YAML/JSON file & reload.
Maintenance High: Manage dependencies/versions for N projects. Low: Upgrade one core server; just swap config files.
Extensibility Write code (TypeScript/Python/Go). Write JSON/YAML.

Most "popular" MCP servers today are bespoke binaries. If the upstream API changes, you must wait for the maintainer to update the code, release a new version, and then you must redeploy. With MCP Any, you simply update your configuration file to match the new API signature—zero downtime, zero recompilation.

✨ Key Features

  • Dynamic Config Reloading: Automatically detects changes to configuration files (including atomic saves) and hot-swaps the registry without restarting the server.
  • Dynamic Tool Registration & Auto-Discovery: Automatically discover and register tools from various backend services. For gRPC and OpenAPI, simply provide the server URL or spec URL—MCP Any handles the rest (no manual tool definition required).
  • Multiple Service Types: Supports a wide range of service types, including:
    • gRPC: Register services from .proto files or by using gRPC reflection.
    • OpenAPI: Ingest OpenAPI (Swagger) specifications to expose RESTful APIs as tools.
    • HTTP: Expose any HTTP endpoint as a tool.
    • GraphQL: Expose a GraphQL API as a set of tools, with the ability to customize the selection set for each query.
    • SQL: Connect to SQL databases (Postgres, SQLite, MySQL) and expose safe queries as tools.
    • WebSocket: Connect to WebSocket servers.
    • WebRTC: Connect to WebRTC services.
  • Advanced Service & Safety Policies:
    • Safety: Control which tools are exposed to the AI to limit context (reduce hallucinations) and prevent dangerous actions (e.g., blocking DELETE operations).
    • Performance: Configure Caching and Rate Limiting to optimize performance and protect upstream services.
    • Semantic Caching: Intelligent caching using vector embeddings to serve similar requests from cache. Now supports SQLite persistence to survive restarts.
    • Audit Logging: Keep a tamper-evident record of all tool executions in a JSON file or SQLite database (using SHA-256 hash chaining) for compliance and security auditing.
  • Network Topology Visualization: Visualizes your entire MCP ecosystem (Clients, Core, Services, Tools, API Calls) in a 5-level hierarchical interactive graph with real-time QPS and Latency metrics. Network Topology
  • MCP Any Proxy: Proxy and re-expose tools from another MCP Any instance.
  • MCP Sampling Support: Enables upstream tools to request sampling (LLM generation) from the connected client, fully supported via mcp.Client options.
  • Upstream Authentication: Securely connect to your backend services using:
    • API Keys
    • Bearer Tokens
    • Basic Auth
    • mTLS
  • Unified API: Interact with all registered tools through a single, consistent API based on the Model Context Protocol.
  • Multi-User & Multi-Profile: Securely support multiple users with distinct profiles, each with its own set of enabled services and granular authentication.
  • Advanced Configuration: Customize tool behavior with Merge Strategies and Profile Filtering.
  • Extensible: Designed to be easily extended with new service types and capabilities.

⚡ Quick Start (5 Minutes)

Ready to give your AI access to real-time data? Let's connect a public Weather API to Gemini CLI (or any MCP client) using MCP Any.

1. Prerequisites

  • Go: Ensure you have Go installed (1.23+ recommended).
  • Gemini CLI: If not installed, see the installation guide.

(Prefer building from source? See Getting Started for build instructions.)

2. Configuration

We will use the pre-built wttr.in configuration available in the examples directory: server/examples/popular_services/wttr.in/config.yaml.

Quick Start: Weather Service

  1. Run the Server:

    Choose one of the following methods to run the server.

    Option 1: Remote Configuration (Recommended)

    Fastest way to get started. No need to clone the repository.

    docker run -d --rm --name mcpany-server \
      -p 50050:50050 \
      ghcr.io/mcpany/server:dev-latest \
      run --config-path https://raw.githubusercontent.com/mcpany/core/main/server/examples/popular_services/wttr.in/config.yaml

    Option 2: Local Configuration

    Best if you want to modify the configuration or use your own. Requires cloning the repository.

    # Clone the repository
    git clone https://github.com/mcpany/core.git
    cd core
    
    # Run with local config mounted
    docker run -d --rm --name mcpany-server \
      -p 50050:50050 \
      -v $(pwd)/server/examples/popular_services/wttr.in/config.yaml:/config.yaml \
      ghcr.io/mcpany/server:dev-latest \
      run --config-path /config.yaml

    Tip: Need detailed logs? Add the --debug flag to the end of the run command.

  2. Connect Gemini CLI:

    gemini mcp add --transport http --trust mcpany http://localhost:50050
  3. Chat!

    Ask your AI about the weather:

    gemini -m gemini-2.5-flash -p "What is the weather in London?"

    The AI will:

    1. Call the tool (e.g., wttrin_<hash>.get_weather).
    2. mcpany will proxy the request to https://wttr.in.
    3. The AI receives the JSON response and answers your question!

Ask about the moon phase:

gemini -m gemini-2.5-flash -p "What is the moon phase?"

The AI will:

  1. Call the get_moon_phase tool.
  2. mcpany will proxy the request to https://wttr.in/moon.
  3. The AI receives the ASCII art response and describes it!

For more complex examples, including gRPC, OpenAPI, and authentication, check out server/docs/reference/configuration.md.

💡 More Usage

Once the server is running, you can interact with it using its JSON-RPC API.

🛠️ Development Guide

We welcome contributions to MCP Any! This section provides a brief overview of how to set up your development environment. For more detailed information, including code structure, service registration, and debugging tips, please refer to the Developer Guide.

Prerequisites

  • Go: Version 1.23+
  • Docker: For running tests and building images.
  • Make: For running build automation scripts.

Quick Setup

  1. Clone the repository:

    git clone https://github.com/mcpany/core.git
    cd core
  2. Install dependencies and tools: Run the following command to set up your environment (installs protoc, linters, etc.):

    make prepare

Common Commands

  • Build: make build (Binary will be at build/bin/server)
  • Test: make test (Runs unit, integration, and E2E tests)
  • Lint: make lint (Runs golangci-lint and other checks)
  • Generate: make gen (Regenerates code from Protocol Buffers)
  • Clean: make clean (Removes build artifacts)

Running Locally

After building, you can run the server locally:

./build/bin/server run --config-path server/examples/popular_services/wttr.in/config.yaml

Project Structure

The project is organized as follows:

  • server/cmd/: Application entry points.
    • server/: The main MCP Any server binary.
  • server/pkg/: Core library code.
    • app/: Application lifecycle and wiring.
    • config/: Configuration loading and validation.
    • mcpserver/: Core MCP protocol implementation.
    • upstream/: Adapters for upstream services (gRPC, HTTP, OpenAPI, Filesystem, etc.).
  • proto/: Protocol Buffer definitions for configuration and internal APIs.
  • server/examples/: Example configuration files and demo services.
  • server/docs/: Detailed documentation and guides.

Code Standards

We strive for high code quality. Please ensure the following before submitting a PR:

  • Documentation: All public functions, methods, and types must have comments explaining their purpose, parameters, and return values. You can verify coverage with:
    go run server/tools/check_doc.go server/
  • Testing: Add unit tests for new functionality. Run all tests with:
    make test
  • Linting: Ensure the code is linted and formatted correctly:
    make lint

🤝 Contributing

Contributions are welcome! Please feel free to open an issue or submit a pull request.

🗺️ Roadmap

Check out our Roadmap to see what we're working on and what's coming next.

📄 License

This project is licensed under the terms of the LICENSE file.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

 
 
 

Contributors 4

  •  
  •  
  •  
  •