One server, Infinite possibilities.
MCP Any revolutionizes how you interact with the Model Context Protocol (MCP). It is not just another MCP proxy or aggregator—it is a powerful Universal Adapter that turns any API into an MCP-compliant server through simple configuration.
Traditional MCP adoption requires running a separate server binary for every tool or service you want to expose. This leads to "binary fatigue," complex local setups, and maintenance nightmares.
MCP Any solves this with a Single Binary approach:
- Install once: Run a single
mcpanyserver instance. - Configure everything: Load lightweight YAML/JSON configurations to capability-enable different APIs (REST, gRPC, GraphQL, Command-line).
- Run anywhere: No need for
npx,python, or language-specific runtimes for each tool.
We believe you shouldn't have to write and maintain new code just to expose an existing API to your AI assistant.
- Metamcp / Onemcp vs. MCP Any: While other tools might proxy existing MCP servers (aggregator pattern), MCP Any creates them from scratch using your existing upstream APIs.
- No More "Sidecar hell": Instead of running 10 different containers for 10 different tools, run 1
mcpanycontainer loaded with 10 config files. - Ops Friendly: Centralize authentication, rate limiting, and observability in one robust layer.
Unlike traditional "Wrapper" MCP servers (like mcp-server-postgres, mcp-server-github, etc.) which are compiled binaries dedicated to a single service, MCP Any is a generic runtime.
| Feature | Traditional MCP Server (e.g., mcp-server-postgres) |
MCP Any |
|---|---|---|
| Architecture | Code-Driven Wrapper: Wraps internal API calls with MCP annotations. | Config-Driven Adapter: Maps existing API endpoints to MCP tools via config. |
| Deployment | 1 Binary per Service: Need 10 different binaries for 10 services. | 1 Binary for All: One mcpany binary handles N services. |
| Updates | Recompile & Redistribute: Internal API change = New Binary release. | Update Config: API change = Edit YAML/JSON file & reload. |
| Maintenance | High: Manage dependencies/versions for N projects. | Low: Upgrade one core server; just swap config files. |
| Extensibility | Write code (TypeScript/Python/Go). | Write JSON/YAML. |
Most "popular" MCP servers today are bespoke binaries. If the upstream API changes, you must wait for the maintainer to update the code, release a new version, and then you must redeploy. With MCP Any, you simply update your configuration file to match the new API signature—zero downtime, zero recompilation.
- Dynamic Config Reloading: Automatically detects changes to configuration files (including atomic saves) and hot-swaps the registry without restarting the server.
- Dynamic Tool Registration & Auto-Discovery: Automatically discover and register tools from various backend services. For gRPC and OpenAPI, simply provide the server URL or spec URL—MCP Any handles the rest (no manual tool definition required).
- Multiple Service Types: Supports a wide range of service types, including:
- gRPC: Register services from
.protofiles or by using gRPC reflection. - OpenAPI: Ingest OpenAPI (Swagger) specifications to expose RESTful APIs as tools.
- HTTP: Expose any HTTP endpoint as a tool.
- GraphQL: Expose a GraphQL API as a set of tools, with the ability to customize the selection set for each query.
- SQL: Connect to SQL databases (Postgres, SQLite, MySQL) and expose safe queries as tools.
- WebSocket: Connect to WebSocket servers.
- WebRTC: Connect to WebRTC services.
- gRPC: Register services from
- Advanced Service & Safety Policies:
- Safety: Control which tools are exposed to the AI to limit context (reduce hallucinations) and prevent dangerous actions (e.g., blocking
DELETEoperations). - Performance: Configure Caching and Rate Limiting to optimize performance and protect upstream services.
- Semantic Caching: Intelligent caching using vector embeddings to serve similar requests from cache. Now supports SQLite persistence to survive restarts.
- Audit Logging: Keep a tamper-evident record of all tool executions in a JSON file or SQLite database (using SHA-256 hash chaining) for compliance and security auditing.
- Safety: Control which tools are exposed to the AI to limit context (reduce hallucinations) and prevent dangerous actions (e.g., blocking
- Network Topology Visualization: Visualizes your entire MCP ecosystem (Clients, Core, Services, Tools, API Calls) in a 5-level hierarchical interactive graph with real-time QPS and Latency metrics.

- MCP Any Proxy: Proxy and re-expose tools from another MCP Any instance.
- MCP Sampling Support: Enables upstream tools to request sampling (LLM generation) from the connected client, fully supported via
mcp.Clientoptions. - Upstream Authentication: Securely connect to your backend services using:
- API Keys
- Bearer Tokens
- Basic Auth
- mTLS
- Unified API: Interact with all registered tools through a single, consistent API based on the Model Context Protocol.
- Multi-User & Multi-Profile: Securely support multiple users with distinct profiles, each with its own set of enabled services and granular authentication.
- Advanced Configuration: Customize tool behavior with Merge Strategies and Profile Filtering.
- Extensible: Designed to be easily extended with new service types and capabilities.
Ready to give your AI access to real-time data? Let's connect a public Weather API to Gemini CLI (or any MCP client) using MCP Any.
- Go: Ensure you have Go installed (1.23+ recommended).
- Gemini CLI: If not installed, see the installation guide.
(Prefer building from source? See Getting Started for build instructions.)
We will use the pre-built wttr.in configuration available in the examples directory: server/examples/popular_services/wttr.in/config.yaml.
-
Run the Server:
Choose one of the following methods to run the server.
Option 1: Remote Configuration (Recommended)
Fastest way to get started. No need to clone the repository.
docker run -d --rm --name mcpany-server \ -p 50050:50050 \ ghcr.io/mcpany/server:dev-latest \ run --config-path https://raw.githubusercontent.com/mcpany/core/main/server/examples/popular_services/wttr.in/config.yaml
Option 2: Local Configuration
Best if you want to modify the configuration or use your own. Requires cloning the repository.
# Clone the repository git clone https://github.com/mcpany/core.git cd core # Run with local config mounted docker run -d --rm --name mcpany-server \ -p 50050:50050 \ -v $(pwd)/server/examples/popular_services/wttr.in/config.yaml:/config.yaml \ ghcr.io/mcpany/server:dev-latest \ run --config-path /config.yaml
Tip: Need detailed logs? Add the
--debugflag to the end of theruncommand. -
Connect Gemini CLI:
gemini mcp add --transport http --trust mcpany http://localhost:50050
-
Chat!
Ask your AI about the weather:
gemini -m gemini-2.5-flash -p "What is the weather in London?"The AI will:
- Call the tool (e.g.,
wttrin_<hash>.get_weather). mcpanywill proxy the request tohttps://wttr.in.- The AI receives the JSON response and answers your question!
- Call the tool (e.g.,
Ask about the moon phase:
gemini -m gemini-2.5-flash -p "What is the moon phase?"The AI will:
- Call the
get_moon_phasetool. mcpanywill proxy the request tohttps://wttr.in/moon.- The AI receives the ASCII art response and describes it!
For more complex examples, including gRPC, OpenAPI, and authentication, check out server/docs/reference/configuration.md.
Once the server is running, you can interact with it using its JSON-RPC API.
- For detailed configuration options, see Configuration Reference.
- For instructions on how to connect
mcpanywith your favorite AI coding assistant (Claude Desktop, Cursor, VS Code, JetBrains, Cline), see the Integration Guide. - For hands-on examples, see the Examples and the Profile Authentication Example.
- For monitoring metrics, see Monitoring.
We welcome contributions to MCP Any! This section provides a brief overview of how to set up your development environment. For more detailed information, including code structure, service registration, and debugging tips, please refer to the Developer Guide.
- Go: Version 1.23+
- Docker: For running tests and building images.
- Make: For running build automation scripts.
-
Clone the repository:
git clone https://github.com/mcpany/core.git cd core -
Install dependencies and tools: Run the following command to set up your environment (installs
protoc, linters, etc.):make prepare
- Build:
make build(Binary will be atbuild/bin/server) - Test:
make test(Runs unit, integration, and E2E tests) - Lint:
make lint(Runsgolangci-lintand other checks) - Generate:
make gen(Regenerates code from Protocol Buffers) - Clean:
make clean(Removes build artifacts)
After building, you can run the server locally:
./build/bin/server run --config-path server/examples/popular_services/wttr.in/config.yamlThe project is organized as follows:
server/cmd/: Application entry points.server/: The main MCP Any server binary.
server/pkg/: Core library code.app/: Application lifecycle and wiring.config/: Configuration loading and validation.mcpserver/: Core MCP protocol implementation.upstream/: Adapters for upstream services (gRPC, HTTP, OpenAPI, Filesystem, etc.).
proto/: Protocol Buffer definitions for configuration and internal APIs.server/examples/: Example configuration files and demo services.server/docs/: Detailed documentation and guides.
We strive for high code quality. Please ensure the following before submitting a PR:
- Documentation: All public functions, methods, and types must have comments explaining their purpose, parameters, and return values. You can verify coverage with:
go run server/tools/check_doc.go server/
- Testing: Add unit tests for new functionality. Run all tests with:
make test - Linting: Ensure the code is linted and formatted correctly:
make lint
Contributions are welcome! Please feel free to open an issue or submit a pull request.
Check out our Roadmap to see what we're working on and what's coming next.
This project is licensed under the terms of the LICENSE file.

