A Model Context Protocol (MCP) server that provides OpenBB Workspace documentation to AI assistants through a two-step retrieval workflow.
This server exposes two tools that work together to retrieve relevant OpenBB documentation:
- Takes a user query
- Fetches the complete OpenBB documentation table of contents
- Provides it to the LLM with instructions to identify up to 3 relevant section titles
- Returns the raw TOC for intelligent analysis
- Takes the section titles identified in step 1 + the original user query
- Fetches the full OpenBB documentation
- Extracts only the relevant sections
- Returns the content with OpenBB Copilot-compatible citation format instructions
- LLM calls
identify_openbb_docs_sectionswith user's question - LLM analyzes the TOC and identifies relevant sections (up to 3)
- LLM calls
fetch_openbb_contentwith those section titles - LLM uses the extracted content to answer the user's question with proper citations
Start the server locally:
python server.pyThe server will start on port 8000 by default. You can change it with:
PORT=8014 python server.pyThe MCP endpoint will be available at http://localhost:8000/mcp
The server is pre-configured with CORS for:
https://pro.openbb.cohttps://pro.openbb.devhttp://localhost:1420
Install requirements:
pip install -r requirements.txtRequires Python 3.10+