A visual workflow builder for LangChain applications. This project allows you to build complex LLM applications with a drag-and-drop interface.
- Visual, drag-and-drop interface for LangChain components
- Connect nodes to create custom LLM workflows
- Support for various components:
- LLMs (OpenAI, etc.)
- Prompts
- Chat Input/Output
- Memory systems (coming soon)
- Agents (coming soon)
- Tools (coming soon)
- Document loaders (coming soon)
- Vector stores (coming soon)
- Embeddings (coming soon)
src/
├── backend/
│ └── LLMcontrols/ # Python backend using FastAPI
└── frontend/ # React TypeScript frontend
- Python 3.8+
- Node.js 14+
- npm 6+
- Install the required Python dependencies:
pip install -r requirements.txt- Start the backend server:
python src/backend/run_server.pyThe backend server will run at http://localhost:8000.
- Navigate to the frontend directory:
cd src/frontend- Install the required Node.js dependencies:
npm install- Start the frontend development server:
npm startThe frontend will run at http://localhost:3000.
To run the backend:
python src\backend\run_server.py
To run the frontend:
cd src\frontend && npm start
- Open your browser and navigate to http://localhost:3000
- Create a new flow by clicking "New Flow"
- Drag components from the left panel to the canvas
- Connect the components by dragging from one node's output handle to another node's input handle
- Configure each component by clicking on it
- Save your flow by clicking "Save"
- Run your flow by clicking "Run"
A simple chatbot flow might include:
- Chat Input node (to get user input)
- Prompt node (to format the input)
- OpenAI LLM node (to generate a response)
- Chat Output node (to display the response)
This project is under active development. Contributions are welcome!
MIT License