Author: Aravind V Version: 1.0.0
A sophisticated command-line coding assistant built with Rust. This agent leverages the power of multiple Large Language Models (LLMs) to understand goals, create plans, and execute them by writing code and using a variety of tools.
- Multi-Provider LLM Support: Seamlessly switch between different AI models using a simple command-line flag.
--provider openai--provider gemini--provider claude--provider deepseek--provider ollama(For running local models)
- Intelligent Orchestration: A reasoning agent creates a step-by-step plan for your goal and executes it intelligently.
- Extensible Tool System: The agent can interact with its environment to:
- Read and write files (
ReadFile,WriteFile). - Execute arbitrary shell commands (
RunCommand). - Perform real-time web searches for up-to-date information (
Search). - List directory contents to understand project structure (
ListFiles).
- Read and write files (
- Context-Aware Operation: Maintains a history of actions and results to make informed decisions and self-correct.
- Asynchronous & Performant: Built on
tokiofor efficient, non-blocking operations. - Secure Configuration: Manages API keys and other secrets via a
.envfile, keeping them out of the source code.
- Rust: Install the Rust toolchain from rustup.rs.
- API Keys: You'll need API keys for the LLM services you intend to use.
-
Clone the Repository:
git clone <repository-url> cd rust-cli-coding-agent
-
Configure Environment Variables:
- Copy the example
.envfile:cp .env.example .env
- Open the
.envfile and add your API keys. Only the keys for the providers you use are required.# For OpenAI OPENAI_API_KEY="your-openai-api-key" # For Google Gemini GOOGLE_API_KEY="your-google-api-key" # For Anthropic Claude ANTHROPIC_API_KEY="your-anthropic-api-key" # For DeepSeek DEEPSEEK_API_KEY="your-deepseek-api-key" # For the Search Tool (using Brave Search API) BRAVE_SEARCH_API_KEY="your-brave-search-api-key" # For Ollama (if using a custom base URL) OLLAMA_BASE_URL="http://localhost:11434"
- Copy the example
-
Build the Project:
cargo build --release
The executable will be located at
target/release/cli_coding_agent.
After building, the agent runs in an interactive mode, prompting you for goals.
Navigate to the project root and run:
cargo run --Or, if you've added the executable to your PATH (see below):
cli_coding_agentOnce running, you will be prompted to enter your goal:
Enter your goal (or 'quit' to exit): Create a Rust function that calculates the factorial of a number and write it to a file named `factorial.rs`.
You can specify the LLM provider when starting the agent:
cargo run -- --provider geminiOr, if using the direct executable:
cli_coding_agent --provider ollamaTo run cli_coding_agent from any directory without specifying its full path, you can add its executable to your system's PATH or create a symbolic link.
For Windows:
- Open Command Prompt as Administrator.
- Navigate to the project root:
cd rust-cli-coding-agent - Run the provided script:
install_path_windows.bat - Important: Restart your command prompt or PowerShell for changes to take effect.
For Linux/macOS:
- Open your terminal.
- Navigate to the project root:
cd rust-cli-coding-agent - Make the script executable:
chmod +x install_path_linux.sh - Run the script with
sudo:sudo ./install_path_linux.sh
This will allow you to simply type cli_coding_agent in any directory to start the interactive agent.
main.rs: Entry point, CLI parsing.orchestrator.rs: The core reasoning engine that manages the plan and state.llm/: Module containing all LLM client implementations, unified under theLLMClienttrait.agents/: Contains specialized agents (PlannerAgent,CoderAgent) responsible for specific tasks.tools/: Defines and implements the tools the agent can use.state.rs: Manages the application state, including history and context.config.rs: Handles loading configuration from the.envfile.error.rs: Custom error types for robust error handling.