SmolAgentWithMCP enables developers to build powerful AI agents that integrate multiple MCP tool servers, like Brave Search, with LLMs for enhanced, real-time question-answering and tool orchestration.
SmolAgentWithMCP
Overview
SmolAgentWithMCP is a Python-based AI agent framework that connects multiple Model Context Protocol (MCP) tool servers and orchestrates them with a large language model (LLM). With support for Brave Search and other MCP-compatible tools, it enables powerful, tool-augmented question-answering workflows. The agent leverages smolagents, LiteLLM, and MCP protocol to offer flexible, extensible, and modern AI tooling.
Use Case
This project is ideal for developers and researchers who want to:
- Integrate multiple tool servers (e.g., Brave Search, custom MCP crawlers) into a single AI agent.
- Enable conversational AI agents to use external tools for enhanced answering capabilities.
- Prototype and deploy advanced AI workflows with minimal setup.
- Extend the agent with custom tools using the MCP protocol.
Example Scenarios:
- An assistant that answers queries using live web search data and custom knowledge bases.
- A research agent that leverages both public APIs and private datasets/tools.
- An automation bot that can interact with external services via MCP-enabled tools.
Features
- Multi-tool orchestration: Connects to multiple MCP tool servers.
- Tool-calling agent: Uses LLM to decide when and how to call tools.
- Configurable via
.env
: Supports multiple API keys and environment settings. - Async workflow: Efficient asynchronous tool management.
- Extensible: Easily add new MCP tools or change LLM models.
Project Structure
- smolagentwithmcp.py # Main agent source code
- requirements.txt # Python dependencies
- .env.example # Example env file
- README.md # Project documentation
Setup & Installation
-
Clone the repository:
git clone https://github.com/ashishpatel26/SmolAgentWithMCP.git cd SmolAgentWithMCP
-
Install dependencies:
pip install -r requirements.txt
-
Configure environment variables:
- Copy
.env.example
to.env
and fill in your API keys.
cp .env.example .env
- Copy
-
Run the agent:
python smolagentwithmcp.py
Code Workflow Diagram
flowchart TD
subgraph User Interaction
U([User]) -->|Query| AGENT[SmolAgentWithMCP]
end
AGENT -->|Loads| MCP1[MCP Tool Server 1: Brave Search]
AGENT -->|Loads| MCP2[MCP Tool Server 2: Custom]
AGENT -->|Initializes| LLM[LiteLLM Model]
U -->|Input| AGENT
AGENT -->|Decides tool usage| TOOLCALL[ToolCallingAgent]
TOOLCALL -->|Calls| MCP1
TOOLCALL -->|Calls| MCP2
MCP1 -->|Returns Data| TOOLCALL
MCP2 -->|Returns Data| TOOLCALL
TOOLCALL -->|Generates Answer| AGENT
AGENT -->|Output| U
How It Works
- Initialization: Loads MCP tool servers (like Brave Search, custom crawlers) and configures LLM.
- User Query: Accepts user input via terminal.
- Tool Orchestration: Determines which tools to call for the query.
- Execution: Calls tools, gathers results, and generates a final response using the LLM.
- Async Management: Handles multiple tool collections and steps asynchronously.
- Output: Returns the answer to the user interactively.
Configuration
Edit .env
to set your API keys:
BRAVE_API_KEY=your_brave_search_api_key
OPENAI_API_KEY=your_openai_api_key
Requirements
See requirements.txt
:
smolagents
python-dotenv
mcp
litellm
License
MIT License. See LICENSE for details.
Contributing
Contributions welcome! Please open issues or PRs for feature requests, bug fixes, or enhancements.