M
MCP Agent Orchestrator
Local Agentic AI Framework using Model Context Protocol (MCP), LangGraph, and Ollama
Created 3/15/2026
Updated about 7 hours ago
README
Repository documentation and setup instructions
🤖 mcp-agent-orchestrator: Local Agentic AI via Model Context Protocol
mcp-agent-orchestrator is a production-ready starter template for building local-first Agentic AI systems. It demonstrates how to connect a FastMCP (Python) Server to a LangGraph Orchestrator using Ollama as the local LLM host.
🚀 Key Features
- Standardized MCP Integration: Uses the Model Context Protocol to decouple tool logic from LLM logic.
- Stateful Orchestration: Built with LangGraph for robust, multi-turn agent conversations and error handling.
- Privacy-First: 100% local execution using Ollama (Llama 3.1/Mistral)—no API keys required.
- Modern UI: A clean Streamlit interface for real-time interaction with your MCP tools.
- FastMCP Framework: Simplified Python tool definition with automatic schema generation.
🏗️ Architecture Overview
- MCP Host (Ollama): Provides the reasoning engine (LLM).
- MCP Server (FastMCP): Defines Python tools (e.g., SQLite, File System, Web Search).
- MCP Client: Manages the
stdiotransport and subprocess communication. - Orchestrator (LangGraph): A stateful ReAct agent that decides when to call MCP tools.
- Interface (Streamlit): The user-facing web application.
🛠️ Quick Start
Prerequisites
- Install Ollama and run
ollama run llama3.1. - Python 3.10 or higher.
Installation
- Clone the repository:
git clone https://github.com/NxtGenCodeBase/mcp-agent-orchestrator.git cd mcp-agent-orchestrator
Quick Setup
Installation guide for this server
Install Package (if required)
uvx mcp-agent-orchestrator
Cursor configuration (mcp.json)
{
"mcpServers": {
"nxtgencodebase-mcp-agent-orchestrator": {
"command": "uvx",
"args": [
"mcp-agent-orchestrator"
]
}
}
}