MCP server by groxaxo
MCP LLM Router
A Model Context Protocol (MCP) server for routing LLM requests across multiple providers and connecting to other MCP servers.
Features
- Multi-Provider LLM Routing: Route requests to OpenAI, OpenRouter, DeepInfra, and other OpenAI-compatible APIs
- Session Management: Track agent sessions with goals, constraints, and event logging
- MCP Server Orchestration: Connect to and orchestrate multiple MCP servers
- Cross-Server Tool Calling: Call tools across different MCP servers
- Universal MCP Compatibility: Works with any MCP-compatible client (not tied to specific IDEs)
Installation
- Clone or navigate to this directory:
cd ~/mcp-llm-router
- Create and activate virtual environment:
python3 -m venv .venv
source .venv/bin/activate
- Install dependencies:
pip install -U pip
pip install "fastmcp<3" openai httpx mcp
Configuration
MCP Server Configuration (mcp-config.json)
{
"mcpServers": {
"llm-router": {
"command": "python",
"args": ["-m", "mcp_llm_router.server"],
"env": {
"OPENAI_API_KEY": "your-openai-key",
"DEEPINFRA_API_KEY": "your-deepinfra-key",
"OPENROUTER_API_KEY": "your-openrouter-key"
}
},
"other-server": {
"command": "python",
"args": ["-m", "other_mcp_server"],
"env": {}
}
}
}
Environment Variables
Set API keys in your environment or in the config:
export OPENAI_API_KEY="sk-proj-..."
export DEEPINFRA_API_KEY="..."
export OPENROUTER_API_KEY="sk-or-..."
Usage
Running MCP Servers
Using the Server Runner
# List configured servers
python mcp_server_runner.py list
# Run a specific server
python mcp_server_runner.py run llm-router
Using the Server Manager
# Add a new server
python mcp_manager.py add my-server python -m my_mcp_server
# List servers
python mcp_manager.py list
# Test server connection
python mcp_manager.py test llm-router
# Remove a server
python mcp_manager.py remove my-server
Connecting to MCP Servers
Using the MCP Client
# List tools on a server
python mcp_client.py list-tools llm-router
# Call a tool on a server
python mcp_client.py call-tool llm-router start_session '{"goal": "Test session"}'
Using the Server Manager for Cross-Server Operations
# Call a tool across all configured servers
python mcp_manager.py call start_session '{"goal": "Test all servers"}'
MCP Tools Available
Session Management
start_session(goal, constraints, context, metadata)- Start a new agent sessionlog_event(session_id, kind, message, details)- Log events to a sessionget_session_context(session_id)- Retrieve full session data
LLM Routing
agent_llm_request(session_id, prompt, model, base_url, api_key_env, ...)- Route to LLM providers
MCP Server Orchestration
connect_mcp_server(server_name, command, args, env)- Configure connection to another MCP serverlist_mcp_servers()- List configured MCP server connectionscall_mcp_tool(server_name, tool_name, arguments)- Call tools on other MCP serverslist_mcp_tools(server_name)- List tools available on another MCP server
Integration with MCP Clients
Any MCP-Compatible Client
The server works with any client that supports the MCP protocol:
{
"mcpServers": {
"llm-router": {
"command": "python",
"args": ["-m", "mcp_llm_router.server"],
"env": {
"OPENAI_API_KEY": "your-key"
}
}
}
}
Example: Claude Desktop
Add to your Claude Desktop MCP configuration:
{
"mcpServers": {
"llm-router": {
"command": "python",
"args": ["-m", "mcp_llm_router.server"],
"env": {
"OPENAI_API_KEY": "sk-...",
"DEEPINFRA_API_KEY": "..."
}
}
}
}
Example: Custom MCP Client
import asyncio
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
async def main():
server_params = StdioServerParameters(
command="python",
args=["-m", "mcp_llm_router.server"],
env={"OPENAI_API_KEY": "your-key"}
)
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()
# Start a session
result = await session.call_tool("start_session", {
"goal": "Test the MCP server"
})
print("Session started:", result)
if __name__ == "__main__":
asyncio.run(main())
Provider Configuration
OpenAI
{
"base_url": null, # Uses default
"api_key_env": "OPENAI_API_KEY"
}
OpenRouter
{
"base_url": "https://openrouter.ai/api/v1",
"api_key_env": "OPENROUTER_API_KEY"
}
DeepInfra
{
"base_url": "https://api.deepinfra.com/v1/openai",
"api_key_env": "DEEPINFRA_API_KEY"
}
CLI Tool
The opencode command provides direct CLI access:
# Basic usage
opencode run "What is Python"
# Use specific provider
opencode run "Explain Docker" --provider deepinfra --model meta-llama/Meta-Llama-3.1-70B-Instruct
Development
Running the Server Directly
cd ~/mcp-llm-router
source .venv/bin/activate
python -m mcp_llm_router.server
Testing
# Test server startup
timeout 5 python -m mcp_llm_router.server
# Test CLI
opencode run "Hello world"
# Test MCP client
python mcp_client.py list-tools llm-router
Architecture
┌─────────────────┐ ┌──────────────────┐
│ MCP Client │◄──►│ LLM Router MCP │
│ (Claude, etc.) │ │ Server │
└─────────────────┘ └──────────────────┘
│
▼
┌──────────────────┐
│ LLM Providers │
│ • OpenAI │
│ • OpenRouter │
│ • DeepInfra │
└──────────────────┘
│
▼
┌──────────────────┐
│ Other MCP Servers│
│ • File system │
│ • Database │
│ • APIs │
└──────────────────┘
License
MIT License - see LICENSE file for details.
# Basic usage with OpenAI (default)
opencode run "Explain quantum computing"
# Use a specific provider
opencode run "Write a Python function" --provider openrouter --model anthropic/claude-3-opus
# Use DeepInfra
opencode run "Summarize this text" --provider deepinfra --model meta-llama/Llama-3.1-70B-Instruct
Available providers:
openai(default) - Uses OPENAI_API_KEYopenrouter- Uses OPENROUTER_API_KEYdeepinfra- Uses DEEPINFRA_API_KEY
MCP Tools
When used as an MCP server in Antigravity, the following tools are available:
start_session
Start a new agent session with a goal and constraints.
{
"goal": "Implement user authentication",
"constraints": "Use JWT tokens, no external dependencies",
"context": "FastAPI application"
}
log_event
Log events during an agent session (info, error, warning, success).
{
"session_id": "uuid-here",
"kind": "error",
"message": "Build failed",
"details": {"exit_code": 1}
}
agent_llm_request
Make a request to an LLM provider within a session.
{
"session_id": "uuid-here",
"prompt": "How do I fix this error?",
"model": "gpt-4",
"base_url": "https://openrouter.ai/api/v1", # optional
"api_key_env": "OPENROUTER_API_KEY"
}
get_session_context
Retrieve full session history and events.
{
"session_id": "uuid-here"
}
Example Agent Workflow in Antigravity
-
Start session:
Call start_session with goal="Build a REST API for task management" -
Work on task:
Create files, run commands, etc. -
Log progress:
Call log_event with kind="info", message="Created database schema" -
When stuck:
Call agent_llm_request with prompt="How do I handle authentication?" -
Review context:
Call get_session_context to see full history
Development
Run the MCP server directly:
cd ~/mcp-llm-router
source .venv/bin/activate
python -m mcp_llm_router.server
Environment Variables
Set these in your ~/.bashrc or Antigravity config:
export OPENAI_API_KEY="sk-..."
export OPENROUTER_API_KEY="sk-or-..."
export DEEPINFRA_API_KEY="..."