M
MCP Server
by @R2D2-fwks
mcp-server
Created 7/17/2025
Updated 4 days ago
README
Repository documentation and setup instructions
MCP Server for LLM Orchestration
A Model Context Protocol (MCP) server built in Erlang that orchestrates requests to different LLMs (ChatGPT, Gemini, Claude, etc.) using OTP principles.
Features
- OTP-based architecture for robust, fault-tolerant operation
- Support for multiple LLM providers (OpenAI, Google Gemini, Anthropic Claude)
- Dynamic model registration and management
- RESTful API for chat completions and model listing
- Automatic failover and load balancing between models
Architecture
The application follows OTP design principles with the following components:
mcp_app
: Main application modulemcp_sup
: Top-level supervisormcp_model_sup
: Dynamic supervisor for LLM connectionsmcp_orchestrator
: Central orchestrator for routing requestsmcp_model
: GenServer implementation for each LLM type- HTTP API handlers for external communication
Setup
Prerequisites
- Erlang/OTP 24 or later
- Rebar3
Building
cd mcp_server
rebar3 compile
Running
rebar3 shell
API Endpoints
POST /api/v1/chat
: Send a chat request to an LLMGET /api/v1/models
: List all available models
Configuration
LLM API keys and other configuration should be provided through environment variables or a config file.
License
MIT
Quick Setup
Installation guide for this server
Installation Command (package not published)
git clone https://github.com/R2D2-fwks/mcp-server
Manual Installation: Please check the README for detailed setup instructions and any additional dependencies required.
Cursor configuration (mcp.json)
{
"mcpServers": {
"r2d2-fwks-mcp-server": {
"command": "git",
"args": [
"clone",
"https://github.com/R2D2-fwks/mcp-server"
]
}
}
}