This project demonstrates the integration between LLM clients and MCP (Model Context Protocol) servers using Server-Sent Events (SSE) for real-time communication.
LLM SSE MCP Demo 2025
This project demonstrates the integration between LLM clients and MCP (Model Context Protocol) servers using Server-Sent Events (SSE) for real-time communication. It consists of two Spring Boot applications that showcase tool calling capabilities with various LLM models.
Project Structure
1. SSE MCP Server Demo (sse-mcp-server-demo
)
A Spring Boot application that serves as an MCP server, exposing mathematical and date/time tools via SSE endpoints.
Features:
- Math Tools: Addition and multiplication operations
- DateTime Tools: Current time retrieval and alarm setting
- SSE Support: Real-time communication via Server-Sent Events
- MCP Protocol: Implements Model Context Protocol for tool discovery and execution
Available Tools:
sumNumbers(int, int)
- Adds two numbersmultiplyNumbers(int, int)
- Multiplies two numbersgetCurrentDateTime()
- Gets current date/time in user's timezonesetAlarm(String)
- Sets an alarm for specified ISO-8601 time
Port: 8080 (default)
2. LLM MCP Client Demo (llm-mcp-client-demo
)
A Spring Boot web application that connects to the MCP server and provides an interactive chat interface with multiple LLM providers.
Features:
- Multiple LLM Support: Anthropic Claude, OpenAI GPT, Google Gemini, Ollama
- Interactive Web Chat: Real-time chat interface using Thymeleaf
- MCP Integration: Connects to SSE MCP server for tool calling
- Tool Discovery: Automatically discovers and uses available tools from MCP server
Supported LLM Models:
- Anthropic Claude: claude-sonnet-4-20250514
- OpenAI: o4-mini-2025-04-16
- Google Gemini: gemini-2.5-flash-preview-05-20
- Ollama: qwen3:8b (local)
Port: 9090
Prerequisites
- Java 17 or higher
- Gradle
- API keys for cloud LLM providers (optional)
- Ollama installed locally (for local models)
Environment Variables
For the LLM client, set the following environment variables for cloud providers:
export ANTHROPIC_API_KEY=your_anthropic_key
export OPENAI_API_KEY=your_openai_key
export GOOGLE_PROJECT_ID=your_google_project_id
export GOOGLE_ZONE=your_google_zone
Quick Start
1. Start the MCP Server
cd sse-mcp-server-demo
./gradlew bootRun
2. Start the LLM Client
cd llm-mcp-client-demo
./gradlew bootRun
3. Access the Chat Interface
Open your browser and navigate to: http://localhost:9090
Usage
- Start both services in the order specified above
- Open the web interface at
http://localhost:9090
- Send messages that require mathematical operations or time queries
- Watch the LLM automatically discover and use the available tools
Example queries:
- "What's 5 times 4 plus 7?"
- "What time is it?"
- "Set an alarm for 2025-01-01T10:00:00"
Technology Stack
- Spring Boot 3.x
- Spring AI - LLM integration framework
- Model Context Protocol (MCP) - Tool discovery and execution
- Server-Sent Events (SSE) - Real-time communication
- Thymeleaf - Web templating
- Gradle - Build system
Configuration
Both applications use YAML configuration files:
- MCP server configuration in
sse-mcp-server-demo/src/main/resources/application.yml
- LLM client configuration in
llm-mcp-client-demo/src/main/resources/application.yml
Development
The project demonstrates:
- MCP Protocol Implementation using Spring AI
- SSE-based Real-time Communication
- Multi-provider LLM Integration
- Tool Calling and Discovery
- Interactive Web Interfaces
This demo serves as a foundation for building more complex LLM-powered applications with external tool capabilities.