MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

MCP server by Sheesikram

Created 12/8/2025
Updated 5 days ago
Repository documentation and setup instructions

MCP Server Demo

A demonstration project showcasing Model Context Protocol (MCP) servers with LangChain integration. This project includes two MCP servers (Math and Weather) and a client that uses LangChain agents to interact with them.

Overview

This project demonstrates how to:

  • Create MCP servers using FastMCP
  • Connect multiple MCP servers to a LangChain agent
  • Use the agent to perform calculations and retrieve information using MCP tools

Features

  • Math Server: Provides mathematical operations (add, multiply)
  • Weather Server: Provides weather information for cities
  • LangChain Agent: Intelligent agent that can use both servers' tools to answer questions

Prerequisites

  • Python 3.12 or higher
  • OpenAI API key (for the LangChain agent)

Installation

  1. Clone the repository:
git clone https://github.com/Sheesikram/MCP_SERVER.git
cd MCP_SERVER
  1. Create a virtual environment:
python -m venv venv
  1. Activate the virtual environment:

Windows (PowerShell):

.\venv\Scripts\Activate.ps1

Windows (CMD):

venv\Scripts\activate.bat

Linux/Mac:

source venv/bin/activate
  1. Install dependencies:
pip install -r requirements.txt
  1. Create a .env file in the project root:
OPENAI_API_KEY=your_openai_api_key_here

Project Structure

MCP_SERVER/
├── mathserver.py      # Math MCP server (stdio transport)
├── weather.py         # Weather MCP server (streamable-http transport)
├── mcp-client.py      # LangChain client that uses both servers
├── requirements.txt   # Python dependencies
├── .gitignore        # Git ignore file
└── README.md         # This file

Usage

Running the Weather Server

In a separate terminal, start the weather server:

python weather.py

The server will start on http://localhost:8000/mcp using streamable-http transport.

Running the Math Server

The math server runs automatically when the client connects to it (stdio transport). No separate command needed.

Running the Client

With the weather server running, execute the client:

python mcp-client.py

The client will:

  1. Connect to both MCP servers
  2. Load available tools
  3. Create a LangChain agent
  4. Process queries using the available tools

Example Queries

The client can handle questions like:

  • "what's (3 + 5) x 12?" - Uses math tools
  • "what's the weather in faisalabad?" - Uses weather tools

MCP Servers

Math Server (mathserver.py)

Provides mathematical operations:

  • add(a: int, b: int) -> int: Add two numbers
  • multiply(a: int, b: int) -> int: Multiply two numbers

Transport: stdio

Weather Server (weather.py)

Provides weather information:

  • get_weather(city: str) -> str: Get weather for a city

Transport: streamable-http (runs on port 8000)

Configuration

MCP Client Configuration

The client (mcp-client.py) is configured to connect to:

  • Math server: stdio transport (spawns python mathserver.py)
  • Weather server: streamable-http transport at http://localhost:8000/mcp

Model Configuration

The client uses OpenAI's gpt-4o model. You can change this in mcp-client.py:

llm = ChatOpenAI(model="gpt-4o", temperature=0)

Dependencies

  • langchain - LangChain framework
  • langchain-core - Core LangChain components
  • langchain-community - Community integrations
  • langchain_openai - OpenAI integration
  • langchain-mcp-adapters - MCP adapters for LangChain
  • langgraph - LangGraph for agent workflows
  • mcp - Model Context Protocol
  • python-dotenv - Environment variable management

Troubleshooting

Weather server not connecting

Make sure the weather server is running before starting the client:

python weather.py

Transport errors

  • Math server uses "stdio" transport
  • Weather server uses "streamable_http" transport (note the underscore, not hyphen)

OpenAI API errors

  • Ensure your .env file contains a valid OPENAI_API_KEY
  • Check that you have sufficient API credits

License

MIT License - see LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Author

Shees Ikram

References

Quick Setup
Installation guide for this server

Install Package (if required)

uvx mcp_server

Cursor configuration (mcp.json)

{ "mcpServers": { "sheesikram-mcp-server": { "command": "uvx", "args": [ "mcp_server" ] } } }