MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

An MCP Server for chatGPT Chat Completions

Created 4/7/2025
Updated 2 months ago
Repository documentation and setup instructions

🧠 Ask ChatGPT - MCP Server (Stdio)

This is a Model Context Protocol (MCP) stdio server that forwards prompts to OpenAI’s ChatGPT (GPT-4o). It is designed to run inside LangGraph-based assistants and enables advanced summarization, analysis, and reasoning by accessing an external LLM.

πŸ“Œ What It Does

This server exposes a single tool:

{
  "name": "ask_chatgpt",
  "description": "Sends the provided text ('content') to an external ChatGPT (gpt-4o) model for advanced reasoning or summarization.",
  "parameters": {
    "type": "object",
    "properties": {
      "content": {
        "type": "string",
        "description": "The text to analyze, summarize, compare, or reason about."
      }
    },
    "required": ["content"]
  }
}

Use this when your assistant needs to:

Summarize long documents

Analyze configuration files

Compare options

Perform advanced natural language reasoning

🐳 Docker Usage

Build and run the container:


docker build -t ask-chatgpt-mcp .

docker run -e OPENAI_API_KEY=your-openai-key -i ask-chatgpt-mcp

πŸ§ͺ Manual Test

Test the server locally using a one-shot request:


echo '{"method":"tools/call","params":{"name":"ask_chatgpt","arguments":{"content":"Summarize this config..."}}}' | \
  OPENAI_API_KEY=your-openai-key python3 server.py --oneshot

🧩 LangGraph Integration

To connect this MCP server to your LangGraph pipeline, configure it like this:


("chatgpt-mcp", ["python3", "server.py", "--oneshot"], "tools/discover", "tools/call")

βš™οΈ MCP Server Config Example

Here’s how to configure the server using an mcpServers JSON config:


{
  "mcpServers": {
    "chatgpt": {
      "command": "python3",
      "args": [
        "server.py",
        "--oneshot"
      ],
      "env": {
        "OPENAI_API_KEY": "<YOUR_OPENAI_API_KEY>"
      }
    }
  }
}

πŸ” Explanation

"command": Runs the script with Python

"args": Enables one-shot stdin/stdout mode

"env": Injects your OpenAI key securely

🌍 Environment Setup

Create a .env file (auto-loaded with python-dotenv) or export the key manually:


OPENAI_API_KEY=your-openai-key

Or:


export OPENAI_API_KEY=your-openai-key

πŸ“¦ Dependencies

Installed during the Docker build:

openai

requests

python-dotenv

πŸ“ Project Structure

.
β”œβ”€β”€ Dockerfile        # Docker build for the MCP server
β”œβ”€β”€ server.py         # Main stdio server implementation
└── README.md         # You're reading it!

πŸ” Security Notes

Never commit .env files or API keys.

Store secrets in secure environment variables or secret managers.

Quick Setup
Installation guide for this server

Install Package (if required)

uvx chatGPT_MCP

Cursor configuration (mcp.json)

{ "mcpServers": { "automateyournetwork-chatgpt-mcp": { "command": "uvx", "args": [ "chatGPT_MCP" ] } } }