MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

An autonomous, Agentic AI application that manages your GitHub account through natural language.

Created 2/25/2026
Updated about 21 hours ago
Repository documentation and setup instructions

Autonomous MCP GitHub Agent

An autonomous, Agentic AI application that manages your GitHub account through natural language.

Built using the cutting-edge Model Context Protocol (MCP), this project connects an ultra-fast LLM (LLaMA-3 via Groq) to the GitHub API. Instead of just chatting, the AI acts as an autonomous agent: it reasons, remembers conversation context, and executes complex tool calls (like creating repositories, managing files, and searching issues) directly on your GitHub account.

Demo (Note: Replace with your actual GIF path)

Key Features

  • Agentic Reasoning: The AI decides which tools to use and when to use them based on your natural language prompt.
  • Contextual Memory: The agent remembers previous steps in the conversation (e.g., "Create a repo called X... now add a file to that repo").
  • Model Context Protocol (MCP): Utilizes the official @modelcontextprotocol/server-github via local stdio to securely bridge the LLM and GitHub without exposing your token to the internet.
  • Advanced Prompt Engineering: Implements strict XML-based tool calling rules and edge-case handling (e.g., automatically creating .gitkeep files since Git doesn't support empty directories).
  • Interactive UI: A sleek, real-time chat interface built with Streamlit, featuring an action-tracking dropdown to show the AI's internal reasoning and tool execution steps.

Tech Stack

  • AI & LLM: LLaMA-3.3-70B-versatile via Groq API (using the OpenAI Python SDK)
  • Architecture: Agentic AI, Tool Calling, Model Context Protocol (MCP)
  • Backend: Python asyncio, Regex parsing
  • Frontend: Streamlit
  • Environment: Node.js (npx to run the MCP server)

Prerequisites

Before running this project, ensure you have the following installed on your machine:

  1. Python 3.10+
  2. Node.js & npm (Required to run the MCP server dynamically via npx)
  3. A Groq API Key (Free at console.groq.com)
  4. A GitHub Personal Access Token (PAT) with repo (Read & Write) permissions.

Installation & Setup

1. Clone the repository

git clone https://github.com/BouchraBenGhazala/Autonomous MCP Github Agent.git
cd your-repo-name

2. Create and activate a virtual environment (Recommended)

python -m venv env
# On Windows:
env\Scripts\activate
# On macOS/Linux:
source env/bin/activate

3. Install Python dependencies

pip install -r requirements.txt

4. Configure Environment Variables Create a .env file in the root directory of the project and add your API keys:

GITHUB_PERSONAL_ACCESS_TOKEN=ghp_YOUR_GITHUB_TOKEN_HERE
GROQ_API_KEY=gsk_YOUR_GROQ_API_KEY_HERE

(Make sure to add .env to your .gitignore file so you don't accidentally publish your secrets!)


Usage

Start the Streamlit application by running the following command in your terminal:

streamlit run app.py

This will open the web interface in your default browser.

Example Prompts to try:

  • "What are my latest repositories?"
  • "Create a new private repository called 'mcp-test-project'."
  • "Add a new file named README.md to the 'mcp-test-project' repository saying 'Hello World'."
  • "Search for the 3 latest issues in the langchain-ai/langchain repository."

How it works (Under the hood)

  1. User Input: You ask the agent a question via the Streamlit UI.
  2. System Prompt & Schema: The Python client fetches the available tools from the local Node.js MCP Server and passes them to LLaMA-3 as a JSON schema.
  3. LLM Reasoning: The model analyzes the request and generates an XML-formatted <tool_call> block containing the exact tool name and required arguments.
  4. Execution: Python parses the XML, triggers the local MCP server via asynchronous pipes (stdio), and the server securely communicates with the GitHub API.
  5. Analysis & Response: The raw JSON result from GitHub is fed back into the LLM, which translates it into a clean, human-readable response displayed in the chat.
Quick Setup
Installation guide for this server

Installation Command (package not published)

git clone https://github.com/BouchraBenGhazala/MCP-Github-Agent
Manual Installation: Please check the README for detailed setup instructions and any additional dependencies required.

Cursor configuration (mcp.json)

{ "mcpServers": { "bouchrabenghazala-mcp-github-agent": { "command": "git", "args": [ "clone", "https://github.com/BouchraBenGhazala/MCP-Github-Agent" ] } } }