MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

Robust async Python REPL MCP with variable persistence, pkg installation, bacckground workers, send input, and more

Created 3/15/2026
Updated about 7 hours ago
Repository documentation and setup instructions

mcp-async-repl

Async Python REPL + Shell execution for MCP. Persistent state, background jobs, and interactive input() bridging — in a single server.

The only MCP server that combines a crash-isolated persistent Python worker, true async shell execution, stdin interaction, and job lifecycle management in one package.

What makes this different

Every other Python REPL MCP server runs exec() in the server process and blocks on long-running code. This one doesn't.

  • Crash-isolated Python worker — runs in a separate subprocess with a JSON protocol over stdin/stdout. If your code segfaults, the MCP server stays alive.
  • True async from the start — both shell and Python execution return job IDs immediately if they take >2s. Poll with status tools. Never blocks the server.
  • input() bridging — user code can call input("prompt") and the server surfaces a waiting_input status. Feed text via send_python_input() to unblock it. No other REPL MCP has this.
  • Persistent state — variables survive across execute_python() calls until you restart the worker.
  • Shell + REPL in one server — async shell commands with stdin support alongside the persistent Python worker. No need for two separate MCP servers.
  • Auto venv via uv — creates and manages a .venv automatically. No Docker required.

Install

# Run directly with uvx (no install needed)
uvx mcp-async-repl

# Or install from PyPI
pip install mcp-async-repl

Configuration

Add to your MCP client config:

{
  "mcpServers": {
    "async-repl": {
      "command": "uvx",
      "args": ["mcp-async-repl"]
    }
  }
}

With a custom venv path:

{
  "mcpServers": {
    "async-repl": {
      "command": "uvx",
      "args": ["mcp-async-repl"],
      "env": {
        "REPL_VENV_DIR": "/path/to/your/venv"
      }
    }
  }
}

Claude Desktop

Add to claude_desktop_config.json:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json

Claude Code

claude mcp add async-repl uvx mcp-async-repl

Cursor

Add to .cursor/mcp.json or ~/.cursor/mcp.json.

Tools (9 total)

Python REPL

| Tool | Description | |------|-------------| | execute_python | Execute code in a persistent worker. Returns immediately if >2s. Supports input(). | | send_python_input | Feed text to a pending input() call in the worker. | | check_python_status | Poll status, stdout, stderr, and input prompts for the current job. | | list_variables | Inspect all variables in the persistent session. | | restart_python_worker | Kill and restart the worker. Clears all state. |

Async Shell

| Tool | Description | |------|-------------| | execute_command_async | Run any shell command in the background. Returns a job ID. | | send_command_input | Send stdin text to a running shell job (e.g. y\n, passwords). | | check_job_status | Poll status and output of a background shell job. |

Package Management

| Tool | Description | |------|-------------| | install_package_async | Install a pip package via uv in the background. |

Example session

Full 8-step test run, executed by an LLM agent via Mux-Swarm:

> Test the Python REPL MCP by running these steps in order:
>
> 1. Execute: x = 42; print(f"x is {x}")
> 2. Execute: y = x * 2; print(f"y is {y}")
> 3. List variables
> 4. Execute: name = input("Enter your name: ")
> 5. Send python input: Jonathan
> 6. Execute: print(f"Hello {name}, x={x}")
> 7. Execute: import time; [print(f"tick {i}") for i in range(5) if not time.sleep(1)]
> 8. Restart the python worker, then list variables

── MuxAgent ────────────────────────────────────────────────────────────

## Step 1: Execute initial code with state and stdout
✅ Step 1 passed — State persisted, stdout captured: `x is 42`

## Step 2: Verify x persisted from step 1
✅ Step 2 passed — `x` persisted correctly: `y is 84`

## Step 3: List variables
✅ Step 3 passed — Variables `x` and `y` are present

## Step 4: Execute code with input() — should return waiting_input status
✅ Step 4 passed — Correctly returned `waiting_input` status with prompt: `'Enter your name: '`

## Step 5: Send python input to unblock
✅ Step 5 passed — Input delivered, status changed to `completed`

## Step 6: Verify name persisted alongside x and y
✅ Step 6 passed — All variables persisted: `Hello Jonathan, x=42`

## Step 7: Execute long-running code — should timeout to background
✅ Step 7a passed — Timed out to background as expected.
✅ Step 7b passed — Background job completed with all ticks: `tick 0` through `tick 4`

## Step 8: Restart python worker, then list variables
✅ Step 8 passed — Worker restarted, variables cleared (empty list returned)

───────────────────────────────────────────────────────────────────────

Summary: All Tests Passed ✅

| Step | Test                                      | Result                            |
|------|-------------------------------------------|-----------------------------------|
| 1    | State persistence + stdout capture        | ✅ `x is 42`                       |
| 2    | Variable persistence across calls         | ✅ `y is 84` (x=42 persisted)      |
| 3    | List variables                            | ✅ `x`, `y` found                  |
| 4    | input() returns waiting_input             | ✅ Status: `waiting_input`          |
| 5    | Send input unblocks                       | ✅ Status: `completed`              |
| 6    | All variables persist together            | ✅ `Hello Jonathan, x=42`           |
| 7    | Long-running → background + status check  | ✅ All ticks captured               |
| 8    | Restart clears state                      | ✅ Empty variable list              |

Architecture

┌─────────────────────────────────────────────────────┐
│                   MCP Server                        │
│               (mcp-async-repl)                      │
│                                                     │
│  ┌─────────────────┐    ┌────────────────────────┐  │
│  │  Shell Jobs      │    │  Python Worker Manager │  │
│  │                  │    │                        │  │
│  │  execute_command │    │  ┌──────────────────┐  │  │
│  │  send_input      │    │  │ Worker Subprocess│  │  │
│  │  check_status    │    │  │                  │  │  │
│  │                  │    │  │  Main Thread:    │  │  │
│  │  Each command    │    │  │   stdin loop     │  │  │
│  │  runs as an      │    │  │   (never blocks) │  │  │
│  │  async subprocess│    │  │                  │  │  │
│  │  with job ID     │    │  │  Exec Thread:    │  │  │
│  │                  │    │  │   runs exec()    │  │  │
│  │                  │    │  │   in daemon      │  │  │
│  │                  │    │  │   thread         │  │  │
│  │                  │    │  │                  │  │  │
│  │                  │    │  │  Input Queue:    │  │  │
│  │                  │    │  │   bridges input()│  │  │
│  │                  │    │  │   calls to host  │  │  │
│  └─────────────────┘    │  └──────────────────┘  │  │
│                          └────────────────────────┘  │
└─────────────────────────────────────────────────────┘

The Python worker runs in a separate subprocess communicating via JSON over stdin/stdout. Inside the worker:

  • Main thread reads commands from stdin and dispatches them. Never blocks.
  • Exec thread runs exec() in a daemon thread so input() calls don't starve the command loop.
  • Input queue bridges send_python_input() from the host to builtins.input() inside user code. When code calls input("prompt"), the worker emits an input_request message and blocks on the queue. The host pushes text onto the queue to unblock it.

This means:

  • Long-running code doesn't block the MCP server
  • input() works without a PTY
  • A crash in user code kills the worker, not the server
  • The worker can be restarted cleanly at any time

Environment variables

| Variable | Default | Description | |----------|---------|-------------| | REPL_VENV_DIR | .venv | Path to the virtual environment. Created automatically via uv if it doesn't exist. |

How it compares

| Feature | mcp-async-repl | mcp-python (hdresearch) | mcp-python-repl (soufiane-aazizi) | mcp-background-job (dylan-gluck) | |---------|:-:|:-:|:-:|:-:| | Persistent Python state | ✅ | ✅ | ✅ | ❌ | | Async (non-blocking) execution | ✅ | ❌ | ❌ | ✅ | | Background job polling | ✅ | ❌ | ❌ | ✅ | | input() bridging | ✅ | ❌ | ❌ | ❌ | | Shell command execution | ✅ | ❌ | ❌ | ✅ | | Shell stdin interaction | ✅ | ❌ | ❌ | ✅ | | Crash-isolated worker | ✅ | ❌ | ❌ | N/A | | Variable introspection | ✅ | ✅ | ✅ | ❌ | | Package install tool | ✅ | ✅ | ❌ | ❌ | | Auto venv (no Docker) | ✅ | ❌ | ✅ | ✅ |

Requirements

  • Python >= 3.10
  • uv (for venv creation and package installs)
  • mcp (FastMCP)

License

Apache License 2.0

Author

Jonathan Bankston (@jnotsknab)

Built for Mux-Swarm — a CLI-native multi-agent runtime.

Quick Setup
Installation guide for this server

Install Package (if required)

uvx mcp-async-repl

Cursor configuration (mcp.json)

{ "mcpServers": { "jnotsknab-mcp-async-repl": { "command": "uvx", "args": [ "mcp-async-repl" ] } } }