MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

L
Local Memory MCP

Local-first personal RAG memory system for AI assistants via MCP. Stores text-first chunks with lightweight metadata, supports versioned updates and retrieval, and runs fully self-hosted with user-controlled data. Designed for practical context continuity, not rigid schemas or SaaS workflows.

Created 3/12/2026
Updated about 19 hours ago
Repository documentation and setup instructions

Local Memory MCP v1

What Is This?

Local Memory MCP v1 is a local-first personal RAG memory system for AI assistants.
It stores text chunks plus lightweight metadata in a local ChromaDB, then exposes MCP tools so a new LLM session can quickly recover user context.

This project is built for technical users who want to self-host and control their own data. It is not a SaaS product.

AIX Philosophy

AIX (AI eXperience) means designing for how LLMs actually consume context:

  • Prefer clear text chunks over rigid document schemas.
  • Keep metadata minimal but useful: timestamps, confidence, supersedes links, deprecation flags.
  • Preserve history with version chains instead of destructive overwrites.
  • Return warning-rich tool responses so the model can self-correct write behavior.

The goal is practical retrieval quality and reliable AI behavior, not perfect human taxonomies.

How It Works

[Assistant via MCP Client]
            |
            v
[run_mcp_v1_stdio.py | run_mcp_v1_http_sse.py]
            |
            v
      [src/mcp_server_v1.py]
        /          |          \
       v           v           v
[src/vector_store.py] [src/reconciliation.py] [src/health_monitor.py]
       |                   |
       v                   v
 [Local ChromaDB]   [Reconciliation Log Collection]

Write path:

  1. store or update writes a chunk.
  2. Reconciliation checks for overlap/conflict signals.
  3. The system returns warnings/self-heal hints when writes look risky.

Read path:

  1. search runs semantic retrieval.
  2. Ranking blends similarity with lightweight lexical/recency signals.
  3. Deprecated chunks stay hidden by default unless explicitly requested.

Features

Current v1 capabilities:

  • MCP tools for store, search, update, delete, get_chunk, get_evolution_chain.
  • Versioned updates (strategy="version") with supersedes chains.
  • Soft delete by default (history retained), optional hard delete.
  • Heuristic reconciliation and conflict logging.
  • Warning-first write responses with structured warnings[] and self-heal fields.
  • Health checks for oversized chunks and unresolved conflicts.
  • Local backup/restore commands for the persisted vector DB.
  • Stdio transport and SSE transport for MCP clients.
  • Optional auth modes for SSE: none (local-only), bearer, or oauth.

Quickstart

Use one of the two paths below. If you are unsure, choose Path A (Docker).

Path A (Recommended Easiest): Docker

Best for most users. This avoids local Python/venv setup.

Prerequisite:

  • Docker Desktop (or Docker Engine) is installed and running.
  1. Clone the repo:
git clone <your-repo-url>
cd local-memory-mcp
  1. Start the service:
docker compose up --build -d
  1. Verify endpoints:
  • http://localhost:8000/mcp
  • http://localhost:8000/sse
  • http://localhost:8000/messages/
  • http://localhost:8000/health
  1. Stop when finished:
docker compose down

For config mounts, volumes, and stdio-in-Docker details, see docs/docker.md.

Path B: Local Python Install

Use this when you want direct local Python control and stdio-first desktop workflows.

Prerequisites:

  • Python 3.11+
  • pip
  • Windows PowerShell or a POSIX shell
  1. Clone and install dependencies:
git clone <your-repo-url>
cd local-memory-mcp
python -m venv .venv

Windows:

.\.venv\Scripts\Activate.ps1
pip install -r requirements.txt

macOS/Linux:

source .venv/bin/activate
pip install -r requirements.txt
  1. Ensure the embedding model is available locally (one-time):
python -c "from sentence_transformers import SentenceTransformer; SentenceTransformer('all-MiniLM-L6-v2')"
  1. Optional: create a local config override (kept out of git):

Windows PowerShell:

Copy-Item config.example.json config.json

macOS/Linux:

cp config.example.json config.json

If you skip this, built-in defaults are used (local-first, MCP_AUTH_MODE=none).

  1. Run a direct local verification (no MCP client required yet):
python examples/try_local.py

This performs one write and one retrieval, and creates ./chroma_db automatically on first write.

  1. Run MCP over stdio (recommended starting point for local runtime):
python run_mcp_v1_stdio.py
  1. Optional: run SSE server:
python run_mcp_v1_http_sse.py
  1. Optional: expose SSE through an external relay workspace:
  • Keep relay scripts/config in a separate folder outside this repository.
  • Point the relay to http://localhost:8000.

Example Workflow

1. Store memory

tool: store
input: {
  "text": "Example user context: weekday focus block is 6:30-9:00 AM as the current default schedule."
}

2. Retrieve memory

tool: search
input: {
  "query": "current deep work schedule",
  "top_k": 5
}

3. Bootstrap a new LLM session

Use a focused retrieval pass, then summarize:

search("current work schedule and constraints")
search("active priorities this month")
search("current preferences and hard boundaries")

Then synthesize only active, non-deprecated chunks into a short session brief for the new model instance.

More sample chunks and retrieval flows are in examples/.

Local Data And Git Hygiene

  • Real memory data is stored in ./chroma_db by default and is generated locally at runtime.
  • Local DB and backup folders are ignored by git (chroma_db/, backups/, and common DB file extensions).
  • config.json is treated as local machine config and is ignored by git.
  • Keep commit-safe templates in config.example.json and .env.example.

Privacy And Deployment

  • Default posture is local-first and user-controlled.
  • Data is stored in local ChromaDB files under the configured persist directory.
  • The server itself does not require a cloud backend.
  • Optional remote access is available through user-managed tunneling.
  • Do not commit real secrets. Use local config/env values for auth credentials.

Open Source Status

This is an early but usable v1 release.

  • Stable enough for personal self-hosted workflows.
  • APIs and internal heuristics may still change between minor versions.
  • Some rough edges are documented in docs/limitations.md.

Build Provenance

This v1 release was developed with an AI-assisted coding workflow. Product direction, constraints, and final implementation decisions were led by Patrick Tobey.

Documentation

Contributing

See CONTRIBUTING.md.

License

MIT. See LICENSE.

Quick Setup
Installation guide for this server

Install Package (if required)

uvx local-memory-mcp

Cursor configuration (mcp.json)

{ "mcpServers": { "ptobey-local-memory-mcp": { "command": "uvx", "args": [ "local-memory-mcp" ] } } }