MCP Servers

A collection of Model Context Protocol servers, templates, tools and more.

B
Better Mem0 MCP

MCP server by n24q02m

Created 1/2/2026
Updated about 1 month ago
Repository documentation and setup instructions

better-mem0-mcp

[!CAUTION] PROJECT DISCONTINUED — The replacement solution is being developed at mnemo-mcp.

Self-hosted MCP Server for AI memory with PostgreSQL (pgvector).

PyPI Docker License: MIT


Discontinuation Notice

The better-mem0-mcp project has been discontinued as of February 2026.

The replacement solution is being developed at mnemo-mcp — a lightweight, self-hosted MCP server for persistent AI memory.


Legacy Documentation

[!NOTE] The content below is kept for reference. The final version on PyPI/Docker still works but will not receive any new updates.

Features

  • Self-hosted PostgreSQL - Your data stays with you (Neon/Supabase free tier supported)
  • Graph Memory - SQL-based relationship tracking alongside vector memory
  • Multi-provider LLM - Gemini, OpenAI, Anthropic, Groq, DeepSeek, Mistral
  • Fallback chains - Multi-key per provider + multi-model fallback
  • Zero manual setup - Just DATABASE_URL + API_KEYS

Quick Start (Legacy)

1. Get Prerequisites

2. Add to mcp.json

uvx (Recommended)

{
  "mcpServers": {
    "better-mem0": {
      "command": "uvx",
      "args": ["better-mem0-mcp@latest"],
      "env": {
        "DATABASE_URL": "postgresql://user:pass@xxx.neon.tech/neondb?sslmode=require",
        "API_KEYS": "GOOGLE_API_KEY:AIza..."
      }
    }
  }
}

Docker

{
  "mcpServers": {
    "better-mem0": {
      "command": "docker",
      "args": ["run", "-i", "--rm", "-e", "DATABASE_URL", "-e", "API_KEYS", "n24q02m/better-mem0-mcp:latest"],
      "env": {
        "DATABASE_URL": "postgresql://...",
        "API_KEYS": "GOOGLE_API_KEY:AIza..."
      }
    }
  }
}

3. Done!

Ask your AI: "Remember that I prefer dark mode and use FastAPI"

Configuration (Legacy)

| Variable | Required | Description | |----------|----------|-------------| | DATABASE_URL | Yes | PostgreSQL with pgvector extension | | API_KEYS | Yes | ENV_VAR:key pairs, comma-separated | | LLM_MODELS | No | Model fallback chain | | EMBEDDER_MODELS | No | Embedding model chain |

Supported LiteLLM Providers

Use environment variable names from LiteLLM docs: GOOGLE_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY, GROQ_API_KEY, etc.

Single provider:

API_KEYS=GOOGLE_API_KEY:AIza...

Multi-key with fallback:

API_KEYS=GOOGLE_API_KEY:AIza-1,GOOGLE_API_KEY:AIza-2,OPENAI_API_KEY:sk-xxx
LLM_MODELS=gemini/gemini-3-flash-preview,openai/gpt-4o-mini
EMBEDDER_MODELS=gemini/gemini-embedding-001,openai/text-embedding-3-small

Defaults

| Setting | Default | |---------|---------| | LLM_MODELS | gemini/gemini-3-flash-preview | | EMBEDDER_MODELS | gemini/gemini-embedding-001 |

Tools (Legacy)

| Tool | Description | |------|-------------| | memory | Memory operations: add, search, list, delete | | help | Get full documentation for tools |

Usage Examples

{"action": "add", "content": "I prefer TypeScript over JavaScript"}
{"action": "search", "query": "programming preferences"}
{"action": "list"}
{"action": "delete", "memory_id": "abc123"}
Build from Source (Legacy)
git clone https://github.com/n24q02m/better-mem0-mcp
cd better-mem0-mcp

# Setup (requires mise: https://mise.jdx.dev/)
mise run setup

# Run
uv run better-mem0-mcp

Requirements: Python 3.13+


License

MIT - See LICENSE

Quick Setup
Installation guide for this server

Install Package (if required)

uvx better-mem0-mcp

Cursor configuration (mcp.json)

{ "mcpServers": { "n24q02m-better-mem0-mcp": { "command": "uvx", "args": [ "better-mem0-mcp" ] } } }