MCP server by n24q02m
better-mem0-mcp
[!CAUTION] PROJECT DISCONTINUED — The replacement solution is being developed at mnemo-mcp.
Self-hosted MCP Server for AI memory with PostgreSQL (pgvector).
Discontinuation Notice
The better-mem0-mcp project has been discontinued as of February 2026.
The replacement solution is being developed at mnemo-mcp — a lightweight, self-hosted MCP server for persistent AI memory.
Legacy Documentation
[!NOTE] The content below is kept for reference. The final version on PyPI/Docker still works but will not receive any new updates.
Features
- Self-hosted PostgreSQL - Your data stays with you (Neon/Supabase free tier supported)
- Graph Memory - SQL-based relationship tracking alongside vector memory
- Multi-provider LLM - Gemini, OpenAI, Anthropic, Groq, DeepSeek, Mistral
- Fallback chains - Multi-key per provider + multi-model fallback
- Zero manual setup - Just
DATABASE_URL+API_KEYS
Quick Start (Legacy)
1. Get Prerequisites
- Database: Neon or Supabase (free tier works)
- API Key: Any supported provider (Google AI Studio is free)
2. Add to mcp.json
uvx (Recommended)
{
"mcpServers": {
"better-mem0": {
"command": "uvx",
"args": ["better-mem0-mcp@latest"],
"env": {
"DATABASE_URL": "postgresql://user:pass@xxx.neon.tech/neondb?sslmode=require",
"API_KEYS": "GOOGLE_API_KEY:AIza..."
}
}
}
}
Docker
{
"mcpServers": {
"better-mem0": {
"command": "docker",
"args": ["run", "-i", "--rm", "-e", "DATABASE_URL", "-e", "API_KEYS", "n24q02m/better-mem0-mcp:latest"],
"env": {
"DATABASE_URL": "postgresql://...",
"API_KEYS": "GOOGLE_API_KEY:AIza..."
}
}
}
}
3. Done!
Ask your AI: "Remember that I prefer dark mode and use FastAPI"
Configuration (Legacy)
| Variable | Required | Description |
|----------|----------|-------------|
| DATABASE_URL | Yes | PostgreSQL with pgvector extension |
| API_KEYS | Yes | ENV_VAR:key pairs, comma-separated |
| LLM_MODELS | No | Model fallback chain |
| EMBEDDER_MODELS | No | Embedding model chain |
Supported LiteLLM Providers
Use environment variable names from LiteLLM docs:
GOOGLE_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY, GROQ_API_KEY, etc.
Single provider:
API_KEYS=GOOGLE_API_KEY:AIza...
Multi-key with fallback:
API_KEYS=GOOGLE_API_KEY:AIza-1,GOOGLE_API_KEY:AIza-2,OPENAI_API_KEY:sk-xxx
LLM_MODELS=gemini/gemini-3-flash-preview,openai/gpt-4o-mini
EMBEDDER_MODELS=gemini/gemini-embedding-001,openai/text-embedding-3-small
Defaults
| Setting | Default |
|---------|---------|
| LLM_MODELS | gemini/gemini-3-flash-preview |
| EMBEDDER_MODELS | gemini/gemini-embedding-001 |
Tools (Legacy)
| Tool | Description |
|------|-------------|
| memory | Memory operations: add, search, list, delete |
| help | Get full documentation for tools |
Usage Examples
{"action": "add", "content": "I prefer TypeScript over JavaScript"}
{"action": "search", "query": "programming preferences"}
{"action": "list"}
{"action": "delete", "memory_id": "abc123"}
Build from Source (Legacy)
git clone https://github.com/n24q02m/better-mem0-mcp
cd better-mem0-mcp
# Setup (requires mise: https://mise.jdx.dev/)
mise run setup
# Run
uv run better-mem0-mcp
Requirements: Python 3.13+
License
MIT - See LICENSE