Python is a high-level, interpreted, general-purpose programming language.
Bi-Temporal Knowledge Graph MCP Server
A production-ready MCP (Model Context Protocol) server that combines a sophisticated bi-temporal knowledge graph with dynamic automation tool generation. Save facts with full temporal tracking, extract entities using AI, and generate custom automation tools on-the-fly from database configurations.
🎯 Build intelligent AI agents with persistent memory that understands time and context
Architecture
This server uses a modular architecture:
- main.py - The main orchestrator that initializes FastMCP, registers core memory tools, and manages the complete server lifecycle
- memory.py - Bi-temporal Graphiti memory implementation with FalkorDB for knowledge graph storage
- tools.py - Container for automation tools with webhook execution utilities
⭐ Star This Repo
If you find this project useful, please give it a star! It helps others discover the project and motivates continued development.

🔗 Links
- 🎁 Get Started - Ready in 5 minutes
- 🎥 Video Tutorial - Watch how to set it up
- ❓ FAQs - Common questions answered
- 🐛 Report Bugs - Found an issue?
- 🆕 Request Features - Have an idea?
Resources
- 💬 Community - High Ticket AI Builders community
- 📚 Full Documentation - Complete guide
- 🚀 Deployment Guide - Deploy anywhere
- 🧪 Examples - Interactive scenarios
📑 Table of Contents
- Features
- How It Works
- Screenshots
- Video Tutorial
- Quick Start
- Creating Automation Tools
- Use Cases
- FAQ
- Changelog
- Support
- License
✨ Features
🧠 Bi-Temporal Knowledge Graph
- Smart Memory: Automatically tracks when facts were created AND when they became true in reality
- Conflict Resolution: When you move locations or change jobs, old facts are automatically invalidated
- Time Travel Queries: Ask "Where did John live in March 2024?" and get accurate historical answers
- Session Tracking: Maintains context across conversations with automatic cleanup
🤖 AI-Powered Entity Extraction
- Natural Language Understanding: Just tell it in plain English - "Alice moved to San Francisco and started working at Google"
- Automatic Relationship Discovery: AI extracts entities and relationships without manual input
- OpenAI Integration: Uses GPT-4 for intelligent entity extraction
- Graceful Degradation: Works without AI - just add facts manually
🛠️ Dynamic Tool Generator
- Flexible Configuration: Define webhook configurations easily
- Auto-Generate Code: Automatically creates Python functions from your configs
- Single & Multi-Webhook: Execute one webhook or fire multiple in parallel
- Hot Reload: New tools available instantly without restarting
🚀 Production Ready
- Docker Support: Complete docker-compose setup included
- Replit Optimized: Built specifically for Replit Autoscale environments
- Resource Management: Automatic session cleanup and connection pooling
- Health Checks: Built-in monitoring and status endpoints
- 100% Privacy-Friendly: Your data stays in your database
🎬 How It Works
┌─────────────────────────────────────────────────────────┐
│ 1. Natural Language Input │
│ "Bob moved to NYC and joined Google as a PM" │
└────────────────┬────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ 2. AI Entity Extraction (OpenAI) │
│ • Bob -> lives in -> NYC │
│ • Bob -> works at -> Google │
│ • Bob -> has role -> PM │
└────────────────┬────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ 3. Bi-Temporal Storage (FalkorDB) │
│ • Fact: Bob works at Google │
│ • created_at: 2024-12-19T10:00:00Z │
│ • valid_at: 2024-12-19T10:00:00Z │
│ • invalid_at: null (still true) │
└────────────────┬────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ 4. Query Anytime │
│ • "Where does Bob work now?" → Google │
│ • "What was Bob's job history?" → All past jobs │
│ • "Where did Bob live in 2023?" → Historical data │
└─────────────────────────────────────────────────────────┘
📸 Screenshots
Memory in Action

AI Entity Extraction

Dynamic Tool Generation

Temporal Queries

🎥 Video Tutorial
Watch the complete setup and usage guide:
Topics covered:
- Installation & setup (0:00)
- Adding your first facts (2:30)
- Using AI entity extraction (5:15)
- Creating automation tools (8:45)
- Temporal queries (12:20)
- Deployment to production (15:00)
🚀 Quick Start
Option 1: Docker Compose (Recommended)
# 1. Download and extract
wget https://github.com/YOUR_USERNAME/bitemporal-mcp-server/archive/main.zip
unzip main.zip
cd bitemporal-mcp-server-main
# 2. Configure
echo "OPENAI_API_KEY=sk-your-key" > .env
# 3. Start everything (FalkorDB + MCP Server)
docker-compose up -d
# 4. Verify it's running
curl http://localhost:8080/health
That's it! 🎉 Your server is now running at http://localhost:8080/sse
Option 2: Python (Local Development)
# 1. Install dependencies
pip install -r requirements.txt
# 2. Configure
cp .env.example .env
# Edit .env with your settings
# 3. Start FalkorDB (Docker)
docker run -d -p 6379:6379 falkordb/falkordb:latest
# 4. Run the server
python main.py
Option 3: One-Click Deploy
🛠️ Creating Automation Tools
Overview
The tool generator reads webhook configurations and automatically creates MCP tools. Here's how:
Step 1: Define Your Webhook in Automation Engine OS

- Go to Automation Engine OS
- Create a new webhook configuration
- Define fields and parameters
- Save your configuration
Step 2: Generate the MCP Tool
# Via MCP protocol or directly in Python
await generate_tool_from_db(
user_id="your_user_id",
item_name="Slack Notification",
item_type="single" # or "multi" for multiple webhooks
)
Step 3: Use Your New Tool
# Your tool is now available!
await slack_notification(
message="Deployment completed!",
channel="#devops"
)
Example: Single Webhook Tool
Database Configuration:
{
"name": "Send Email",
"url": "https://api.example.com/send-email",
"template_fields": {
"to": {"type": "str", "required": true},
"subject": {"type": "str", "required": true},
"body": {"type": "str", "required": true}
}
}
Generated Tool:
@mcp.tool()
async def send_email(to: str, subject: str, body: str):
"""Send an email via webhook."""
# Automatically generated code
Example: Multi-Webhook Tool (Parallel Execution)
Database Configuration:
{
"name": "Broadcast Alert",
"webhooks": [
{"url": "https://hooks.slack.com/...", "data": {"message": "..."}},
{"url": "https://discord.com/api/webhooks/...", "data": {"content": "..."}},
{"url": "https://api.email.com/send", "data": {"subject": "..."}}
]
}
Result: All three webhooks fire simultaneously using asyncio.gather!
💡 Use Cases
Personal Knowledge Management
Track your life events, relationships, and locations with full history:
await add_message(
"I met Sarah at the tech conference. She works at OpenAI.",
session_id="my_life"
)
# Later: "Where did I meet Sarah?" → "At the tech conference"
Customer Relationship Management
Monitor customer interactions with automatic conflict resolution:
await add_fact("CustomerA", "status", "premium")
# Automatically invalidates previous "status" facts
# Query history: "What was CustomerA's status in January?"
AI Agent Memory
Give your AI agents persistent, queryable memory:
# Agent learns from conversation
await add_message(
"User prefers morning meetings and uses Slack",
session_id="agent_123"
)
# Agent recalls later: "What are the user's preferences?"
Workflow Automation
Combine knowledge with actions:
# When fact changes, trigger automation
if customer_upgraded_to_premium:
await notify_sales_team(customer_name=name)
await update_crm(customer_id=id, tier="premium")
❓ Frequently Asked Questions
Q: Does this require OpenAI?
A: No! OpenAI is optional for AI entity extraction. You can add facts manually without it.
Q: Can I use this with Claude Desktop?
A: Yes! Add the server URL to your claude_desktop_config.json:
{
"mcpServers": {
"knowledge-graph": {
"url": "http://localhost:8080/sse"
}
}
}
Q: How do I query historical data?
A: Use the query_at_time tool:
await query_at_time(
timestamp="2024-01-15T00:00:00Z",
entity_name="John"
)
Q: Can I deploy this to production?
A: Absolutely! See DEPLOYMENT.md for guides on:
- Replit Autoscale
- Railway
- Render
- Fly.io
- Docker
- VPS
Q: How does fact invalidation work?
A: When you add a fact about location or employment, the system automatically finds previous facts of the same type and marks them as invalid_at: current_time. Your query results only show current facts unless you specifically request historical data.
Q: Can I create bulk download tools?
A: Yes! Create a multi-webhook template with multiple endpoints, and the tool generator will create a function that fires all webhooks in parallel.
Q: Is my data secure?
A: Yes! Everything runs in your infrastructure. No data is sent anywhere except:
- OpenAI (only if you use entity extraction)
- Your configured webhooks (only when you call them)
Q: How much does it cost to run?
A: Free for self-hosting! Only costs:
- FalkorDB hosting (free tier available)
- OpenAI API usage (optional, ~$0.001 per extraction)
📋 Changelog
[1.0.0] - 2024-12-19
Added
- ✅ Full bi-temporal tracking (created_at, valid_at, invalid_at, expired_at)
- ✅ Smart conflict resolution for location and employment changes
- ✅ Session-aware episodic memory with 30-minute TTL
- ✅ OpenAI-powered entity extraction from natural language
- ✅ Dynamic tool generator for automation workflows
- ✅ Single webhook tool template
- ✅ Multi-webhook parallel execution template
- ✅ Docker and Docker Compose support
- ✅ Replit Autoscale optimization
- ✅ Background cleanup manager
- ✅ Comprehensive documentation and examples
Supported Features
| Feature | Status | Notes | |---------|--------|-------| | Bi-Temporal Tracking | ✅ | Full implementation | | AI Entity Extraction | ✅ | OpenAI GPT-4 | | Smart Invalidation | ✅ | Location, employment, relationships | | Session Management | ✅ | Auto-cleanup after 30 min | | Dynamic Tools | ✅ | Single & multi-webhook | | Parallel Webhooks | ✅ | asyncio.gather | | Docker Support | ✅ | Complete stack included | | Health Checks | ✅ | Built-in monitoring |
🆘 Support
Need Help?
- Check Documentation: Start with QUICKSTART.md
- Join Community: High Ticket AI Builders - Free access!
- Watch Tutorial: Video Guide
- Report Bugs: GitHub Issues
Creating Tools in Automation Engine OS

Need help setting up automation tools? Join our community for:
- 📹 Video tutorials
- 🤝 1-on-1 support
- 💡 Example configurations
- 🎓 Best practices
👉 Access the tool and community for free
🤝 Contributing
Contributions are welcome! Areas for improvement:
- 🔍 Additional temporal query operators
- 🧠 Enhanced entity extraction prompts
- 🔧 More webhook authentication methods
- 📊 Performance optimizations
- 🌐 Additional deployment platforms
- 📖 More examples and tutorials
To contribute:
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature) - Commit your changes (
git commit -m 'Add some AmazingFeature') - Push to the branch (
git push origin feature/AmazingFeature) - Open a Pull Request
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
TL;DR: You can use this commercially, modify it, distribute it. Just keep the license notice.
🙏 Acknowledgments
- Built with FastMCP
- Powered by FalkorDB
- AI features via OpenAI
- Inspired by the High Ticket AI Builders community
⭐ Star History
📞 Connect
- 💬 Community: High Ticket AI Builders
- 📅 Want this implemented for your business? Book a Meeting
Built with ❤️ for the High Ticket AI Builders ecosystem
If this project helps you, please consider giving it a ⭐!