raymondclowe/Memory-MCP
If you are the rightful owner of Memory-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Memory-MCP is an intelligent memory management server implementing the Model Context Protocol (MCP) for AI applications and personal knowledge systems.
Memory-MCP
An intelligent memory management server implementing the Model Context Protocol (MCP) for AI applications and personal knowledge systems.
Overview
Memory-MCP is a sophisticated memory-oriented knowledge graph management system designed to store, organize, and retrieve memories, facts, instructions, discussions, and conclusions. It implements a temporally-aware knowledge graph that can be queried and extended by AI models and applications.
π Key Features
- π§ Intelligent Memory Storage: Stores memories as nodes in a knowledge graph with rich metadata
- β° Temporal Awareness: Prioritizes memories based on recency of access and relevance
- π€ Background AI Processing: "Dreamer" mode that discovers connections and creates summaries
- π Simple Client Interface: Intuitive commands for storing and retrieving memories
- π Flexible Transport: Supports both local (stdin/stdout) and remote (HTTP/SSE) communication
- π Knowledge Graph: Dynamic relationship discovery between memories
- ποΈ Admin Interface: Web-based Gradio interface for system management
- π‘ MCP over HTTP: FastMCP server with SSE support for web integration
- π³ Docker Ready: Easy deployment with Docker and Docker Compose
π― Use Cases
- Personal AI assistants with long-term memory
- Organizational knowledge management
- Research and note-taking systems
- Context-aware chatbots
- Documentation and insight discovery
- Project tracking and workflow management
π Prerequisites & System Requirements
System Requirements
- Python: 3.8 or higher
- RAM: Minimum 512MB, recommended 2GB+
- Storage: 100MB for application, additional space for database
- OS: Linux, macOS, Windows (WSL2 recommended for Windows)
Required Tools
- Git (for source installation)
- Python pip (package manager)
- Docker (optional, for containerized deployment)
π Complete Setup & Installation Guide
Method 1: Quick Installation (Recommended)
# 1. Clone the repository
git clone https://github.com/raymondclowe/Memory-MCP.git
cd Memory-MCP
# 2. Create Python virtual environment (recommended)
python -m venv memory-mcp-env
source memory-mcp-env/bin/activate # On Windows: memory-mcp-env\Scripts\activate
# 3. Install dependencies
pip install -r requirements.txt
# 4. Create configuration file
python server.py --create-env
cp .env.example .env
# 5. Test installation
python -c "import memory_core; print('β
Installation successful!')"
Method 2: Development Installation
# 1. Clone repository
git clone https://github.com/raymondclowe/Memory-MCP.git
cd Memory-MCP
# 2. Install in development mode with all dependencies
pip install -e .
pip install -r requirements.txt
# 3. Install development tools
pip install black mypy pytest pytest-asyncio
# 4. Run tests to verify installation
python test_memory_suite.py
Method 3: Docker Installation (Production Ready)
# 1. Clone repository
git clone https://github.com/raymondclowe/Memory-MCP.git
cd Memory-MCP
# 2. Create data directory
mkdir -p data
# 3. Configure environment
cp .env.example .env
# Edit .env file with your settings
# 4. Deploy with Docker Compose
docker-compose up -d
# 5. Verify deployment
docker-compose logs memory-mcp
curl http://localhost:8080/health
βοΈ Configuration Setup
Step 1: Environment Configuration
Create and customize your .env
file:
# Copy example configuration
cp .env.example .env
# Edit configuration
nano .env # or use your preferred editor
Step 2: Basic Configuration Options
# Server Configuration
MEMORY_HOST=0.0.0.0 # Host to bind to (0.0.0.0 for all interfaces)
MEMORY_PORT=8080 # HTTP server port
MEMORY_LOG_LEVEL=INFO # Logging level (DEBUG, INFO, WARNING, ERROR)
# Database Configuration
MEMORY_DB_PATH=memory_graph.db # SQLite database file path
# Gradio Admin Interface
MEMORY_GRADIO_HOST=0.0.0.0 # Admin interface host
MEMORY_GRADIO_PORT=7860 # Admin interface port
MEMORY_GRADIO_SHARE=false # Enable public sharing (security risk)
# Background Processing
MEMORY_DREAMER_ENABLED=true # Enable AI background processing
MEMORY_DREAMER_INTERVAL=300 # Seconds between processing cycles
Step 3: Optional AI Configuration
For enhanced AI features, configure OpenAI integration:
# AI Configuration (Optional)
MEMORY_AI_PROVIDER=openai
MEMORY_AI_API_KEY=sk-your-openai-api-key-here
MEMORY_AI_MODEL=gpt-3.5-turbo
MEMORY_AI_EMBEDDING_MODEL=text-embedding-ada-002
Step 4: Advanced Configuration
# Authentication (Production Security)
MEMORY_AUTH_ENABLED=true
MEMORY_API_KEYS=secret-key-1,secret-key-2,secret-key-3
# Performance Tuning
MEMORY_CACHE_SIZE=1000 # Memory cache size
MEMORY_QUERY_TIMEOUT=30 # Query timeout in seconds
MEMORY_MAX_CONNECTIONS=10 # Max concurrent connections
π― Deployment Modes
Mode 1: MCP Server (Standard Protocol)
For integration with MCP-compatible clients:
# Start MCP server (stdio communication)
python server.py --mcp
# Or use default mode
python server.py
Use Case: Integration with Claude Desktop, other MCP clients
Mode 2: HTTP Server (Web API)
For web applications and HTTP clients:
# Start FastMCP HTTP server
python server.py --rest
# Server available at: http://localhost:8080/mcp
Use Case: Web applications, REST API integration, browser-based clients
Mode 3: Admin Interface
For web-based management and monitoring:
# Start Gradio admin interface
python server.py --admin
# Access at: http://localhost:7860
Use Case: System administration, data visualization, manual memory management
Mode 4: All Services
Run everything together:
# Start all services
python server.py --all
# Available endpoints:
# - MCP over HTTP: http://localhost:8080/mcp
# - Admin Interface: http://localhost:7860
Use Case: Development, testing, single-server deployment
β Installation Verification
Step 1: Basic Functionality Test
# Test core memory functions
python -c "
import asyncio
from memory_core import MemoryCore
async def test():
core = MemoryCore()
health = await core.get_health_status()
print(f'Health Status: {health}')
# Store test memory
memory_id = await core.store_memory('Test memory', {'test': True})
print(f'Stored memory: {memory_id}')
# Query memories
results = await core.query_memories('test')
print(f'Found {len(results)} memories')
print('β
Core functionality working!')
asyncio.run(test())
"
Step 2: Run Comprehensive Test Suite
# Run full test suite
python test_memory_suite.py
# Expected output:
# β
PASS: Basic Memory Storage
# β
PASS: Memory Retrieval by ID
# β
PASS: Context-based Search
# β
PASS: Content Search
# β
PASS: Priority-based Ordering
# β
PASS: Memory Update and Access Tracking
# β
PASS: Exhaustive Search
# β
PASS: Health Status Monitoring
Step 3: Test Each Service Mode
# Test MCP server (run in background, then test)
python server.py --mcp &
MCP_PID=$!
sleep 2
kill $MCP_PID
# Test HTTP server
python server.py --rest &
REST_PID=$!
sleep 2
curl http://localhost:8080/health
kill $REST_PID
# Test admin interface
python server.py --admin &
ADMIN_PID=$!
sleep 2
curl http://localhost:7860/
kill $ADMIN_PID
echo "β
All services tested successfully!"
π³ Docker Deployment Guide
Quick Docker Deployment
# 1. Prepare environment
mkdir -p data
cp .env.example .env
# 2. Start services
docker-compose up -d
# 3. Check status
docker-compose ps
docker-compose logs memory-mcp
Custom Docker Configuration
# Build custom image
docker build -t my-memory-mcp .
# Run with custom settings
docker run -d \
--name memory-mcp \
-p 8080:8080 \
-p 7860:7860 \
-v $(pwd)/data:/app/data \
-v $(pwd)/.env:/app/.env:ro \
-e MEMORY_DB_PATH=/app/data/memory_graph.db \
my-memory-mcp
# Monitor logs
docker logs -f memory-mcp
Production Docker Setup
# 1. Create production environment file
cat > .env.production << EOF
MEMORY_HOST=0.0.0.0
MEMORY_PORT=8080
MEMORY_LOG_LEVEL=WARNING
MEMORY_DB_PATH=/app/data/memory_graph.db
MEMORY_GRADIO_HOST=0.0.0.0
MEMORY_GRADIO_PORT=7860
MEMORY_GRADIO_SHARE=false
MEMORY_DREAMER_ENABLED=true
MEMORY_DREAMER_INTERVAL=300
MEMORY_AUTH_ENABLED=true
MEMORY_API_KEYS=your-secure-api-key-here
EOF
# 2. Deploy with production settings
docker-compose -f docker-compose.yml --env-file .env.production up -d
# 3. Set up monitoring
docker-compose exec memory-mcp python -c "
import asyncio
from memory_core import MemoryCore
print(asyncio.run(MemoryCore('/app/data/memory_graph.db').get_health_status()))
"
π§ Troubleshooting Guide
Common Installation Issues
Issue: ModuleNotFoundError: No module named 'memory_core'
# Solution: Install dependencies and check Python path
pip install -r requirements.txt
python -c "import sys; print(sys.path)"
export PYTHONPATH="${PYTHONPATH}:$(pwd)"
Issue: Permission denied
when creating database
# Solution: Check directory permissions
mkdir -p data
chmod 755 data
ls -la data/
Issue: Port already in use
error
# Solution: Find and kill process using port
lsof -i :8080
kill -9 <PID>
# Or use different port
export MEMORY_PORT=8081
Docker Issues
Issue: Container fails to start
# Check logs
docker-compose logs memory-mcp
# Common fixes:
docker-compose down
docker system prune -f
docker-compose up -d --force-recreate
Issue: Database connection errors
# Verify database path and permissions
docker-compose exec memory-mcp ls -la /app/data/
docker-compose exec memory-mcp python -c "
import sqlite3
conn = sqlite3.connect('/app/data/memory_graph.db')
print('Database accessible!')
conn.close()
"
Performance Issues
Issue: Slow query performance
# Check database size and optimize
python -c "
import asyncio
from memory_core import MemoryCore
async def optimize():
core = MemoryCore()
await core.db.execute('VACUUM;')
await core.db.execute('ANALYZE;')
print('Database optimized!')
asyncio.run(optimize())
"
Issue: High memory usage
# Adjust cache settings in .env
MEMORY_CACHE_SIZE=500
MEMORY_MAX_CONNECTIONS=5
# Restart service
docker-compose restart memory-mcp
π Health Monitoring & Maintenance
Health Check Commands
# Basic health check
python -c "
import asyncio
from memory_core import MemoryCore
print(asyncio.run(MemoryCore().get_health_status()))
"
# Detailed system status
curl http://localhost:8080/health
# Docker health check
docker-compose exec memory-mcp python -c "
import asyncio
from memory_core import MemoryCore
status = asyncio.run(MemoryCore('/app/data/memory_graph.db').get_health_status())
print(f'Status: {status}')
"
Database Maintenance
# Backup database
cp data/memory_graph.db data/memory_graph_backup_$(date +%Y%m%d).db
# Database statistics
python -c "
import asyncio, sqlite3
from memory_core import MemoryCore
async def stats():
core = MemoryCore()
async with core.db.execute('SELECT COUNT(*) FROM memory_nodes') as cursor:
count = await cursor.fetchone()
print(f'Total memories: {count[0]}')
asyncio.run(stats())
"
π Usage Examples
Example 1: Basic Memory Operations
# Start server
python server.py --all &
# Test via Python
python -c "
import asyncio
from memory_core import MemoryCore
async def demo():
core = MemoryCore()
# Store user preference
await core.store_memory(
'User prefers dark mode UI',
{'preference': 'ui', 'setting': 'dark_mode', 'value': True}
)
# Store project information
await core.store_memory(
'Working on Python automation project',
{'project': 'automation', 'language': 'Python', 'status': 'active'}
)
# Query memories
prefs = await core.query_memories('preference dark mode')
projects = await core.query_memories('Python project')
print(f'Found {len(prefs)} preferences')
print(f'Found {len(projects)} projects')
asyncio.run(demo())
"
Example 2: AI Chatbot Amnesia Recovery
# System prompt for AI chatbots
SYSTEM_PROMPT = '''
You have amnesia and remember nothing from previous conversations.
Use the memory-mcp tools to recover your memories and context about
this user, their preferences, ongoing projects, and conversation history.
Start each session by querying your memory system to rebuild context.
Available tools:
- query_memories: Search for relevant memories
- store_memory: Save new information
- get_knowledge_overview: Get general overview
'''
# Recovery script
async def recover_context(user_id: str):
core = MemoryCore()
# Recover user preferences
preferences = await core.search_by_context({'user_id': user_id, 'type': 'preference'})
# Recover ongoing projects
projects = await core.search_by_context({'user_id': user_id, 'status': 'active'})
# Recover recent conversations
recent = await core.query_memories(f'user:{user_id}', limit=10)
return {
'preferences': preferences,
'active_projects': projects,
'recent_context': recent
}
π Quick Start
Installation
# Clone the repository
git clone https://github.com/raymondclowe/Memory-MCP.git
cd Memory-MCP
# Install dependencies
pip install -r requirements.txt
# Create sample configuration (optional)
python server.py --create-env
Basic Usage
1. MCP Server (Default)
# Run MCP server for stdio communication
python server.py --mcp
# or simply
python server.py
2. FastMCP HTTP Server
# Run FastMCP HTTP server (MCP over HTTP with SSE)
python server.py --rest
# Server runs at:
# http://localhost:8080/mcp
3. Admin Interface
# Run Gradio admin interface
python server.py --admin
# Access web interface at:
# http://localhost:7860
4. All Services
# Run both FastMCP HTTP and admin interface
python server.py --all
Docker Deployment
# Using Docker Compose (recommended)
docker-compose up -d
# Or build and run manually
docker build -t memory-mcp .
docker run -p 8080:8080 -p 7860:7860 -v ./data:/app/data memory-mcp
ποΈ Architecture
The system consists of three main components:
- Client Interface Layer: Handles user commands and responses
- Memory Management Core: Manages the knowledge graph and memory operations
- Background Processor (Dreamer): Discovers relationships and creates summaries
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Client API βββββΆβ Memory Core βββββΆβ Graph Storage β
β β β β β β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β
βΌ
βββββββββββββββββββ
β Dreamer AI β
β (Background) β
βββββββββββββββββββ
π οΈ Core Commands
- Store Memory:
I should remember (content) [context: (metadata)]
- Query Memories:
What do I remember about (query)?
- Knowledge Overview:
What do I know about (topic)?
- Recall Memory:
Recall memory (memory_id)
- Deep Search:
Search extensively for (query)
π‘ API Reference
MCP Tools & Resources
The MCP server provides these tools:
store_memory
- Store a new memory with optional contextquery_memories
- Search for memories based on content or contextrecall_memory
- Retrieve a specific memory by IDget_knowledge_overview
- Get an overview of stored knowledgeexhaustive_search
- Perform comprehensive search across all memories
And these MCP resources:
memory://health
- System health statusmemory://overview
- Knowledge base overviewmemory://memory/{id}
- Individual memory by ID
MCP Prompts
The server also provides context-generation prompts:
memory_context_prompt
- Generate contextual information for topicssummarize_knowledge_prompt
- Create knowledge base summaries
FastMCP HTTP Access
When running in HTTP mode, the server provides MCP protocol over HTTP with SSE support:
# MCP server endpoint
http://localhost:8080/mcp
# The server implements the full MCP protocol including:
# - Tools for memory operations
# - Resources for data access
# - Prompts for context generation
# - Server-Sent Events for real-time communication
βοΈ Configuration
Configure via environment variables or .env
file:
# Server Configuration
MEMORY_HOST=0.0.0.0
MEMORY_PORT=8080
MEMORY_LOG_LEVEL=INFO
# Database Configuration
MEMORY_DB_PATH=memory_graph.db
# AI Configuration (optional)
MEMORY_AI_PROVIDER=openai
MEMORY_AI_API_KEY=your-openai-api-key
MEMORY_AI_MODEL=gpt-3.5-turbo
# Gradio Admin Interface
MEMORY_GRADIO_HOST=0.0.0.0
MEMORY_GRADIO_PORT=7860
# Background Processing
MEMORY_DREAMER_ENABLED=true
MEMORY_DREAMER_INTERVAL=300
π§ͺ Examples
See example_usage.py
for comprehensive examples:
python example_usage.py
Basic Memory Operations
from memory_core import MemoryCore
# Initialize
memory_core = MemoryCore("my_memory.db")
# Store a memory
memory_id = await memory_core.store_memory(
"Important project deadline is next Friday",
{"project": "Alpha", "type": "deadline", "urgency": "high"}
)
# Search memories
memories = await memory_core.query_memories("project deadline")
# Recall specific memory
memory = await memory_core.recall_memory(memory_id)
π§° Technology Stack
- Backend: Python with FastMCP and asyncio
- Database: SQLite (with support for other databases)
- AI/ML: Sentence transformers for embeddings, OpenAI integration
- Admin Interface: Gradio for web-based management
- Protocol: Model Context Protocol (MCP) with FastMCP framework
- Transport: HTTP with Server-Sent Events (SSE) support
- Deployment: Docker and Docker Compose ready
π§ Development
Running Tests
# Test core functionality
python memory_core.py
# Test MCP server
python mcp_server.py
# Test FastMCP HTTP server
python rest_api.py --test
# Test admin interface
python gradio_admin.py --test
# Test Dreamer AI
python dreamer_ai.py
Code Structure
Memory-MCP/
βββ server.py # Main server entry point
βββ memory_core.py # Core memory management
βββ mcp_server.py # MCP protocol implementation
βββ rest_api.py # FastMCP HTTP server
βββ gradio_admin.py # Web admin interface
βββ dreamer_ai.py # Background AI processing
βββ config.py # Configuration management
βββ example_usage.py # Usage examples
βββ requirements.txt # Python dependencies
βββ Dockerfile # Docker configuration
βββ docker-compose.yml # Docker Compose setup
βββ SPECIFICATION.md # Detailed technical specification
π Documentation
- - Detailed technical specification
- FastMCP Documentation - FastMCP framework documentation
- - Comprehensive usage examples
π³ Docker Deployment
Using Docker Compose (Recommended)
# Start all services
docker-compose up -d
# View logs
docker-compose logs -f
# Stop services
docker-compose down
Manual Docker
# Build image
docker build -t memory-mcp .
# Run with data persistence
docker run -d \
-p 8080:8080 \
-p 7860:7860 \
-v ./data:/app/data \
-e MEMORY_DB_PATH=/app/data/memory_graph.db \
memory-mcp
π Health Monitoring
Check system health:
# Via MCP tool (if using MCP client)
# The server provides get_knowledge_overview tool for health monitoring
# Via command line
python -c "import asyncio; from memory_core import MemoryCore; print(asyncio.run(MemoryCore().get_health_status()))"
π€ Contributing
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
π License
- see LICENSE file for details.
π Support
- GitHub Issues - Bug reports and feature requests
- Discussions - Questions and community
- - Technical details
Status: β
Production Ready
Version: 1.0.0
Last Updated: December 2024
Built with β€οΈ using the Model Context Protocol