Hroerkr/mem0mcp
If you are the rightful owner of mem0mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Local Mem0 MCP Server is a self-hosted solution that integrates Mem0 for persistent memory capabilities, allowing AI assistants to store and retrieve contextual information across conversations.
Local Mem0 MCP Server
A fully self-hosted Model Context Protocol (MCP) server that integrates Mem0 for persistent memory capabilities. Enables AI assistants like Claude to store and retrieve contextual information across conversations.
⨠Features
- š§ Persistent Memory: Store and retrieve memories across conversations
- š Fully Self-Hosted: No external APIs or cloud dependencies
- š³ Containerized: Complete Docker deployment with one command
- š Easy Installation: Single script setup for Windows, Mac, and Linux
- š¤ Local AI Models: Uses Ollama with phi3:mini and nomic-embed-text
- š Vector Storage: PostgreSQL with pgvector for efficient memory search
- š MCP Compatible: Works with Claude Desktop and other MCP-capable AI tools
š Quick Start
Prerequisites
- Docker Desktop installed and running
- Claude Desktop (for testing)
Installation
Windows:
git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
install.bat
Mac/Linux:
git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
chmod +x install.sh
./install.sh
The installation will:
- Build the MCP server container
- Start PostgreSQL and Ollama services
- Download AI models (~2.5GB total)
- Configure Claude Desktop integration
- Test the installation
Testing
After installation and configuration:
- Restart Claude Desktop completely (close and reopen)
- Verify MCP server: Type
/mcp
- should listmem0-local
as available - Test memory storage: "Remember that I'm testing the MCP memory system today"
- Test memory retrieval: "What do you remember about me?"
- Verify persistence: Restart Claude Desktop and ask again - memories should persist
Troubleshooting MCP Connection:
- If
/mcp
shows no servers, check the configuration file path and JSON syntax - Ensure Docker containers are running:
docker ps
- Check MCP server logs:
docker logs mem0-mcp-server
šļø Architecture
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā Claude ā ā MCP Server ā ā PostgreSQL ā
ā Desktop āāāāāŗā (FastMCP) āāāāāŗā + pgvector ā
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā
ā¼
āāāāāāāāāāāāāāāāāāāā
ā Ollama ā
ā phi3:mini + ā
ā nomic-embed-text ā
āāāāāāāāāāāāāāāāāāāā
š§ Configuration
Claude Desktop MCP Configuration
After installation, configure Claude Desktop to use the MCP server:
Windows:
Edit %APPDATA%\Claude\claude_desktop_config.json
:
Mac:
Edit ~/Library/Application Support/Claude/claude_desktop_config.json
:
Linux:
Edit ~/.config/Claude/claude_desktop_config.json
:
Add this configuration:
{
"mcpServers": {
"mem0-local": {
"command": "docker",
"args": [
"exec", "-i", "mem0-mcp-server",
"python", "/app/src/server.py"
]
}
}
}
System Configuration
The system is configured for local operation by default:
- MCP Server: Runs in Docker container with STDIO transport
- Database: PostgreSQL with pgvector on port 5432
- AI Models: Local Ollama instance on port 11434
- Memory Storage: User-isolated memories with vector embeddings
š Available Memory Operations
- add_memory: Store new memories
- search_memories: Find relevant memories by query
- get_all_memories: Retrieve all memories for a user
- update_memory: Modify existing memories
- delete_memory: Remove specific memories
- delete_all_memories: Clear all memories for a user
- get_memory_stats: Get memory statistics
š Troubleshooting
Check services:
docker ps
View logs:
docker-compose -f docker-compose.local.yml logs
Restart services:
docker-compose -f docker-compose.local.yml restart
Clean restart:
docker-compose -f docker-compose.local.yml down -v
# Run install script again
š¤ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
š License
This project is licensed under the MIT License - see the file for details.