mem0mcp

Hroerkr/mem0mcp

3.2

If you are the rightful owner of mem0mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Local Mem0 MCP Server is a self-hosted solution that integrates Mem0 for persistent memory capabilities, allowing AI assistants to store and retrieve contextual information across conversations.

Tools
1
Resources
0
Prompts
0

Local Mem0 MCP Server

A fully self-hosted Model Context Protocol (MCP) server that integrates Mem0 for persistent memory capabilities. Enables AI assistants like Claude to store and retrieve contextual information across conversations.

✨ Features

  • 🧠 Persistent Memory: Store and retrieve memories across conversations
  • šŸ”’ Fully Self-Hosted: No external APIs or cloud dependencies
  • 🐳 Containerized: Complete Docker deployment with one command
  • šŸš€ Easy Installation: Single script setup for Windows, Mac, and Linux
  • šŸ¤– Local AI Models: Uses Ollama with phi3:mini and nomic-embed-text
  • šŸ“Š Vector Storage: PostgreSQL with pgvector for efficient memory search
  • šŸ”Œ MCP Compatible: Works with Claude Desktop and other MCP-capable AI tools

šŸš€ Quick Start

Prerequisites

Installation

Windows:

git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
install.bat

Mac/Linux:

git clone https://github.com/Synapse-OS/local-mem0-mcp.git
cd local-mem0-mcp
chmod +x install.sh
./install.sh

The installation will:

  1. Build the MCP server container
  2. Start PostgreSQL and Ollama services
  3. Download AI models (~2.5GB total)
  4. Configure Claude Desktop integration
  5. Test the installation

Testing

After installation and configuration:

  1. Restart Claude Desktop completely (close and reopen)
  2. Verify MCP server: Type /mcp - should list mem0-local as available
  3. Test memory storage: "Remember that I'm testing the MCP memory system today"
  4. Test memory retrieval: "What do you remember about me?"
  5. Verify persistence: Restart Claude Desktop and ask again - memories should persist

Troubleshooting MCP Connection:

  • If /mcp shows no servers, check the configuration file path and JSON syntax
  • Ensure Docker containers are running: docker ps
  • Check MCP server logs: docker logs mem0-mcp-server

šŸ—ļø Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   Claude        │    │   MCP Server     │    │   PostgreSQL    │
│   Desktop       │◄──►│   (FastMCP)      │◄──►│   + pgvector    │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                │
                                ā–¼
                       ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                       │     Ollama       │
                       │   phi3:mini +    │
                       │ nomic-embed-text │
                       ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

šŸ”§ Configuration

Claude Desktop MCP Configuration

After installation, configure Claude Desktop to use the MCP server:

Windows: Edit %APPDATA%\Claude\claude_desktop_config.json:

Mac: Edit ~/Library/Application Support/Claude/claude_desktop_config.json:

Linux: Edit ~/.config/Claude/claude_desktop_config.json:

Add this configuration:

{
  "mcpServers": {
    "mem0-local": {
      "command": "docker",
      "args": [
        "exec", "-i", "mem0-mcp-server",
        "python", "/app/src/server.py"
      ]
    }
  }
}

System Configuration

The system is configured for local operation by default:

  • MCP Server: Runs in Docker container with STDIO transport
  • Database: PostgreSQL with pgvector on port 5432
  • AI Models: Local Ollama instance on port 11434
  • Memory Storage: User-isolated memories with vector embeddings

šŸ“‹ Available Memory Operations

  • add_memory: Store new memories
  • search_memories: Find relevant memories by query
  • get_all_memories: Retrieve all memories for a user
  • update_memory: Modify existing memories
  • delete_memory: Remove specific memories
  • delete_all_memories: Clear all memories for a user
  • get_memory_stats: Get memory statistics

šŸ” Troubleshooting

Check services:

docker ps

View logs:

docker-compose -f docker-compose.local.yml logs

Restart services:

docker-compose -f docker-compose.local.yml restart

Clean restart:

docker-compose -f docker-compose.local.yml down -v
# Run install script again

šŸ¤ Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

šŸ“„ License

This project is licensed under the MIT License - see the file for details.

šŸ™ Acknowledgments

  • Mem0 - Memory management framework
  • FastMCP - MCP server implementation
  • Ollama - Local AI model inference
  • pgvector - Vector similarity search