graphiti-mcp-server

gifflet/graphiti-mcp-server

3.6

If you are the rightful owner of graphiti-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Graphiti MCP Server is a powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).

Graphiti MCP Server 🧠

Python Version License Docker

🌟 A powerful knowledge graph server for AI agents, built with Neo4j and integrated with Model Context Protocol (MCP).

šŸš€ Features

  • šŸ”„ Dynamic knowledge graph management with Neo4j
  • šŸ¤– Seamless integration with OpenAI models
  • šŸ”Œ MCP (Model Context Protocol) support
  • 🐳 Docker-ready deployment
  • šŸŽÆ Custom entity extraction capabilities
  • šŸ” Advanced semantic search functionality

šŸ› ļø Installation

Prerequisites

  • Docker and Docker Compose
  • Python 3.10 or higher
  • OpenAI API key
  • Minimum 4GB RAM (recommended 8GB)
  • 2GB free disk space

Quick Start šŸš€

  1. Clone the repository:
git clone https://github.com/gifflet/graphiti-mcp-server.git
cd graphiti-mcp-server
  1. Set up environment variables:
cp .env.sample .env
  1. Edit .env with your configuration:
# Required for LLM operations
OPENAI_API_KEY=your_openai_api_key_here
MODEL_NAME=gpt-4.1-mini

# Optional: Custom OpenAI endpoint (e.g., for proxies)
# OPENAI_BASE_URL=https://api.openai.com/v1

# Neo4j Configuration (defaults work with Docker)
NEO4J_URI=bolt://neo4j:7687
NEO4J_USER=neo4j
NEO4J_PASSWORD=demodemo
  1. Start the services:
docker compose up -d
  1. Verify installation:
# Check if services are running
docker compose ps

# Check logs
docker compose logs graphiti-mcp

Alternative: Environment Variables

You can run with environment variables directly:

OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up

šŸ”§ Configuration

Service Ports 🌐

ServicePortPurpose
Neo4j Browser7474Web interface for graph visualization
Neo4j Bolt7687Database connection
Graphiti MCP8000MCP server endpoint

Environment Variables šŸ”§

OpenAI Configuration
VariableRequiredDefaultDescription
OPENAI_API_KEYāœ…-Your OpenAI API key
OPENAI_BASE_URLāŒ-Custom OpenAI API endpoint (consumed by OpenAI SDK)
MODEL_NAMEāŒgpt-4.1-miniMain LLM model to use
SMALL_MODEL_NAMEāŒgpt-4.1-nanoSmall LLM model for lighter tasks
LLM_TEMPERATUREāŒ0.0LLM temperature (0.0-2.0)
EMBEDDER_MODEL_NAMEāŒtext-embedding-3-smallEmbedding model
Neo4j Configuration
VariableRequiredDefaultDescription
NEO4J_URIāŒbolt://neo4j:7687Neo4j connection URI
NEO4J_USERāŒneo4jNeo4j username
NEO4J_PASSWORDāŒdemodemoNeo4j password
Server Configuration
VariableRequiredDefaultDescription
MCP_SERVER_HOSTāŒ-MCP server host binding
SEMAPHORE_LIMITāŒ10Concurrent operation limit for LLM calls
Azure OpenAI Configuration (Optional)

For Azure OpenAI deployments, use these environment variables instead of the standard OpenAI configuration:

VariableRequiredDefaultDescription
AZURE_OPENAI_ENDPOINTāœ…*-Azure OpenAI endpoint URL
AZURE_OPENAI_API_VERSIONāœ…*-Azure OpenAI API version
AZURE_OPENAI_DEPLOYMENT_NAMEāœ…*-Azure OpenAI deployment name
AZURE_OPENAI_USE_MANAGED_IDENTITYāŒfalseUse Azure managed identity for auth
AZURE_OPENAI_EMBEDDING_ENDPOINTāŒ-Separate endpoint for embeddings
AZURE_OPENAI_EMBEDDING_API_VERSIONāŒ-API version for embeddings
AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAMEāŒ-Deployment name for embeddings
AZURE_OPENAI_EMBEDDING_API_KEYāŒ-Separate API key for embeddings

* Required when using Azure OpenAI

Notes:

  • OPENAI_BASE_URL is consumed directly by the OpenAI Python SDK, useful for proxy configurations or custom endpoints
  • SEMAPHORE_LIMIT controls concurrent LLM API calls - decrease if you encounter rate limits, increase for higher throughput
  • Azure configuration is an alternative to standard OpenAI - don't mix both configurations

Neo4j Settings šŸ—„ļø

Default configuration for Neo4j:

  • Username: neo4j
  • Password: demodemo
  • URI: bolt://neo4j:7687 (within Docker network)
  • Memory settings optimized for development

Docker Environment Variables 🐳

You can run with environment variables directly:

OPENAI_API_KEY=your_key MODEL_NAME=gpt-4.1-mini docker compose up

For Azure OpenAI:

AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com \
AZURE_OPENAI_API_VERSION=2024-02-01 \
AZURE_OPENAI_DEPLOYMENT_NAME=your-deployment \
OPENAI_API_KEY=your_key \
docker compose up

šŸ”Œ Integration

Cursor IDE Integration šŸ–„ļø

  1. Configure Cursor MCP settings:
{
  "mcpServers": {
    "Graphiti": {
      "command": "uv",
      "args": ["run", "graphiti_mcp_server.py"],
      "env": {
        "OPENAI_API_KEY": "your_key_here"
      }
    }
  }
}
  1. For Docker-based setup:
{
  "mcpServers": {
    "Graphiti": {
      "url": "http://localhost:8000/sse"
    }
  }
}
  1. Add Graphiti rules to Cursor's User Rules (see graphiti_cursor_rules.mdc)
  2. Start an agent session in Cursor

Other MCP Clients

The server supports standard MCP transports:

  • SSE (Server-Sent Events): http://localhost:8000/sse
  • WebSocket: ws://localhost:8000/ws
  • Stdio: Direct process communication

šŸ’» Development

Local Development Setup

  1. Install dependencies:
# Using uv (recommended)
curl -LsSf https://astral.sh/uv/install.sh | sh
uv sync

# Or using pip
pip install -r requirements.txt
  1. Start Neo4j locally:
docker run -d \
  --name neo4j-dev \
  -p 7474:7474 -p 7687:7687 \
  -e NEO4J_AUTH=neo4j/demodemo \
  neo4j:5.26.0
  1. Run the server:
# Set environment variables
export OPENAI_API_KEY=your_key
export NEO4J_URI=bolt://localhost:7687

# Run with stdio transport
uv run graphiti_mcp_server.py

# Or with SSE transport
uv run graphiti_mcp_server.py --transport sse --use-custom-entities

Testing

# Run basic connectivity test
curl http://localhost:8000/health

# Test MCP endpoint
curl http://localhost:8000/sse

šŸ” Troubleshooting

Common Issues

🐳 Docker Issues
# Clean up and restart
docker compose down -v
docker compose up --build

# Check disk space
docker system df

Logs and Debugging

# View all logs
docker compose logs -f

# View specific service logs
docker compose logs -f graphiti-mcp
docker compose logs -f neo4j

# Enable debug logging
docker compose up -e LOG_LEVEL=DEBUG

Performance Issues

  • Memory: Increase Neo4j heap size in docker-compose.yml
  • Storage: Monitor Neo4j data volume usage
  • Network: Check for firewall blocking ports 7474, 7687, 8000

šŸ—ļø Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   MCP Client    │    │  Graphiti MCP    │    │     Neo4j       │
│   (Cursor)      │◄──►│     Server       │◄──►│   Database      │
│                 │    │   (Port 8000)    │    │  (Port 7687)    │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                │
                                ā–¼
                       ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                       │   OpenAI API     │
                       │   (LLM Client)   │
                       ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

Components

  • Neo4j Database: Graph storage and querying
  • Graphiti MCP Server: API layer and LLM operations
  • OpenAI Integration: Entity extraction and semantic processing
  • MCP Protocol: Standardized AI agent communication

šŸ¤ Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

šŸ“ License

This project is licensed under the MIT License - see the file for details.

šŸ™ Acknowledgments


Need help? Open an issue or check our troubleshooting guide above.