mcp-server-vector-search

omarguzmanm/mcp-server-vector-search

3.2

If you are the rightful owner of mcp-server-vector-search and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The MCP Server - Vector Search is a high-performance server that integrates Neo4j's graph database with advanced vector search capabilities using embeddings, enabling semantic search through natural language queries.

Tools
1
Resources
0
Prompts
0

šŸ” MCP Server - Vector Search

Python Neo4j FastMCP uv

A blazing-fast Model Context Protocol (MCP) Server built with FastMCP that seamlessly combines Neo4j's graph database capabilities with advanced vector search using embeddings. This server enables intelligent semantic search across your knowledge graph, allowing you to discover contextually relevant information through natural language queries with lightning speed.

šŸ—ļø Architecture

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   MCP Client    │◄──►│   Vector Search  │◄──►│      Neo4j      │
│   (Claude AI)   │    │      Server      │    │     Database    │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
                                │
                                ā–¼
                       ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
                       │    Embeddings    │
                       ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜

šŸš€ Quick Start

Prerequisites

  • Python 3.8+
  • uv
  • Neo4j Database (v5.0+) with APOC plugin
  • OpenAI API Key

Installation with uv

  1. Install uv (if not already installed)

    # On macOS and Linux
    curl -LsSf https://astral.sh/uv/install.sh | sh
    
    # On Windows
    powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
    
  2. Clone and setup the project

    git clone https://github.com/omarguzmanm/mcp-server-vector-search.git
    cd mcp-server-vector-search
    
    # Create virtual environment and install dependencies
    uv venv
    uv pip install fastmcp neo4j openai python-dotenv sentence-transformers pydantic
    
  3. Environment Configuration

    # Create .env file
    cp .env.example .env
    

    Edit .env with your configurations:

    NEO4J_URI=bolt://localhost:7687
    NEO4J_USERNAME=neo4j
    NEO4J_PASSWORD=your_neo4j_password
    NEO4J_DATABASE=neo4j
    OPENAI_API_KEY=your_openai_api_key
    
  4. Neo4j Vector Index Setup

    // Create vector index for 1536-dimensional OpenAI embeddings
    // If does not works
    CREATE VECTOR INDEX embeddableIndex FOR (n:Document) ON (n.embedding)
    OPTIONS {indexConfig: {
      `vector.dimensions`: 1536,
      `vector.similarity_function`: 'cosine'
    }}
    
  5. Launch the Server

    # Activate virtual environment
    source .venv/bin/activate  # On Linux/macOS
    # or
    .venv\Scripts\activate     # On Windows
    
    # Start the FastMCP server
    python main.py
    

šŸ› ļø Tool

The server exposes a single, powerful tool optimized for vector search:

šŸ” Vector Search
vector_search_neo4j(
    prompt="Find documents about machine learning and neural networks"
)

What it does:

  • Converts your natural language query into a 1536-dimensional vector using OpenAI
  • Searches your Neo4j vector index for the most semantically similar nodes
  • Returns ranked results with similarity scores

āš™ļø Configuration

Environment Variables

VariableDescriptionRequiredDefault
NEO4J_URINeo4j connection URIāœ…bolt://localhost:7687
NEO4J_USERNAMENeo4j usernameāœ…neo4j
NEO4J_PASSWORDNeo4j passwordāœ…password
NEO4J_DATABASENeo4j database nameāœ…neo4j
OPENAI_API_KEYOpenAI API keyāŒall-MiniLM-L6-v2 model

Neo4j Requirements

  1. APOC Plugin: Essential for advanced graph operations
  2. Vector Index: Must support 1536 dimensions for OpenAI embeddings
  3. Node Structure: Nodes should have embedding properties as vectors

Performance Optimization

  • uv Benefits: 10-100x faster dependency resolution compared to pip
  • FastMCP Advantages: Minimal overhead, optimized for MCP protocol
  • Connection Pooling: Automatic Neo4j connection management
  • Async Operations: Non-blocking I/O for maximum throughput

šŸ¤ Integration with Claude Desktop

MCP Configuration

Add to your Claude Desktop MCP settings:

{
  "mcpServers": {
      "mcp-neo4j-vector-search": {
      "command": "python",
      "args": [
        "you\\server.py",
        "--with",
        "mcp[cli]",
        "--with",
        "neo4j",
        "--with",
        "pydantic"
      ],
      "env": {
        "NEO4J_URI": "bolt://localhost:7687",
        "NEO4J_USERNAME": "neo4j",
        "NEO4J_PASSWORD": "your_password",
        "NEO4J_DATABASE": "neo4j",
        "OPENAI_API_KEY": "your_api_key"
      }
    }
  }
}

šŸ› Troubleshooting

Common Issues

  1. "Module not found" errors

    # Reinstall dependencies with uv
    uv pip install --force-reinstall fastmcp neo4j openai
    
  2. "Vector index not found"

    // Check existing indexes
    SHOW INDEXES
    
    // Create if missing
    CREATE VECTOR INDEX embeddableIndex FOR (n:Document) ON (n.embedding)
    OPTIONS {indexConfig: {`vector.dimensions`: 1536, `vector.similarity_function`: 'cosine'}}
    
  3. OpenAI API errors

    # Verify API key
    uv run python -c "
    import os
    from openai import OpenAI
    client = OpenAI(api_key=os.getenv('OPENAI_API_KEY'))
    print('API key is valid!' if client.api_key else 'API key missing!')
    "
    

šŸ¤ Contributing

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature/amazing-feature
  3. Install development dependencies: uv pip install -e ".[dev]"
  4. Make your changes and add tests
  5. Commit: git commit -m 'Add amazing feature'
  6. Push: git push origin feature/amazing-feature
  7. Open a Pull Request

šŸ“„ License

This project is licensed under the MIT License - see the file for details.

šŸ™ Acknowledgments

  • FastMCP - For the incredible MCP framework
  • uv - For blazing-fast Python package management
  • Neo4j - For powerful graph database capabilities
  • OpenAI - For state-of-the-art embedding models
  • Model Context Protocol - For the protocol specification

šŸš€ Made with ā¤ļø for the AI and Graph Database community

ā¬†ļø Back to Top