lancelot-mcp

cms-pm/lancelot-mcp

3.1

If you are the rightful owner of lancelot-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Lancelot-MCP is a containerized RAG service that integrates with Claude Desktop to enable AI-driven document search and interaction.

šŸ—„ļø Lancelot-MCP

License: MIT Docker

A production-ready, containerized RAG (Retrieval-Augmented Generation) service that enables Claude to search and interact with your PDF documents using advanced AI processing and vector search. Built as an MCP (Model Context Protocol) server for seamless Claude Desktop integration.

Built on the foundation of lance-mcp by Alex Komyagin

✨ Features

  • 🧠 Complete RAG Pipeline: Document ingestion, vector embeddings, semantic search, and context retrieval
  • šŸ” Hybrid AI Processing: Combines Google Gemini for public docs and local Ollama for private documents
  • šŸ—ļø Container-First: Zero local dependencies, fully containerized deployment
  • šŸ”’ Privacy-Preserving: Private documents stay local with Ollama processing
  • šŸ  Local Vector Search: LanceDB index stored locally - no data transferred to cloud when using local LLMs
  • ⚔ Production-Ready: Health checks, error handling, and retry mechanisms
  • šŸ“Š Semantic Search: LanceDB-powered vector similarity search and document retrieval
  • 🌐 HTTP Transport: Simple integration with Claude Desktop via Server-Sent Events

šŸš€ Quick Start

Prerequisites

  • Docker and Docker Compose
  • Google Gemini API key (free tier available)
  • Claude Desktop app

1. Setup Environment

# Clone the repository
git clone https://github.com/chrismichael555/lancelot-mcp.git
cd lancelot-mcp

# Copy environment template
cp .env.example .env

# Edit .env with your Gemini API key
# Get your API key from: https://ai.google.dev/

2. Add Your Documents

# Create document directories
mkdir -p pdfs/your-product private-pdfs

# Add your PDFs
# Public docs → pdfs/your-product/
# Private docs → private-pdfs/

3. Start Services

# Build and start all services
npm run build
npm run server

# Process your documents
npm run process-docs

4. Configure Claude Desktop

Add this to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json

{
  "mcpServers": {
    "lancelot-mcp": {
      "command": "curl",
      "args": [
        "-N",
        "-H",
        "Accept: text/event-stream",
        "http://localhost:3000/mcp"
      ],
      "env": {}
    }
  }
}

šŸ“ Architecture

Lancelot-MCP uses a microservices architecture with three main components:

ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”    ā”Œā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”
│   MCP Server    │    │ Gemini Service  │    │   Ollama    │
│   (Node.js)     │◄──►│   (Python)      │    │  (Models)   │
│   Port: 3000    │    │   Port: 5000    │    │ Port: 11434 │
ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜    ā””ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”˜
  • MCP Server: Handles Claude Desktop connections and vector search
  • Gemini Service: Processes public documents with Google's AI
  • Ollama: Processes private documents locally with open-source models

šŸ’” Usage Examples

Once configured, you can ask Claude questions like:

"What documents do we have about kubernetes?"
"Summarize the main security considerations from our documentation"
"Find information about API authentication in our guides"

šŸ“š Available Scripts

npm run server        # Start MCP server
npm run process-docs  # Process documents with AI
npm run build         # Build all Docker images
npm run stop          # Stop all services
npm run logs          # View service logs
npm run health        # Check service health

šŸ”§ Configuration

Environment Variables

  • GEMINI_API_KEY - Your Google Gemini API key (required)
  • LANCEDB_PATH - Database storage path (default: ./data)
  • MCP_PORT - MCP server port (default: 3000)

Document Organization

  • pdfs/product-name/ - Public documents processed with Gemini
  • private-pdfs/ - Private documents processed locally with Ollama

šŸ› ļø Development

# View detailed logs
docker-compose logs -f mcp-server
docker-compose logs -f gemini-processor

# Health checks
curl http://localhost:3000/health
curl http://localhost:5000/health

šŸ”Œ Integration Options

Lancelot-MCP works with Claude Desktop, Claude Code, local LLMs, and any application that can make HTTP requests.

See for:

  • Claude Code workspace setup
  • Local LLM integration (Ollama, LM Studio, Open WebUI)
  • HTTP API endpoints and examples
  • Connectivity test scripts

šŸ“– Documentation

šŸ¤ Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

šŸ“„ License

This project is licensed under the MIT License - see the file for details.

šŸ™ Acknowledgments

This project is built upon the excellent foundation of lance-mcp by Alex Komyagin (alex@adiom.io). The original project provided the core MCP integration and LanceDB vector search capabilities that make Lancelot-MCP possible.

Key contributions from lance-mcp:

  • MCP protocol integration with Claude Desktop
  • LanceDB vector store implementation
  • Core document search tools and operations
  • TypeScript architecture and tooling

The Lancelot-MCP project extends this foundation with:

  • Hybrid Gemini + Ollama processing pipeline
  • Production containerization
  • Enhanced error handling and retry mechanisms
  • Simplified deployment and configuration