cms-pm/lancelot-mcp
If you are the rightful owner of lancelot-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Lancelot-MCP is a containerized RAG service that integrates with Claude Desktop to enable AI-driven document search and interaction.
šļø Lancelot-MCP
A production-ready, containerized RAG (Retrieval-Augmented Generation) service that enables Claude to search and interact with your PDF documents using advanced AI processing and vector search. Built as an MCP (Model Context Protocol) server for seamless Claude Desktop integration.
Built on the foundation of lance-mcp by Alex Komyagin
⨠Features
- š§ Complete RAG Pipeline: Document ingestion, vector embeddings, semantic search, and context retrieval
- š Hybrid AI Processing: Combines Google Gemini for public docs and local Ollama for private documents
- šļø Container-First: Zero local dependencies, fully containerized deployment
- š Privacy-Preserving: Private documents stay local with Ollama processing
- š Local Vector Search: LanceDB index stored locally - no data transferred to cloud when using local LLMs
- ā” Production-Ready: Health checks, error handling, and retry mechanisms
- š Semantic Search: LanceDB-powered vector similarity search and document retrieval
- š HTTP Transport: Simple integration with Claude Desktop via Server-Sent Events
š Quick Start
Prerequisites
- Docker and Docker Compose
- Google Gemini API key (free tier available)
- Claude Desktop app
1. Setup Environment
# Clone the repository
git clone https://github.com/chrismichael555/lancelot-mcp.git
cd lancelot-mcp
# Copy environment template
cp .env.example .env
# Edit .env with your Gemini API key
# Get your API key from: https://ai.google.dev/
2. Add Your Documents
# Create document directories
mkdir -p pdfs/your-product private-pdfs
# Add your PDFs
# Public docs ā pdfs/your-product/
# Private docs ā private-pdfs/
3. Start Services
# Build and start all services
npm run build
npm run server
# Process your documents
npm run process-docs
4. Configure Claude Desktop
Add this to your Claude Desktop configuration file:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"lancelot-mcp": {
"command": "curl",
"args": [
"-N",
"-H",
"Accept: text/event-stream",
"http://localhost:3000/mcp"
],
"env": {}
}
}
}
š Architecture
Lancelot-MCP uses a microservices architecture with three main components:
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāā
ā MCP Server ā ā Gemini Service ā ā Ollama ā
ā (Node.js) āāāāāŗā (Python) ā ā (Models) ā
ā Port: 3000 ā ā Port: 5000 ā ā Port: 11434 ā
āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāā
- MCP Server: Handles Claude Desktop connections and vector search
- Gemini Service: Processes public documents with Google's AI
- Ollama: Processes private documents locally with open-source models
š” Usage Examples
Once configured, you can ask Claude questions like:
"What documents do we have about kubernetes?"
"Summarize the main security considerations from our documentation"
"Find information about API authentication in our guides"
š Available Scripts
npm run server # Start MCP server
npm run process-docs # Process documents with AI
npm run build # Build all Docker images
npm run stop # Stop all services
npm run logs # View service logs
npm run health # Check service health
š§ Configuration
Environment Variables
GEMINI_API_KEY
- Your Google Gemini API key (required)LANCEDB_PATH
- Database storage path (default: ./data)MCP_PORT
- MCP server port (default: 3000)
Document Organization
pdfs/product-name/
- Public documents processed with Geminiprivate-pdfs/
- Private documents processed locally with Ollama
š ļø Development
# View detailed logs
docker-compose logs -f mcp-server
docker-compose logs -f gemini-processor
# Health checks
curl http://localhost:3000/health
curl http://localhost:5000/health
š Integration Options
Lancelot-MCP works with Claude Desktop, Claude Code, local LLMs, and any application that can make HTTP requests.
See for:
- Claude Code workspace setup
- Local LLM integration (Ollama, LM Studio, Open WebUI)
- HTTP API endpoints and examples
- Connectivity test scripts
š Documentation
š¤ Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
š License
This project is licensed under the MIT License - see the file for details.
š Acknowledgments
This project is built upon the excellent foundation of lance-mcp by Alex Komyagin (alex@adiom.io). The original project provided the core MCP integration and LanceDB vector search capabilities that make Lancelot-MCP possible.
Key contributions from lance-mcp:
- MCP protocol integration with Claude Desktop
- LanceDB vector store implementation
- Core document search tools and operations
- TypeScript architecture and tooling
The Lancelot-MCP project extends this foundation with:
- Hybrid Gemini + Ollama processing pipeline
- Production containerization
- Enhanced error handling and retry mechanisms
- Simplified deployment and configuration