search-mcp

blakazulu/search-mcp

3.3

If you are the rightful owner of search-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Search MCP is a local-first Model Context Protocol server that enhances AI assistants with semantic search capabilities for codebases.

Tools
7
Resources
0
Prompts
0

Search MCP 🔍

Make your AI 58x smarter about your code.

# Navigate to your project, then run:
npx @liraz-sbz/search-mcp setup

Your AI assistant searches your entire codebase semantically. No API keys. No cloud. 100% local.

npm version npm downloads GitHub stars License: MIT

Works with: Claude Desktop • Claude Code • Cursor • Windsurf • Antigravity


Why Search MCP?

Without Search MCPWith Search MCP
Copy-paste files manuallyAI finds code automatically
~488,000 tokens per query~8,400 tokens per query
"Context limit exceeded"Always fits
Multiple tool callsSingle semantic search

Table of Contents


What Does This Do?

Search MCP makes your AI assistant smarter about your code. Instead of you copying and pasting files into the chat, the AI can automatically search your project and find exactly what it needs.

Before Search MCP:

You: "How does login work?"
AI: "I don't have access to your code. Please paste the relevant files."
You: *pastes 5 files*
AI: "Now I can help..."

After Search MCP:

You: "How does login work?"
AI: *automatically searches your code*
AI: "Based on src/auth/login.ts, here's how login works..."

Features

  • Just Works - No setup, no API keys, no accounts
  • Private - Your code never leaves your computer
  • Always Current - Automatically updates when you save files
  • Safe - Never indexes passwords, secrets, or junk files
  • Secure - Built-in protections against path traversal, symlink attacks, and resource exhaustion

Quick Start

Prerequisites: Node.js 18+

Option 1: Interactive Setup (Recommended)

Navigate to your project folder and run:

npx @liraz-sbz/search-mcp setup

This interactive wizard will:

  • Confirm you're in the correct project folder
  • Auto-detect and configure your AI assistants (Claude Desktop, Claude Code, Cursor, Windsurf)
  • Offer to index your project immediately with progress bars

See all CLI commands:

Option 2: Quick Setup (One-liner)

npx --yes @liraz-sbz/search-mcp@latest --setup

Configures your AI assistants automatically. You'll need to index separately via your AI assistant.

Option 3: Manual Configuration

Add to your MCP config file:

{
  "mcpServers": {
    "search": {
      "command": "npx",
      "args": ["-y", "@liraz-sbz/search-mcp"]
    }
  }
}

Config file locations:

  • Claude Desktop (Mac): ~/Library/Application Support/Claude/claude_desktop_config.json
  • Claude Desktop (Windows): %APPDATA%\Claude\claude_desktop_config.json
  • Claude Code: claude mcp add search -- npx @liraz-sbz/search-mcp

See full guide:

After Setup

  1. Restart your AI assistant
  2. Verify connection: Type /mcp and check that "search" is listed
  3. Start searching: Ask "How does login work?"

That's it!


Standalone CLI

Search MCP also works as a standalone CLI tool - no MCP client required:

# Index your project
npx @liraz-sbz/search-mcp index

# Search directly from terminal
npx @liraz-sbz/search-mcp search "authentication logic"

# Check index status
npx @liraz-sbz/search-mcp status

Features:

  • Progress bars and colored output
  • --json flag for scripting
  • Works independently of AI assistants

Perfect for quick searches, debugging, or CI/CD integration.


What Can You Ask?

Once set up, just talk naturally:

  • "How does user registration work?"
  • "Find all files related to payments"
  • "What's the database schema?"
  • "Show me where errors are handled"
  • "What files import the Logger class?"
  • "Search the docs for API rate limits"

See more .


Performance

MetricValue
Efficiency vs Grep58x fewer tokens
Search speed~400ms (with GPU acceleration)
Tokens per query~8,400
Codebase tested306 files, 4,231 chunks

Semantic search returns focused code chunks instead of entire files. Your AI stays under context limits even on large codebases.


Configuration

Config is auto-generated when you first index a project:

  • macOS/Linux: ~/.mcp/search/indexes/<project-hash>/config.json
  • Windows: %USERPROFILE%\.mcp\search\indexes\<project-hash>\config.json

Finding your config file: Ask your AI assistant "Where is my config file?" or "Show me my search config" - it will use the get_config tool to return the exact path.

Key options:

OptionDefaultDescription
indexingStrategy"realtime""realtime", "lazy", or "git"
include["**/*"]Files to index
exclude[]Files to skip
indexDocstrueIndex .md and .txt files separately

Indexing Strategies:

StrategyBest For
realtimeSmall projects, instant freshness
lazyLarge projects, index only when searching
gitOnly search committed code

For full configuration options, see the .


FAQ

Does my code leave my computer? Never. All processing happens locally. No cloud, no API calls, no tracking.

How big can my codebase be? Tested on projects with 1000+ files. Indexing takes ~1 minute for most projects.

What languages are supported? Any text-based code or documentation. The semantic search understands concepts across all languages.

How do I update the index? File changes are detected automatically. Use reindex_project for a full rebuild.


Documentation

GuideDescription
Detailed installation for all clients
Standalone command-line interface
Full config reference + indexing strategies
Complete tool documentation
Use cases & best practices
Common issues & solutions
Planned features
Version history
How to contribute

For Developers

Architecture

┌─────────────────────────────────────────────────────────────┐
│                      MCP CLIENT                             │
│  (Claude Desktop, Claude Code, Cursor, Windsurf, etc.)      │
└─────────────────────────┬───────────────────────────────────┘
                          │ MCP Protocol (stdio)
                          ▼
┌─────────────────────────────────────────────────────────────┐
│                  SEARCH MCP SERVER                          │
│  ┌───────────┐ ┌───────────┐ ┌───────────┐ ┌─────────────┐ │
│  │create_    │ │search_code│ │search_by_ │ │get_index_   │ │
│  │index      │ │           │ │path       │ │status       │ │
│  └───────────┘ └───────────┘ └───────────┘ └─────────────┘ │
│  ┌───────────┐ ┌───────────┐ ┌───────────┐                 │
│  │reindex_   │ │reindex_   │ │delete_    │                 │
│  │project    │ │file       │ │index      │                 │
│  └───────────┘ └───────────┘ └───────────┘                 │
└─────────────────────────┬───────────────────────────────────┘
                          │
        ┌─────────────────┼─────────────────┐
        ▼                 ▼                 ▼
┌───────────────┐ ┌───────────────┐ ┌───────────────┐
│   Chunking    │ │   Embedding   │ │   LanceDB     │
│   Engine      │ │   (BGE)       │ │   (Local)     │
└───────────────┘ └───────────────┘ └───────────────┘

MCP Tools

ToolDescriptionConfirmation
create_indexCreate a search index for the current projectYes
search_codeSemantic search for relevant code chunksNo
search_docsSemantic search for documentation filesNo
search_by_pathFind files by name/glob patternNo
get_index_statusShow index statistics and pathsNo
get_configGet config file path and contentsNo
get_file_summaryExtract symbols and complexity metrics from a fileNo
reindex_projectRebuild the entire indexYes
reindex_fileRe-index a single fileNo
delete_indexRemove the project indexYes

Technical Details

PropertyValue
Embedding ModelsCode: Xenova/bge-small-en-v1.5 (384d), Docs: Xenova/bge-base-en-v1.5 (768d)
Code Chunk Size~1000 tokens
Doc Chunk Size~2000 tokens
Search Latency< 200ms
Storage~/.mcp/search/indexes/ (macOS/Linux) or %USERPROFILE%\.mcp\search\indexes\ (Windows)

GPU Acceleration

Search MCP automatically uses GPU acceleration when available for faster indexing:

PlatformGPU SupportNotes
WindowsDirectMLAutomatic GPU acceleration on all modern GPUs (NVIDIA, AMD, Intel)
macOSCPU onlyCoreML not available in Node.js bindings
LinuxCPU onlyCUDA requires separate package (not included)

GPU Compatibility (Windows):

  • NVIDIA: GeForce GTX 1000+, RTX series, Quadro
  • AMD: RX 400+, Radeon Pro
  • Intel: Arc, UHD/Iris integrated graphics

GPU acceleration is automatic - no configuration needed. The system detects available hardware and selects the best option. Check get_index_status to see which compute device is being used.

Hybrid GPU Laptops (NVIDIA + Intel/AMD integrated):

On laptops with both discrete and integrated GPUs, Search MCP defaults to CPU to avoid DirectML selecting the wrong GPU. To enable GPU acceleration:

  1. Open Windows Settings → System → Display → Graphics
  2. Click Add an app and select your IDE (VS Code, Cursor, etc.) or terminal
  3. Click on the app → Options → Select High performance
  4. Set environment variable: FORCE_DML=1

This tells Windows to use your discrete GPU (NVIDIA/AMD) for that application.

For full technical documentation, see .


Updating & Uninstalling

Updating

If using npx in your config (recommended): Updates are automatic - you always get the latest version.

If installed globally:

npm install -g @liraz-sbz/search-mcp

Uninstalling

1. Remove from your AI assistant:

  • Claude Code: claude mcp remove search
  • Other clients: Delete the search entry from your MCP config file

2. Uninstall the package (if installed globally):

npm uninstall -g @liraz-sbz/search-mcp

3. (Optional) Remove index data:

  • macOS/Linux: rm -rf ~/.mcp/search
  • Windows: rmdir /s /q %USERPROFILE%\.mcp\search

Troubleshooting

Common issues:

IssueSolution
"Index not found"Say "Index this project" to create the index
MCP connection issuesRun npx --yes @liraz-sbz/search-mcp@latest --setup to reconfigure
Search results seem wrongRun reindex_project to rebuild
Changes not detectedRun reindex_file for specific file

CLI commands:

npx @liraz-sbz/search-mcp index                # Create/update index
npx @liraz-sbz/search-mcp search "query"       # Search code
npx @liraz-sbz/search-mcp status               # Show index info
npx @liraz-sbz/search-mcp --setup              # Configure MCP clients

See the for all commands and options.

Debug mode: Set DEBUG=1 or SEARCH_MCP_DEBUG=1 environment variable for verbose logging.

For all error codes and solutions, see the .


Privacy & License

Your code stays on your computer. Nothing is uploaded anywhere. No accounts, no API keys, no tracking.

MIT License - See for details.


Getting Help

  • GitHub Issues - Report bugs or request features
  • - Full guides and references