peteretelej/largefile
If you are the rightful owner of largefile and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Largefile MCP Server is designed to assist AI assistants in handling large files that exceed context limits, enabling efficient navigation, search, and editing without loading entire content into memory.
get_overview
Get file structure with Tree-sitter semantic analysis, line counts, and suggested search patterns
search_content
Find patterns with fuzzy matching, context lines, and semantic information
read_content
Read targeted content by line number or pattern with semantic chunking (complete functions/classes)
edit_content
Primary editing via search/replace blocks with automatic backups and preview mode
Largefile MCP Server
MCP server that helps your AI assistant work with large files that exceed context limits.
This MCP server enables AI assistants to navigate, search, and edit files of any size without loading entire content into memory. It provides targeted access to specific lines, patterns, and sections while maintaining file integrity using research-backed search/replace editing instead of error-prone line-based operations. Perfect for working with large codebases, generated files, logs, and datasets that would otherwise be inaccessible due to context window limitations.
MCP Tools
The server provides 4 core tools that work together for progressive file exploration:
get_overview
- Get file structure with Tree-sitter semantic analysis, line counts, and suggested search patternssearch_content
- Find patterns with fuzzy matching, context lines, and semantic informationread_content
- Read targeted content by line number or pattern with semantic chunking (complete functions/classes)edit_content
- Primary editing via search/replace blocks with automatic backups and preview mode
Quick Start
Prerequisite: Install uv (an extremely fast Python package manager) which provides the uvx
command.
Add to your MCP configuration:
{
"mcpServers": {
"largefile": {
"command": "uvx",
"args": ["--from", "largefile", "largefile-mcp"]
}
}
}
Usage Examples
Analyzing Large Code Files
AI Question: "Can you analyze this large Django models file and tell me about the class structure and any potential issues? It's a large file so use largefile."
AI Assistant workflow:
- Get file overview to understand structure
- Search for classes and their methods
- Look for code issues like TODOs or long functions
# AI gets file structure
overview = get_overview("/path/to/django-models.py")
# Returns: 2,847 lines, 15 classes, semantic outline with Tree-sitter
# AI searches for all class definitions
classes = search_content("/path/to/django-models.py", "class ", max_results=20)
# Returns: Model classes with line numbers and context
# AI examines specific class implementation
model_code = read_content("/path/to/django-models.py", "class User", mode="semantic")
# Returns: Complete class definition with all methods
Working with Documentation
AI Question: "Find all the installation methods mentioned in this README file and update the pip install to use uv instead."
AI Assistant workflow:
- Search for installation patterns
- Read the installation section
- Replace pip commands with uv equivalents
# AI finds installation instructions
install_sections = search_content("/path/to/readme.md", "install", fuzzy=True, context_lines=3)
# AI reads the installation section
install_content = read_content("/path/to/readme.md", "## Installation", mode="semantic")
# AI replaces pip with uv
edit_result = edit_content(
"/path/to/readme.md",
search_text="pip install anthropic",
replace_text="uv add anthropic",
preview=True
)
Debugging Large Log Files
AI Question: "Check this production log file for any critical errors in the last few thousand lines and show me the context around them. Use largefile mcp."
AI Assistant workflow:
- Get log file overview
- Search for error patterns
- Read context around critical issues
# AI gets log file overview
overview = get_overview("/path/to/production.log")
# Returns: 150,000 lines, 2.1GB file size
# AI searches for critical errors
errors = search_content("/path/to/production.log", "CRITICAL|ERROR", fuzzy=True, max_results=10)
# AI examines context around each error
for error in errors:
context = read_content("/path/to/production.log", error.line_number, mode="lines")
# Shows surrounding log entries for debugging
Refactoring Code
AI Question: "I need to rename the function process_data
to transform_data
throughout this large codebase file. Can you help me do this safely?"
AI Assistant workflow:
- Find all occurrences of the function
- Preview changes to ensure accuracy
- Apply changes with automatic backup
# AI finds all usages
usages = search_content("/path/to/codebase.py", "process_data", fuzzy=False, max_results=50)
# AI previews the changes
preview = edit_content(
"/path/to/codebase.py",
search_text="process_data",
replace_text="transform_data",
preview=True
)
# AI applies changes after confirmation
result = edit_content(
"/path/to/codebase.py",
search_text="process_data",
replace_text="transform_data",
preview=False
)
# Creates automatic backup before changes
Exploring API Documentation
AI Question: "What are all the available methods in this large API documentation file and can you show me examples of authentication?"
AI Assistant workflow:
- Get document structure overview
- Search for method definitions and auth patterns
- Extract relevant code examples
# AI analyzes document structure
overview = get_overview("/path/to/api-docs.md")
# Returns: Section outline, headings, suggested search patterns
# AI finds API methods
methods = search_content("/path/to/api-docs.md", "###", max_results=30)
# Returns: All method headings with context
# AI searches for authentication examples
auth_examples = search_content("/path/to/api-docs.md", "auth", fuzzy=True, context_lines=5)
# AI reads complete authentication section
auth_section = read_content("/path/to/api-docs.md", "## Authentication", mode="semantic")
File Size Handling
- Small files (<50MB): Memory loading with Tree-sitter AST caching
- Medium files (50-500MB): Memory-mapped access
- Large files (>500MB): Streaming processing
- Long lines (>1000 chars): Automatic truncation for display
Supported Languages
Tree-sitter semantic analysis for:
- Python (.py)
- JavaScript/JSX (.js, .jsx)
- TypeScript/TSX (.ts, .tsx)
- Rust (.rs)
- Go (.go)
Files without Tree-sitter support use text-based analysis with graceful degradation.
Configuration
Configure via environment variables:
# File processing thresholds
LARGEFILE_MEMORY_THRESHOLD_MB=50 # Memory loading limit
LARGEFILE_MMAP_THRESHOLD_MB=500 # Memory mapping limit
# Search settings
LARGEFILE_FUZZY_THRESHOLD=0.8 # Fuzzy match sensitivity
LARGEFILE_MAX_SEARCH_RESULTS=20 # Result limit
LARGEFILE_CONTEXT_LINES=2 # Context window
# Performance
LARGEFILE_ENABLE_TREE_SITTER=true # Semantic features
LARGEFILE_BACKUP_DIR=".largefile_backups" # Backup location
Key Features
- Search/replace primary: Eliminates LLM line number errors
- Fuzzy matching: Handles whitespace and formatting variations
- Atomic operations: File integrity with automatic backups
- Semantic awareness: Tree-sitter integration for code structure
- Memory efficient: Handles files of any size without context limits
- Error recovery: Graceful degradation with clear error messages
Documentation
- - Detailed tool documentation
- - Environment variables and tuning
- - Real-world usage examples and workflows
- - Architecture and implementation details