mpklu/local-mcp-server
If you are the rightful owner of local-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A universal Model Context Protocol (MCP) server that automatically discovers and exposes local scripts to AI assistants through a modern web interface.
Local MCP Server
A universal Model Context Protocol (MCP) server that automatically discovers and exposes local tools to AI assistants through a modern web interface and pluggable host adapter system.
š Features
- Directory-Based Auto-Discovery: Automatically detects tools organized in individual folders under
tools/
- Universal Host Support: Works with Claude Desktop, Generic MCP clients, and Google Gemini CLI through pluggable adapters
- Intelligent Discovery Pipeline: Auto-generates configurations for new tools with dual-config system
- Visual Management: Modern React-based web interface for tool configuration and monitoring
- Host Adapter Architecture: Pluggable system supporting different MCP communication protocols
- Dependency Isolation: Each tool gets its own virtual environment with automatic dependency management
- Secure Execution: Sandboxed script execution with timeout protection and structured result handling
- Real-time Monitoring: Live server status, execution monitoring, and configuration management
š Quick Start
Prerequisites
- Python 3.8+
- Node.js 16+ (for web interface)
uv
package manager (installation guide)
Installation
-
Clone the repository
git clone https://github.com/yourusername/local-mcp-server.git cd local-mcp-server
-
Start the development environment
# This will set up both server and web interface ./scripts/setup.sh
-
Add your first tool
- Create a folder in
tools/
with your tool name (e.g.,tools/my-tool/
) - Add your script as
run.py
,run.sh
, orrun
(entry point) - Run
python server/discover_tools.py
to auto-generate configuration - Open http://localhost:3000 to configure it via the web interface
- The tool will automatically appear in your MCP-compatible AI assistant
- Create a folder in
Configure with Your MCP Host
Claude Desktop (Default)
{
"mcpServers": {
"local-tools": {
"command": "/path/to/local-mcp-server/server/start_server.sh",
"args": ["--host=claude-desktop"],
"cwd": "/path/to/local-mcp-server/server"
}
}
}
Generic MCP Client
# Start server for any MCP-compatible client
./server/start_server.sh --host=generic
Google Gemini CLI
# Start server for Google Gemini CLI
./server/start_server.sh --host=google-gemini-cli
See for detailed configuration instructions and supported features.
š Project Structure
local-mcp-server/
āāā server/ # MCP server core
ā āāā src/local_mcp/ # Main server code
ā ā āāā adapters/ # Host adapter system
ā ā āāā config.py # Configuration management
ā ā āāā discovery.py # Tool discovery system
ā ā āāā server.py # Main MCP server
ā āāā config/ # Server configurations
ā ā āāā tools/ # Individual tool configs
ā ā āāā tools.json # Compiled tool config
ā ā āāā config_templates/ # Host-specific templates
ā āāā discover_tools.py # Discovery tool utility
ā āāā build_tools.py # Config compilation utility
ā āāā start_server.sh # Server startup script
āāā web-interface/ # Web management interface
ā āāā backend/ # FastAPI backend
ā āāā frontend/ # React frontend
ā āāā start_dev.sh # Development server
āāā tools/ # Directory-based tools (each in own folder)
ā āāā demo-features/ # Sample: Advanced features showcase
ā āāā file-ops/ # Sample: File operations
ā āāā http-client/ # Sample: HTTP utilities
ā āāā system-info/ # Sample: System information
ā āāā text-utils/ # Sample: Text processing
āāā docs/ # Documentation
āāā examples/ # Host-specific configuration examples
š ļø Sample Tools
The project includes several sample tools to help you get started:
- System Info: Get system information and metrics
- File Operations: File reading, writing, and listing utilities
- Text Processing: Text manipulation and analysis tools
- HTTP Client: Make HTTP requests and API calls
- Demo Features: Showcase advanced configuration features
š Documentation
š¤ Contributing
We welcome contributions! Please see our for details.
š License
This project is licensed under the MIT License - see the file for details.