yergin/local-mcp-hub
If you are the rightful owner of local-mcp-hub and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Local MCP Hub is a portable Model Context Protocol server designed to run on development machines, providing AI coding assistance by connecting to a remote Ollama server.
Serena
A suite of 18 tools for semantic code analysis.
Context7
A set of 2 tools for documentation search.
Local MCP Hub
A portable MCP (Model Context Protocol) hub that runs locally on development machines while connecting to a remote Ollama server. Provides AI coding assistance with access to local code analysis and documentation tools.
Architecture
Dev Machine Remote Server
āāāāāāāāāāāāāāāāāāāāāāāāāāā āāāāāāāāāāāāāāāāāāā
ā Local MCP Hub ā ā Ollama Server ā
ā āāā Serena (Code) āāāāāŗā qwen2.5:latest ā
ā āāā Context7 (Docs) ā ā Port 11434 ā
ā āāā OpenAI API (3002) ā āāāāāāāāāāāāāāāāāāā
āāāāāāāāāāāāāāāāāāāāāāāāāāā
ā²
ā
āāāāāāāāāāāāāāāāāāāāāāāāāāā
ā Continue Extension ā
ā VS Code ā
āāāāāāāāāāāāāāāāāāāāāāāāāāā
Features
- š§ Local Code Analysis: 18 Serena tools for semantic code understanding
- š Documentation Search: 2 Context7 tools for up-to-date library docs
- š Remote AI Processing: Connects to remote Ollama server for model inference
- š Continue Integration: OpenAI-compatible API for VS Code Continue extension
- š¦ Auto-Installer: One-command setup that downloads and configures all dependencies
- š„ļø Cross-Platform: Works on Mac, Windows, and Linux
- ā” Portable: Easy deployment across multiple development machines
Available Tools
Serena (18 tools)
list_dir
,find_file
,symbol_overview
,find_symbol
get_symbol_definition
,list_symbols_in_file
,find_references
replace_symbol_body
,search_for_pattern
,read_file_content
get_workspace_overview
,search_symbols_in_workspace
get_class_hierarchy
,find_implementations
,get_function_calls
analyze_dependencies
,find_similar_code
,extract_interfaces
Context7 (2 tools)
resolve-library-id
,get-library-docs
Quick Start
Prerequisites
- Node.js (v18+)
- Python 3.8+
- Git
- Remote Ollama server running
qwen2.5:latest
Installation
-
Clone the repository:
git clone <repository-url> cd local-mcp-hub
-
Run the installer for your platform:
Linux:
./install.sh
macOS:
./install-mac.sh
Windows:
install.bat
-
Configure the Ollama server connection:
Edit
config.json
and update the Ollama host to match your server:{ "ollama": { "host": "http://YOUR_OLLAMA_SERVER:11434", "model": "qwen2.5:latest" } }
-
Start the hub:
npm start
-
Configure Continue extension in VS Code:
models: - name: "Local MCP Hub + Qwen2.5" provider: openai model: "qwen2.5:latest" apiBase: "http://localhost:3002/v1" apiKey: "dummy-key"
Configuration
Environment Variables
PORT
: Override default port (3002)- Example:
PORT=3003 npm start
Config File (config.json
)
{
"ollama": {
"host": "http://10.0.0.24:11434",
"model": "qwen2.5:latest"
},
"hub": {
"port": 3002,
"log_level": "info",
"cors_origins": ["*"]
},
"mcps": {
"enabled": ["serena", "context7"]
}
}
Usage Examples
Ask Continue these questions to see the MCP tools in action:
- "Can you analyze the LocalMCPHub class in my codebase? Show me its methods."
- "What files are in my src directory and how is the project structured?"
- "How does the sendToOllama method work and what error handling does it have?"
- "Find documentation for the latest React hooks API" (uses Context7)
API Endpoints
GET /health
- Health checkGET /v1/models
- Available modelsPOST /v1/chat/completions
- OpenAI-compatible chat endpointGET /v1/tools
- List available MCP tools
Development
Project Structure
local-mcp-hub/
āāā src/
ā āāā hub.ts # Main hub implementation
āāā mcps/ # Auto-downloaded MCPs
ā āāā serena/ # Code analysis toolkit
ā āāā context7/ # Documentation search
āāā install.sh # Auto-installer
āāā config.json # Configuration
āāā continue-config.yaml # Continue extension template
Running in Development
npm run dev # Start with ts-node
npm run build # Build TypeScript
npm test # Run tests
Logs
Check local-mcp-hub.log
for detailed logs including:
- MCP tool usage
- Ollama communication
- Continue extension requests
Troubleshooting
Port Already in Use
PORT=3003 npm start
Continue Extension Not Connecting
- Ensure hub is running:
curl http://localhost:3002/health
- Check Continue config uses correct port
- Verify Continue extension is reloaded
MCP Tools Not Working
- Check
mcps/
directory exists with serena and context7 - Run
./install.sh
again to re-download - Verify Python virtual environment:
mcps/serena/.venv/
Ollama Connection Issues
- Update
config.json
with correct Ollama server address:{ "ollama": { "host": "http://YOUR_OLLAMA_SERVER:11434", "model": "qwen2.5:latest" } }
- Test direct connection:
curl http://your-server:11434/api/tags
- Ensure qwen2.5:latest model is installed on server
- Check firewall settings allow access to port 11434
Contributing
- Fork the repository
- Create a feature branch
- Make changes
- Test on multiple platforms
- Submit a pull request
License
MIT License - see LICENSE file for details.