countermine2992/mcplocal
3.1
If you are the rightful owner of mcplocal and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The MCP Local AI Server is a versatile tool designed for macOS, enabling secure and efficient operations with local and remote LLMs, filesystem management, and development tasks.
Tools
3
Resources
0
Prompts
0
MCP Local AI Server (macOS) + Gradio UI
A local Model Context Protocol (MCP) server that connects to Ollama-hosted LLMs and your macOS machine to perform secure filesystem operations, automation, VSCode and Git tasks. Includes a Gradio UI to start/stop services and chat.
Features
- Filesystem tools: read_file, write_file, list_files (path-guarded)
- macOS automation: AppleScript, notifications, app control
- VSCode integration: open file, create project, install extensions
- Git operations: init, add, commit, status, clone
- Learning system: tracks successful task workflows for suggestions
- Gradio UI: Start/Stop All, health/status, chat, direct MCP tools
Requirements
- macOS 10.15+
- Python 3.9+
- Ollama installed and at least one model (e.g.,
qwen2.5:0.5b) - (Optional) Visual Studio Code
Install
# Clone your repo
git clone <your-repo-url>
cd mcp-local-ai-server
# Python env
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
Start services (CLI)
# Start server (gunicorn)
gunicorn -w 2 -t 180 -b 127.0.0.1:5001 main:app > server.log 2>&1 & echo $! > gunicorn.pid
# Ensure Ollama is running
pgrep -x ollama >/dev/null || (ollama serve >/dev/null 2>&1 &)
# Health check
curl -s http://127.0.0.1:5001/api/mcp/health | cat
Gradio UI
# launch on 127.0.0.1:7861
export GRADIO_SERVER_PORT=7861
python ui.py
Open http://127.0.0.1:7861 and:
- Set model (e.g.,
qwen2.5:0.5b) and Allowed Paths - Click Start All
- Use Status/Health, List Directory (direct MCP), and Chat
Environment variables
FLASK_SECRET_KEY: Flask secret key (recommended)ALLOWED_PATHS: Comma-separated list of directories the filesystem tools may access. Example:export ALLOWED_PATHS="/Volumes/externalmac,$HOME,$PWD"GRADIO_SERVER_PORT: Gradio UI port (default 7860)
API Quickstart
# list tools
curl -s http://127.0.0.1:5001/api/mcp/tools | jq .
# list files in a directory (must be in ALLOWED_PATHS)
curl -s -X POST http://127.0.0.1:5001/api/mcp \
-H 'Content-Type: application/json' \
-d '{
"jsonrpc":"2.0","id":"1","method":"tools/call",
"params":{"name":"list_files","arguments":{"path":"/Volumes/externalmac"}}
}' | jq .
# chat
curl -s -X POST http://127.0.0.1:5001/api/chat/start \
-H 'Content-Type: application/json' \
-d '{"prompt":"Say hello and list available tools","enable_learning":true}' | jq .
Security
- Filesystem access is restricted to
ALLOWED_PATHS(whitelist) - Potentially destructive actions should be reviewed
- Logs are written to
server.log(do not commit)
Development
# stop services
kill $(cat gunicorn.pid) 2>/dev/null || true
pkill -x ollama 2>/dev/null || true
# logs
tail -n 120 server.log | cat
Packaging (optional)
pip install pyinstaller
pyinstaller --onefile --name mcp-local-server --hidden-import flask_cors main.py
License
MIT