thealexkul/lean-mcp
If you are the rightful owner of lean-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
Lean MCP is a minimal server designed to bridge AI assistants like Claude Desktop and LM Studio to local API services.
Lean MCP - Local Services Integration
Minimal MCP server for bridging AI assistants (Claude Desktop, LM Studio) to local API services.
Quick Start
Option 1: Docker (Recommended)
# Build
docker build -t lean-mcp .
# Run with stdio (for Claude Desktop)
docker run -it lean-mcp
# Run with SSE (for LM Studio)
docker run -p 8888:8888 -e MCP_TRANSPORT=sse lean-mcp
Option 2: Manual
# Install dependencies
pip install -r requirements.txt
# Run with stdio (Claude Desktop)
python server.py
# Run with SSE (LM Studio)
MCP_TRANSPORT=sse python server.py
LLM Integration
Claude Desktop
Add to %APPDATA%\Claude\claude_desktop_config.json (Windows) or ~/Library/Application Support/Claude/claude_desktop_config.json (Mac):
{
"mcpServers": {
"lean-mcp": {
"command": "python",
"args": ["C:\\path\\to\\server.py"],
"env": {
"MODEL_SERVICE_URL": "http://localhost:8000"
}
}
}
}
Or with Docker:
{
"mcpServers": {
"lean-mcp": {
"command": "docker",
"args": ["run", "-i", "lean-mcp"]
}
}
}
LM Studio
Configure in Settings → Developer → MCP Servers:
{
"mcpServers": {
"lean-mcp": {
"url": "http://127.0.0.1:8888/sse"
}
}
}
Start server with:
MCP_TRANSPORT=sse python server.py
Available Tools
Model Service Tools
- get_available_models() - List available models from model service
- get_model_details(model_id) - Get details for specific model
- check_model_service_health() - Check model service status
iDRAC/Specific API Tools (Formatted JSON Strings)
- get_chassis() - Get chassis information (hardware, power state, status)
- get_system_info() - Get system details (model, CPU, memory, BIOS)
- get_thermal_info() - Get temperature and fan information
- get_power_info() - Get power supply and consumption data
Generic REST API Tools
- call_api_endpoint(endpoint, method) - Call any REST API endpoint
- call_api_with_body(endpoint, body, method) - Call API with JSON body
- check_api_health() - Check if API service is available
All tools return formatted strings optimized for LM Studio consumption.
Adding New Tools
- Create a new file in
tools/(e.g.,tools/database.py) - Define tools with the registration function pattern:
For dictionary responses:
def register_tools(mcp_instance):
@mcp_instance.tool()
def your_tool_name(param: str) -> dict:
"""Tool description for AI"""
# Your implementation
return {"result": "data"}
For string responses (easier for LM Studio to display):
def register_tools(mcp_instance):
@mcp_instance.tool()
def your_tool_name(param: str) -> str:
"""Tool description for AI"""
# Your implementation - return formatted string
return f"Result: {data}"
- Import and register in
server.py:
from tools import database
database.register_tools(mcp)
See tools/api_example.py for complete examples of REST API tools that return strings.
Configuration
Environment variables:
| Variable | Default | Description |
|---|---|---|
MCP_TRANSPORT | stdio | Transport mode: stdio or sse |
MCP_PORT | 8888 | Port for SSE transport |
MCP_HOST | 0.0.0.0 | Host binding for SSE transport |
MODEL_SERVICE_URL | http://localhost:8000 | Model service API URL |
API_BASE_URL | http://localhost:80 | Base URL for generic REST API tools |
IDRAC_BASE_URL | http://localhost:80 | Base URL for iDRAC/specific API tools |
API_TIMEOUT | 30 | API request timeout (seconds) |
LOG_LEVEL | INFO | Logging level |
Docker
Build
docker build -t lean-mcp .
Run Options
stdio mode (Claude Desktop):
docker run -it lean-mcp
SSE mode (LM Studio):
docker run -p 8888:8888 -e MCP_TRANSPORT=sse lean-mcp
With custom model service (accessing host services):
docker run -it \
-e MODEL_SERVICE_URL=http://host.docker.internal:80 \
lean-mcp
Note: Use host.docker.internal instead of localhost when accessing services on your host machine from inside Docker. Always include http:// prefix.
With host networking (access localhost services):
docker run --network=host lean-mcp
Project Structure
.
├── server.py # Main MCP server
├── tools/
│ ├── __init__.py
│ └── models.py # Example tools for model service
├── Dockerfile # Container definition
├── requirements.txt # Python dependencies
└── README.md # This file
Troubleshooting
Claude Desktop: "Server exits unexpectedly"
- Ensure you're using
stdiotransport (default) - Use full absolute path in config
- Check Python is in PATH
LM Studio: "404 Not Found"
- Set
MCP_TRANSPORT=sse - Use endpoint
/ssenot/mcp - Verify server is running on correct port
Tools return errors
- This is normal if target services aren't running
- Tools handle errors gracefully
- Configure
MODEL_SERVICE_URLto point to your service
Development
# Create virtual environment
python -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Run server
python server.py
License
MIT