lean-mcp

thealexkul/lean-mcp

3.2

If you are the rightful owner of lean-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Lean MCP is a minimal server designed to bridge AI assistants like Claude Desktop and LM Studio to local API services.

Tools
3
Resources
0
Prompts
0

Lean MCP - Local Services Integration

Minimal MCP server for bridging AI assistants (Claude Desktop, LM Studio) to local API services.

Quick Start

Option 1: Docker (Recommended)

# Build
docker build -t lean-mcp .

# Run with stdio (for Claude Desktop)
docker run -it lean-mcp

# Run with SSE (for LM Studio)
docker run -p 8888:8888 -e MCP_TRANSPORT=sse lean-mcp

Option 2: Manual

# Install dependencies
pip install -r requirements.txt

# Run with stdio (Claude Desktop)
python server.py

# Run with SSE (LM Studio)  
MCP_TRANSPORT=sse python server.py

LLM Integration

Claude Desktop

Add to %APPDATA%\Claude\claude_desktop_config.json (Windows) or ~/Library/Application Support/Claude/claude_desktop_config.json (Mac):

{
  "mcpServers": {
    "lean-mcp": {
      "command": "python",
      "args": ["C:\\path\\to\\server.py"],
      "env": {
        "MODEL_SERVICE_URL": "http://localhost:8000"
      }
    }
  }
}

Or with Docker:

{
  "mcpServers": {
    "lean-mcp": {
      "command": "docker",
      "args": ["run", "-i", "lean-mcp"]
    }
  }
}

LM Studio

Configure in Settings → Developer → MCP Servers:

{
  "mcpServers": {
    "lean-mcp": {
      "url": "http://127.0.0.1:8888/sse"
    }
  }
}

Start server with:

MCP_TRANSPORT=sse python server.py

Available Tools

Model Service Tools

  1. get_available_models() - List available models from model service
  2. get_model_details(model_id) - Get details for specific model
  3. check_model_service_health() - Check model service status

iDRAC/Specific API Tools (Formatted JSON Strings)

  1. get_chassis() - Get chassis information (hardware, power state, status)
  2. get_system_info() - Get system details (model, CPU, memory, BIOS)
  3. get_thermal_info() - Get temperature and fan information
  4. get_power_info() - Get power supply and consumption data

Generic REST API Tools

  1. call_api_endpoint(endpoint, method) - Call any REST API endpoint
  2. call_api_with_body(endpoint, body, method) - Call API with JSON body
  3. check_api_health() - Check if API service is available

All tools return formatted strings optimized for LM Studio consumption.

Adding New Tools

  1. Create a new file in tools/ (e.g., tools/database.py)
  2. Define tools with the registration function pattern:

For dictionary responses:

def register_tools(mcp_instance):
    @mcp_instance.tool()
    def your_tool_name(param: str) -> dict:
        """Tool description for AI"""
        # Your implementation
        return {"result": "data"}

For string responses (easier for LM Studio to display):

def register_tools(mcp_instance):
    @mcp_instance.tool()
    def your_tool_name(param: str) -> str:
        """Tool description for AI"""
        # Your implementation - return formatted string
        return f"Result: {data}"
  1. Import and register in server.py:
from tools import database
database.register_tools(mcp)

See tools/api_example.py for complete examples of REST API tools that return strings.

Configuration

Environment variables:

VariableDefaultDescription
MCP_TRANSPORTstdioTransport mode: stdio or sse
MCP_PORT8888Port for SSE transport
MCP_HOST0.0.0.0Host binding for SSE transport
MODEL_SERVICE_URLhttp://localhost:8000Model service API URL
API_BASE_URLhttp://localhost:80Base URL for generic REST API tools
IDRAC_BASE_URLhttp://localhost:80Base URL for iDRAC/specific API tools
API_TIMEOUT30API request timeout (seconds)
LOG_LEVELINFOLogging level

Docker

Build

docker build -t lean-mcp .

Run Options

stdio mode (Claude Desktop):

docker run -it lean-mcp

SSE mode (LM Studio):

docker run -p 8888:8888 -e MCP_TRANSPORT=sse lean-mcp

With custom model service (accessing host services):

docker run -it \
  -e MODEL_SERVICE_URL=http://host.docker.internal:80 \
  lean-mcp

Note: Use host.docker.internal instead of localhost when accessing services on your host machine from inside Docker. Always include http:// prefix.

With host networking (access localhost services):

docker run --network=host lean-mcp

Project Structure

.
├── server.py           # Main MCP server
├── tools/
│   ├── __init__.py
│   └── models.py       # Example tools for model service
├── Dockerfile          # Container definition
├── requirements.txt    # Python dependencies
└── README.md          # This file

Troubleshooting

Claude Desktop: "Server exits unexpectedly"

  • Ensure you're using stdio transport (default)
  • Use full absolute path in config
  • Check Python is in PATH

LM Studio: "404 Not Found"

  • Set MCP_TRANSPORT=sse
  • Use endpoint /sse not /mcp
  • Verify server is running on correct port

Tools return errors

  • This is normal if target services aren't running
  • Tools handle errors gracefully
  • Configure MODEL_SERVICE_URL to point to your service

Development

# Create virtual environment
python -m venv venv
source venv/bin/activate  # Windows: venv\Scripts\activate

# Install dependencies
pip install -r requirements.txt

# Run server
python server.py

License

MIT