llm-mcp-server

erdalgunes/llm-mcp-server

3.2

If you are the rightful owner of llm-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

MCP (Model Context Protocol) server wrapper for Simon Willison's LLM CLI tool.

Tools
4
Resources
0
Prompts
0

LLM MCP Server

MCP (Model Context Protocol) server wrapper for Simon Willison's LLM CLI tool.

Features

  • Send prompts to LLM models
  • Chat with LLM models
  • List available models
  • Install new models/providers
  • OpenAI API key integration

Setup

Local Development

  1. Install dependencies:
npm install
  1. Set your OpenAI API key:
export OPENAI_API_KEY=your-api-key
  1. Run the server:
npm start

Deploy to Render

  1. Fork/push this repository to GitHub
  2. Connect your GitHub repo to Render
  3. Add your OPENAI_API_KEY in Render's environment variables
  4. Deploy using the included render.yaml configuration

MCP Client Configuration

Add to your Claude Desktop or other MCP client configuration:

{
  "mcpServers": {
    "llm-cli": {
      "command": "node",
      "args": ["/path/to/llm-mcp/index.js"],
      "env": {
        "OPENAI_API_KEY": "your-api-key"
      }
    }
  }
}

Available Tools

  • prompt: Send a single prompt to an LLM
  • chat: Interactive chat with an LLM
  • list_models: List available models
  • install_model: Install new models or providers

Environment Variables

  • OPENAI_API_KEY: Your OpenAI API key for GPT models