erdalgunes/llm-mcp-server
3.2
If you are the rightful owner of llm-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
MCP (Model Context Protocol) server wrapper for Simon Willison's LLM CLI tool.
Tools
4
Resources
0
Prompts
0
LLM MCP Server
MCP (Model Context Protocol) server wrapper for Simon Willison's LLM CLI tool.
Features
- Send prompts to LLM models
- Chat with LLM models
- List available models
- Install new models/providers
- OpenAI API key integration
Setup
Local Development
- Install dependencies:
npm install
- Set your OpenAI API key:
export OPENAI_API_KEY=your-api-key
- Run the server:
npm start
Deploy to Render
- Fork/push this repository to GitHub
- Connect your GitHub repo to Render
- Add your
OPENAI_API_KEY
in Render's environment variables - Deploy using the included
render.yaml
configuration
MCP Client Configuration
Add to your Claude Desktop or other MCP client configuration:
{
"mcpServers": {
"llm-cli": {
"command": "node",
"args": ["/path/to/llm-mcp/index.js"],
"env": {
"OPENAI_API_KEY": "your-api-key"
}
}
}
}
Available Tools
prompt
: Send a single prompt to an LLMchat
: Interactive chat with an LLMlist_models
: List available modelsinstall_model
: Install new models or providers
Environment Variables
OPENAI_API_KEY
: Your OpenAI API key for GPT models