mcp-server-litellm

itsDarianNgo/mcp-server-litellm

3.3

If you are the rightful owner of mcp-server-litellm and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The LiteLLM MCP Server is designed to facilitate text completions using OpenAI models through the integration of LiteLLM.

The LiteLLM MCP Server is a robust solution for developers and businesses looking to leverage the power of OpenAI's language models for text completion tasks. By integrating LiteLLM, this server provides a streamlined and efficient way to handle requests for text generation, making it an ideal choice for applications that require natural language processing capabilities. The server is built on the Model Context Protocol (MCP), which ensures a standardized approach to handling model interactions, thereby enhancing compatibility and ease of use. With its focus on simplicity and performance, the LiteLLM MCP Server is suitable for a wide range of applications, from chatbots and virtual assistants to content generation and data analysis tools. Its installation process is straightforward, and it is designed to be easily integrated into existing systems, providing a seamless experience for developers.

Features

  • Integration with LiteLLM for efficient text completion
  • Utilizes OpenAI models for high-quality language processing
  • Built on the Model Context Protocol for standardized interactions
  • Easy installation and integration into existing systems
  • Suitable for a wide range of applications, including chatbots and content generation

Usages

usage with local integration stdio

python
mcp.run(transport='stdio')  # Tools defined via @mcp.tool() decorator

usage with local integration ide plugin

{
  "mcpServers": {
    "litellm": {
      "command": "python",
      "args": ["server.py"]
    }
  }
}

usage with remote integration sse

python
mcp.run(transport='sse', host="0.0.0.0", port=8000)  # Specify SSE endpoint

usage with remote integration streamable http

yaml
paths:
  /mcp:
    post:
      x-ms-agentic-protocol: mcp-streamable-1.0  # Copilot Studio integration

usage with platform ecosystem integration github

{"command": "docker", "args": ["run", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "ghcr.io/github/github-mcp-server"]}

usage with development frameworks fastmcp

python
from mcp.server import FastMCP
app = FastMCP('demo')
@app.tool()
async def query(): ...