llm-mcp

sandraschi/llm-mcp

3.2

If you are the rightful owner of llm-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The LLM MCP Server is a FastMCP 2.10-compliant server designed to manage local and cloud-based large language models (LLMs) with advanced features such as video generation and interactive chat capabilities.

Tools

Functions exposed to the LLM to take actions

list_models

List all available models from all providers

get_model

Get details about a specific model

load_model

Load a model into memory

unload_model

Unload a model from memory

generate_text

Generate text using a loaded model

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources