ollama-mcp-server

paolodalprato/ollama-mcp-server

3.2

If you are the rightful owner of ollama-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Ollama MCP Server is a self-contained Model Context Protocol server designed for comprehensive Ollama management with zero external dependencies.

Tools

Functions exposed to the LLM to take actions

list_local_models

List installed models with details.

local_llm_chat

Chat directly with local models.

download_model

Download models with progress tracking.

remove_model

Safely remove models from storage.

start_ollama_server

Start Ollama server with self-contained implementation.

ollama_health_check

Comprehensive server health diagnostics.

system_resource_check

Hardware compatibility analysis.

suggest_models

AI-powered model recommendations based on user needs.

search_available_models

Search Ollama Hub by category.

check_download_progress

Monitor download progress with visual indicators.

select_chat_model

Interactive model selection interface.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources