ollama-mcp

etnlbck/ollama-mcp

3.2

If you are the rightful owner of ollama-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The Ollama MCP Server is a local server that facilitates interaction with Ollama models using the Model Context Protocol.

Tools

Functions exposed to the LLM to take actions

ollama_list_models

Lists all available Ollama models on your system.

ollama_chat

Chat with a model using conversation history.

ollama_generate

Generate a response from a single prompt.

ollama_pull_model

Download a model from the Ollama registry.

ollama_delete_model

Remove a model from your local installation.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources