hyzhak/ollama-mcp-server
3.3
If you are the rightful owner of ollama-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Ollama MCP Server is a robust integration tool that connects Ollama's local LLM capabilities with the Model Context Protocol (MCP), providing a seamless interface for AI model management and execution.
Tools
Functions exposed to the LLM to take actions
pull
Pull models from registries
run
Execute models with customizable prompts
chat_completion
OpenAI-compatible chat completion API
create
Create custom models from Modelfiles
Prompts
Interactive templates invoked by user choice
No prompts
Resources
Contextual data attached and managed by the client