mcp-ollama
If you are the rightful owner of mcp-ollama and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
MCP Ollama is a server designed to facilitate the integration of Ollama with Claude Desktop or other MCP clients. It requires Python 3.10 or higher and the Ollama application to be installed and running. Users must have at least one model pulled with Ollama, such as 'llama2'. The server allows users to configure Claude Desktop to communicate with Ollama through MCP, enabling seamless interaction with various models. The development setup involves cloning the repository and syncing with 'uvx'. Testing can be done using MCP Inspector to ensure the server is functioning correctly. The server is licensed under MIT, making it open for modification and distribution.
Features
- list_models - List all downloaded Ollama models
- show_model - Get detailed information about a specific model
- ask_model - Ask a question to a specified model