ollama-mcp

ollama-mcp

3.5

If you are the rightful owner of ollama-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

An MCP (Model Context Protocol) server for Ollama that enables seamless integration between Ollama's local LLM models and MCP-compatible applications like Claude Desktop.

The Ollama MCP Server is designed to facilitate the integration of Ollama's local LLM models with applications that support the Model Context Protocol (MCP). This server allows users to list available models, pull new models, chat with models using Ollama's chat API, and retrieve detailed model information. It features automatic port management and can be configured using environment variables. The server is built to work with Node.js and npm, and requires Ollama to be installed and running locally. It provides a straightforward installation process and can be integrated into other MCP-compatible applications by modifying their settings files.

Features

  • List available Ollama models
  • Pull new models from Ollama
  • Chat with models using Ollama's chat API
  • Get detailed model information
  • Automatic port management