ollama-mcp-server

ngc-shj/ollama-mcp-server

3.2

If you are the rightful owner of ollama-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Ollama MCP Server is a bridge to use Ollama as an MCP server from Claude Code.

Ollama MCP Server

A bridge to use Ollama as an MCP server from Claude Code.

Features

  • ollama_generate: Single-turn text generation (supports vision models with image input)
  • ollama_chat: Multi-turn chat conversations (supports vision models with image input)
  • ollama_list: List available models
  • ollama_show: Show model details
  • ollama_pull: Download models
  • ollama_embeddings: Generate text embeddings

Supported Vision Models

  • llava - General-purpose vision model
  • llama3.2-vision - Meta's multimodal model
  • deepseek-ocr - OCR-specialized vision model

Prerequisites

  1. Ollama installed and running

    # Install Ollama (macOS)
    brew install ollama
    
    # Start Ollama server
    ollama serve
    
  2. At least one model downloaded

    ollama pull llama3.2
    

Installation

cd ollama-mcp-server
npm install
npm run build

Claude Code Configuration

Method 1: Using CLI (Recommended)

# Add to local scope (current project)
claude mcp add --transport stdio ollama -- node /path/to/ollama-mcp-server/dist/index.js

# Add to user scope (all projects)
claude mcp add --transport stdio ollama --scope user -- node /path/to/ollama-mcp-server/dist/index.js

To add environment variables:

claude mcp add --transport stdio ollama \
  --env OLLAMA_BASE_URL=http://localhost:11434 \
  -- node /path/to/ollama-mcp-server/dist/index.js

Method 2: Manual Configuration

Project scope (.mcp.json in project root):

{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-mcp-server/dist/index.js"],
      "env": {
        "OLLAMA_BASE_URL": "http://localhost:11434"
      }
    }
  }
}

User scope (~/.claude.json):

{
  "mcpServers": {
    "ollama": {
      "command": "node",
      "args": ["/path/to/ollama-mcp-server/dist/index.js"],
      "env": {
        "OLLAMA_BASE_URL": "http://localhost:11434"
      }
    }
  }
}

Verify Installation

# List configured MCP servers
claude mcp list

# Inside Claude Code
/mcp

Auto-approve Tool Calls (Optional)

By default, Claude Code asks for confirmation each time an Ollama tool is called. To skip confirmations, add the following to ~/.claude/settings.json:

{
  "permissions": {
    "allow": [
      "mcp__ollama__ollama_generate",
      "mcp__ollama__ollama_chat",
      "mcp__ollama__ollama_list",
      "mcp__ollama__ollama_show",
      "mcp__ollama__ollama_pull",
      "mcp__ollama__ollama_embeddings"
    ]
  }
}

Environment Variables

VariableDefaultDescription
OLLAMA_BASE_URLhttp://localhost:11434Ollama server URL

Usage Examples

From Claude Code:

List Models

List available Ollama models

Text Generation

Generate "3 features of Rust" using Ollama's llama3.2 model

Chat

I'd like to have Ollama do a code review

Vision / Image Analysis

Analyze this image using llava: /path/to/image.jpg
Use deepseek-ocr to extract text from this document: /path/to/document.png

Troubleshooting

Cannot connect to Ollama

# Check if Ollama is running
curl http://localhost:11434/api/tags

# If not running
ollama serve

No models available

ollama pull llama3.2

MCP server not showing up

# Verify server is registered
claude mcp list

# Check server health
claude mcp get ollama

License

MIT