mcp-server

izardy/mcp-server

3.1

If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

This document provides a structured overview of a minimal Ollama MCP server implemented in Python, designed to facilitate communication between MCP clients and a local model using Ollama's API.

Tools
1
Resources
0
Prompts
0

📝 Minimal Ollama MCP Server (Python)

⚙️ How It Works

  • Imports MCP server utilities (mcp.server.stdio).
  • Defines a tool called "chat" that takes a prompt string.
  • Uses Ollama’s Python API (ollama.chat) to run the model locally.
  • Returns the model’s response back to the MCP client.
  • Runs over stdio so MCP clients can connect via pipes.

🚀 Usage

  1. Install dependencies:
    pip install mcp ollama
    
    (Make sure you have Ollama installed and running locally.)
  2. Save the script as ollama-mcp-server.py.
  3. Run it:
    python ollama-mcp-server.py
    
  4. Connect with your MCP client (like the one I showed earlier).
    Example:
    await client.connect("ollama-mcp-server.py")
    

🔑 Notes

  • The chat_tool here is minimal — it just sends a prompt to Ollama and returns the text.
  • You can extend it with more tools (e.g., embeddings, generate, list_models).
  • The model name (llama3) can be swapped for any Ollama model you’ve pulled (mistral, gemma, etc.).