ollama_mcp_demo

dakofler/ollama_mcp_demo

3.1

If you are the rightful owner of ollama_mcp_demo and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This document provides a comprehensive overview of setting up and using a custom Model Context Protocol (MCP) server to expose Python functions as tools, and its integration with Ollama.

Tools

Functions exposed to the LLM to take actions

echo

A tool to echo back the input message.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources