langgraph-mcp-ollama-example

claudobahn/langgraph-mcp-ollama-example

3.1

If you are the rightful owner of langgraph-mcp-ollama-example and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This repository demonstrates a minimal local setup where a Python client connects to a local Model Context Protocol (MCP) server and uses an Ollama LLM.

Tools

Functions exposed to the LLM to take actions

add_numbers

Adds two numbers and returns the result.

Prompts

Interactive templates invoked by user choice

No prompts

Resources

Contextual data attached and managed by the client

No resources