claudobahn/langgraph-mcp-ollama-example
3.1
If you are the rightful owner of langgraph-mcp-ollama-example and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This repository demonstrates a minimal local setup where a Python client connects to a local Model Context Protocol (MCP) server and uses an Ollama LLM.
Comments
No comments yet. Be the first to comment!