Sanjaykrishnamurthy/mcp-ollama-integration
If you are the rightful owner of mcp-ollama-integration and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
This project demonstrates the integration of local LLMs (Ollama) with Model Context Protocol (MCP) servers to enhance AI capabilities.
MCP Ollama Integration Demo
A practical demonstration of integrating local LLMs (Ollama) with Model Context Protocol (MCP) servers for enhanced AI capabilities.
🚀 Overview
This project demonstrates how to:
- Build MCP servers using FastMCP
- Create MCP clients with configuration-driven architecture
- Integrate local LLMs (Ollama) with MCP tools
- Enable AI assistants to access external tools and services
📁 Project Structure
├── server.py # MCP weather server using FastMCP
├── config.json # MCP server configuration
├── client.py # Basic MCP client
├── client_config.py # Configuration-based MCP client
├── simple_ollama_mcp.py # ✨ Ollama + MCP integration demo
├── ollama_mcp_client.py # Advanced Ollama + MCP integration
└── test_ollama_connection.py # Ollama connection tester
🛠️ Setup
Prerequisites
- Python 3.10+
- Ollama installed with a model (e.g.,
gpt-oss:20b) - MCP Python SDK
Installation
-
Clone the repository:
git clone https://github.com/Sanjaykrishnamurthy/mcp-ollama-integration.git cd mcp-ollama-integration -
Install dependencies:
pip install "mcp[cli]" openai -
Start Ollama server:
ollama serve -
Pull a model (if not already available):
ollama pull gpt-oss:20b
🎯 Usage Examples
1. Basic MCP Client
python client.py
Tests basic MCP server connection and tool calling.
2. Configuration-Based Client
python client_config.py
Demonstrates config-driven MCP client with multiple server support.
3. Ollama + MCP Integration ⭐
python simple_ollama_mcp.py
Shows how local LLM can use MCP tools to answer questions with real-time data.
Sample Interaction:
👤 User: What's the weather in Paris?
🎯 Detected location: Paris
🌤️ Getting weather data from MCP server...
🔧 MCP Result: The weather in Paris is sunny ☀️
🤖 Ollama Response: Paris is sunny today ☀️—perfect for a walk around the city!
🔧 How It Works
MCP Server (server.py)
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("weather-server")
@mcp.tool()
async def get_weather(location: str = "London") -> str:
"""Get weather information for a location."""
return f"The weather in {location} is sunny ☀️"
if __name__ == "__main__":
mcp.run(transport="stdio")
Ollama Integration (simple_ollama_mcp.py)
- Question Analysis - Ollama determines if tools are needed
- Location Extraction - Ollama extracts city names from questions
- Tool Execution - MCP client calls weather server
- Response Formatting - Ollama creates natural language response
🌟 Key Features
✅ FastMCP Server - Modern, type-safe MCP server implementation
✅ Stdio Transport - Simple process-based communication
✅ Configuration-Driven - JSON-based server management
✅ Local LLM Integration - Works with Ollama models
✅ Tool Detection - Smart tool usage based on user intent
✅ Error Handling - Graceful failure and recovery
🔄 Architecture
User Question
⬇️
Ollama LLM (Intent Detection)
⬇️
MCP Client (Tool Orchestration)
⬇️
MCP Server (Tool Execution)
⬇️
Ollama LLM (Response Formatting)
⬇️
Natural Language Response
📝 Configuration
Edit config.json to add more MCP servers:
{
"mcpServers": {
"weather-server": {
"command": "python",
"args": ["server.py"],
"env": {},
"description": "Weather information server"
},
"your-server": {
"command": "python",
"args": ["your_server.py"],
"env": {"API_KEY": "your-key"},
"description": "Your custom server"
}
}
}
🚀 Extending the Project
Add New Tools
@mcp.tool()
async def your_tool(param: str) -> str:
"""Your tool description."""
return f"Result for {param}"
Support More Models
Replace gpt-oss:20b with any Ollama-compatible model:
llama2:7bllama2:70bcodellama:7bmistral:7b
🐛 Troubleshooting
Common Issues
-
Ollama Connection Failed
# Make sure Ollama is running ollama serve # Test connection python test_ollama_connection.py -
Model Not Found
# List available models ollama list # Pull missing model ollama pull gpt-oss:20b -
MCP Server Issues
# Test basic MCP functionality python client.py
📚 Learn More
🤝 Contributing
Contributions are welcome! Please feel free to submit pull requests or open issues for bugs and feature requests.
📄 License
This project is licensed under the MIT License - see the file for details.
🙏 Acknowledgments
- Model Context Protocol for the excellent protocol specification
- Ollama for making local LLMs accessible
- FastMCP for the Python SDK