mcp

hmgtech/mcp

3.2

If you are the rightful owner of mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

MCP Weather Assistant is a local implementation using Mistral on Ollama that allows the model to fetch real-time weather data via a dedicated MCP server.

Tools
1
Resources
0
Prompts
0

MCP Weather Assistant

A local MCP (Model Context Protocol) implementation using Mistral on Ollama that allows the model to fetch real-time weather data via a dedicated MCP server.

This project demonstrates how an LLM can safely interact with external APIs by using tool calls, with the client orchestrating the workflow between user input, model reasoning, tool execution, and final output.

Features

  • Local Mistral model hosted via Ollama
  • MCP client to manage tool discovery and calls
  • Weather MCP server fetching live weather data from Open-Meteo
  • Two-step reasoning loop: model decides if a tool is needed, tool executes, then model summarizes results naturally
  • Fully modular — new tools can be added easily

Project Structure

MCP-Weather-Assistant/
├── mcp_client.py           # MCP client connecting LLM to tools
├── mcp_weather_server.py   # MCP server exposing weather API
├── README.md               # Project documentation
└── requirements.txt        # Python dependencies

Requirements

  • Python 3.10+
  • Ollama (local model hosting)
  • Mistral or LLaMA model installed in Ollama
  • requests and flask Python packages

Install dependencies:

pip install -r requirements.txt

requirements.txt

The requirements.txt file includes the following dependencies:

blinker==1.9.0
certifi==2025.10.5
charset-normalizer==3.4.3
click==8.3.0
colorama==0.4.6
Flask==3.1.2
idna==3.11
itsdangerous==2.2.0
Jinja2==3.1.6
MarkupSafe==3.0.3
requests==2.32.5
urllib3==2.5.0
Werkzeug==3.1.3

Setup

Clone this repository:

git clone https://github.com/hmgtech/mcp.git
cd MCP

Ensure Ollama is running locally:

ollama serve

Verify your model is installed (e.g., Mistral):

ollama list

You should see mistral (or your preferred model) in the list.

Optional: Modify mcp_client.py to point to your Ollama model:

MODEL_NAME = "mistral"

Running the Project

1️⃣ Start the Weather MCP Server

python mcp_weather_server.py

Output:

🌦️ MCP Weather Server running at http://localhost:8080

2️⃣ Start the MCP Client

In a separate terminal:

python mcp_client.py

Output:

🤖 MCP Client started. Type your queries below.
🔍 Discovered MCP tools: ['get_weather']

3️⃣ Interact with the Assistant

Example conversation:

You: What’s the weather in Tokyo?
🔧 Model requested MCP tool: get_weather({'city': 'Tokyo'})
🌐 Tool result: {'city': 'Tokyo', 'temperature': 22.1, 'windspeed': 3.7}
Assistant: The current temperature in Tokyo is around 22°C with gentle wind.

You: Write a haiku about clouds.
Assistant: Floating soft above, shadows drift across still fields, sky whispers in gray.

4️⃣ Exit

Type exit or quit to stop the client.

How It Works

  1. User input → sent to the Mistral model.
  2. Model decides if a tool call is required (e.g., get_weather).
  3. MCP client executes the tool via the MCP server.
  4. Tool result is sent back to the model for final reasoning.
  5. Model generates human-readable output returned to the user.

This architecture ensures:

  • Safe, auditable tool execution
  • Modular addition of new tools
  • Separation of model reasoning and external actions

Adding More Tools

  1. Create a new MCP server exposing your tool(s) via /tools and /invoke.
  2. Register the server in MCP_REGISTRY in mcp_client.py.
  3. Update the model prompt to include descriptions of the new tools.

The MCP client will automatically discover and route calls to the appropriate server.

License

MIT License
© 2025 Hiteshkumar Gupta