mcp_weather_server_demo

hanktchen18/mcp_weather_server_demo

3.2

If you are the rightful owner of mcp_weather_server_demo and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

This is a Model Context Protocol (MCP) Server application providing real-time weather alerts and forecasts using the National Weather Service API.

Tools
2
Resources
0
Prompts
0

MCP Weather Server Demo

This is a Model Context Protocol (MCP) Server application.

A Python-based MCP server that provides real-time weather alerts and forecasts for US states using the National Weather Service API. This server can be integrated with AI assistants like Claude (via Claude Desktop) or Cursor IDE, enabling them to access and provide weather information through natural language interactions.

Requirements

  • Python 3.12 or higher
  • uv package manager (recommended) or pip
  • (Optional) OpenAI API key for interactive chat client

Installation

Clone this repository to your computer:

git clone https://github.com/yourusername/mcp_server.git
cd mcp_server

Install dependencies using uv:

uv sync

Or using pip:

pip install -r requirements.txt

How to Use

Running the MCP Server

The server can be run in two modes:

1. Standard I/O Mode (for Claude Desktop integration)
uv run mcp run server/weather.py
2. SSE (Server-Sent Events) Mode (for web integration)
uv run python mcpserver/server.py

Using with Claude Desktop

  1. Install Claude Desktop (if not already installed)

  2. Configure Claude Desktop by editing the MCP config file:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
    • Linux: ~/.config/Claude/claude_desktop_config.json
  3. Add the server configuration:

{
  "mcpServers": {
    "weather": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "mcp[cli]",
        "mcp",
        "run",
        "/path/to/server/weather.py"
      ]
    }
  }
}

Using with Cursor IDE

The server is configured in ~/.cursor/mcp.json. Restart Cursor after configuration.

Interactive Chat Client

Run the interactive chat client to test the server:

uv run python server/client.py

Make sure to create a .env file with your OpenAI API key:

OPENAI_API_KEY=your_api_key_here

The system will open with an interactive interface where you can:

  • Ask questions about weather alerts for any US state
  • Get weather forecasts for specific locations
  • Have natural language conversations with context memory

Features

  • Real-time weather alerts - Get active weather alerts for any US state
  • Weather forecasts - Retrieve detailed forecasts by latitude/longitude
  • MCP protocol support - Full Model Context Protocol implementation
  • Interactive chat client - Test the server with a conversational interface
  • Memory-enabled conversations - Chat client maintains conversation context
  • Multiple transport modes - Support for stdio and SSE transports
  • Docker support - Containerized deployment option

Available Tools

get_alerts(state: str)

Get weather alerts for a US state.

  • Parameters:
    • state: Two-letter US state code (e.g., "CA", "NY", "TX")
  • Returns: Formatted weather alerts including event type, severity, description, and timing information

get_forecast(latitude: float, longitude: float)

Get weather forecast for a specific location.

  • Parameters:
    • latitude: Latitude coordinate
    • longitude: Longitude coordinate
  • Returns: Detailed forecast including temperature, wind conditions, and detailed forecasts for the next 5 periods

What I Learned

  • Implementing Model Context Protocol (MCP) servers
  • Building FastMCP-based servers with async Python
  • Integrating external APIs (National Weather Service)
  • Creating interactive chat interfaces with LangChain
  • Managing conversation memory in AI applications
  • Docker containerization for MCP servers
  • Configuring MCP servers for different AI platforms