try-my-best-agents/open-weather-mcp
If you are the rightful owner of open-weather-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
MCP server for fetching weather data using OpenWeather API.
Open Weather MCP Server
Python MCP server providing tools to call OpenWeather One Call API 3.0 for timestamped weather and to start a new AI Weather Assistant session (stateless) that returns a human-readable weather summary.
What this server provides
- weather_at_time: Get weather for a coordinate pair at a UNIX timestamp via One Call 3.0 timemachine
- ai_weather_assistant_new_session: Start a new (stateless) AI assistant session and receive a human-readable weather summary using the Weather Overview endpoint
- Rolling 24h rate limiting: Server-level moving-window rate limiter so limits are enforced consistently across multiple clients/instances. Uses
limits
with memory storage by default and supports Redis via env.
References:
- One Call API 3.0 main docs: openweathermap.org/api/one-call-3
- Timestamp endpoint (timemachine): docs
- Weather overview (AI-generated summary): migration page mentioning overview
Install
- Python 3.10+
- From repo root:
pip install -r requirements.txt
- Create your env file:
cp .env.example .env # Edit .env and set OPENWEATHER_API_KEY
Configure
OPENWEATHER_API_KEY
(required)OPENWEATHER_BASE_URL
(default:https://api.openweathermap.org
)OPENWEATHER_RATE_LIMIT_DAILY
(default:1000
)RATE_LIMIT_STORAGE_URI
(optional, default in-memory): e.g.redis://localhost:6379/0
for distributed/global limiter across processes/hosts
Run (Streamable HTTP)
Start the server with FastMCP using Streamable HTTP transport so multiple MCP clients can connect reliably:
python /Users/weizhengmak/GitHub/open-weather-mcp/mcp_server.py
# or
fastmcp run /Users/weizhengmak/GitHub/open-weather-mcp/mcp_server.py --transport http
By default FastMCP HTTP listens on 127.0.0.1:8000
with path /mcp/
. You can customize:
fastmcp run /Users/weizhengmak/GitHub/open-weather-mcp/mcp_server.py --transport http --host 0.0.0.0 --port 4200 --path /mcp/
Point your MCP-compatible host (e.g., Cursor, Claude Desktop) to the HTTP endpoint.
Tools
weather_at_time
- Description: Get weather data for coordinates at a specific UNIX timestamp [One Call 3.0 timemachine]
- Params:
lat
(float) [-90..90]lon
(float) [-180..180]dt
(int, Unix UTC)units
(string;standard
|metric
|imperial
; defaultstandard
)lang
(string; optional)
- Upstream:
GET /data/3.0/onecall/timemachine
- Docs: One Call API 3.0
Example (conceptual):
{
"tool": "weather_at_time",
"arguments": {"lat": 39.099724, "lon": -94.578331, "dt": 1643803200, "units": "metric"}
}
ai_weather_assistant_new_session
- Description: Start a new (stateless) AI Weather Assistant interaction and return a human-readable weather overview for the given coordinates. This uses the One Call 3.0 Weather Overview endpoint under the hood to provide AI-generated summaries without session resume.
- Params:
lat
(float) [-90..90]lon
(float) [-180..180]units
(string;standard
|metric
|imperial
; defaultstandard
)lang
(string; optional)prompt
(string; optional)
- Upstream:
GET /data/3.0/onecall/overview
- Docs: One Call API 3.0, Overview mention
Example (conceptual):
{
"tool": "ai_weather_assistant_new_session",
"arguments": {"lat": 40.7128, "lon": -74.0060, "units": "metric", "prompt": "What should I wear this afternoon?"}
}
Rate limiting
- Rolling window: 24 hours
- Limit: 1,000 calls (default)
- Enforced via server-level moving-window limiter (
limits
). - To enable consistent limits across multiple processes or machines, set
RATE_LIMIT_STORAGE_URI
(e.g. Redis) so all instances share the same limiter storage.
Citations
- One Call API 3.0 docs and endpoints: openweathermap.org/api/one-call-3
- Weather data for timestamp (timemachine): openweathermap.org/api/one-call-3
- Weather overview and AI summary context: openweathermap.org/darksky-openweather-3
- FastMCP Streamable HTTP: gofastmcp.com/deployment/running-server