pcilli/weather-mcp-server
If you are the rightful owner of weather-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Weather MCP Server is a specialized server designed to provide weather data using the Model Context Protocol (MCP).
Weather MCP Server Overview
TLDR;
Use this codebase to connect either ChatGPT or Gemini to weather data through an MCP server.
Repository Purpose
This repository provides a set of web applications and backend scripts for interacting with LLMs (such as Gemini and ChatGPT) and accessing live weather data. It leverages Streamlit for web interfaces and FastAPI for backend API endpoints, integrating with Google and OpenAI’s language models as well as public weather APIs.
File Summaries
1. streamlit_app_gemini.py
- Purpose:
Web application interface built with Streamlit for interacting with Google’s Gemini LLM. - Key Features:
- Accepts user queries and displays Gemini’s responses.
- Handles user session states and prompt management.
- Clean and minimal UI for LLM interaction.
- Fetches and displays weather data based on user input.
2. streamlit_app_chatgpt.py
- Purpose:
Web application interface built with Streamlit for interacting with OpenAI’s ChatGPT LLM. - Key Features:
- Similar to the Gemini app, but uses the OpenAI API.
- Handles user queries and displays ChatGPT responses.
- Provides session and prompt history.
- Fetches and displays weather data based on user input.
3. main.py
- Purpose:
Main entry point for the FastAPI backend service. - Key Features:
- Starts the API server.
- Sets up all routes and core API configuration.
- Includes the weather API router.
- Provides a health check endpoint.
4. weather_router.py
- Purpose:
FastAPI router for handling weather-related API endpoints. - Key Features:
- Contains routes for accessing and managing weather data.
- Handles API requests, data formatting, and error handling for weather integrations.
- Modular for easy integration into the main FastAPI application.
5. requirements.txt
- Purpose:
Lists all required Python packages to run the codebase. - Key Libraries:
fastapi,uvicorn(backend web API)streamlit(web apps)httpx,requests(HTTP requests)python-dotenv(environment variable management)openai,google-generativeai(LLM APIs)pydantic(data validation)
How the Pieces Fit Together
-
Frontend:
The two Streamlit apps (streamlit_app_gemini.pyandstreamlit_app_chatgpt.py) provide user interfaces to interact with different LLMs and fetch weather data for cities of interest. -
Backend:
main.pylaunches the FastAPI server, registering routes fromweather_router.py. The backend provides APIs for retrieving live weather data, and can serve as a backend for the Streamlit apps. -
Weather Integration:
All weather API routes and logic are modularized inweather_router.py, which is included by the main FastAPI app. -
Dependencies:
All dependencies are listed inrequirements.txtfor easy installation and reproducibility.
Getting Started
-
Install requirements:
pip install -r requirements.txt -
Run the backend API:
uvicorn main:app --reload -
Launch Streamlit apps:
streamlit run streamlit_app_gemini.py # or streamlit run streamlit_app_chatgpt.py
Environment Variables and Secrets
1 · OpenAI Key (OPENAI_API_KEY)
- Log in at https://platform.openai.com.
- Click “API Keys” in the left sidebar.
- Choose “Create new secret key”, give it a name, and copy it immediately.
- Add the key to
OPENAI_API_KEYin the.env.
2 · Gemini Key (GEMINI_API_KEY)
Option A — Google AI Studio (fast)
- Go to https://aistudio.google.com.
- Select API Keys → “Create API key”.
- Copy the key and store it in
GEMINI_API_KEY.
Option B — Google Cloud Console
- Open the Cloud Console and create / select a project.
- Enable the Gemini API.
- Navigate to APIs & Services → Credentials → Create Credential → API key.
- (Optional) Add key restrictions, then copy it into
GEMINI_API_KEY.