Traia-IO/weather-api-complete-mcp-server
If you are the rightful owner of weather-api-complete-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
This server provides secure and efficient access to the Weather API Complete through the Model Context Protocol, enabling AI agents and LLMs to interact with weather data seamlessly.
Weather API Complete MCP Server
This is an MCP (Model Context Protocol) server that provides with authentication via Bearer tokens access to the Weather API Complete API. It enables AI agents and LLMs to interact with Weather API Complete through standardized tools.
Features
- 🔧 MCP Protocol: Built on the Model Context Protocol for seamless AI integration
- 🌐 Full API Access: Provides tools for interacting with Weather API Complete endpoints
- 🔐 Secure Authentication: Supports API key authentication via Bearer tokens
- 🐳 Docker Support: Easy deployment with Docker and Docker Compose
- ⚡ Async Operations: Built with FastMCP for efficient async handling
API Documentation
- Weather API Complete Website: https://api.weatherapi.com/v1
- API Documentation: https://www.weatherapi.com/docs/
Available Tools
This server provides the following tools:
example_tool: Placeholder tool (to be implemented)get_api_info: Get information about the API service and authentication status
Note: Replace example_tool with actual Weather API Complete API tools based on the documentation.
Installation
Using Docker (Recommended)
-
Clone this repository:
git clone https://github.com/Traia-IO/weather-api-complete-mcp-server.git cd weather-api-complete-mcp-server -
Set your API key:
export WEATHER_API_COMPLETE_API_KEY="your-api-key-here" -
Run with Docker:
./run_local_docker.sh
Using Docker Compose
- Create a
.envfile with your configuration:
WEATHER_API_COMPLETE_API_KEY=your-api-key-here PORT=8000
2. Start the server:
```bash
docker-compose up
Manual Installation
-
Install dependencies using
uv:uv pip install -e . -
Run the server:
WEATHER_API_COMPLETE_API_KEY="your-api-key-here" uv run python -m server
## Usage
### Health Check
Test if the server is running:
```bash
python mcp_health_check.py
Using with CrewAI
from traia_iatp.mcp.traia_mcp_adapter import create_mcp_adapter_with_auth
# Connect with authentication
with create_mcp_adapter_with_auth(
url="http://localhost:8000/mcp/",
api_key="your-api-key"
) as tools:
# Use the tools
for tool in tools:
print(f"Available tool: {tool.name}")
# Example usage
result = await tool.example_tool(query="test")
print(result)
Authentication
This server requires API key authentication. Clients must provide their API key in the Authorization header:
Authorization: Bearer YOUR_API_KEY
The API key is then used to authenticate requests to the Weather API Complete API.
Development
Testing the Server
- Start the server locally
- Run the health check:
python mcp_health_check.py - Test individual tools using the CrewAI adapter
Adding New Tools
To add new tools, edit server.py and:
- Create API client functions for Weather API Complete endpoints
- Add
@mcp.tool()decorated functions - Update this README with the new tools
- Update
deployment_params.jsonwith the tool names in the capabilities array
Deployment
Deployment Configuration
The deployment_params.json file contains the deployment configuration for this MCP server:
{
"github_url": "https://github.com/Traia-IO/weather-api-complete-mcp-server",
"mcp_server": {
"name": "weather-api-complete-mcp",
"description": "Weather api integration",
"server_type": "streamable-http",
"requires_api_key": true,
"api_key_header": "Authorization",
"capabilities": [
// List all implemented tool names here
"example_tool",
"get_api_info"
]
},
"deployment_method": "cloud_run",
"gcp_project_id": "traia-mcp-servers",
"gcp_region": "us-central1",
"tags": ["weather api complete", "api"],
"ref": "main"
}
Important: Always update the capabilities array when you add or remove tools!
Google Cloud Run
This server is designed to be deployed on Google Cloud Run. The deployment will:
- Build a container from the Dockerfile
- Deploy to Cloud Run with the specified configuration
- Expose the
/mcpendpoint for client connections
Environment Variables
PORT: Server port (default: 8000)STAGE: Environment stage (default: MAINNET, options: MAINNET, TESTNET)LOG_LEVEL: Logging level (default: INFO)WEATHER_API_COMPLETE_API_KEY: Your Weather API Complete API key (required)
Troubleshooting
- Server not starting: Check Docker logs with
docker logs <container-id> - Authentication errors: Ensure your API key is correctly set in the environment
- API errors: Verify your API key has the necessary permissions3. Tool errors: Check the server logs for detailed error messages
Contributing
- Fork the repository
- Create a feature branch
- Implement new tools or improvements
- Update the README and deployment_params.json
- Submit a pull request