lshankarrao/mcp-server
If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The MCP Weather Application is a comprehensive implementation of the Model Context Protocol (MCP) designed to provide real-time weather data and AI-powered insights through a client-server architecture.
๐ค๏ธ MCP Weather Server - Complete Implementation Guide
A production-ready Model Context Protocol (MCP) server with comprehensive weather services, AI-powered insights, and professional documentation.
๐ Table of Contents
- ๐ Project Overview
- ๐๏ธ Architecture
- ๐ Quick Start
- โ๏ธ Environment Setup
- ๐ง Development Guide
- ๐ก MCP Protocol Implementation
- ๐ Deployment Guide
- ๐ API Documentation
- ๐งช Testing
- ๐ Troubleshooting
- ๐ค Contributing
๐ Project Overview
This is a complete, production-ready implementation of the Model Context Protocol (MCP) featuring:
๐ค๏ธ Weather Services
- Real-time weather data from OpenWeatherMap API
- Multi-day forecasts with detailed conditions
- AI-powered insights using LangChain and OpenAI
- Travel advisories and activity recommendations
๐ MCP Protocol Compliance
- โ Full JSON-RPC 2.0 implementation
- โ Standard MCP methods (initialize, tools, resources, prompts)
- โ WebSocket & HTTP transport support
- โ Error handling with proper MCP response formats
- โ Optional methods (completion, notifications)
๐ญ Production Features
- ๐ Deployed on Railway with auto-scaling
- ๐ Comprehensive Swagger docs with interactive examples
- ๐ CORS configuration for cross-origin requests
- ๐ Health monitoring and status endpoints
- ๐ณ Container-ready with Dockerfile support
๐๏ธ Architecture
๐ MCP Weather Project/
โโโ ๐ฅ๏ธ server/ # Python FastAPI MCP Server
โ โโโ main.py # Application entry point
โ โโโ mcp_server.py # Core MCP protocol implementation
โ โโโ models.py # Pydantic models & schemas
โ โโโ weather_service.py # OpenWeatherMap integration
โ โโโ langchain_integration.py # AI insights & analysis
โ โโโ requirements.txt # Python dependencies
โ โโโ railway.toml # Railway deployment config
โ โโโ Procfile # Process configuration
โโโ ๐ป client/ # React Next.js Client
โ โโโ app/ # Next.js 13+ app directory
โ โโโ components/ # React components
โ โโโ lib/ # MCP client library
โ โโโ types/ # TypeScript definitions
โ โโโ package.json # Node.js dependencies
โโโ ๐ docs/ # Documentation
โโโ ๐ README.md # This file
๐ Data Flow
graph TD
A[Next.js Client] -->|MCP Requests| B[FastAPI Server]
B -->|Weather Data| C[OpenWeatherMap API]
B -->|AI Analysis| D[OpenAI API via LangChain]
B -->|MCP Responses| A
E[Browser/Swagger UI] -->|HTTP/WebSocket| B
๐ Quick Start
โก Option 1: Use Deployed Server (Fastest)
The MCP server is live and ready to use:
# 1. Clone the repository
git clone https://github.com/lshankarrao/mcp-server.git
cd mcp-server
# 2. Start the client
cd client
npm install
npm run dev
# 3. Open your browser
# Client: http://localhost:3000
# Server Docs: https://mcp-server-production-3da3.up.railway.app/docs
๐ ๏ธ Option 2: Full Local Development
# 1. Set up the server
cd server
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -r requirements.txt
# 2. Configure environment variables
cp .env.example .env
# Edit .env with your API keys (see Environment Setup)
# 3. Start the server
python main.py
# 4. Set up the client (in new terminal)
cd client
npm install
# 5. Configure client for localhost
echo "NEXT_PUBLIC_MCP_SERVER_URL=http://localhost:8000" > .env.local
# 6. Start the client
npm run dev
โ๏ธ Environment Setup
๐ Required API Keys
-
OpenWeatherMap API Key (for weather data)
- Sign up at: https://openweathermap.org/api
- Free tier: 1000 calls/day
-
OpenAI API Key (for AI insights)
- Sign up at: https://platform.openai.com
- Required for weather analysis features
๐ Server Environment Variables
Create server/.env
file:
# Weather API Configuration
OPENWEATHERMAP_API_KEY=your_openweather_api_key_here
# AI/LangChain Configuration
OPENAI_API_KEY=your_openai_api_key_here
# Server Configuration (optional)
MCP_SERVER_HOST=0.0.0.0
MCP_SERVER_PORT=8000
# Railway Configuration (for deployment)
PORT=8000
RAILWAY_ENVIRONMENT=production
๐ฅ๏ธ Client Environment Variables
Create client/.env.local
file:
# MCP Server URL
NEXT_PUBLIC_MCP_SERVER_URL=http://localhost:8000
# For production deployment:
# NEXT_PUBLIC_MCP_SERVER_URL=https://mcp-server-production-3da3.up.railway.app
๐ง Development Guide
๐ Server Development
Project Structure
server/
โโโ main.py # FastAPI app creation & uvicorn server
โโโ mcp_server.py # MCP protocol implementation
โโโ models.py # Pydantic models for MCP & weather
โโโ weather_service.py # Weather data fetching logic
โโโ langchain_integration.py # AI-powered insights
โโโ requirements.txt # Python dependencies
Key Classes
MCPServer
: Main MCP protocol handlerWeatherService
: OpenWeatherMap API integrationWeatherLangChainService
: AI analysis using LangChain
Running Tests
cd server
python -m pytest tests/ # (if tests directory exists)
# Or test manually via Swagger UI: http://localhost:8000/docs
โ๏ธ Client Development
Project Structure
client/
โโโ app/
โ โโโ page.tsx # Main weather app page
โ โโโ debug/page.tsx # Environment debugging page
โโโ components/
โ โโโ WeatherCard.tsx # Weather data display
โ โโโ MCPStatus.tsx # Connection status indicator
โ โโโ ...
โโโ lib/
โ โโโ mcp-client.ts # MCP protocol client
โโโ types/
โโโ mcp.ts # TypeScript definitions
Key Components
MCPClient
: MCP protocol communicationWeatherCard
: Weather data visualizationMCPStatus
: Real-time connection monitoring
Development Commands
cd client
npm run dev # Development server
npm run build # Production build
npm run start # Production server
npm run type-check # TypeScript validation
๐ก MCP Protocol Implementation
๐ Supported Methods
Method | Description | Parameters | Response |
---|---|---|---|
initialize | Initialize MCP connection | protocolVersion , capabilities , clientInfo | Server capabilities & info |
tools/list | List available weather tools | None | Array of tool definitions |
tools/call | Execute weather tools | name , arguments | Tool execution results |
resources/list | List weather resources | None | Array of resource definitions |
resources/read | Read resource content | uri | Resource content |
prompts/list | List AI prompt templates | None | Array of prompt definitions |
prompts/get | Get prompt template | name , arguments | Prompt content |
completion/complete | Auto-completion support | argument | Completion suggestions |
notifications/* | Progress notifications | Varies | Acknowledgment |
๐ ๏ธ Available Weather Tools
-
get_weather
- Purpose: Current weather conditions
- Parameters:
location
(required),units
(optional) - Example:
{"location": "Paris", "units": "metric"}
-
get_forecast
- Purpose: Multi-day weather forecast
- Parameters:
location
(required),days
(optional, 1-7) - Example:
{"location": "London", "days": 5}
-
get_weather_insights
- Purpose: AI-powered weather analysis
- Parameters:
location
(required),activity
(optional) - Example:
{"location": "Tokyo", "activity": "outdoor hiking"}
-
get_weather_summary_advisory
- Purpose: Weather summary with travel advice
- Parameters:
location
(required) - Example:
{"location": "New York"}
๐ MCP Request Example
{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {
"name": "get_weather",
"arguments": {
"location": "San Francisco",
"units": "imperial"
}
}
}
๐ MCP Response Example
{
"jsonrpc": "2.0",
"id": 1,
"result": {
"content": [
{
"type": "text",
"text": "Weather in San Francisco: Temperature: 65ยฐF, Description: Partly cloudy, Humidity: 72%, Wind Speed: 8.5 mph"
}
],
"isError": false
}
}
๐ Deployment Guide
๐ Railway Deployment (Current Setup)
The server is deployed on Railway with automatic GitHub integration:
- Repository: https://github.com/lshankarrao/mcp-server
- Live URL: https://mcp-server-production-3da3.up.railway.app
- Auto-deploy: Pushes to
master
branch trigger rebuilds
Railway Configuration Files
railway.toml
: Deployment settingsProcfile
: Process definitionrequirements.txt
: Python dependencies
Environment Variables in Railway
OPENWEATHERMAP_API_KEY=your_key_here
OPENAI_API_KEY=your_key_here
PORT=8000
RAILWAY_ENVIRONMENT=production
๐ Redeployment Process
# 1. Make changes locally
git add .
git commit -m "Your changes"
# 2. Push to trigger Railway rebuild
git push origin master
# 3. Monitor deployment
# Check Railway dashboard or server logs
โ๏ธ Alternative Deployment Options
Vercel (Client)
cd client
npx vercel --prod
Heroku (Server)
cd server
heroku create your-mcp-server
git push heroku master
Docker (Local/Cloud)
# Create Dockerfile in server/
docker build -t mcp-weather-server .
docker run -p 8000:8000 mcp-weather-server
๐ API Documentation
๐ Interactive Documentation
- Swagger UI: https://mcp-server-production-3da3.up.railway.app/docs
- ReDoc: https://mcp-server-production-3da3.up.railway.app/redoc
- OpenAPI JSON: https://mcp-server-production-3da3.up.railway.app/openapi.json
๐ Key Endpoints
Endpoint | Method | Purpose | Documentation |
---|---|---|---|
/health | GET | Server health check | Server status & MCP compliance |
/mcp | POST | MCP protocol endpoint | All MCP method execution |
/mcp/methods | GET | MCP method reference | Complete method documentation |
/mcp/ws | WebSocket | Real-time MCP communication | WebSocket MCP protocol |
/docs | GET | Swagger UI | Interactive API documentation |
๐งช Testing via Swagger
- Go to: https://mcp-server-production-3da3.up.railway.app/docs
- Click "Try it out" on
/mcp
POST endpoint - Use provided examples:
- Initialize MCP connection
- List available tools
- Call weather tools
- Get AI insights
๐งช Testing
๐ Manual Testing
Server Health Check
curl https://mcp-server-production-3da3.up.railway.app/health
MCP Protocol Test
curl -X POST https://mcp-server-production-3da3.up.railway.app/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list"
}'
๐ฅ๏ธ Client Testing
- Environment Debug Page: http://localhost:3000/debug
- Main Application: http://localhost:3000
- Browser Console: Check MCP connection logs
๐ง Integration Testing
Test the complete MCP flow:
- Initialize connection
- List tools
- Call weather tool
- Verify response format
๐ Troubleshooting
๐ Common Issues
Server Won't Start
# Check Python version (3.11+ required)
python --version
# Install dependencies
pip install -r requirements.txt
# Check environment variables
cat .env
# Check port availability
lsof -i :8000 # On Unix/Mac
netstat -ano | findstr :8000 # On Windows
Client Can't Connect to Server
# Check server URL in .env.local
cat client/.env.local
# Verify server is running
curl http://localhost:8000/health
# Check CORS settings in mcp_server.py
Weather Data Not Loading
# Verify OpenWeatherMap API key
echo $OPENWEATHERMAP_API_KEY
# Test API key manually
curl "https://api.openweathermap.org/data/2.5/weather?q=London&appid=YOUR_API_KEY"
AI Insights Not Working
# Verify OpenAI API key
echo $OPENAI_API_KEY
# Check OpenAI API status
curl https://api.openai.com/v1/models -H "Authorization: Bearer YOUR_API_KEY"
๐ Debug Information
Server Logs
- Railway: Check deployment logs in Railway dashboard
- Local: Server outputs logs to console
Client Logs
- Browser: Open Developer Tools โ Console
- Debug page: http://localhost:3000/debug
MCP Protocol Debugging
- Use Swagger UI to test individual MCP methods
- Check JSON-RPC 2.0 format compliance
- Verify request/response structure
๐ Performance & Monitoring
โก Server Performance
- Railway Auto-scaling: Handles traffic spikes automatically
- Health Endpoint: Monitor server status via
/health
- CORS Optimization: Dynamic origin handling for production
๐ Monitoring
- Railway Dashboard: Deployment and resource monitoring
- Application Logs: Real-time server logs
- Client Status: Connection monitoring in UI
๐ก๏ธ Security
๐ API Key Management
- Environment variables only (never in code)
- Railway encrypted environment storage
- Separate keys for development/production
๐ CORS Configuration
- Development: Specific localhost origins
- Production: Dynamic origin handling
- No credentials with wildcard origins
๐ Input Validation
- Pydantic models for request validation
- Parameter sanitization
- Error handling without data exposure
๐ฆ Dependencies
๐ Server Dependencies
fastapi>=0.100.0 # Web framework
uvicorn[standard]>=0.23.0 # ASGI server
pydantic>=2.4.0 # Data validation
httpx>=0.25.0 # HTTP client
langchain>=0.0.350 # AI framework
langchain-openai>=0.0.1 # OpenAI integration
python-dotenv>=1.0.0 # Environment loading
websockets>=11.0.0 # WebSocket support
typing-extensions>=4.7.0 # Type hints
โ๏ธ Client Dependencies
{
"next": "14.0.0",
"react": "18.0.0",
"typescript": "5.0.0",
"@heroicons/react": "2.0.0",
"tailwindcss": "3.0.0"
}
๐ Development Workflow
๐ Making Changes
-
Create feature branch:
git checkout -b feature/your-feature-name
-
Make changes and test locally:
# Server changes cd server && python main.py # Client changes cd client && npm run dev
-
Commit and push:
git add . git commit -m "feat: your descriptive message" git push origin feature/your-feature-name
-
Deploy to production:
git checkout master git merge feature/your-feature-name git push origin master # Triggers Railway auto-deploy
๐ Adding New MCP Methods
-
Add method to
models.py
:class MCPMethod(str, Enum): NEW_METHOD = "your/new_method"
-
Implement handler in
mcp_server.py
:def handle_new_method(self, request: MCPRequest) -> MCPResponse: # Implementation here pass
-
Add to request processor:
elif request.method == "your/new_method": return self.handle_new_method(request)
-
Update documentation in Swagger
๐ค Contributing
๐ How to Contribute
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Update documentation
- Submit a pull request
๐ Contribution Guidelines
- Follow existing code style
- Add comprehensive docstrings
- Update README for significant changes
- Test locally before submitting
- Include MCP protocol compliance
๐ Reporting Issues
When reporting issues, include:
- Server/client version
- Environment details (OS, Python/Node version)
- Steps to reproduce
- Error messages and logs
- Expected vs actual behavior
๐ License
MIT License - see LICENSE file for details.
๐ Acknowledgments
- Model Context Protocol: Anthropic for the MCP specification
- OpenWeatherMap: Weather data API
- OpenAI: AI-powered insights
- Railway: Cloud deployment platform
- FastAPI: High-performance web framework
- Next.js: React framework for the client
๐ Support & Contact
- Repository: https://github.com/lshankarrao/mcp-server
- Live Demo: https://mcp-server-production-3da3.up.railway.app/docs
- Issues: Create GitHub issues for bugs/features
๐ You now have everything needed to understand, modify, and extend this MCP Weather Server!
This README serves as a complete guide for developers taking over or contributing to this project. Keep it updated as the project evolves.