Baighasan/Chat-Royale
If you are the rightful owner of Chat-Royale and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Chat Royale is an intelligent chatbot that leverages the Clash Royale API through the Model Context Protocol (MCP) to provide natural language interactions with comprehensive game data.
Chat Royale
AI-powered Clash Royale agent using Gemini 2.0 with MCP tools for real-time game data access.
🌐 Live: https://chat-royale.com
Architecture
Three containerized services:
MCP Server (src/mcp/
) - Python FastMCP server exposing Clash Royale API tools:
- Players, Clans, Cards, Rankings
Backend API (src/backend/
) - Node.js/Express server:
- Integrates OpenAI with MCP client via
@modelcontextprotocol/sdk
- Standard JSON request/response (no streaming)
- Tool execution handled internally with iterative Gemini calls
Frontend (src/frontend/
) - React/TypeScript UI:
- Simple chat interface with markdown support
- Standard HTTP requests to backend API
How It Works
- User sends message to frontend
- Frontend posts to backend
/api/chat
- Backend connects to MCP server and discovers tools
- OpenAI processes message with available tools
- If tools needed: backend executes via MCP, feeds results back to OpenAI
- Final response returned as JSON to frontend
Deployment
Production: AWS Lightsail with Cloudflare CDN
CI/CD: GitHub Actions auto-deploy on main
branch push
# Deploy
docker-compose -f docker-compose.prod.yml up -d --build
Development
# Environment setup
# Backend: src/backend/.env - GEMINI_API_KEY
# MCP: src/mcp/.env - CR_API_KEY
# Run all services
docker-compose up -d
Tech Stack
- MCP Server: Python 3.12, FastMCP, Clash Royale API
- Backend: Node.js, Express, TypeScript, OpenAI SDK
- Frontend: React, TypeScript, Vite, TailwindCSS
- Deployment: Docker, AWS Lightsail, Cloudflare, GitHub Actions