mohdjami/mcp-chatbot
If you are the rightful owner of mcp-chatbot and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Model Context Protocol (MCP) server is a modular and reusable AI tool server designed to integrate seamlessly with various AI applications, providing a robust framework for executing specific tasks and retrieving data.
🚀 Market Intelligence AI Chatbot - Complete Project README
📋 Quick Overview
A production-ready AI-powered market intelligence chatbot that provides real-time stock data, market news, and financial analysis through natural conversations. Built using FastAPI, LangGraph, OpenAI GPT-4, and the Model Context Protocol (MCP) for modular tool integration.
🌐 Live Deployments:
- Chatbot API: https://mcp-chatbot-production.up.railway.app
- MCP Server: https://mcp-chatbot-production-f2f9.up.railway.app/mcp
- Frontend To Test: https://mkt-wise-bot.lovable.app/
🎯 Project Summary
This project fulfills all requirements of the internship assignment plus multiple bonus features:
✅ Core Requirements
- ✅ Backend service running on MCP server architecture
- ✅ Python-based implementation (FastAPI + LangGraph)
- ✅ AI chatbot with context memory between questions
- ✅ Market/financial data capabilities
- ✅ Public testable API endpoint accessible via curl
- ✅ Hosted and deployed on Railway
⭐ Bonus Features Implemented
- ⭐ Streaming endpoint for real-time chat experience (like ChatGPT)
- ⭐ Redis caching for improved performance
- ⭐ Rate limiting for API protection
- ⭐ Tool verification - shows which MCP tools were actually used
- ⭐ Comprehensive error handling with helpful messages
- ⭐ Production-ready architecture with Docker support
- ⭐ Complete documentation with examples
🏗️ Architecture Overview
┌─────────────────────────────────────────────────────────────────┐
│ USER / CLIENT │
│ (curl, Postman, Frontend App) │
└────────────────────────────┬────────────────────────────────────┘
│ HTTP/HTTPS
▼
┌─────────────────────────────────────────────────────────────────┐
│ CHATBOT BACKEND (FastAPI + LangGraph) │
│ https://mcp-chatbot-production.up.railway.app │
│ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ Endpoints: │ │
│ │ • POST /query - Non-streaming responses │ │
│ │ • POST /stream - Real-time streaming (SSE) │ │
│ │ • GET /health - Health check │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ LangGraph Agent: │ │
│ │ • OpenAI GPT-4o-mini for reasoning │ │
│ │ • MemorySaver for conversation context │ │
│ │ • Agentic workflow (decides when to use tools) │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ Bonus Features: │ │
│ │ • Redis caching (reduces API calls) │ │
│ │ • Rate limiting (token bucket algorithm) │ │
│ │ • CORS middleware │ │
│ │ • Comprehensive error handling │ │
│ └────────────────────────────────────────────────────────────┘ │
└────────────────────────────┬────────────────────────────────────┘
│ MCP Protocol (Streamable HTTP)
▼
┌─────────────────────────────────────────────────────────────────┐
│ MCP SERVER (Remote Market Intelligence Tools) │
│ https://mcp-chatbot-production-f2f9.up.railway.app/mcp │
│ │
│ ┌────────────────────────────────────────────────────────────┐ │
│ │ 5 Market Intelligence Tools: │ │
│ │ │ │
│ │ 1. get_stock_price - Real-time stock data │ │
│ │ 2. get_market_news - Latest financial news │ │
│ │ 3. get_stock_history - Historical price data │ │
│ │ 4. compare_stocks - Multi-stock comparison │ │
│ │ 5. get_market_summary - Major indices overview │ │
│ └────────────────────────────────────────────────────────────┘ │
│ │
│ Data Sources: │
│ • Yahoo Finance (yfinance) - Stock data │
│ • NewsAPI - Financial news articles │
└─────────────────────────────────────────────────────────────────┘
🧪 Testing the API
1. Basic Query (Non-Streaming)
# Simple question about stock price
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "What is the current price of Apple stock?"
}'
Expected Response:
{
"response": "Apple (AAPL) is currently trading at $178.25, up $0.75 (+0.42%) from yesterday's close of $177.50. The stock has a market cap of approximately $2.8 trillion. The stock is performing well today with a day high of $179.50 and day low of $177.80.",
"session_id": "abc-123-def-456",
"tools_used": ["get_stock_price"]
}
2. Context-Aware Conversation
# First message - establish context
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "What is Tesla stock price?",
"session_id": "my-session-123"
}'
# Follow-up - remembers we're talking about Tesla
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "How does it compare to yesterday?",
"session_id": "my-session-123"
}'
3. Market News Query
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "Give me the latest news about tech stocks"
}'
4. Stock Comparison
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "Compare Apple, Microsoft, and Google stocks"
}'
5. Historical Data
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "Show me Amazon stock performance over the last 6 months"
}'
6. Market Summary
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "How are the major market indices doing today?"
}'
7. 🌟 BONUS: Streaming Response (Real-Time)
# Stream responses token-by-token like ChatGPT
curl -N -X POST https://mcp-chatbot-production.up.railway.app/stream \
-H "Content-Type: application/json" \
-d '{
"message": "What is Apple stock trading at right now?"
}'
Expected Stream Output:
event: message
data: {"type": "tool_call", "tool": "get_stock_price", "args": {"symbol": "AAPL"}}
event: message
data: {"type": "tool_executing", "message": "Fetching data..."}
event: message
data: {"type": "token", "content": "Apple "}
event: message
data: {"type": "token", "content": "(AAPL) "}
event: message
data: {"type": "token", "content": "is "}
event: message
data: {"type": "token", "content": "currently "}
...
event: message
data: {"type": "done", "tools_used": ["get_stock_price"], "session_id": "xxx"}
8. Health Check
curl https://mcp-chatbot-production.up.railway.app/health
Response:
{
"status": "healthy",
"service": "Market Intelligence Chatbot",
"agent_ready": true
}
📡 API Documentation
Interactive Swagger UI
Visit: https://mcp-chatbot-production.up.railway.app/docs
Endpoints
POST /query
Non-streaming endpoint for chat queries
Request Body:
{
message: string; // User's question (1-2000 chars)
session_id?: string; // Optional session ID for context
}
Response:
{
response: string; // AI assistant's response
session_id: string; // Session ID (generated if not provided)
tools_used?: string[]; // MCP tools that were called
}
Rate Limit: 10 requests/minute per IP
POST /stream ⭐ BONUS
Real-time streaming responses using Server-Sent Events
Request Body: Same as /query
Response: Server-Sent Events stream with event types:
tool_call- Agent decides to use a tooltool_executing- Tools are fetching datatoken- Individual response tokens (streaming!)done- Final summary with tools_usederror- Error occurred
Rate Limit: 5 requests/minute per IP (stricter due to resource usage)
GET /health
Health check endpoint
Response:
{
status: "healthy",
service: string,
agent_ready: boolean
}
🛠️ Tech Stack
Backend (Chatbot)
- FastAPI - Modern async web framework
- LangGraph - Agentic workflow orchestration
- LangChain - LLM integration framework
- OpenAI GPT-4o-mini - Language model for reasoning
- MCP Client - Model Context Protocol client
- Redis (Upstash) - Caching and state management
- Uvicorn - ASGI server
MCP Server (Tools)
- MCP SDK - Model Context Protocol server
- yfinance - Yahoo Finance API for stock data
- NewsAPI - Financial news articles
- aiohttp - Async HTTP client
- FastMCP - MCP server framework
Deployment
- Railway - Cloud platform (both services)
- Docker - Containerization
- Python 3.11 - Runtime environment
🎨 Key Features Explained
1. Context Management (MemorySaver)
The chatbot remembers previous messages in a conversation:
# Example conversation flow:
User: "What is Apple stock price?"
Bot: "Apple is trading at $178.25..."
User: "How does that compare to yesterday?" # Remembers we're talking about Apple
Bot: "Apple's price is up $0.75 from yesterday's close of $177.50..."
How it works:
- Each conversation has a unique
session_id - LangGraph's
MemorySaverstores message history - Agent has access to full context when responding
2. 🌟 Streaming Responses (Bonus Feature)
Real-time token-by-token streaming like ChatGPT:
Benefits:
- Instant feedback to users
- Better user experience (feels faster)
- Progressive content loading
- Transparent tool usage (shows when fetching data)
Implementation:
- Server-Sent Events (SSE) protocol
- Async generators in Python
- LangGraph's streaming capabilities
3. 🌟 Redis Caching (Bonus Feature)
Reduces API calls and improves response times:
What's cached:
- Stock price data (5 min TTL)
- Market news (5 min TTL)
- Historical data (5 min TTL)
- Tool results (automatic)
Benefits:
- Faster responses
- Reduced external API usage
- Lower costs
- Better reliability
Cache Hit Example:
First request: ❌ Cache MISS → Fetch from yfinance → 2.5s
Second request: ✅ Cache HIT → Return cached data → 0.1s
4. 🌟 Rate Limiting (Bonus Feature)
Token bucket algorithm prevents abuse:
Limits:
/query: 10 requests/minute per IP/stream: 5 requests/minute per IP/health: No limit
Response when rate limited:
{
"error": "Rate limit exceeded",
"message": "Too many requests. Please try again in 45 seconds.",
"retry_after": 45,
"limit": 10,
"window": "60 seconds"
}
Headers returned:
X-RateLimit-Limit: 10
X-RateLimit-Remaining: 7
X-RateLimit-Reset: 1697123456
5. Agentic Workflow
The AI autonomously decides when and which tools to use:
Flow:
- User asks a question
- Agent analyzes the query
- Agent decides: "Do I need external data?"
- If yes → selects appropriate tool(s)
- Executes tool(s) via MCP
- Synthesizes results into natural language
- Returns response to user
Example:
Query: "Compare Apple and Tesla, and give me tech news"
Agent thinks:
✓ Need stock data → use get_stock_price (AAPL)
✓ Need stock data → use get_stock_price (TSLA)
✓ Need news → use get_market_news (tech)
Executes 3 tools, then synthesizes response.
6. MCP Tool Verification
Every response shows which tools were actually used:
{
"response": "...",
"tools_used": ["get_stock_price", "get_market_news"]
}
Why this matters:
- Transparency for debugging
- Verify correct tool selection
- Track tool usage patterns
- Validate agent behavior
🔧 Environment Variables
Chatbot Backend
# Required
OPENAI_API_KEY=sk-your-openai-key
# MCP Configuration
MCP_TRANSPORT_PROTOCOL=http # or "stdio" for local
MCP_SERVER_URL=https://mcp-chatbot-production-f2f9.up.railway.app/mcp
# Optional - Redis (Upstash)
UPSTASH_REDIS_REST_URL=https://your-redis.upstash.io
UPSTASH_REDIS_REST_TOKEN=your-token
# Optional - Server
PORT=8080
LOG_LEVEL=INFO
OPENAI_MODEL=gpt-4o-mini
MCP Server
# Required
NEWS_API_KEY=your-newsapi-key
# Optional
PORT=8000
LOG_LEVEL=INFO
MCP_TRANSPORT_MODE=http # For Railway deployment
📊 Available MCP Tools
1. get_stock_price(symbol: str)
Get real-time stock price and company information.
Example:
{
"symbol": "AAPL",
"company_name": "Apple Inc.",
"current_price": 178.25,
"previous_close": 177.50,
"change": 0.75,
"change_percent": 0.42,
"volume": 45678900,
"market_cap": 2800000000000,
"day_high": 179.50,
"day_low": 177.80,
"52_week_high": 199.62,
"52_week_low": 164.08
}
2. get_market_news(query: str, num_articles: int)
Fetch latest financial news articles.
Parameters:
query: Search term (default: "stock market")num_articles: Number of articles (default: 5, max: 10)
Example:
[
{
"title": "Tech Stocks Rally on Strong Earnings",
"description": "Major tech companies exceeded expectations...",
"url": "https://example.com/article",
"published_at": "2025-10-12T10:00:00Z",
"source": "Financial Times",
"author": "John Doe"
}
]
3. get_stock_history(symbol: str, period: str)
Historical price data for technical analysis.
Parameters:
symbol: Stock tickerperiod: 1d, 5d, 1mo, 3mo, 6mo, 1y, 2y, 5y, 10y, ytd, max
Example:
{
"symbol": "AAPL",
"period": "1mo",
"data_points": 21,
"start_date": "2025-09-12",
"end_date": "2025-10-12",
"latest_close": 178.25,
"period_high": 182.50,
"period_low": 170.30,
"price_change": 7.95,
"price_change_percent": 4.67,
"average_volume": 55000000
}
4. compare_stocks(symbols: list[str])
Compare multiple stocks side by side (max 5).
Example:
{
"comparison": {
"AAPL": {
"company_name": "Apple Inc.",
"current_price": 178.25,
"market_cap": 2800000000000,
"pe_ratio": 29.5,
"dividend_yield": 0.52,
"52_week_high": 199.62,
"52_week_low": 164.08,
"beta": 1.25
},
"MSFT": { ... },
"GOOGL": { ... }
},
"symbols_analyzed": 3
}
5. get_market_summary()
Overview of major market indices.
Example:
{
"market_summary": {
"S&P 500": {
"symbol": "^GSPC",
"value": 4567.89,
"change": 23.45,
"change_percent": 0.52,
"high": 4589.12,
"low": 4545.67
},
"Dow Jones Industrial Average": { ... },
"NASDAQ Composite": { ... },
"Russell 2000": { ... }
}
}
🚀 Deployment Details
Railway Configuration
Chatbot Backend:
- URL: https://mcp-chatbot-production.up.railway.app
- Port: 8080 (auto-assigned by Railway)
- Build: Docker (Dockerfile)
- Health Check:
/healthendpoint - Auto-restart: Enabled
MCP Server:
- URL: https://mcp-chatbot-production-f2f9.up.railway.app/mcp
- Port: 8000 (auto-assigned by Railway)
- Build: Docker (Dockerfile)
- Transport: Streamable HTTP
- Endpoint:
/mcp(MCP protocol)
Docker Images
Both services use Python 3.11 slim images:
FROM python:3.11-slim
# Optimized for production
# Includes health checks
# Runs as non-root user
💡 Example Use Cases
1. Portfolio Tracking
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "I own Apple, Tesla, and Amazon. How are they performing today?",
"session_id": "portfolio-tracker"
}'
2. Investment Research
# Multi-turn conversation
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "Tell me about NVIDIA stock",
"session_id": "research-123"
}'
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "What news is there about AI chip demand?",
"session_id": "research-123"
}'
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "Compare it to AMD",
"session_id": "research-123"
}'
3. Market Monitoring
curl -X POST https://mcp-chatbot-production.up.railway.app/query \
-H "Content-Type: application/json" \
-d '{
"message": "Give me a market overview and the latest financial news"
}'
🎓 What I Learned
Technical Skills
- MCP Architecture: Building modular, reusable AI tool servers
- LangGraph: Creating agentic workflows with state management
- Async Python: Handling concurrent operations efficiently
- API Design: RESTful principles and streaming protocols
- Caching Strategies: Optimizing performance with Redis
- Rate Limiting: Implementing token bucket algorithm
- Docker & Deployment: Containerization and cloud deployment
- Error Handling: Graceful degradation and user-friendly errors
Design Patterns
- Separation of Concerns: MCP server vs chatbot backend
- Microservices: Independent, scalable services
- Context Managers: Python's
async withfor resource cleanup - Middleware Pattern: Rate limiting, CORS, error handling
- Observer Pattern: Streaming with SSE
Best Practices
- Environment Variables: Never hardcode secrets
- Logging: Comprehensive logging for debugging
- Health Checks: Monitoring service availability
- Documentation: Clear README and API docs
- Testing: curl commands for easy verification
📁 Project Structure
mcp-chatbot/
├── chatbot-backend/ # FastAPI chatbot service
│ ├── agent/
│ │ ├── graph.py # LangGraph agent
│ │ └── state.py # State schema
│ ├── mcp_client/
│ │ └── client.py # MCP client wrapper
│ ├── utils/
│ │ ├── cache.py # Redis caching
│ │ ├── rate_limiter.py # Rate limiting
│ │ └── redis_client.py # Redis connection
│ ├── main.py # FastAPI app
│ ├── requirements.txt
│ ├── Dockerfile
│ └── .env
├── mcp_server_remote.py # Remote MCP server
├── mcp_server.py # Local MCP server (stdio)
├── requirements.txt
├── Dockerfile
└── README.md # This file!
🤝 Contributing & Feedback
This project was built for an internship selection process. Feedback and suggestions are welcome!
Contact:
- GitHub: https://github.com/mohdjami
- Email: mohdjamikhann@gmail.com
📜 License
Educational project for internship selection process.
🙏 Acknowledgments
- Anthropic - For the Model Context Protocol (MCP)
- LangChain/LangGraph - For agentic workflow framework
- OpenAI - For GPT-4 API
- Railway - For free hosting tier
- Yahoo Finance - For stock market data
- NewsAPI - For financial news
📚 Additional Resources
- MCP Documentation: https://modelcontextprotocol.io/
- LangGraph Docs: https://langchain-ai.github.io/langgraph/
- FastAPI Docs: https://fastapi.tiangolo.com/
- Railway Docs: https://docs.railway.app/
✨ Final Notes
This project demonstrates:
- ✅ Full-stack AI application development
- ✅ Production-ready architecture
- ✅ Clean, maintainable code
- ✅ Comprehensive documentation
- ✅ Modern best practices
- ✅ Bonus features beyond requirements
Ready for production use and easy to extend with more MCP tools! 🚀
Built with ❤️ using Python, FastAPI, LangGraph, and MCP