KaayaanAi/CryptoSentiment-Intelligence-MCP-Server
If you are the rightful owner of CryptoSentiment-Intelligence-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
CryptoSentiment Intelligence MCP Server is an advanced AI-driven platform designed for comprehensive cryptocurrency sentiment analysis, supporting multiple protocols for broad compatibility.
🚀 CryptoSentiment Intelligence MCP Server
Advanced AI-driven cryptocurrency sentiment analysis with quad-protocol support and multi-provider AI integration
✨ Features
🔄 Quad-Protocol Support
- STDIO MCP - Claude Desktop integration
- HTTP REST API - Standard REST endpoints
- HTTP MCP - JSON-RPC 2.0 for n8n-nodes-mcp
- WebSocket MCP - Real-time streaming analysis
🤖 Multi-Provider AI Integration ⭐ NEW
- Primary: Ollama Cloud with
gpt-oss:120b
model for cost-effective analysis - Fallback: Google Gemini with
gemini-2.0-flash
for reliability - Smart Failover: Automatic provider switching with exponential backoff
- Enhanced Error Handling: Comprehensive retry logic and circuit breaker patterns
🧠 Advanced AI Analysis Frameworks
- Sentiment Fusion - Multi-source sentiment aggregation with ML adaptation
- Behavioral Network Analysis - Whale and influencer behavior patterns
- Multi-Modal Processing - Text, image, and video content analysis
- Predictive Impact Modeling - ML-based market impact forecasting
- Quantum Correlation - Cross-sector event cascade analysis
📊 Comprehensive Market Intelligence
- Real-time cryptocurrency price integration
- Multi-source news aggregation (RSS feeds, Reddit, social media)
- Historical pattern matching and correlation analysis
- Risk assessment and actionable recommendations
- Geopolitical ripple effect analysis
🛡️ Production-Ready Features
- Advanced caching with Redis fallback to memory
- Rate limiting and API key authentication
- Comprehensive logging and health monitoring
- Graceful shutdown and connection management
- Docker containerization with multi-stage builds
- Enterprise-grade TypeScript with strict linting (0 errors)
🎯 Quick Start
1. Environment Setup
# Clone the repository
git clone https://github.com/kaayaan-ai/crypto-sentiment-intelligence-mcp-server
cd crypto-sentiment-intelligence-mcp-server
# Setup environment interactively
npm run setup-env
# Or copy template manually
cp .env.example .env && nano .env
2. Required Environment Variables
# AI Provider Configuration - Ollama Cloud Primary
OLLAMA_API_KEY=your-ollama-cloud-api-key
OLLAMA_BASE_URL=https://ollama.com
OLLAMA_MODEL=gpt-oss:120b
# AI Provider Configuration - Gemini Fallback
GEMINI_API_KEY=your-google-gemini-api-key
GEMINI_BASE_URL=https://generativelanguage.googleapis.com/v1beta
GEMINI_MODEL=gemini-2.0-flash
# Database Connections
MONGODB_URL=mongodb://localhost:27017/crypto_sentiment
REDIS_URL=redis://localhost:6379
# Optional: Price API
COINGECKO_API_KEY=your-coingecko-api-key
3. Quick Start Options
Option A: Docker (Recommended)
# Start with Docker Compose
docker-compose up -d
# Check logs
docker-compose logs -f crypto-sentiment-mcp
# Validate environment
npm run validate-env
Option B: Native Installation
# Install dependencies
npm install
# Build TypeScript
npm run build
# Validate environment and configuration
npm run validate-env
# Start STDIO MCP (Claude Desktop)
npm run dev
# Or start HTTP server (REST + MCP + WebSocket)
npm run http-server
🔧 Usage Examples
Claude Desktop Integration
Add to your Claude Desktop MCP configuration:
{
"mcpServers": {
"crypto-sentiment": {
"command": "npx",
"args": ["-s", "user", "crypto-sentiment-intelligence-mcp-server"]
}
}
}
n8n Integration
Use the MCP Client node with these settings:
{
"connection": {
"type": "HTTP MCP",
"url": "http://localhost:4004/mcp"
},
"tools": ["analyze_crypto_sentiment"]
}
REST API Usage
# Analyze latest crypto sentiment
curl -X POST http://localhost:4004/tools/analyze_crypto_sentiment \
-H "Content-Type: application/json" \
-d '{
"query": "Bitcoin ETF news",
"analysis_depth": "standard",
"include_prices": true
}'
WebSocket Real-time Analysis
const ws = new WebSocket('ws://localhost:4004/mcp/ws');
ws.onopen = () => {
// Initialize connection
ws.send(JSON.stringify({
jsonrpc: '2.0',
id: 1,
method: 'initialize',
params: {
protocolVersion: '2024-11-05',
capabilities: {},
clientInfo: { name: 'MyApp', version: '1.0.0' }
}
}));
};
// Request analysis with real-time updates
ws.send(JSON.stringify({
jsonrpc: '2.0',
id: 2,
method: 'tools/call',
params: {
name: 'analyze_crypto_sentiment',
arguments: {
query: 'latest crypto sentiment',
stream_updates: true
}
}
}));
📋 Analysis Tool Parameters
analyze_crypto_sentiment
Parameters:
query
(required) - Search query or "latest" for general analysisanalysis_depth
- "quick" | "standard" | "deep" (default: "standard")max_news_items
- 5-50 (default: 15)time_range
- "1h" | "6h" | "12h" | "24h" (default: "6h")include_prices
- Include current price data (default: true)focus_coins
- Array of specific coins to prioritize (optional)
Example Response:
{
"overall_sentiment": "BULLISH",
"confidence_score": 0.87,
"processing_time_ms": 4200,
"analysis_timestamp": "2025-09-23T17:15:00Z",
"ai_model_used": "gpt-oss:120b",
"market_signals": [
{
"headline": "BlackRock increases Bitcoin ETF allocation by 40%",
"sentiment": "STRONGLY_POSITIVE",
"affected_coins": ["BTC", "ETH"],
"impact_prediction": {
"timeframe": "SHORT_TERM",
"magnitude": "HIGH",
"direction": "BULLISH"
},
"price_context": {
"BTC": { "current": 45250.33, "change_24h": 2.4 }
},
"ai_analysis": "Institutional adoption signal suggests sustained buying pressure",
"recommendation": "STRONG_BUY_SIGNAL",
"confidence_score": 0.92
}
],
"behavioral_insights": {
"whale_activity": "INCREASED_ACCUMULATION",
"social_sentiment": "FEAR_TO_GREED_TRANSITION",
"influencer_alignment": 0.73,
"volume_patterns": "ACCUMULATION",
"retail_sentiment": "BULLISH"
},
"risk_assessment": {
"level": "MEDIUM",
"factors": ["regulatory_uncertainty", "macro_correlation"],
"mitigation": "Consider dollar-cost averaging and stop-loss levels",
"probability": 0.65,
"impact_severity": "MODERATE"
},
"actionable_recommendations": [
"Consider BTC accumulation on next dip below $44,000",
"Monitor Ethereum for breakout above $2,850 resistance",
"Set stop-loss at 8% below entry for risk management"
],
"data_sources_count": 15
}
🐳 Docker Deployment
Full Stack Deployment
# Deploy everything (MCP Server + Redis + MongoDB + Monitoring)
docker-compose up -d
# Scale HTTP servers
docker-compose up -d --scale crypto-sentiment-http=3
# Enable monitoring (Prometheus + Grafana)
docker-compose --profile monitoring up -d
Kaayaan Stack Integration
# Add to your existing docker-compose.yml
services:
crypto-sentiment:
image: kaayaan/crypto-sentiment-mcp:latest
networks:
- kaayaan_default
environment:
# AI Provider Configuration - Ollama Cloud Primary
- OLLAMA_API_KEY=${OLLAMA_API_KEY}
- OLLAMA_BASE_URL=${OLLAMA_BASE_URL}
- OLLAMA_MODEL=${OLLAMA_MODEL}
# AI Provider Configuration - Gemini Fallback
- GEMINI_API_KEY=${GEMINI_API_KEY}
- GEMINI_BASE_URL=${GEMINI_BASE_URL}
- GEMINI_MODEL=${GEMINI_MODEL}
# Database Configuration
- MONGODB_URL=${MONGODB_URL}
- REDIS_URL=${REDIS_URL}
- COINGECKO_API_KEY=${COINGECKO_API_KEY}
🛠️ Development
Local Development Setup
# Install dependencies
npm install
# Start development with auto-reload
npm run watch
# Run in development mode
npm run dev
# Test the server with comprehensive suite
npm test
# Run linting and type checking
npm run test:quick
# Format code
npm run format
Environment Validation
# Validate environment configuration
npm run validate-env
# Verbose validation with detailed output
npm run validate-env:verbose
# Interactive environment setup
npm run setup-env
Testing Individual Components
# Test STDIO MCP protocol
echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0.0"}}}' | node build/index.js
# Test tool execution
echo -e '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0.0"}}}\n{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"analyze_crypto_sentiment","arguments":{"query":"Bitcoin news"}}}' | node build/index.js
MCP Inspector
# Launch MCP Inspector for interactive testing
npm run inspector
📊 API Endpoints
Health & Status
GET /health
- Health check with service statusGET /status
- Server status and capabilitiesGET /metrics
- Performance metrics
Tools
GET /tools
- List available tools with schemasGET /tools/{name}/schema
- Get specific tool schemaPOST /tools/analyze_crypto_sentiment
- Execute sentiment analysis
MCP Protocol
POST /mcp
- JSON-RPC 2.0 MCP endpointPOST /mcp/batch
- Batch request supportWebSocket /mcp/ws
- WebSocket MCP protocol
Documentation
GET /docs
- API documentation with examplesGET /
- Server information and available endpoints
🔐 Security
Authentication
# Set API key requirement
export REQUIRED_API_KEY="your-secure-api-key"
# Include in requests
curl -H "x-api-key: your-secure-api-key" http://localhost:4004/tools
Rate Limiting
- Default: 100 requests per minute per IP
- Configurable via
RATE_LIMIT_MAX_REQUESTS
- Returns 429 with retry-after header
Security Headers
- Helmet.js for security headers
- CORS configuration
- Input validation with Zod schemas
- SQL/NoSQL injection protection
📈 Performance
Optimizations
- Redis caching with memory fallback
- Connection pooling for databases
- Async processing throughout
- Smart batching for external APIs
- Response compression for WebSocket
- Multi-provider AI with intelligent failover
Benchmarks
- Response time: < 6 seconds for standard analysis (improved with Ollama Cloud)
- Concurrent requests: 1000+ connections supported
- Memory usage: ~512MB typical, 1GB limit
- CPU usage: Optimized for multi-core processing
- AI Provider latency: ~1.2-1.7 seconds (Ollama Cloud)
🔧 Configuration
Environment Variables
Variable | Description | Default |
---|---|---|
NODE_ENV | Environment mode | development |
PORT | HTTP server port | 4004 |
LOG_LEVEL | Logging level | info |
ENABLE_STDIO | Enable STDIO MCP | true |
ENABLE_HTTP_REST | Enable REST API | true |
ENABLE_HTTP_MCP | Enable HTTP MCP | true |
ENABLE_WEBSOCKET | Enable WebSocket | true |
OLLAMA_API_KEY | Ollama Cloud API key | Required |
OLLAMA_BASE_URL | Ollama Cloud base URL | https://ollama.com |
OLLAMA_MODEL | Ollama model name | gpt-oss:120b |
GEMINI_API_KEY | Google Gemini API key | Required for fallback |
GEMINI_BASE_URL | Gemini API base URL | https://generativelanguage.googleapis.com/v1beta |
GEMINI_MODEL | Gemini model name | gemini-2.0-flash |
MONGODB_URL | MongoDB connection | Required |
REDIS_URL | Redis connection | Required |
COINGECKO_API_KEY | Price API key | Optional |
DEFAULT_ANALYSIS_DEPTH | Default depth | standard |
RATE_LIMIT_MAX_REQUESTS | Rate limit | 100 |
Advanced Configuration
# AI Configuration
AI_TIMEOUT_MS=30000
MAX_TOKENS=4000
# News Sources
RSS_FEEDS=https://cointelegraph.com/rss,https://coindesk.com/rss
REDDIT_FEEDS=https://reddit.com/r/CryptoCurrency/.rss
# Cache Configuration
PRICE_CACHE_TTL=300
NEWS_CACHE_TTL=600
ANALYSIS_CACHE_TTL=900
# Performance
MAX_CONCURRENT_ANALYSIS=5
CONNECTION_POOL_SIZE=10
REQUEST_TIMEOUT_MS=30000
# Database Configuration
MONGO_INITDB_ROOT_PASSWORD=crypto_sentiment_2025_secure_password!
REDIS_PASSWORD=crypto_sentiment_redis_2025_secure!
📝 Troubleshooting
Common Issues
Server won't start:
# Check build
npm run build
# Verify environment
npm run validate-env
# Test connections
npm run validate-env:verbose
MCP Inspector errors:
# Ensure build is current
npm run build
# Check executable permissions
ls -la build/index.js
# Test with minimal input
echo '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2024-11-05","capabilities":{},"clientInfo":{"name":"test","version":"1.0.0"}}}' | node build/index.js
Analysis failures:
- Verify both
OLLAMA_API_KEY
andGEMINI_API_KEY
are set - Check internet connectivity for news feeds
- Ensure sufficient API quotas on both providers
- Review logs:
docker-compose logs crypto-sentiment-mcp
- Test individual provider availability
AI Provider Issues:
# Test primary provider (Ollama Cloud)
curl -X POST "https://ollama.com/api/chat" \
-H "Authorization: Bearer $OLLAMA_API_KEY" \
-H "Content-Type: application/json" \
-d '{"model":"gpt-oss:120b","messages":[{"role":"user","content":"test"}]}'
# Test fallback provider (Gemini)
curl -X POST "https://generativelanguage.googleapis.com/v1beta/models/gemini-2.0-flash:generateContent?key=$GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{"contents":[{"parts":[{"text":"test"}]}]}'
Debug Mode
# Enable debug logging
export LOG_LEVEL=debug
# Run with detailed output
npm run dev
🤝 Integration Examples
Python Client
import requests
def analyze_crypto_sentiment(query):
response = requests.post(
'http://localhost:4004/tools/analyze_crypto_sentiment',
json={
'query': query,
'analysis_depth': 'standard',
'include_prices': True
}
)
return response.json()
result = analyze_crypto_sentiment("Bitcoin ETF approval")
print(f"Sentiment: {result['data']['overall_sentiment']}")
print(f"AI Model: {result['data']['ai_model_used']}")
print(f"Confidence: {result['data']['confidence_score']}")
Node.js Client
import axios from 'axios';
async function analyzeCrypto(query) {
const response = await axios.post(
'http://localhost:4004/tools/analyze_crypto_sentiment',
{
query,
analysis_depth: 'deep',
max_news_items: 20
}
);
return response.data;
}
const result = await analyzeCrypto('Ethereum upgrade');
console.log(`Confidence: ${result.data.confidence_score}`);
console.log(`AI Provider: ${result.data.ai_model_used}`);
console.log(`Processing Time: ${result.data.processing_time_ms}ms`);
🎯 What's New in v1.1.0
⭐ Multi-Provider AI Integration
- Ollama Cloud Primary: Cost-effective
gpt-oss:120b
model integration - Google Gemini Fallback: Reliable
gemini-2.0-flash
backup - Smart Failover: Automatic provider switching with exponential backoff
- Enhanced Reliability: Circuit breaker patterns and comprehensive error handling
🔧 Quality Improvements
- Zero ESLint Errors: Enterprise-grade TypeScript with strict linting
- Improved Type Safety: Enhanced interfaces and proper error handling
- Better Environment Management: Interactive setup and validation scripts
- Enhanced Testing: Comprehensive test suite with proper cleanup
🚀 Performance Enhancements
- Faster Response Times: Optimized AI provider integration (~1.2-1.7s latency)
- Better Error Recovery: Robust retry mechanisms with smart backoff
- Improved Caching: Enhanced Redis integration with memory fallback
- Production Readiness: All quality gates validated and passing
📄 License
MIT License - see file for details.
🙏 Acknowledgments
- Ollama Cloud - Primary AI provider infrastructure
- Google Gemini - Reliable AI fallback provider
- CoinGecko - Cryptocurrency price data
- Model Context Protocol - Standardized AI tool integration
- Kaayaan AI Infrastructure - Production deployment platform
📞 Support
- Issues: GitHub Issues
- Documentation: API Reference
- Discord: Kaayaan Community
Built with ❤️ by Kaayaan AI Infrastructure
Powering the future of cryptocurrency intelligence through advanced multi-provider AI analysis