Li-zhaoxue/mcp-streaming-analytics
If you are the rightful owner of mcp-streaming-analytics and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The MCP Streaming Analytics Server is designed to analyze streaming platform data with advanced analytics capabilities.
MCP Streaming Analytics Server
A Model Context Protocol (MCP) server for analyzing streaming platform data (music, video, podcasts) with advanced analytics capabilities including engagement tracking, churn analysis, funnel metrics, and cohort analysis.
Features
Data Model
The database includes realistic streaming platform data with:
- Users: subscription tiers, demographics, churn tracking
- Content: movies, series, songs, albums, podcasts
- Streaming Events: play history, completion rates, device/platform tracking
- Engagement Metrics: daily user activity, sessions, watch time
- Subscriptions & Payments: billing, revenue tracking
- Funnel Events: conversion tracking for signup, upgrades, etc.
- A/B Tests: experiment tracking and variant assignments
- Recommendations: recommendation performance tracking
- Search & Notifications: user interaction tracking
MCP Tools
- query_users - Query users with filters (subscription tier, country, status)
- analyze_engagement - Analyze user engagement metrics over time
- content_performance - Analyze content performance by type, genre, plays
- funnel_analysis - Conversion funnel analysis (signup, upgrade, etc.)
- churn_prediction - Identify at-risk users based on inactivity
- cohort_analysis - Cohort retention and revenue analysis
- custom_query - Execute custom SQL queries (SELECT only)
MCP Resources
analytics://users/active
- Active users by tieranalytics://users/churn
- Churn analysis by reasonanalytics://engagement/daily
- Daily engagement trendsanalytics://content/popular
- Top performing contentanalytics://revenue/summary
- Revenue trends by month
Setup
Prerequisites
- Python 3.10+
- Docker & Docker Compose
- PostgreSQL (via Docker)
- uv (recommended) or
brew install uv
Installation
- Start the database:
cd mcp-streaming-analytics
docker compose up -d
Wait for the database to initialize (check with docker compose logs -f
).
- Create virtual environment and install dependencies:
# Create virtual environment
uv venv
# Activate virtual environment
source .venv/bin/activate # macOS/Linux
# OR
.venv\Scripts\activate # Windows
# Install dependencies
uv pip install -r requirements.txt
- Verify database connection:
docker exec -it streaming-db psql -U streaming_user -d streaming_analytics -c "SELECT COUNT(*) FROM users;"
Running the MCP Server
Development Mode
python src/server.py
With MCP Inspector (for testing)
npx @modelcontextprotocol/inspector python src/server.py
Using with Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json
on macOS):
{
"mcpServers": {
"streaming-analytics": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-streaming-analytics",
"run",
"python",
"src/server.py"
]
}
}
}
Note: Replace /path/to/mcp-streaming-analytics
with your actual project path. Using uv run
automatically manages the virtual environment. Database credentials are loaded automatically from the .env
file.
Example Queries
Using Tools
Find at-risk premium users:
Use the churn_prediction tool with subscription_tier="premium" and days_threshold=7
Analyze signup funnel:
Use the funnel_analysis tool with funnel_name="signup"
Get top performing content:
Use the content_performance tool with content_type="series" and min_plays=2
Cohort retention analysis:
Use the cohort_analysis tool with cohort_month="2024-01" and metric="retention"
Custom SQL Queries
User engagement by country:
SELECT
u.country,
COUNT(DISTINCT u.user_id) as users,
AVG(m.engagement_score) as avg_engagement,
SUM(m.total_watch_time_seconds)/3600 as total_hours
FROM users u
JOIN daily_user_metrics m ON u.user_id = m.user_id
WHERE m.metric_date >= CURRENT_DATE - INTERVAL '30 days'
GROUP BY u.country
ORDER BY total_hours DESC
Content binge-watching analysis:
SELECT
c.title,
COUNT(*) as binge_sessions,
AVG(se.duration_seconds/60) as avg_minutes_per_session
FROM streaming_events se
JOIN content c ON se.content_id = c.content_id
WHERE se.is_binge_watch = true
GROUP BY c.title
ORDER BY binge_sessions DESC
LIMIT 10
Database Schema
Key tables:
users
- User accounts with subscription and churn infocontent
- Streaming content catalogstreaming_events
- Playback events with completion trackinguser_sessions
- User session trackingdaily_user_metrics
- Daily engagement snapshotssubscriptions
- Subscription lifecyclepayment_transactions
- Revenue trackingfunnel_events
- Conversion funnel trackingcontent_recommendations
- Recommendation performancenotifications
- Notification engagement
Analytics Use Cases
- Engagement Analysis: Track DAU/MAU, session duration, content consumption
- Churn Prediction: Identify at-risk users, analyze churn reasons
- Content Performance: Measure views, completion rates, popularity trends
- Funnel Optimization: Analyze signup/upgrade conversion rates
- Cohort Analysis: Track user retention and LTV by cohort
- Revenue Analytics: Monitor MRR, ARPU, payment success rates
- A/B Testing: Evaluate experiment impact on key metrics
- Recommendation Performance: Track CTR and conversion of recommendations
Development
Add More Seed Data
Edit database/seeds/01_seed_data.sql
and restart the database:
docker compose down -v
docker compose up -d
Extend the MCP Server
Add new tools or resources in src/server.py
:
- Tools: Add to
@mcp.list_tools()
and@mcp.call_tool()
- Resources: Add to
@mcp.list_resources()
and@mcp.read_resource()
Troubleshooting
Database connection issues:
# Check if database is running
docker compose ps
# View logs
docker compose logs -f postgres
# Restart database
docker compose restart postgres
Reset database:
docker compose down -v
docker compose up -d
License
MIT