santhoshravindran7/Fabric-Analytics-MCP
If you are the rightful owner of Fabric-Analytics-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Microsoft Fabric Analytics MCP Server is a comprehensive server that provides analytics capabilities and tools for interacting with the Microsoft Fabric data platform using standardized Model Context Protocols.
Microsoft Fabric Analytics MCP Server
A comprehensive Model Context Protocol (MCP) server that provides analytics capabilities and tools for interacting with Microsoft Fabric data platform. This server enables AI assistants like Claude to seamlessly access, analyze, and monitor Microsoft Fabric resources through standardized MCP protocols, bringing the power of Microsoft Fabric directly to your AI conversations.
π Table of Contents
- π Key Features
- π Quick Start
- π οΈ Tools & Capabilities
- π§ͺ Development & Testing
- π¬ Example Queries
- π Authentication
- ποΈ Architecture
- βοΈ Configuration
- π€ Contributing
- π Security
- π License
- π Support
π Key Features
- ποΈ Complete Workspace Management - Create, delete, and manage Fabric workspaces with capacity assignment
- π Enhanced CRUD Operations - Create, read, update, and delete all Fabric items (notebooks, lakehouses, datasets, reports)
- π Advanced Notebook Management - Create, execute, and manage Fabric notebooks with 5 predefined templates
- β‘ Livy API Integration - Full Spark session and batch job management with real-time monitoring
- π Comprehensive Spark Monitoring - Real-time monitoring across workspaces, items, and applications
- π€ Multi-AI Assistant Support - Works with Claude Desktop, GitHub Copilot, and other MCP-compatible AI tools
- π Enhanced Azure CLI Authentication - Zero-config setup with automatic token management
- οΏ½οΈ Enterprise Authentication - Multiple auth methods (Bearer, Service Principal, Device Code, Interactive, Azure CLI)
- π Analytics & Insights - Generate comprehensive monitoring dashboards with real-time metrics
- π§ͺ End-to-End Testing - Complete test suite with real workspace creation and job execution
- π Advanced Token Management - Automatic token validation, refresh, and expiration handling
- βΈοΈ Enterprise Deployment - Full Kubernetes and Azure deployment support with auto-scaling
- π Docker Support - Containerized deployment with health checks and monitoring
- π Monitoring & Observability - Built-in Prometheus metrics and Grafana dashboards
- π― 48 Total Tools - Comprehensive coverage of Fabric operations (up from 31 tools)
ποΈ New Workspace Management Features
π Latest Updates - Comprehensive Workspace Operations
The MCP server now includes 21 new workspace management tools that enable complete workspace lifecycle management:
π Core Workspace Operations
- fabric_list_workspaces - List all accessible workspaces with detailed metadata
- fabric_create_workspace - Create new workspaces with custom configuration
- fabric_delete_workspace - Delete workspaces with confirmation and cleanup
- fabric_update_workspace - Update workspace properties and settings
- fabric_get_workspace - Get detailed workspace information and status
β‘ Capacity & Resource Management
- fabric_list_capacities - List all available Fabric capacities
- fabric_assign_workspace_to_capacity - Attach workspaces to dedicated capacity
- fabric_unassign_workspace_from_capacity - Move workspaces to shared capacity
- fabric_list_capacity_workspaces - List all workspaces in a capacity
π₯ Access Control & Security
- fabric_get_workspace_role_assignments - View workspace permissions
- fabric_add_workspace_role_assignment - Grant workspace access to users/groups
- fabric_update_workspace_role_assignment - Modify user permissions
- fabric_remove_workspace_role_assignment - Remove workspace access
π Advanced Operations
- fabric_get_workspace_git_status - Check Git integration status
- fabric_connect_workspace_to_git - Enable Git integration for workspace
- fabric_disconnect_workspace_from_git - Disable Git integration
- fabric_update_workspace_git_connection - Modify Git repository settings
π οΈ Environment & Pipeline Management
- fabric_list_workspace_environments - List all environments in workspace
- fabric_create_workspace_environment - Create new environments
- fabric_delete_workspace_environment - Remove environments
- fabric_list_workspace_data_pipelines - List data integration pipelines
- fabric_create_workspace_data_pipeline - Create new data pipelines
π― Real-World Scenarios Enabled
π Automated Workspace Provisioning:
"Create a new workspace called 'Analytics-Q1-2025' and assign it to our premium capacity"
π Multi-Workspace Analytics:
"List all workspaces in our tenant and show their capacity assignments"
π Access Management:
"Add user john.doe@company.com as Admin to the Analytics workspace"
ποΈ Environment Setup:
"Create a development environment in the Analytics workspace with Python and R libraries"
π Git Integration:
"Connect the Analytics workspace to our GitHub repository for version control"
π€ GitHub Copilot Integration
Perfect for GitHub Copilot - The enhanced workspace management works seamlessly with GitHub Copilot's built-in terminal, making it ideal for:
- π§ Azure CLI Authentication - Uses your existing
az login
session - π» Terminal-Based Operations - Natural workflow within your coding environment
- β‘ Rapid Prototyping - Quickly create test workspaces and environments
- ποΈ Infrastructure as Code - Manage Fabric resources alongside your codebase
- π CI/CD Integration - Automate workspace provisioning in deployment pipelines
GitHub Copilot Example Commands:
# Using Azure CLI auth, create a new workspace for our ML project
# List all workspaces and their Git integration status
# Set up a complete analytics environment with lakehouse and notebooks
π― End-to-End Testing with Real Workspaces
The MCP server now includes comprehensive end-to-end testing that creates real workspaces, assigns them to capacities, and executes actual jobs to validate the complete workflow:
# One-command end-to-end test
npm run test:e2e
What it tests:
- β Workspace Creation - Creates real Fabric workspaces
- β Capacity Assignment - Attaches workspaces to your Fabric capacity
- β Item Creation - Creates notebooks, lakehouses, and other items
- β Job Execution - Runs actual Spark jobs and monitors completion
- β Resource Cleanup - Automatically removes all test resources
π Deployment Options
π€ Claude Desktop Integration
Recommended for AI Assistant Usage:
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["C:\\path\\to\\your\\build\\index.js"],
"cwd": "C:\\path\\to\\your\\project",
"env": {
"FABRIC_AUTH_METHOD": "bearer_token",
"FABRIC_TOKEN": "your_bearer_token_here",
"FABRIC_WORKSPACE_ID": "your_workspace_id",
"ENABLE_HEALTH_SERVER": "false"
}
}
}
}
π‘ Get Bearer Token: Visit Power BI Embed Setup to generate tokens
β οΈ Important: Tokens expire after ~1 hour and need to be refreshed
π§ Claude Desktop Authentication Fix
If you experience 60-second timeouts during startup, this is due to interactive authentication flows blocking Claude Desktop's sandboxed environment. Solution:
-
Use Bearer Token Method (Recommended):
- Set
FABRIC_AUTH_METHOD: "bearer_token"
in your config - Provide
FABRIC_TOKEN
with a valid bearer token - This bypasses interactive authentication entirely
- Set
-
Alternative - Per-Tool Authentication:
- Provide token directly in tool calls:
bearerToken: "your_token_here"
- Or use simulation mode:
bearerToken: "simulation"
- Provide token directly in tool calls:
-
Troubleshooting:
- Server now has 10-second timeout protection to prevent hanging
- Falls back to simulation mode if authentication fails
- Enhanced error messages provide clear guidance
π― Quick Fix: The server automatically prioritizes
FABRIC_TOKEN
environment variable over interactive authentication flows, preventing Claude Desktop timeouts.
π± Local Development
# Clone and run locally
git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git
cd Fabric-Analytics-MCP
npm install && npm run build && npm start
π³ Docker Deployment
# Using Docker Compose
docker-compose up -d
# Or standalone Docker
docker build -t fabric-analytics-mcp .
docker run -p 3000:3000 -e FABRIC_CLIENT_ID=xxx fabric-analytics-mcp
βΈοΈ Azure Kubernetes Service (AKS)
# One-command enterprise deployment
export ACR_NAME="your-registry" FABRIC_CLIENT_ID="xxx" FABRIC_CLIENT_SECRET="yyy" FABRIC_TENANT_ID="zzz"
./scripts/setup-azure-resources.sh && ./scripts/build-and-push.sh && ./scripts/deploy-to-aks.sh
π Azure MCP Server (Preview)
# Serverless deployment on Azure
az mcp server create --name "fabric-analytics-mcp" --repository "santhoshravindran7/Fabric-Analytics-MCP"
π Detailed Guides:
π οΈ Tools & Capabilities
π CRUD Operations for Fabric Items
-
Tool:
list-fabric-items
-
Description: List items in a Microsoft Fabric workspace (Lakehouses, Notebooks, etc.)
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemType
: Filter by item type (optional)
-
Tool:
create-fabric-item
-
Description: Create new items in Microsoft Fabric workspace
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemType
: Type of item (Lakehouse, Notebook, Dataset, Report, Dashboard)displayName
: Display name for the new itemdescription
: Optional description
-
Tool:
get-fabric-item
-
Description: Get detailed information about a specific Microsoft Fabric item
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to retrieve
-
Tool:
update-fabric-item
-
Description: Update existing items in Microsoft Fabric workspace
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to updatedisplayName
: New display name (optional)description
: New description (optional)
-
Tool:
delete-fabric-item
-
Description: Delete items from Microsoft Fabric workspace
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to delete
π Query Fabric Dataset (Enhanced)
- Tool:
query-fabric-dataset
- Description: Execute SQL or KQL queries against Microsoft Fabric datasets
- Parameters:
bearerToken
: Microsoft Fabric bearer token (optional - uses simulation if not provided)workspaceId
: Microsoft Fabric workspace IDdatasetName
: Name of the dataset to queryquery
: SQL or KQL query to execute
π Execute Fabric Notebook
- Tool:
execute-fabric-notebook
- Description: Execute a notebook in Microsoft Fabric workspace
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to executeparameters
: Optional parameters to pass to the notebook
π Get Analytics Metrics
- Tool:
get-fabric-metrics
- Description: Retrieve performance and usage metrics for Microsoft Fabric items
- Parameters:
workspaceId
: Microsoft Fabric workspace IDitemId
: Item ID (dataset, report, etc.)timeRange
: Time range for metrics (1h, 24h, 7d, 30d)metrics
: List of metrics to analyze
π§ Analyze Data Model
- Tool:
analyze-fabric-model
- Description: Analyze a Microsoft Fabric data model and get optimization recommendations
- Parameters:
workspaceId
: Microsoft Fabric workspace IDitemId
: Item ID to analyze
π Generate Analytics Report
- Tool:
generate-fabric-report
- Description: Generate comprehensive analytics reports for Microsoft Fabric workspaces
- Parameters:
workspaceId
: Microsoft Fabric workspace IDreportType
: Type of report (performance, usage, health, summary)
π Livy API Integration (Sessions & Batch Jobs)
Session Management
-
Tool:
create-livy-session
-
Description: Create a new Livy session for interactive Spark/SQL execution
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionConfig
: Optional session configuration
-
Tool:
get-livy-session
-
Description: Get details of a Livy session
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session ID
-
Tool:
list-livy-sessions
-
Description: List all Livy sessions in a lakehouse
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse ID
-
Tool:
delete-livy-session
-
Description: Delete a Livy session
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session ID
Statement Execution
-
Tool:
execute-livy-statement
-
Description: Execute SQL or Spark statements in a Livy session
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDcode
: SQL or Spark code to executekind
: Statement type (sql, spark, etc.)
-
Tool:
get-livy-statement
-
Description: Get status and results of a Livy statement
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDstatementId
: Statement ID
Batch Job Management
-
Tool:
create-livy-batch
-
Description: Create a new Livy batch job for long-running operations
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchConfig
: Batch job configuration
-
Tool:
get-livy-batch
-
Description: Get details of a Livy batch job
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchId
: Batch job ID
-
Tool:
list-livy-batches
-
Description: List all Livy batch jobs in a lakehouse
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse ID
-
Tool:
delete-livy-batch
-
Description: Delete a Livy batch job
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchId
: Batch job ID
π Spark Application Monitoring
Workspace-Level Monitoring
- Tool:
get-workspace-spark-applications
- Description: Get all Spark applications in a Microsoft Fabric workspace
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDcontinuationToken
: Optional token for pagination
Item-Specific Monitoring
-
Tool:
get-notebook-spark-applications
-
Description: Get all Spark applications for a specific notebook
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: Notebook IDcontinuationToken
: Optional token for pagination
-
Tool:
get-lakehouse-spark-applications
-
Description: Get all Spark applications for a specific lakehouse
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Lakehouse IDcontinuationToken
: Optional token for pagination
-
Tool:
get-spark-job-definition-applications
-
Description: Get all Spark applications for a specific Spark Job Definition
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDsparkJobDefinitionId
: Spark Job Definition IDcontinuationToken
: Optional token for pagination
Application Management
-
Tool:
get-spark-application-details
-
Description: Get detailed information about a specific Spark application
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlivyId
: Livy session ID
-
Tool:
cancel-spark-application
-
Description: Cancel a running Spark application
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlivyId
: Livy session ID
Monitoring Dashboard
- Tool:
get-spark-monitoring-dashboard
- Description: Generate a comprehensive monitoring dashboard with analytics
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace ID
π Notebook Management
The MCP server provides comprehensive notebook management capabilities with predefined templates and custom notebook support.
Create Notebook from Template
- Tool:
create-fabric-notebook
- Description: Create new Fabric notebooks from predefined templates or custom definitions
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDdisplayName
: Display name for the new notebooktemplate
: Template type (blank, sales_analysis, nyc_taxi_analysis, data_exploration, machine_learning, custom)customNotebook
: Custom notebook definition (required if template is 'custom')environmentId
: Optional environment ID to attachlakehouseId
: Optional default lakehouse IDlakehouseName
: Optional default lakehouse name
Available Templates:
- blank: Basic notebook with minimal setup
- sales_analysis: Comprehensive sales data analysis with sample dataset
- nyc_taxi_analysis: NYC taxi trip data analysis with sample dataset
- data_exploration: Structured data exploration template
- machine_learning: Complete ML workflow template
- custom: Use your own notebook definition
Get Notebook Definition
- Tool:
get-fabric-notebook-definition
- Description: Retrieve the notebook definition (cells, metadata) from an existing Fabric notebook
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to retrieveformat
: Format to return (ipynb or fabricGitSource)
Update Notebook Definition
- Tool:
update-fabric-notebook-definition
- Description: Update the notebook definition (cells, metadata) of an existing Fabric notebook
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to updatenotebookDefinition
: Updated notebook definition object
Execute Notebook
- Tool:
run-fabric-notebook
- Description: Execute a Fabric notebook on-demand with optional parameters and configuration
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to runparameters
: Optional notebook parameters (key-value pairs with types)configuration
: Optional execution configuration (environment, lakehouse, pools, etc.)
Features:
- π Base64 encoded notebook payload support
- π§ Comprehensive metadata management
- π Environment and lakehouse integration
- ποΈ Parameterized notebook execution
- β‘ Spark configuration support
- π€ Support for multiple programming languages (Python, Scala, SQL, R)
π Quick Start
π― Installation Methods
Choose your preferred installation method:
Option 1: Python Package (PyPI) β Recommended
# Install via pip (easiest method)
pip install fabric-analytics-mcp
# Verify installation
fabric-analytics --version
# Start the server
fabric-analytics-mcp start
Option 2: NPM Package
# Install globally via npm
npm install -g mcp-for-microsoft-fabric-analytics
# Verify installation
fabric-analytics --version
# Start the server
fabric-analytics
# Or using npx (no installation required)
npx mcp-for-microsoft-fabric-analytics
Option 3: Universal Installation Script
For automated setup with environment configuration:
Unix/Linux/macOS:
# Download and run universal installer
curl -fsSL https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-universal.sh | bash
# Or with options for full setup
curl -fsSL https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-universal.sh | bash -s -- --method pip --config --env --test
Windows (PowerShell):
# Download and run Windows installer
iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-windows.ps1'))
# Or with options for full setup
& ([scriptblock]::Create((iwr https://raw.githubusercontent.com/santhoshravindran7/Fabric-Analytics-MCP/main/scripts/install-windows.ps1).Content)) -Method pip -Config -Environment -Test
Option 4: Docker
# Clone repository
git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git
cd Fabric-Analytics-MCP
# Build and run with Docker
docker build -t fabric-analytics-mcp .
docker run -d --name fabric-mcp -p 3000:3000 --env-file .env fabric-analytics-mcp
π See for detailed Docker and Kubernetes deployment options.
Option 5: From Source (Development)
# Clone and build from source
git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git
cd Fabric-Analytics-MCP
npm install
npm run build # β
All configuration files included!
βοΈ Configuration
Set up your environment variables:
export FABRIC_AUTH_METHOD=bearer_token # or service_principal, interactive
export FABRIC_CLIENT_ID=your-client-id
export FABRIC_CLIENT_SECRET=your-client-secret
export FABRIC_TENANT_ID=your-tenant-id
export FABRIC_DEFAULT_WORKSPACE_ID=your-workspace-id
π§ Claude Desktop Setup
Add to your Claude Desktop config:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
For PyPI Installation:
{
"mcpServers": {
"fabric-analytics": {
"command": "fabric-analytics-mcp",
"args": ["start"],
"env": {
"FABRIC_AUTH_METHOD": "bearer_token"
}
}
}
}
For NPM Installation:
{
"mcpServers": {
"fabric-analytics": {
"command": "fabric-analytics",
"env": {
"FABRIC_AUTH_METHOD": "bearer_token"
}
}
}
}
For Source Installation:
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"]
}
}
}
π Start Using
Restart Claude Desktop and try these queries:
- "List all workspaces I have access to"
- "Find workspace named 'Analytics'"
- "List all items in my Fabric workspace [your-workspace-id]"
- "Create a new lakehouse called 'Analytics Hub'"
- "Show me all running Spark applications"
- "Execute this SQL query: SELECT * FROM my_table LIMIT 10"
π§ͺ Development & Testing
Running the Server
npm start # Production mode
npm run dev # Development mode with auto-reload
Testing Livy API Integration
For comprehensive testing of Spark functionality, install Python dependencies:
pip install -r livy_requirements.txt
Available Test Scripts:
livy_api_test.ipynb
- Interactive notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingspark_monitoring_test.py
- Spark application monitoring testsmcp_spark_monitoring_demo.py
- MCP server integration demo
Claude Desktop Integration
Add this configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"]
}
}
}
π You're ready! Restart Claude Desktop and start asking questions about your Microsoft Fabric data!
Livy API Testing Setup
For testing the Livy API functionality, additional Python dependencies are required:
# Install Python dependencies for Livy API testing
pip install -r livy_requirements.txt
Available Test Scripts:
livy_api_test.ipynb
- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingsimple_livy_test.py
- Simple test following example patternslivy_batch_test.py
- Batch job testing capabilitiesspark_monitoring_test.py
- Spark application monitoring tests
Usage
Running the Server
npm start
Development Mode
npm run dev
Testing with Claude Desktop
Add the following configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"]
}
}
}
π¬ Example Queries
Once connected to Claude Desktop, you can ask natural language questions like:
CRUD Operations:
- "List all Lakehouses in my workspace"
- "Create a new Notebook called 'Data Analysis'"
- "Update the description of my lakehouse"
- "Delete the test notebook from my workspace"
Notebook Management:
- "Create a sales analysis notebook with sample data"
- "Generate a new NYC taxi analysis notebook"
- "Create a machine learning notebook template"
- "Get the definition of my existing notebook"
- "Run my notebook with specific parameters"
- "Update my notebook with new cells"
Data Operations:
- "Query the sales dataset to get total revenue by region"
- "Execute my analytics notebook with today's date"
Analytics:
- "Get performance metrics for the last 24 hours"
- "Analyze my data model and provide optimization recommendations"
- "Generate a usage report for my workspace"
Livy API Operations:
- "Create a Livy session for interactive Spark analysis"
- "Execute SQL query 'SELECT * FROM my_table LIMIT 10'"
- "Run Spark code to show all tables"
- "Monitor my batch job progress"
Spark Application Monitoring:
- "Show me all Spark applications in my workspace"
- "What's the status of my notebook Spark jobs?"
- "Generate a comprehensive Spark monitoring dashboard"
- "Show me recent failed applications"
- "Cancel the problematic Spark application"
π Authentication
This MCP server supports multiple authentication methods powered by Microsoft Authentication Library (MSAL):
π€ For Claude Desktop: Use Bearer Token Authentication (Method #1) for the best experience and compatibility.
π§ Claude Desktop Fix: Recent updates prevent authentication timeouts by prioritizing bearer tokens and adding timeout protection for interactive authentication flows.
π« 1. Bearer Token Authentication (Recommended for Claude Desktop)
Perfect for AI assistants and interactive usage:
For Claude Desktop:
- Visit Power BI Embed Setup
- Generate a bearer token for your workspace
- Add to your
claude_desktop_config.json
- No timeout issues - bypasses interactive authentication entirely
For Testing:
# All test scripts will prompt for authentication method
python enhanced_auth_test.py
π€ 2. Service Principal Authentication (Recommended for Production)
Use Azure AD application credentials:
- Client ID (Application ID)
- Client Secret
- Tenant ID (Directory ID)
Environment Variables Setup:
export FABRIC_AUTH_METHOD="service_principal"
export FABRIC_CLIENT_ID="your-app-client-id"
export FABRIC_CLIENT_SECRET="your-app-client-secret"
export FABRIC_TENANT_ID="your-tenant-id"
export FABRIC_DEFAULT_WORKSPACE_ID="your-workspace-id"
Claude Desktop Configuration:
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["/path/to/build/index.js"],
"env": {
"FABRIC_AUTH_METHOD": "service_principal",
"FABRIC_CLIENT_ID": "your-client-id",
"FABRIC_CLIENT_SECRET": "your-client-secret",
"FABRIC_TENANT_ID": "your-tenant-id"
}
}
}
}
π± 3. Device Code Authentication
Sign in with browser on another device (great for headless environments):
export FABRIC_AUTH_METHOD="device_code"
export FABRIC_CLIENT_ID="your-client-id"
export FABRIC_TENANT_ID="your-tenant-id"
π 4. Interactive Authentication
Automatic browser-based authentication:
export FABRIC_AUTH_METHOD="interactive"
export FABRIC_CLIENT_ID="your-client-id"
export FABRIC_TENANT_ID="your-tenant-id"
π§ 5. Azure CLI Authentication β (Recommended for Local Development)
Use your existing Azure CLI login for seamless local testing:
export FABRIC_AUTH_METHOD="azure_cli"
Prerequisites:
- Install Azure CLI:
winget install Microsoft.AzureCLI
(Windows) or Download - Login to Azure:
az login
- Set active subscription:
az account set --subscription "your-subscription-name"
Benefits:
- β Zero Configuration - Uses your existing Azure login
- β Instant Setup - No app registration or client secrets needed
- β Multi-Account Support - Switch Azure accounts easily
- β Perfect for Development - Seamless local testing experience
Quick Test:
# Verify Azure CLI setup
npm run test:azure-cli
# Start MCP server with Azure CLI auth
$env:FABRIC_AUTH_METHOD="azure_cli"; npm start
π‘ Pro Tip: Azure CLI authentication is perfect for developers who want to quickly test the MCP server without complex Azure AD app setup. Just
az login
and you're ready to go!
π§ Complete Authentication Setup
π Detailed Guides:
- - Complete Azure AD setup
- - Ready-to-use configurations
π Authentication Testing
Check your authentication status:
"Check my Fabric authentication status"
"What authentication method am I using?"
"Test my Microsoft Fabric authentication setup"
π Security Best Practices
- Never commit authentication tokens to version control
- Use Service Principal authentication for production deployments
- Device Code flow is perfect for CI/CD and headless environments
- Interactive authentication is ideal for development and testing
- All tokens are automatically validated and include expiration checking
Note: The MCP server seamlessly handles token validation and provides clear error messages for authentication issues.
βΈοΈ Azure Kubernetes Service (AKS) Deployment
Deploy the MCP server as a scalable service on Azure Kubernetes Service for enterprise production use.
π Quick AKS Deployment
Prerequisites
- Azure CLI installed and configured
- Docker installed
- kubectl installed
- Azure subscription with AKS permissions
1. Build and Push Docker Image
# Build the Docker image
npm run docker:build
# Tag and push to Azure Container Registry
npm run docker:push
2. Deploy to AKS
# Create Azure resources and deploy
./scripts/deploy-to-aks.sh
3. Access the MCP Server
Once deployed, your MCP server will be available at:
https://your-aks-cluster.region.cloudapp.azure.com/mcp
ποΈ Architecture Overview
The AKS deployment includes:
- Horizontal Pod Autoscaler (3-10 pods based on CPU/memory)
- Azure Load Balancer for high availability
- SSL/TLS termination with Azure Application Gateway
- ConfigMaps for environment configuration
- Secrets for secure credential storage
- Health checks and readiness probes
- Resource limits and quality of service guarantees
π Deployment Files
All Kubernetes manifests are located in the /k8s
directory:
namespace.yaml
- Dedicated namespacedeployment.yaml
- Application deployment with scalingservice.yaml
- Load balancer serviceingress.yaml
- External access and SSLconfigmap.yaml
- Configuration managementsecret.yaml
- Secure credential storagehpa.yaml
- Horizontal Pod Autoscaler
π§ Configuration
Configure the deployment by setting these environment variables:
export AZURE_SUBSCRIPTION_ID="your-subscription-id"
export AZURE_RESOURCE_GROUP="fabric-mcp-rg"
export AKS_CLUSTER_NAME="fabric-mcp-cluster"
export ACR_NAME="fabricmcpregistry"
export DOMAIN_NAME="your-domain.com"
π Production Security
The AKS deployment includes enterprise-grade security:
- Non-root container execution
- Read-only root filesystem
- Secret management via Azure Key Vault integration
- Network policies for traffic isolation
- RBAC with minimal required permissions
- Pod security standards enforcement
π Monitoring & Scaling
- Azure Monitor integration for logs and metrics
- Application Insights for performance monitoring
- Prometheus metrics endpoint for custom monitoring
- Auto-scaling based on CPU (70%) and memory (80%) thresholds
- Health checks for automatic pod restart
π CI/CD Integration
The deployment scripts support:
- Azure DevOps pipelines
- GitHub Actions workflows
- Automated testing before deployment
- Blue-green deployments for zero downtime
- Rollback capabilities for quick recovery
π Detailed Guide: See for complete setup instructions.
π Azure Model Context Protocol Server (Preview)
Microsoft Azure now offers a preview service for hosting MCP servers natively. This eliminates the need for custom infrastructure management.
π Azure MCP Server Deployment
Prerequisites
- Azure subscription with MCP preview access
- Azure CLI with MCP extensions
Deploy to Azure MCP Service
# Login to Azure
az login
# Enable MCP preview features
az extension add --name mcp-preview
# Deploy the MCP server
az mcp server create \
--name "fabric-analytics-mcp" \
--resource-group "your-rg" \
--source-type "github" \
--repository "santhoshravindran7/Fabric-Analytics-MCP" \
--branch "main" \
--auth-method "service-principal"
Configure Authentication
# Set up service principal authentication
az mcp server config set \
--name "fabric-analytics-mcp" \
--setting "FABRIC_CLIENT_ID=your-client-id" \
--setting "FABRIC_CLIENT_SECRET=your-secret" \
--setting "FABRIC_TENANT_ID=your-tenant-id"
Access Your MCP Server
# Get the server endpoint
az mcp server show --name "fabric-analytics-mcp" --query "endpoint"
π§ Azure MCP Server Features
- Automatic scaling based on usage
- Built-in monitoring and logging
- Integrated security with Azure AD
- Zero infrastructure management
- Global CDN for low latency
- Automatic SSL/TLS certificates
π° Cost Optimization
Azure MCP Server offers:
- Pay-per-request pricing model
- Automatic hibernation during idle periods
- Resource sharing across multiple clients
- No minimum infrastructure costs
π Learn More: Azure MCP Server Documentation
Note: Azure MCP Server is currently in preview. Check Azure Preview Terms for service availability and limitations.
ποΈ Architecture
This MCP server is built with:
- TypeScript for type-safe development
- MCP SDK for Model Context Protocol implementation
- Zod for schema validation and input sanitization
- Node.js runtime environment
βοΈ Configuration
The server uses the following configuration files:
tsconfig.json
- TypeScript compiler configurationpackage.json
- Node.js package configuration.vscode/mcp.json
- MCP server configuration for VS Code
π§ Development
Project Structure
βββ src/
β βββ index.ts # Main MCP server implementation
β βββ fabric-client.ts # Microsoft Fabric API client
βββ build/ # Compiled JavaScript output
βββ tests/ # Test scripts and notebooks
βββ .vscode/ # VS Code configuration
βββ package.json
βββ tsconfig.json
βββ README.md
Adding New Tools
To add new tools to the server:
- Define the input schema using Zod
- Implement the tool using
server.tool()
- Add error handling and validation
- Update documentation
API Integration
This server includes:
β Production Ready:
- Full Microsoft Fabric Livy API integration
- Spark session lifecycle management
- Statement execution with SQL and Spark support
- Batch job management for long-running operations
- Comprehensive error handling and retry logic
- Real-time polling and result retrieval
π§ͺ Demonstration Features:
- CRUD operations (configurable for real APIs)
- Analytics and metrics (extensible framework)
- Data model analysis (template implementation)
π§ͺ Testing
π End-to-End Testing
The MCP server includes comprehensive end-to-end testing that creates real workspaces, items, and jobs to validate complete functionality using Azure CLI authentication.
Quick Setup for E2E Testing
# 1. Set up end-to-end testing environment
npm run setup:e2e
# 2. Run the comprehensive end-to-end test
npm run test:e2e
What the E2E Test Does
The end-to-end test creates a complete workflow in your Microsoft Fabric tenant:
- π Validates Azure CLI Authentication - Uses your existing
az login
session - ποΈ Creates a Test Workspace - New workspace with unique naming
- β‘ Attaches to Capacity - Links workspace to your Fabric capacity (optional)
- π Creates Notebooks & Lakehouses - Test items for validation
- π Runs Real Jobs - Executes notebook with actual Spark code
- π Monitors Execution - Tracks job status and completion
- π§Ή Cleans Up Resources - Removes all created test resources
E2E Test Configuration
The setup script creates a .env.e2e
configuration file:
# Example configuration
FABRIC_CAPACITY_ID=your-capacity-id-here # Optional: for capacity testing
E2E_TEST_TIMEOUT=300000 # 5 minutes per operation
E2E_CLEANUP_ON_FAILURE=true # Clean up on test failure
E2E_RETRY_COUNT=3 # Retry failed operations
E2E Test Features
- β Real Resource Creation - Creates actual Fabric workspaces and items
- β Azure CLI Integration - Uses your existing Azure authentication
- β Capacity Assignment - Tests workspace-to-capacity attachment
- β Job Execution - Runs real Spark jobs and monitors completion
- β Automatic Cleanup - Removes all test resources automatically
- β Comprehensive Logging - Detailed logging of all operations
- β Error Handling - Robust error handling and recovery
Prerequisites for E2E Testing
-
Azure CLI installed and logged in:
az login
-
Microsoft Fabric Access with permissions to:
- Create workspaces
- Create notebooks and lakehouses
- Run Spark jobs
- (Optional) Assign workspaces to capacity
-
Fabric Capacity (optional but recommended):
- Set
FABRIC_CAPACITY_ID
in.env.e2e
for capacity testing - Without capacity, workspace will use shared capacity
- Set
Running E2E Tests
# Complete setup and run
npm run setup:e2e && npm run test:e2e
# Or run individual steps
npm run setup:e2e # Set up environment
npm run test:e2e # Run end-to-end test
# Direct execution
node setup-e2e.cjs # Setup script
node test-end-to-end.cjs # Test script
E2E Test Output
The test provides comprehensive output including:
π Starting End-to-End Test for Microsoft Fabric Analytics MCP Server
β
MCP Server Startup (1234ms)
β
Azure CLI Authentication
β
Workspace Creation
β
Capacity Attachment
β
Notebook Creation
β
Lakehouse Creation
β
Item Validation
β
Job Execution
π TEST SUMMARY
================
β
MCP Server Startup (2341ms)
β
Azure CLI Authentication
β
Workspace Creation
β
Capacity Attachment
β
Notebook Creation
β
Lakehouse Creation
β
Item Validation
β
Job Execution
Total: 8 | Passed: 8 | Failed: 0
β οΈ Important Notes for E2E Testing
- Creates Real Resources: The test creates actual workspaces and items in your Fabric tenant
- Requires Permissions: Ensure you have necessary Fabric permissions
- Uses Capacity: Jobs may consume capacity units if using dedicated capacity
- Automatic Cleanup: All resources are automatically deleted after testing
- Network Dependent: Requires stable internet connection for API calls
π§ͺ Unit & Integration Testing
Prerequisites
# Install Python dependencies for API testing
pip install -r livy_requirements.txt
Available Test Scripts
livy_api_test.ipynb
- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingsimple_livy_test.py
- Simple test following example patternslivy_batch_test.py
- Batch job testing capabilitiesspark_monitoring_test.py
- Spark application monitoring tests
Quick Testing
-
Interactive Testing:
jupyter notebook livy_api_test.ipynb
-
Command Line Testing:
python simple_livy_test.py python spark_monitoring_test.py
-
Comprehensive Testing:
π€ Contributing
We welcome contributions! Here's how to get started:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Make your changes and add tests if applicable
- Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Development Guidelines
- Follow TypeScript best practices
- Add JSDoc comments for new functions
- Update tests for any new functionality
- Update documentation as needed
- See for detailed guidelines
π Security
- Never commit authentication tokens to version control
- Use environment variables for sensitive configuration
- Follow Microsoft Fabric security best practices
- Report security issues privately via GitHub security advisories
- See for our full security policy
π License
This project is licensed under the MIT License - see the file for details.
Support
For issues and questions:
- π Check the MCP documentation
- π Review Microsoft Fabric API documentation
- π Open an issue in this repository
- π¬ Join the community discussions
Acknowledgments
- Microsoft Fabric Analytics team for the comprehensive data platform and analytics capabilities
- Microsoft Fabric Platform teams for the robust API platform and infrastructure
- Bogdan Crivat and Chris Finlan for the inspiring brainstorming conversation that gave me the idea to open-source this project
- Anthropic for the Model Context Protocol specification
This project began as my weekend hack project exploring AI integration with Microsoft Fabric. During a casual conversation with Chris and Bogdan about making AI tooling more accessible. What started as a personal experiment over a weekend is now available for everyone to build upon.