Fabric-Analytics-MCP
If you are the rightful owner of Fabric-Analytics-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Microsoft Fabric Analytics MCP Server is a comprehensive server that provides analytics capabilities and tools for interacting with the Microsoft Fabric data platform using standardized Model Context Protocols.
Microsoft Fabric Analytics MCP Server
A comprehensive Model Context Protocol (MCP) server that provides analytics capabilities and tools for interacting with Microsoft Fabric data platform. This server enables AI assistants like Claude to seamlessly access, analyze, and monitor Microsoft Fabric resources through standardized MCP protocols, bringing the power of Microsoft Fabric directly to your AI conversations.
π Table of Contents
- π Key Features
- π Quick Start
- π οΈ Tools & Capabilities
- π§ͺ Development & Testing
- π¬ Example Queries
- π Authentication
- ποΈ Architecture
- βοΈ Configuration
- π€ Contributing
- π Security
- π License
- π Support
π Key Features
- π Complete CRUD Operations - Create, read, update, and delete Fabric items
- π Notebook Management - Create, execute, and manage Fabric notebooks with templates
- β‘ Livy API Integration - Full Spark session and batch job management
- π Spark Application Monitoring - Real-time monitoring across workspaces and items
- π€ Claude Desktop Ready - Plug-and-play integration with Claude Desktop
- π Enterprise Authentication - Multiple auth methods (Bearer, Service Principal, Device Code, Interactive)
- π‘οΈ MSAL Integration - Microsoft Authentication Library for secure enterprise access
- π Analytics & Insights - Generate comprehensive monitoring dashboards
- π§ͺ Comprehensive Testing - Extensive test suite with real API validation
- π Token Management - Automatic token validation and expiration handling
- βΈοΈ Enterprise Deployment - Full Kubernetes and Azure deployment support
- π Docker Support - Containerized deployment with health checks
- π Monitoring & Observability - Built-in Prometheus metrics and Grafana dashboards
- π Azure MCP Server - Native Azure hosting option (preview)
π Deployment Options
π€ Claude Desktop Integration
Recommended for AI Assistant Usage:
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["C:\\path\\to\\your\\build\\index.js"],
"cwd": "C:\\path\\to\\your\\project",
"env": {
"FABRIC_AUTH_METHOD": "bearer_token",
"FABRIC_TOKEN": "your_bearer_token_here",
"FABRIC_WORKSPACE_ID": "your_workspace_id",
"ENABLE_HEALTH_SERVER": "false"
}
}
}
}
π‘ Get Bearer Token: Visit Power BI Embed Setup to generate tokens
β οΈ Important: Tokens expire after ~1 hour and need to be refreshed
π§ Claude Desktop Authentication Fix
If you experience 60-second timeouts during startup, this is due to interactive authentication flows blocking Claude Desktop's sandboxed environment. Solution:
-
Use Bearer Token Method (Recommended):
- Set
FABRIC_AUTH_METHOD: "bearer_token"
in your config - Provide
FABRIC_TOKEN
with a valid bearer token - This bypasses interactive authentication entirely
- Set
-
Alternative - Per-Tool Authentication:
- Provide token directly in tool calls:
bearerToken: "your_token_here"
- Or use simulation mode:
bearerToken: "simulation"
- Provide token directly in tool calls:
-
Troubleshooting:
- Server now has 10-second timeout protection to prevent hanging
- Falls back to simulation mode if authentication fails
- Enhanced error messages provide clear guidance
π― Quick Fix: The server automatically prioritizes
FABRIC_TOKEN
environment variable over interactive authentication flows, preventing Claude Desktop timeouts.
π± Local Development
# Clone and run locally
git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git
cd Fabric-Analytics-MCP
npm install && npm run build && npm start
π³ Docker Deployment
# Using Docker Compose
docker-compose up -d
# Or standalone Docker
docker build -t fabric-analytics-mcp .
docker run -p 3000:3000 -e FABRIC_CLIENT_ID=xxx fabric-analytics-mcp
βΈοΈ Azure Kubernetes Service (AKS)
# One-command enterprise deployment
export ACR_NAME="your-registry" FABRIC_CLIENT_ID="xxx" FABRIC_CLIENT_SECRET="yyy" FABRIC_TENANT_ID="zzz"
./scripts/setup-azure-resources.sh && ./scripts/build-and-push.sh && ./scripts/deploy-to-aks.sh
π Azure MCP Server (Preview)
# Serverless deployment on Azure
az mcp server create --name "fabric-analytics-mcp" --repository "santhoshravindran7/Fabric-Analytics-MCP"
π Detailed Guides:
π οΈ Tools & Capabilities
π CRUD Operations for Fabric Items
-
Tool:
list-fabric-items
-
Description: List items in a Microsoft Fabric workspace (Lakehouses, Notebooks, etc.)
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemType
: Filter by item type (optional)
-
Tool:
create-fabric-item
-
Description: Create new items in Microsoft Fabric workspace
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemType
: Type of item (Lakehouse, Notebook, Dataset, Report, Dashboard)displayName
: Display name for the new itemdescription
: Optional description
-
Tool:
get-fabric-item
-
Description: Get detailed information about a specific Microsoft Fabric item
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to retrieve
-
Tool:
update-fabric-item
-
Description: Update existing items in Microsoft Fabric workspace
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to updatedisplayName
: New display name (optional)description
: New description (optional)
-
Tool:
delete-fabric-item
-
Description: Delete items from Microsoft Fabric workspace
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDitemId
: ID of the item to delete
π Query Fabric Dataset (Enhanced)
- Tool:
query-fabric-dataset
- Description: Execute SQL or KQL queries against Microsoft Fabric datasets
- Parameters:
bearerToken
: Microsoft Fabric bearer token (optional - uses simulation if not provided)workspaceId
: Microsoft Fabric workspace IDdatasetName
: Name of the dataset to queryquery
: SQL or KQL query to execute
π Execute Fabric Notebook
- Tool:
execute-fabric-notebook
- Description: Execute a notebook in Microsoft Fabric workspace
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to executeparameters
: Optional parameters to pass to the notebook
π Get Analytics Metrics
- Tool:
get-fabric-metrics
- Description: Retrieve performance and usage metrics for Microsoft Fabric items
- Parameters:
workspaceId
: Microsoft Fabric workspace IDitemId
: Item ID (dataset, report, etc.)timeRange
: Time range for metrics (1h, 24h, 7d, 30d)metrics
: List of metrics to analyze
π§ Analyze Data Model
- Tool:
analyze-fabric-model
- Description: Analyze a Microsoft Fabric data model and get optimization recommendations
- Parameters:
workspaceId
: Microsoft Fabric workspace IDitemId
: Item ID to analyze
π Generate Analytics Report
- Tool:
generate-fabric-report
- Description: Generate comprehensive analytics reports for Microsoft Fabric workspaces
- Parameters:
workspaceId
: Microsoft Fabric workspace IDreportType
: Type of report (performance, usage, health, summary)
π Livy API Integration (Sessions & Batch Jobs)
Session Management
-
Tool:
create-livy-session
-
Description: Create a new Livy session for interactive Spark/SQL execution
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionConfig
: Optional session configuration
-
Tool:
get-livy-session
-
Description: Get details of a Livy session
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session ID
-
Tool:
list-livy-sessions
-
Description: List all Livy sessions in a lakehouse
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse ID
-
Tool:
delete-livy-session
-
Description: Delete a Livy session
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session ID
Statement Execution
-
Tool:
execute-livy-statement
-
Description: Execute SQL or Spark statements in a Livy session
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDcode
: SQL or Spark code to executekind
: Statement type (sql, spark, etc.)
-
Tool:
get-livy-statement
-
Description: Get status and results of a Livy statement
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDsessionId
: Livy session IDstatementId
: Statement ID
Batch Job Management
-
Tool:
create-livy-batch
-
Description: Create a new Livy batch job for long-running operations
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchConfig
: Batch job configuration
-
Tool:
get-livy-batch
-
Description: Get details of a Livy batch job
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchId
: Batch job ID
-
Tool:
list-livy-batches
-
Description: List all Livy batch jobs in a lakehouse
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse ID
-
Tool:
delete-livy-batch
-
Description: Delete a Livy batch job
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Microsoft Fabric lakehouse IDbatchId
: Batch job ID
π Spark Application Monitoring
Workspace-Level Monitoring
- Tool:
get-workspace-spark-applications
- Description: Get all Spark applications in a Microsoft Fabric workspace
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDcontinuationToken
: Optional token for pagination
Item-Specific Monitoring
-
Tool:
get-notebook-spark-applications
-
Description: Get all Spark applications for a specific notebook
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: Notebook IDcontinuationToken
: Optional token for pagination
-
Tool:
get-lakehouse-spark-applications
-
Description: Get all Spark applications for a specific lakehouse
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlakehouseId
: Lakehouse IDcontinuationToken
: Optional token for pagination
-
Tool:
get-spark-job-definition-applications
-
Description: Get all Spark applications for a specific Spark Job Definition
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDsparkJobDefinitionId
: Spark Job Definition IDcontinuationToken
: Optional token for pagination
Application Management
-
Tool:
get-spark-application-details
-
Description: Get detailed information about a specific Spark application
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlivyId
: Livy session ID
-
Tool:
cancel-spark-application
-
Description: Cancel a running Spark application
-
Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDlivyId
: Livy session ID
Monitoring Dashboard
- Tool:
get-spark-monitoring-dashboard
- Description: Generate a comprehensive monitoring dashboard with analytics
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace ID
π Notebook Management
The MCP server provides comprehensive notebook management capabilities with predefined templates and custom notebook support.
Create Notebook from Template
- Tool:
create-fabric-notebook
- Description: Create new Fabric notebooks from predefined templates or custom definitions
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDdisplayName
: Display name for the new notebooktemplate
: Template type (blank, sales_analysis, nyc_taxi_analysis, data_exploration, machine_learning, custom)customNotebook
: Custom notebook definition (required if template is 'custom')environmentId
: Optional environment ID to attachlakehouseId
: Optional default lakehouse IDlakehouseName
: Optional default lakehouse name
Available Templates:
- blank: Basic notebook with minimal setup
- sales_analysis: Comprehensive sales data analysis with sample dataset
- nyc_taxi_analysis: NYC taxi trip data analysis with sample dataset
- data_exploration: Structured data exploration template
- machine_learning: Complete ML workflow template
- custom: Use your own notebook definition
Get Notebook Definition
- Tool:
get-fabric-notebook-definition
- Description: Retrieve the notebook definition (cells, metadata) from an existing Fabric notebook
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to retrieveformat
: Format to return (ipynb or fabricGitSource)
Update Notebook Definition
- Tool:
update-fabric-notebook-definition
- Description: Update the notebook definition (cells, metadata) of an existing Fabric notebook
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to updatenotebookDefinition
: Updated notebook definition object
Execute Notebook
- Tool:
run-fabric-notebook
- Description: Execute a Fabric notebook on-demand with optional parameters and configuration
- Parameters:
bearerToken
: Microsoft Fabric bearer tokenworkspaceId
: Microsoft Fabric workspace IDnotebookId
: ID of the notebook to runparameters
: Optional notebook parameters (key-value pairs with types)configuration
: Optional execution configuration (environment, lakehouse, pools, etc.)
Features:
- π Base64 encoded notebook payload support
- π§ Comprehensive metadata management
- π Environment and lakehouse integration
- ποΈ Parameterized notebook execution
- β‘ Spark configuration support
- π€ Support for multiple programming languages (Python, Scala, SQL, R)
π Quick Start
Prerequisites
- Node.js 18+ and npm
- Microsoft Fabric workspace access
- Claude Desktop (for AI integration)
Installation & Setup
-
Clone and Install
git clone https://github.com/santhoshravindran7/Fabric-Analytics-MCP.git cd Fabric-Analytics-MCP npm install npm run build # β All configuration files included!
π Note: All essential configuration files (
tsconfig.json
,jest.config.json
, etc.) are now properly included in the repository. Previous build issues have been resolved. -
Configure Claude Desktop
Add to your Claude Desktop config:
Windows:
%APPDATA%\Claude\claude_desktop_config.json
macOS:~/Library/Application Support/Claude/claude_desktop_config.json
{ "mcpServers": { "fabric-analytics": { "command": "node", "args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"] } } }
-
Start Using
Restart Claude Desktop and try these queries: - "List all items in my Fabric workspace [your-workspace-id]"
- "Create a new lakehouse called 'Analytics Hub'"
- "Show me all running Spark applications"
- "Execute this SQL query: SELECT * FROM my_table LIMIT 10"
π§ͺ Development & Testing
Running the Server
npm start # Production mode
npm run dev # Development mode with auto-reload
Testing Livy API Integration
For comprehensive testing of Spark functionality, install Python dependencies:
pip install -r livy_requirements.txt
Available Test Scripts:
livy_api_test.ipynb
- Interactive notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingspark_monitoring_test.py
- Spark application monitoring testsmcp_spark_monitoring_demo.py
- MCP server integration demo
Claude Desktop Integration
Add this configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"]
}
}
}
π You're ready! Restart Claude Desktop and start asking questions about your Microsoft Fabric data!
Livy API Testing Setup
For testing the Livy API functionality, additional Python dependencies are required:
# Install Python dependencies for Livy API testing
pip install -r livy_requirements.txt
Available Test Scripts:
livy_api_test.ipynb
- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingsimple_livy_test.py
- Simple test following example patternslivy_batch_test.py
- Batch job testing capabilitieslivy_setup.py
- Quick setup and configuration helper
Usage
Running the Server
npm start
Development Mode
npm run dev
Testing with Claude Desktop
Add the following configuration to your Claude Desktop config file:
Windows: %APPDATA%\Claude\claude_desktop_config.json
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["/ABSOLUTE/PATH/TO/PROJECT/build/index.js"]
}
}
}
π¬ Example Queries
Once connected to Claude Desktop, you can ask natural language questions like:
CRUD Operations:
- "List all Lakehouses in my workspace"
- "Create a new Notebook called 'Data Analysis'"
- "Update the description of my lakehouse"
- "Delete the test notebook from my workspace"
Notebook Management:
- "Create a sales analysis notebook with sample data"
- "Generate a new NYC taxi analysis notebook"
- "Create a machine learning notebook template"
- "Get the definition of my existing notebook"
- "Run my notebook with specific parameters"
- "Update my notebook with new cells"
Data Operations:
- "Query the sales dataset to get total revenue by region"
- "Execute my analytics notebook with today's date"
Analytics:
- "Get performance metrics for the last 24 hours"
- "Analyze my data model and provide optimization recommendations"
- "Generate a usage report for my workspace"
Livy API Operations:
- "Create a Livy session for interactive Spark analysis"
- "Execute SQL query 'SELECT * FROM my_table LIMIT 10'"
- "Run Spark code to show all tables"
- "Monitor my batch job progress"
Spark Application Monitoring:
- "Show me all Spark applications in my workspace"
- "What's the status of my notebook Spark jobs?"
- "Generate a comprehensive Spark monitoring dashboard"
- "Show me recent failed applications"
- "Cancel the problematic Spark application"
π Authentication
This MCP server supports multiple authentication methods powered by Microsoft Authentication Library (MSAL):
π€ For Claude Desktop: Use Bearer Token Authentication (Method #1) for the best experience and compatibility.
π§ Claude Desktop Fix: Recent updates prevent authentication timeouts by prioritizing bearer tokens and adding timeout protection for interactive authentication flows.
π« 1. Bearer Token Authentication (Recommended for Claude Desktop)
Perfect for AI assistants and interactive usage:
For Claude Desktop:
- Visit Power BI Embed Setup
- Generate a bearer token for your workspace
- Add to your
claude_desktop_config.json
- No timeout issues - bypasses interactive authentication entirely
For Testing:
# All test scripts will prompt for authentication method
python enhanced_auth_test.py
π€ 2. Service Principal Authentication (Recommended for Production)
Use Azure AD application credentials:
- Client ID (Application ID)
- Client Secret
- Tenant ID (Directory ID)
Environment Variables Setup:
export FABRIC_AUTH_METHOD="service_principal"
export FABRIC_CLIENT_ID="your-app-client-id"
export FABRIC_CLIENT_SECRET="your-app-client-secret"
export FABRIC_TENANT_ID="your-tenant-id"
export FABRIC_DEFAULT_WORKSPACE_ID="your-workspace-id"
Claude Desktop Configuration:
{
"mcpServers": {
"fabric-analytics": {
"command": "node",
"args": ["/path/to/build/index.js"],
"env": {
"FABRIC_AUTH_METHOD": "service_principal",
"FABRIC_CLIENT_ID": "your-client-id",
"FABRIC_CLIENT_SECRET": "your-client-secret",
"FABRIC_TENANT_ID": "your-tenant-id"
}
}
}
}
π± 3. Device Code Authentication
Sign in with browser on another device (great for headless environments):
export FABRIC_AUTH_METHOD="device_code"
export FABRIC_CLIENT_ID="your-client-id"
export FABRIC_TENANT_ID="your-tenant-id"
π 4. Interactive Authentication
Automatic browser-based authentication:
export FABRIC_AUTH_METHOD="interactive"
export FABRIC_CLIENT_ID="your-client-id"
export FABRIC_TENANT_ID="your-tenant-id"
π§ Complete Authentication Setup
π Detailed Guides:
- - Complete Azure AD setup
- - Ready-to-use configurations
π Authentication Testing
Check your authentication status:
"Check my Fabric authentication status"
"What authentication method am I using?"
"Test my Microsoft Fabric authentication setup"
π Security Best Practices
- Never commit authentication tokens to version control
- Use Service Principal authentication for production deployments
- Device Code flow is perfect for CI/CD and headless environments
- Interactive authentication is ideal for development and testing
- All tokens are automatically validated and include expiration checking
Note: The MCP server seamlessly handles token validation and provides clear error messages for authentication issues.
βΈοΈ Azure Kubernetes Service (AKS) Deployment
Deploy the MCP server as a scalable service on Azure Kubernetes Service for enterprise production use.
π Quick AKS Deployment
Prerequisites
- Azure CLI installed and configured
- Docker installed
- kubectl installed
- Azure subscription with AKS permissions
1. Build and Push Docker Image
# Build the Docker image
npm run docker:build
# Tag and push to Azure Container Registry
npm run docker:push
2. Deploy to AKS
# Create Azure resources and deploy
./scripts/deploy-to-aks.sh
3. Access the MCP Server
Once deployed, your MCP server will be available at:
https://your-aks-cluster.region.cloudapp.azure.com/mcp
ποΈ Architecture Overview
The AKS deployment includes:
- Horizontal Pod Autoscaler (3-10 pods based on CPU/memory)
- Azure Load Balancer for high availability
- SSL/TLS termination with Azure Application Gateway
- ConfigMaps for environment configuration
- Secrets for secure credential storage
- Health checks and readiness probes
- Resource limits and quality of service guarantees
π Deployment Files
All Kubernetes manifests are located in the /k8s
directory:
namespace.yaml
- Dedicated namespacedeployment.yaml
- Application deployment with scalingservice.yaml
- Load balancer serviceingress.yaml
- External access and SSLconfigmap.yaml
- Configuration managementsecret.yaml
- Secure credential storagehpa.yaml
- Horizontal Pod Autoscaler
π§ Configuration
Configure the deployment by setting these environment variables:
export AZURE_SUBSCRIPTION_ID="your-subscription-id"
export AZURE_RESOURCE_GROUP="fabric-mcp-rg"
export AKS_CLUSTER_NAME="fabric-mcp-cluster"
export ACR_NAME="fabricmcpregistry"
export DOMAIN_NAME="your-domain.com"
π Production Security
The AKS deployment includes enterprise-grade security:
- Non-root container execution
- Read-only root filesystem
- Secret management via Azure Key Vault integration
- Network policies for traffic isolation
- RBAC with minimal required permissions
- Pod security standards enforcement
π Monitoring & Scaling
- Azure Monitor integration for logs and metrics
- Application Insights for performance monitoring
- Prometheus metrics endpoint for custom monitoring
- Auto-scaling based on CPU (70%) and memory (80%) thresholds
- Health checks for automatic pod restart
π CI/CD Integration
The deployment scripts support:
- Azure DevOps pipelines
- GitHub Actions workflows
- Automated testing before deployment
- Blue-green deployments for zero downtime
- Rollback capabilities for quick recovery
π Detailed Guide: See for complete setup instructions.
π Azure Model Context Protocol Server (Preview)
Microsoft Azure now offers a preview service for hosting MCP servers natively. This eliminates the need for custom infrastructure management.
π Azure MCP Server Deployment
Prerequisites
- Azure subscription with MCP preview access
- Azure CLI with MCP extensions
Deploy to Azure MCP Service
# Login to Azure
az login
# Enable MCP preview features
az extension add --name mcp-preview
# Deploy the MCP server
az mcp server create \
--name "fabric-analytics-mcp" \
--resource-group "your-rg" \
--source-type "github" \
--repository "santhoshravindran7/Fabric-Analytics-MCP" \
--branch "main" \
--auth-method "service-principal"
Configure Authentication
# Set up service principal authentication
az mcp server config set \
--name "fabric-analytics-mcp" \
--setting "FABRIC_CLIENT_ID=your-client-id" \
--setting "FABRIC_CLIENT_SECRET=your-secret" \
--setting "FABRIC_TENANT_ID=your-tenant-id"
Access Your MCP Server
# Get the server endpoint
az mcp server show --name "fabric-analytics-mcp" --query "endpoint"
π§ Azure MCP Server Features
- Automatic scaling based on usage
- Built-in monitoring and logging
- Integrated security with Azure AD
- Zero infrastructure management
- Global CDN for low latency
- Automatic SSL/TLS certificates
π° Cost Optimization
Azure MCP Server offers:
- Pay-per-request pricing model
- Automatic hibernation during idle periods
- Resource sharing across multiple clients
- No minimum infrastructure costs
π Learn More: Azure MCP Server Documentation
Note: Azure MCP Server is currently in preview. Check Azure Preview Terms for service availability and limitations.
ποΈ Architecture
This MCP server is built with:
- TypeScript for type-safe development
- MCP SDK for Model Context Protocol implementation
- Zod for schema validation and input sanitization
- Node.js runtime environment
βοΈ Configuration
The server uses the following configuration files:
tsconfig.json
- TypeScript compiler configurationpackage.json
- Node.js package configuration.vscode/mcp.json
- MCP server configuration for VS Code
π§ Development
Project Structure
βββ src/
β βββ index.ts # Main MCP server implementation
β βββ fabric-client.ts # Microsoft Fabric API client
βββ build/ # Compiled JavaScript output
βββ tests/ # Test scripts and notebooks
βββ .vscode/ # VS Code configuration
βββ package.json
βββ tsconfig.json
βββ README.md
Adding New Tools
To add new tools to the server:
- Define the input schema using Zod
- Implement the tool using
server.tool()
- Add error handling and validation
- Update documentation
API Integration
This server includes:
β Production Ready:
- Full Microsoft Fabric Livy API integration
- Spark session lifecycle management
- Statement execution with SQL and Spark support
- Batch job management for long-running operations
- Comprehensive error handling and retry logic
- Real-time polling and result retrieval
π§ͺ Demonstration Features:
- CRUD operations (configurable for real APIs)
- Analytics and metrics (extensible framework)
- Data model analysis (template implementation)
π§ͺ Testing
Prerequisites
# Install Python dependencies for API testing
pip install -r livy_requirements.txt
Available Test Scripts
livy_api_test.ipynb
- Interactive Jupyter notebook for step-by-step testingcomprehensive_livy_test.py
- Full-featured test with error handlingsimple_livy_test.py
- Simple test following example patternslivy_batch_test.py
- Batch job testing capabilitiesspark_monitoring_test.py
- Spark application monitoring tests
Quick Testing
-
Interactive Testing:
jupyter notebook livy_api_test.ipynb
-
Command Line Testing:
python simple_livy_test.py python spark_monitoring_test.py
-
Comprehensive Testing:
π€ Contributing
We welcome contributions! Here's how to get started:
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Make your changes and add tests if applicable
- Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
Development Guidelines
- Follow TypeScript best practices
- Add JSDoc comments for new functions
- Update tests for any new functionality
- Update documentation as needed
- See for detailed guidelines
π Security
- Never commit authentication tokens to version control
- Use environment variables for sensitive configuration
- Follow Microsoft Fabric security best practices
- Report security issues privately via GitHub security advisories
- See for our full security policy
π License
This project is licensed under the MIT License - see the file for details.
Support
For issues and questions:
- π Check the MCP documentation
- π Review Microsoft Fabric API documentation
- π Open an issue in this repository
- π¬ Join the community discussions
Acknowledgments
- Microsoft Fabric Analytics team for the comprehensive data platform and analytics capabilities
- Microsoft Fabric Platform teams for the robust API platform and infrastructure
- Bogdan Crivat and Chris Finlan for the inspiring brainstorming conversation that gave me the idea to open-source this project
- Anthropic for the Model Context Protocol specification
This project began as my weekend hack project exploring AI integration with Microsoft Fabric. During a casual conversation with Chris and Bogdan about making AI tooling more accessible. What started as a personal experiment over a weekend is now available for everyone to build upon.