Eazy-Ops/multi-cloud-finops-mcp-server
If you are the rightful owner of multi-cloud-finops-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
FastMCP is a Model Context Protocol server that connects Gemini-powered assistants with FinOps insights across AWS, GCP, and Azure, enabling natural language-based cost breakdowns, audits, and usage summaries locally and securely.
๐ FastMCP - Multi-Cloud FinOps Copilot
An MCP (Model Context Protocol) server that connects Gemini-powered assistants with FinOps insights across AWS, GCP, and Azure. Perform natural language-based cost breakdowns, audits, and usage summaries โ all locally and securely.
๐ Why Use FastMCP?
Managing multi-cloud costs is complex. FastMCP allows you to:
- Ask AI "How much did we spend on Azure last month?"
- Run a cost-saving audit across AWS and GCP in one prompt
- Receive budget summaries from all major cloud providers
Powered by LangChain, LangGraph, and Gemini Pro, this tool makes FinOps conversational and cross-platform.
๐ฅ Demo Videos
Getting Started with FastMCP: https://www.loom.com/share/0d875df356574d31a2ea16c8d809b2dc
Advanced FinOps Analysis: https://www.loom.com/share/54ecf1a34d6d42ce9ea35c8d160a676f
๐ Features
- ๐ Supports AWS, GCP, Azure
- ๐ง Natural language queries via Gemini Pro
- ๐งฐ Cost breakdowns, FinOps audits, budget status
- โ๏ธ CLI or FastAPI-compatible architecture
- ๐งฐ Quick analysis with actionable recommendations for high-cost resources.
- ๐ก๏ธ Credentials never leave your machine (uses local SDK/CLI auth)
๐งฑ Installation
1. Prerequisites
-
Python 3.11+
-
poetry
-
CLI tools:
- aws CLI (for AWS)
- gcloud CLI (for GCP)
- az CLI (for Azure)
2. Clone & Setup
git clone https://github.com/Eazy-Ops/multi-cloud-finops-mcp-server.git
cd multi-cloud-finops-mcp-server
# Install dependencies
poetry install
# activate virtualenv created by poetry
poetry shell
๐ Authentication Setup
๐น AWS
aws configure --profile your-profile
You'll be prompted for:
- Access Key ID
- Secret Access Key
- Region
- Output format (e.g. json)
๐น GCP
Option 1: Use Application Default Credentials (ADC)
gcloud auth application-default login
Option 2: Use Service Account JSON
Pass the file path to service_account_key_path
when calling GCP functions.
๐น Azure
az login
For service principal auth (optional):
export AZURE_TENANT_ID=your-tenant-id
export AZURE_CLIENT_ID=your-client-id
export AZURE_CLIENT_SECRET=your-client-secret
๐งช Usage
CLI Entry Point
poetry run python -m mcp.server.fastmcp
๐น OR
python main.py
Then ask questions in Claude Desktop, Amazon Q, or any MCP-compatible client:
- "Run a FinOps audit for AWS us-east-1 profile"
- "Get last 30 days GCP cost breakdown for my service account"
- "How many stopped Azure VMs in west-europe?"
๐ฌ Example Prompts
๐ Azure quick network analysis for subscription_id id
OK. I've analyzed the Azure network resources for subscription ID 8a4e83d9-b1b3-485f-a7d1-0aab11284c12 and found some cost optimization opportunities: โ
โ โ
โ โข Expensive NAT Gateways: The NAT Gateway testNAT in centralus has low usage. Consider replacing it with a NAT VM for cost savings, using Service Endpoints where possible, or reviewing if NAT is actually needed. โ
โ โข Idle Load Balancers: The load balancer v2-api-lb in westus has had no traffic in the last 30 days. It's recommended to delete it, but ensure no critical services depend on it before doing so. โ
โ โข Unused Public IPs: Several public IPs are not associated with any resources and can be deleted to avoid unnecessary costs. These include testIPforNAT, DBA-VPN-PRITUNL-ip, OpenVPN-ip, DBA-VM-01-ip, DBA-VM-02-ip, โ
โ DBA-VM-04-ip, DBAVM03ip245, DBA-QM-STAGE-ip, and QM-LOGS-ip. Remember, Azure charges for unassociated public IPs. โ
โ โ
โ This analysis includes details about Load Balancers, NAT Gateways, Network Security Groups (NSGs), and Public IPs. Notably, there are no unused NSGs identified in this analysis. The detailed information about each โ
โ resource, including its location, SKU, and tags, is available in the raw output if you need to dive deeper.
๐ธ AWS cost break down for the current month profile any_locally_configured_profile
โ โ
โ Total Cost: $1046.54 โ
โ โ
โ Cost By Service: โ
โ โ
โ โข Amazon EC2 - Compute: $0.01 โ
โ โข Amazon EC2 - Other: $100.88 โ
โ โข Amazon Elastic Load Balancing: $125.31 โ
โ โข Amazon Relational Database Service: $12.04 โ
โ โข Amazon Simple Storage Service (S3): $3.52 โ
โ โข Amazon Virtual Private Cloud (VPC): $87.26 โ
โ โข Amazon MQ: $677.81 โ
โ โข AWS CloudShell: $0.00 โ
โ โข AWS CloudTrail: $0.00 โ
โ โข AWS Config: $3.73 โ
โ โข AWS Cost Explorer: $0.95 โ
โ โข AWS Glue: $0.00 โ
โ โข AWS Key Management Service (KMS): $2.89 โ
โ โข AWS Lambda: $0.00 โ
โ โข AWS Secrets Manager: $0.39 โ
โ โข AWS Security Hub: $9.58 โ
โ โข AWS Service Catalog: $0.00 โ
โ โข AWS Step Functions: $0.00 โ
โ โข AWS WAF: $15.42 โ
โ โข Amazon CloudFront: $0.00 โ
โ โข Amazon DynamoDB: $0.00 โ
โ โข Amazon Glacier: $0.00 โ
โ โข Amazon GuardDuty: $4.41 โ
โ โข Amazon Route 53: $0.21 โ
โ โข Amazon Simple Notification Service (SNS): $0.00 โ
โ โข Amazon Simple Queue Service (SQS): $0.00 โ
โ โข Amazon Simple Workflow Service (SWF): $0.00 โ
โ โข Amazon SimpleDB: $0.00 โ
โ โข AmazonCloudWatch: $2.13
๐ธ AWS rds instances having cpu utilization under 2 percent
โ The following RDS instances have CPU utilization under 2%: โ
โ โ
โ โข instance-reader (db.r6g.2xlarge): 1.65% average CPU utilization. Consider downsizing. โ
โ๏ธ Break down GCP spend in last 7 days project id project_id
โ Total Cost: $447.48 โ
โ โ
โ Cost By Service: โ
โ โ
โ โข BigQuery: $0.00 โ
โ โข Cloud DNS: $0.38 โ
โ โข Cloud Logging: $3.04 โ
โ โข Cloud Memorystore for Redis: $33.76 โ
โ โข Cloud Monitoring: $0.00 โ
โ โข Cloud SQL: $63.54 โ
โ โข Cloud Speech API: $0.00 โ
โ โข Cloud Storage: $0.00 โ
โ โข Compute Engine: $230.60 โ
โ โข Gemini API: $3.67 โ
โ โข Kubernetes Engine: $68.89 โ
โ โข Networking: $41.14 โ
โ โข Secret Manager: $2.45
๐งช Testing
The project includes comprehensive unit tests for all cloud provider tools. Tests are organized by cloud provider and cover both success and error handling scenarios.
Running Tests
Run All Tests
python run_tests.py
Run Tests for Specific Cloud Provider
AWS Tests:
python run_tests.py aws
GCP Tests:
python run_tests.py gcp
Azure Tests:
python run_tests.py azure
Run Individual Test Files
AWS Tools:
python -m unittest tests.test_aws_tools
GCP Tools:
python -m unittest tests.test_gcp_tools
Azure Tools:
python -m unittest tests.test_azure_tools
Run Specific Test Methods
python -m unittest tests.test_aws_tools.TestAWSTools.test_analyze_aws_disks_success
Run test cases for Specific cloud [aws, gcp, azure]
python run_tests.py aws
Test Coverage
The test suite covers:
- Success Scenarios: Testing tool functionality with valid inputs and mocked cloud responses
- Error Handling: Testing graceful handling of API failures, authentication errors, and network issues
- Parameter Validation: Ensuring tools handle various input parameters correctly
- Mock Integration: Using mocked cloud SDK clients to avoid actual API calls during testing
Test Structure
tests/
โโโ test_aws_tools.py # AWS FinOps tools tests
โโโ test_gcp_tools.py # GCP FinOps tools tests
โโโ test_azure_tools.py # Azure FinOps tools tests
โโโ run_tests.py # Test runner script
Test Dependencies
Tests use the following mocking strategies:
- AWS: Mocks boto3 session and client methods
- GCP: Mocks Google Cloud client libraries and service methods
- Azure: Mocks Azure SDK clients and management operations
All tests are designed to run without requiring actual cloud credentials or making real API calls.
License
This project is licensed under the MIT License. See the file for details.
Contributing
We welcome contributions from the community! Please see our for more information.
When submitting a pull request, please use our to help us review your contribution.
Code of Conduct
We have a that we expect all contributors to follow. Please read it before contributing.
๐ณ Running with Docker Compose
You can run FastMCP using Docker Compose for easy setup and isolation.
1. Build the Docker image
docker compose build
2. Run the FinOps CLI
docker compose run finops-cli
You can now enter prompts just as you would with the CLI