AdminTurnedDevOps/cloud-native-architecture-mcp-server
If you are the rightful owner of cloud-native-architecture-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Cloud Native Architecture MCP Server is a tool designed to generate architecture diagrams for cloud-native infrastructure using the Diagrams library.
Cloud Native Architecture MCP Server
An MCP (Model Context Protocol) server that provides tools to generate architecture diagrams for cloud-native infrastructure:
- Kubernetes cluster diagrams
- AWS infrastructure diagrams
- GCP infrastructure diagrams
Built using the Diagrams library to create professional, visual architecture diagrams programmatically.
Features
- Three specialized tools for different cloud platforms
- Visual diagram generation with proper cloud provider icons
- Cluster/VPC grouping support for organizing components
- Connection mapping between components
- Returns diagrams as images directly in MCP responses
Installation
From PyPI (Recommended)
pip install cloud-native-architecture-mcp
Or use with uvx for on-demand execution:
uvx cloud-native-architecture-mcp
From Source
git clone https://github.com/yourusername/cloud-native-architecture-mcp-server
cd cloud-native-architecture-mcp-server
pip install -e .
Prerequisites
This package requires Graphviz to be installed on your system:
macOS:
brew install graphviz
Ubuntu/Debian:
sudo apt-get install graphviz
Windows: Download from graphviz.org
Usage with MCP Clients
Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"cloud-architecture": {
"command": "uvx",
"args": ["cloud-native-architecture-mcp"]
}
}
}
AgentGateway
Add to your AgentGateway configuration:
mcp_servers:
- name: cloud-architecture
stdio:
cmd: uvx
args: ["cloud-native-architecture-mcp"]
Available Tools
1. build-kubernetes-diagram
Build Kubernetes architecture diagrams with support for:
- Deployments, StatefulSets, DaemonSets, Jobs, Pods
- Services, Ingress
- PVCs, PVs, StorageClass
- ConfigMaps, Secrets
- HPA (Horizontal Pod Autoscaler)
- Namespace clustering
Example Input:
{
"name": "microservices-app",
"components": [
{"type": "deployment", "name": "api-server", "replicas": 3},
{"type": "service", "name": "api-svc"},
{"type": "ingress", "name": "main-ingress"},
{"type": "deployment", "name": "worker", "replicas": 2},
{"type": "pvc", "name": "shared-storage"}
],
"clusters": [
{
"name": "Production Namespace",
"components": ["api-server", "api-svc", "worker"]
}
],
"connections": [
{"from": "main-ingress", "to": "api-svc", "label": "HTTPS"},
{"from": "api-svc", "to": "api-server"},
{"from": "api-server", "to": "shared-storage"}
]
}
2. build-aws-diagram
Build AWS infrastructure diagrams with support for:
- Compute: EC2, ECS, EKS, Lambda
- Database: RDS, DynamoDB, ElastiCache, Redshift
- Storage: S3, EBS, EFS
- Network: ALB, NLB, ELB, CloudFront, Route53, VPC
- Integration: SQS, SNS, EventBridge
- VPC grouping
Example Input:
{
"name": "webapp-infrastructure",
"components": [
{"type": "route53", "name": "dns"},
{"type": "alb", "name": "load-balancer"},
{"type": "ec2", "name": "web-server"},
{"type": "rds", "name": "postgres-db"},
{"type": "s3", "name": "assets-bucket"},
{"type": "elasticache", "name": "redis-cache"}
],
"vpcs": [
{
"name": "Production VPC",
"components": ["web-server", "postgres-db", "redis-cache", "load-balancer"]
}
],
"connections": [
{"from": "dns", "to": "load-balancer"},
{"from": "load-balancer", "to": "web-server"},
{"from": "web-server", "to": "postgres-db"},
{"from": "web-server", "to": "redis-cache"},
{"from": "web-server", "to": "assets-bucket"}
]
}
3. build-gcp-diagram
Build GCP infrastructure diagrams with support for:
- Compute: GCE, GKE, Cloud Functions
- Database: Cloud SQL, Firestore, BigTable, Spanner
- Storage: GCS, Persistent Disk
- Network: Load Balancing, Cloud DNS, VPC
- Analytics: BigQuery, Dataflow, Pub/Sub
- VPC/Network grouping
Example Input:
{
"name": "data-processing-pipeline",
"components": [
{"type": "gcs", "name": "input-bucket"},
{"type": "functions", "name": "process-files"},
{"type": "pubsub", "name": "events"},
{"type": "dataflow", "name": "etl-pipeline"},
{"type": "bigquery", "name": "data-warehouse"}
],
"connections": [
{"from": "input-bucket", "to": "process-files", "label": "trigger"},
{"from": "process-files", "to": "events"},
{"from": "events", "to": "etl-pipeline"},
{"from": "etl-pipeline", "to": "data-warehouse"}
]
}
Development
Setup Development Environment
# Clone the repository
git clone https://github.com/yourusername/cloud-native-architecture-mcp-server
cd cloud-native-architecture-mcp-server
# Create virtual environment
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install in development mode with dev dependencies
pip install -e ".[dev]"
Testing Locally
You can test the MCP server directly:
# Run the server
python -m cloud_native_architecture_mcp.server
Or use with an MCP client like the MCP Inspector:
npx @modelcontextprotocol/inspector uvx cloud-native-architecture-mcp
Publishing to PyPI
Prerequisites
pip install build twine
Build and Publish
- Build the package:
python -m build
- Test on TestPyPI first:
twine upload --repository testpypi dist/*
- Install from TestPyPI to verify:
pip install --index-url https://test.pypi.org/simple/ cloud-native-architecture-mcp
- Publish to PyPI:
twine upload dist/*
- Verify installation:
pip install cloud-native-architecture-mcp
Architecture
cloud-native-architecture-mcp-server/
├── src/
│ └── cloud_native_architecture_mcp/
│ ├── __init__.py
│ └── server.py # Main MCP server implementation
├── pyproject.toml # Package configuration
├── README.md
└── LICENSE
How It Works
- MCP Client (Claude Desktop, AgentGateway, etc.) calls one of the three tools
- MCP Server receives the component configuration (JSON)
- Diagrams Library generates the architecture diagram using Graphviz
- Server returns the diagram as a base64-encoded PNG image
- Client displays the visual diagram to the user
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT License - see file for details
Education
Resources
Support
For issues, questions, or contributions, please visit the GitHub repository.
Why So Much JSON?
MCP communicates via the JSON-RPC protocol. All tools within an MCP Server, parameters, and responses are transmitted as JSON. If the JSON didn't exist, the MCP client wouldn't know what parameters to send and what the tool actually does. JSON-RPC is the MCP communication protocol