gangwgr/workshop-mcp-server
If you are the rightful owner of workshop-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Workshop MCP Server is an AI-powered development assistant designed to enhance code review, GitHub PR automation, OpenShift test generation & execution, must-gather analysis, and cluster debugging capabilities.
Workshop MCP Server
AI-Powered Development Assistant with code review, GitHub PR automation, OpenShift test generation & execution, must-gather analysis, and cluster debugging capabilities.
Features
- Code Review - Line-by-line analysis with security, performance, and quality checks
- PR Review - Automated GitHub PR reviews with comments
- OpenShift Testing - Test generation (Gherkin/YAML/Go/Shell) and execution
- Must-Gather Analyzer - Cluster health assessment with AI-powered diagnostics
- Cluster Debugger - Intelligent issue analysis with fix recommendations
- Web GUI - Beautiful interface for all tools
Quick Start
🚀 Automated Setup (Recommended)
# 1. Clone repository
git clone https://github.com/gangwgr/workshop-mcp-server.git
cd workshop-mcp-server
# 2. Auto-configure for your AI system
./setup-ai.sh all # Configure all AI systems (Gemini, OpenAI, Claude)
# OR choose specific:
# ./setup-ai.sh gemini # Only Google Gemini
# ./setup-ai.sh openai # Only OpenAI GPT-4
# ./setup-ai.sh claude # Only Claude Desktop
# 3. Start web GUI
./setup-ai.sh start
# 4. Open browser: http://127.0.0.1:8080
📝 Manual Setup
# 1. Clone and setup
git clone https://github.com/gangwgr/workshop-mcp-server.git
cd workshop-mcp-server
python3 -m venv .venv
source .venv/bin/activate
# 2. Install dependencies
pip install -e .
pip install -r integrations/requirements-integrations.txt
# 3. Configure API keys (create .env file)
cp .env.template .env
# Edit .env and add your API keys
# 4. (Optional) Authenticate for GitHub/OpenShift features
gh auth login
oc login https://api.your-cluster.example.com:6443
# 5. Run web GUI
cd web_gui
python app.py
# 6. Open browser: http://127.0.0.1:8080
For detailed setup, see | Prerequisites:
Usage
Web GUI (Recommended)
Access at http://127.0.0.1:8080 for an intuitive interface to all tools.
MCP Server (Claude Desktop)
pip install -e .
python -m workshop_mcp_server.server
Configure Claude Desktop to use this server (see ).
AI System Integrations
Supports multiple AI systems out of the box:
- 🤖 Google Gemini - Direct API & Function Calling
- 🧠 OpenAI GPT-4 - Function Calling & Interactive Chat
- 💬 Anthropic Claude - MCP Protocol (Claude Desktop)
- 🔗 LangChain - Universal integration for ANY LLM
Auto-configuration available:
./setup-ai.sh all # Configure all AI systems with guided setup
# Gemini Example
from integrations.gemini_integration import GeminiMCPClient
client = GeminiMCPClient()
result = client.review_code(code="def test(): pass", with_gemini_analysis=True)
# GPT-4 Example
from integrations.openai_integration import OpenAIMCPClient
client = OpenAIMCPClient()
response = client.chat("Debug my OpenShift API server")
# LangChain (Universal - works with ANY LLM)
from integrations.langchain_integration import create_mcp_agent
from langchain_google_genai import ChatGoogleGenerativeAI
agent = create_mcp_agent(ChatGoogleGenerativeAI(model="gemini-pro"))
response = agent.run("Review this code for security issues: ...")
See for complete guide.
Available Tools
| Tool | Description |
|---|---|
review_code_line_by_line | Line-by-line code review with severity filtering |
post_pr_review_comments | Post GitHub PR review comments |
generate_ocp_test_case | Generate OpenShift tests (Gherkin/YAML/Go/Shell) |
execute_ocp_test_step_by_step | Execute tests with real-time progress |
debug_ocp_test_failure | Intelligent test failure analysis |
analyze_mustgather_bundle | Analyze must-gather bundles with AI |
debug_openshift_cluster | Cluster debugging with diagnostics |
Architecture
Users (Web Browser / Claude Desktop)
↓
Web GUI / MCP Server / CLI Tools
↓
Tool Layer (Python Functions)
↓
GitHub API / OpenShift Cluster / Must-Gather Files
Tech Stack: Python 3.8+, Flask, MCP Protocol, GitHub CLI, OpenShift CLI
Deployment
Production (Gunicorn)
pip install gunicorn
cd web_gui
gunicorn -w 4 -b 0.0.0.0:8080 app:app
Docker
docker build -t workshop-mcp-server .
docker run -p 8080:8080 -v ~/.kube/config:/root/.kube/config workshop-mcp-server
OpenShift
oc new-app python:3.9~https://github.com/your-username/workshop-mcp-server.git \
--context-dir=web_gui --name=mcp-server
oc expose svc/mcp-server
Troubleshooting
| Issue | Solution |
|---|---|
| Port 8080 in use | lsof -i :8080 then kill -9 <PID> |
| Module not found | Activate venv: source venv/bin/activate |
| GitHub auth failed | gh auth logout && gh auth login |
| OC cluster failed | oc login https://api.cluster.com:6443 |
See for more help.
Contributing
- Fork the repository
- Create feature branch
- Make changes with tests
- Submit pull request
License
MIT License - see file.
Support
- Documentation: | |
- Issues: https://github.com/your-username/workshop-mcp-server/issues
Built for DevOps, SRE, and Developer Teams