GitLab-MCP-Server

poly-mcp/GitLab-MCP-Server

3.2

If you are the rightful owner of GitLab-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

An MCP server for integrating GitLab with AI assistants, enabling management of merge requests, CI/CD analysis, and more.

Tools
11
Resources
0
Prompts
0

GitLab MCP Server

Python 3.8+ License MIT MCP Compatible

An MCP (Model Context Protocol) server for integrating GitLab with AI assistants like Cursor, ChatGPT, and any polymcp-compatible client. Manage merge requests, analyze CI/CD pipelines, create ADR documents, and more.

What it does

This project exposes GitLab APIs through the MCP protocol, allowing AI assistants to:

  • List and manage merge requests
  • Analyze failed pipeline jobs with fix suggestions
  • Create ADR (Architecture Decision Records) documents in markdown
  • View CI/CD job logs
  • Trigger pipelines and retry failed jobs
  • Deploy to AWS, Azure, and GCP

Requirements

  • Python 3.8 or higher
  • PolyMCP (for AI agent integration)
  • FastAPI and uvicorn (for HTTP server)

Installation

git clone https://github.com/poly-mcp/Gitlab-MCP-Server.git
cd Gitlab-MCP-Server

pip install -r requirements.txt

Contents of requirements.txt:

fastapi>=0.104.0
uvicorn>=0.24.0
aiohttp>=3.9.0
pyyaml>=6.0
docstring-parser>=0.15
python-dotenv>=1.0.0
pydantic>=2.0.0
pip install polymcp==1.2.4

Configuration

Create a .env file in the project root:

# For use with real GitLab
GITLAB_BASE_URL=https://gitlab.com/api/v4
GITLAB_TOKEN=glpat-xxxxxxxxxxxx
GITLAB_PROJECT_ID=12345678

# Optional - security settings
SAFE_MODE=true
DRY_RUN=false

Usage with PolyMCP

This server is fully compatible with the polymcp library. Here is how to use it:

1. Start the server

# Production mode (requires token)
python gitlab_mcp_server.py --http --port 8000

2. Connect with polymcp

Create a file gitlab_chat.py:

#!/usr/bin/env python3
"""GitLab MCP Chat with PolyMCP"""
import asyncio
from polymcp.polyagent import UnifiedPolyAgent, OllamaProvider

async def main():
    # Configure the LLM provider (you can use OpenAI, Anthropic, Ollama, etc.)
    llm = OllamaProvider(model="gpt-oss:120b-cloud", temperature=0.1)
    
    # Point to the GitLab MCP server
    mcp_servers = ["http://localhost:8000/mcp"]
    
    agent = UnifiedPolyAgent(
        llm_provider=llm, 
        mcp_servers=mcp_servers,  
        verbose=True
    )
    
    async with agent:
        print("\nGitLab MCP Server connected!\n")
        print("Available commands:")
        print("- 'show me open merge requests'")
        print("- 'analyze failed jobs'")
        print("- 'create an ADR for cloud migration'")
        print("- 'exit' to quit\n")
        
        while True:
            user_input = input("\nYou: ")
            
            if user_input.lower() in ['exit', 'quit']:
                print("Session ended.")
                break
            
            result = await agent.run_async(user_input, max_steps=5)
            print(f"\nGitLab Assistant: {result}")

if __name__ == "__main__":
    asyncio.run(main())

3. Run it

python gitlab_chat.py

Example session:

GitLab MCP Server connected!

You: show me open merge requests in project mygroup/myproject

GitLab Assistant: I found 3 open merge requests:

1. MR !42 - "Fix authentication bug" 
   Author: john_doe
   Branch: bugfix/auth -> main
   
2. MR !43 - "Add caching layer"
   Author: jane_smith  
   Branch: feature/cache -> main

3. MR !44 - "Update dependencies"
   Author: bob_wilson
   Branch: chore/deps -> main

You: analyze why the pipeline is failing

GitLab Assistant: I analyzed pipeline #12345. There are 2 failed jobs:

1. test:unit (stage: test)
   Error: Snapshot test mismatch
   Suggestion: Run 'npm test -- -u' to update snapshots

2. security:sonar (stage: security)
   Error: Code coverage below threshold (67% < 80%)
   Suggestion: Add tests for uncovered functions

You: exit
Session ended.

Usage with Cursor

Method 1 - Direct import

Copy cursor_tools.py to your project and use it directly:

from cursor_tools import *

# List merge requests
mrs = list_open_merge_requests("mygroup/myproject")

# Analyze pipeline
analysis = analyze_pipeline_failures("mygroup/myproject")

# Create ADR
adr = create_architecture_decision(
    title="Kubernetes Adoption",
    context="We need to scale horizontally",
    decision="We will migrate to Kubernetes on GKE",
    consequences="Higher operational complexity but better scalability"
)

Method 2 - MCP Configuration

Add to .cursor/mcp_config.json:

{
  "mcpServers": {
    "gitlab": {
      "command": "python",
      "args": ["gitlab_mcp_server.py", "--mode", "stdio"]
    }
  }
}

Usage with ChatGPT

  1. Start the server and expose it publicly (with ngrok or similar):
python gitlab_mcp_server.py --http --port 8000
ngrok http 8000
  1. Create a Custom GPT with Actions pointing to the ngrok URL

  2. Or use Code Interpreter by uploading gitlab_assistant.py

Safety Features

The server includes built-in protections:

FeatureDefaultWhat it does
Safe ModeONBlocks write operations until you're ready
Dry RunOFFTest operations without executing them
Project Allowlist* (all allowed)Use * for all, empty to block all, or list specific projects
Rate Limiting60/minPrevents API abuse

💡 Start with SAFE_MODE=true to explore safely, then disable when needed.

Available Tools

Merge Request Management

ToolDescription
list_merge_requestsList merge requests with filters (state, author, assignee)
get_merge_request_detailsGet MR details including changes and discussions
create_merge_requestCreate a new merge request
approve_merge_requestApprove a merge request
merge_merge_requestMerge a merge request into target branch
rebase_merge_requestRebase a merge request onto target branch

Code Search

ToolDescription
search_codeSearch for code across project files

Pipeline & CI/CD

ToolDescription
list_pipeline_jobsList all jobs in a pipeline with status
get_job_logGet the log output of a specific job
analyze_failed_jobsAnalyze failures and suggest fixes
trigger_pipelineTrigger a new pipeline run
retry_pipelineRetry all failed jobs in a pipeline
cancel_pipelineCancel a running pipeline
retry_failed_jobRetry a specific failed job

ADR (Architecture Decision Records)

ToolDescription
create_adr_documentCreate an ADR document in Markdown format
commit_adr_to_gitlabCommit ADR to repository with optional MR

Cloud Deployment

ToolDescription
deploy_to_cloudDeploy to AWS, Azure, or GCP via pipeline

Security

To use the server safely with Cursor or other AI assistants:

  1. Create a GitLab token with minimal permissions (only read_api and read_repository)
  2. Enable SAFE_MODE=true in the .env file to disable destructive operations
  3. Use DRY_RUN=true to simulate operations without executing them
  4. Limit accessible projects by configuring ALLOWED_PROJECTS

The server tracks all operations and provides usage statistics.

Project Structure

Gitlab-MCP-Server/
├── gitlab_mcp_server.py    # Main server
├── cursor_tools.py         # Wrapper for Cursor
├── gitlab_chat.py          # Client for PolyMCP
├── gitlab_assistant.py     # Client for ChatGPT
├── .env.example            # Configuration template
├── requirements.txt        # Dependencies
└── README.md

Troubleshooting

ErrorSolution
GITLAB_TOKEN is requiredCreate .env file with your token
Operation blocked by safe modeSet SAFE_MODE=false in .env
Access denied to projectCheck token permissions or ALLOWED_PROJECTS
Request timed outIncrease MAX_RETRIES in .env

Contributing

Contributions are welcome. Open an issue to discuss significant changes before proceeding with a pull request.

License

MIT License - see LICENSE file for details.

Useful Links