argo-watcher-mcp

shini4i/argo-watcher-mcp

3.3

If you are the rightful owner of argo-watcher-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Argo Watcher MCP Server is a service that integrates Argo Watcher with the Model Context Protocol (MCP) to provide deployment history as a set of tools.

Argo Watcher MCP Server

A simple service that exposes an argo-watcher instance as a set of tools via the Model Context Protocol (MCP), allowing AI agents and other clients to query deployment history.

GitHub Actions codecov GitHub release (latest by date) license

[!IMPORTANT] This project is currently a Proof of Concept (PoC). It was built to explore the integration between Argo Watcher and the Model Context Protocol (MCP). As such, it may be subject to significant changes or be abandoned in the future. Please use it with this understanding.

Features

  • Exposes argo-watcher deployment tasks as an MCP tool.
  • Filter deployments by application name and time range.
  • Packaged as a production-ready Docker container.
  • Simple, dependency-isolated architecture.

Prerequisites

  • Python 3.13+
  • Poetry for dependency management.
  • Docker for containerized deployment.
  • A running instance of argo-watcher.

Usage

This section outlines the full process for setting up the environment and running the interactive AI chat client.

  1. Bootstrap argo-watcher Service

    The MCP server depends on a running argo-watcher instance. You can quickly bootstrap this service using the official docker-compose.yml from the argo-watcher repository.

    # In a separate terminal, from the argo-watcher project directory:
    docker-compose up
    
  2. Start argo-watcher-mcp Server

    With argo-watcher running, start the MCP server. This project includes a convenience task for this purpose.

    # From this project's root directory:
    task run
    
  3. Configure OpenAI Credentials

    The AI chat client requires an OpenAI API key. Export it as an environment variable in the terminal where you plan to run the chat.

    export OPENAI_API_KEY="sk-..."
    
  4. Launch the Interactive AI Chat

    Finally, run the AI chat client using its pre-configured task.

    # This will start the interactive chat session.
    task chat
    
  5. Ask Questions

    The script will enter an interactive loop. You can now ask questions about your deployments in natural language.

Showcase

Contributing

As this is a PoC, formal contributions are not the primary focus. However, if you find a bug or have a suggestion, feel free to open an issue.