prometheus-mcp-server
If you are the rightful owner of prometheus-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This is an MCP server to allow LLMs to interact with a running Prometheus instance via the API to do things like generate and execute promql queries, list and analyze metrics, etc.
The Prometheus MCP Server is designed to facilitate interaction between large language models (LLMs) and a running Prometheus instance. By leveraging the Model Context Protocol (MCP), this server enables LLMs to perform various operations such as generating and executing PromQL queries, listing and analyzing metrics, and more. This integration allows for enhanced data analysis and monitoring capabilities, making it easier to manage and interpret complex datasets. The server is equipped with several tools that provide detailed insights into the Prometheus environment, including alert management, runtime information, and target discovery. This setup is particularly useful for developers and data scientists who require real-time data interaction and analysis.
Features
- Integration with Prometheus API for executing PromQL queries.
- Tools for managing and analyzing Prometheus alerts and metrics.
- Capability to list and interact with Prometheus targets and rules.
- Access to Prometheus runtime and build information.
- Support for local LLM interaction using Ollama.
Tools
alertmanagers
Get overview of Prometheus Alertmanager discovery
build_info
Get Prometheus build information
execute_query
Execute an instant query against the Prometheus datasource
flags
Get runtime flags
list_alerts
List all active alerts
list_rules
List all alerting and recording rules that are loaded
list_targets
Get overview of Prometheus target discovery
runtime_info
Get Prometheus runtime information
tsdb_stats
Get usage and cardinality statistics from the TSDB
wal_replay_status
Get current WAL replay status