dynatrace-mcp-otel

theharithsa/dynatrace-mcp-otel

3.3

If you are the rightful owner of dynatrace-mcp-otel and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Dynatrace MCP Server is a Model Context Protocol server that integrates AI assistants with Dynatrace observability data, automation, and insights.

Tools
20
Resources
0
Prompts
0

Dynatrace MCP

NPM Version NPM Downloads License Build Status Node.js Version TypeScript OpenTelemetry Davis CoPilot MCP Compatible Dynatrace Platform

A powerful Model Context Protocol (MCP) server that provides AI assistants with comprehensive access to Dynatrace's observability platform. Features dual authentication architecture, Davis CoPilot AI integration, and 24 specialized tools for monitoring, automation, and operational intelligence.

๐Ÿš€ What's New in v2.6.0

  • ๐Ÿ“Š Grail Budget Tracking: Advanced budget monitoring system with real-time usage tracking, warnings, and limits
  • ๐Ÿท๏ธ Entity Tagging: Add custom tags to monitored entities for better organization and filtering
  • ๐Ÿ“ˆ HTTP Server Mode: Run as a standalone HTTP server for broader integration possibilities
  • ๐Ÿ”ง Enhanced DQL Execution: Improved metadata extraction, error handling, and budget integration
  • ๐Ÿ“ง Enhanced Email System: Professional email capabilities with HTML/text support and multiple recipients
  • ๐Ÿงช Comprehensive Testing: 38 test cases covering all major functionality with 83% success rate
  • ๐Ÿ“š Extended Documentation: Updated guides, examples, and troubleshooting resources

Costs

Important: While this local MCP server is provided for free, using it to access data in Dynatrace Grail may incur additional costs based on your Dynatrace consumption model. This affects execute_dql tool and other capabilities that query Dynatrace Grail storage, and costs depend on the volume (GB scanned/billed).

Before using this MCP server extensively, please:

  1. Review your current Dynatrace consumption model and pricing
  2. Understand the cost implications of the specific data you plan to query (logs, events, metrics) - see Dynatrace Pricing and Rate Card
  3. Start with smaller timeframes (e.g., 12h-24h) and make use of buckets to reduce the cost impact

Note: We will be providing a way to monitor Query Usage of the dynatrace-mcp-server in the future.

Table of Contents

๐Ÿ† Production Ready & Tested

v2.6.0 has been extensively tested with real Dynatrace environments:

  • โœ… 20/24 tools working perfectly (83% success rate)
  • โœ… All core monitoring functions operational (DQL, entities, dashboards)
  • โœ… AI integration fully functional (Davis CoPilot, natural language processing)
  • โœ… Budget tracking prevents cost overruns with real-time monitoring
  • โœ… Communication tools validated (Slack, Email with professional formatting)
  • โœ… Automation workflows tested and operational

Minor OAuth scope adjustments needed for 4 tools - mainly permission-related issues that are easily configurable.

Quick Start

1. Add to Your MCP Client

Configure your MCP client (Claude Desktop, Cline, etc.) by adding this server to your mcp.json:

{
  "mcpServers": {
    "dynatrace": {
      "command": "npx",
      "args": ["@theharithsa/dynatrace-mcp-server"],
      "env": {
        "OAUTH_CLIENT_ID": "dt0s02.ABC123...",
        "OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
        "DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com"
      }
    }
  }
}

2. Get Your Dynatrace Credentials

  1. Go to your Dynatrace environment โ†’ Settings โ†’ Platform Management โ†’ OAuth clients
  2. Create a new OAuth client with the required OAuth scopes
  3. Copy the Client ID and Secret

3. Start Using

Your AI assistant can now:

  • ๐Ÿค– AI-Powered Queries: Convert natural language to DQL using Davis CoPilot
  • ๐Ÿ“Š Monitor & Analyze: Query problems, vulnerabilities, and execute DQL statements
  • ๐Ÿ—๏ธ Entity Management: Find, tag, and manage monitored entities with ownership info
  • ๐Ÿ“ˆ Dashboard Operations: Create, delete, and share dashboards with access control
  • ๐Ÿค– Automation: Create workflows and execute custom TypeScript functions
  • ๐Ÿ’ฌ Communication: Send Slack messages and professional emails
  • ๐Ÿ“Š Budget Control: Track and manage Grail query usage with budget limits

โ„น๏ธ On startup the server now verifies your Dynatrace connection with up to three exponential-backoff retries. Invalid URLs, credentials, or missing scopes will fail fast with detailed guidance so you can correct the configuration before using any tools.

Configuration

Basic Configuration

{
  "mcpServers": {
    "dynatrace": {
      "command": "npx",
      "args": ["@theharithsa/dynatrace-mcp-server"],
      "env": {
        "OAUTH_CLIENT_ID": "dt0s02.your-client-id",
        "OAUTH_CLIENT_SECRET": "dt0s02.your-client-id.your-client-secret",
        "DT_ENVIRONMENT": "https://your-tenant.apps.dynatrace.com"
      }
    }
  }
}

Alternative: Global Installation

# Install globally
npm install -g @theharithsa/dynatrace-mcp-server

# Then use in mcp.json
{
  "mcpServers": {
    "dynatrace": {
      "command": "dynatrace-mcp-server",
      "env": {
        "OAUTH_CLIENT_ID": "dt0s02.your-client-id",
        "OAUTH_CLIENT_SECRET": "dt0s02.your-client-id.your-client-secret",
        "DT_ENVIRONMENT": "https://your-tenant.apps.dynatrace.com"
      }
    }
  }
}

HTTP Server Mode (NEW!)

Run as a standalone HTTP server for broader integration possibilities:

# Install globally first
npm install -g @theharithsa/dynatrace-mcp-server

# Run as HTTP server (requires all environment variables set)
dynatrace-mcp-server --http-port 3000

# Server will be available at http://localhost:3000
# Endpoints:
# GET  /health      - Health check
# POST /tools/list  - List available tools  
# POST /tools/call  - Execute tool calls

Environment Variables for HTTP Mode:

export OAUTH_CLIENT_ID="dt0s02.your-client-id"
export OAUTH_CLIENT_SECRET="dt0s02.your-client-id.your-client-secret"
export DT_ENVIRONMENT="https://your-tenant.apps.dynatrace.com"
export DT_GRAIL_QUERY_BUDGET_GB="100"  # Optional: Set Grail budget limit
export LOG_LEVEL="info"                # Optional: debug, info, warn, error

Docker Support:

FROM node:18-alpine
WORKDIR /app
RUN npm install -g @theharithsa/dynatrace-mcp-server
EXPOSE 3000
CMD ["dynatrace-mcp-server", "--http-port", "3000"]

Available Tools (24 Total)

๐Ÿค– Davis CoPilot AI Integration (3 tools)

  • generate_dql_from_natural_language - Convert natural language to DQL queries using Davis CoPilot AI
  • explain_dql_in_natural_language - Get plain English explanations of complex DQL statements
  • chat_with_davis_copilot - AI-powered assistant for Dynatrace questions and troubleshooting

๐Ÿ” Monitoring & Observability (4 tools)

  • get_environment_info - Get Dynatrace environment details and configuration
  • list_problems - List all active problems in your environment
  • list_vulnerabilities - List security vulnerabilities detected by Dynatrace
  • get_kubernetes_events - Get Kubernetes cluster events and status

๐Ÿ“Š Data Querying & Analysis (2 tools)

  • execute_dql - Execute Dynatrace Query Language statements with budget tracking
  • verify_dql - Validate DQL syntax and structure before execution

๐Ÿ—๏ธ Entity Management & Tagging (4 tools)

  • find_entity_by_name - Find monitored entities by name across all entity types
  • get_entity_details - Get detailed information about specific monitored entities
  • add_entity_tags - Add custom tags to Dynatrace monitored entities
  • get_ownership - Get ownership information and team assignments for entities

๐Ÿ“ˆ Dashboard & Document Management (4 tools)

  • create_dashboard - Create dashboards from JSON files in bulk
  • bulk_delete_dashboards - Delete multiple dashboards by document IDs
  • share_document_env - Share documents across environments with access control
  • direct_share_document - Share documents directly with specific recipients

๐Ÿค– Automation & Workflows (3 tools)

  • create_workflow_for_notification - Create notification workflows for problem alerts
  • make_workflow_public - Make private workflows publicly accessible
  • execute_typescript - Execute custom TypeScript code via Dynatrace Function Executor

๐Ÿ’ฌ Communication (2 tools)

  • send_slack_message - Send messages via Dynatrace Slack integration
  • send_email - Send professional emails with HTML/text support via Dynatrace Email API

๐Ÿ“Š Budget & Usage Management (2 tools)

  • get_grail_budget_status - Monitor current Grail query budget usage and limits
  • reset_grail_budget - Reset the budget tracker when limits are exceeded

Davis CoPilot AI Integration

Overview

Davis CoPilot AI integration brings intelligent query generation and natural language processing to your Dynatrace MCP workflows. This feature is perfect for:

  • Converting natural language requests into powerful DQL queries
  • Understanding complex DQL statements in plain English
  • Getting AI-powered assistance for Dynatrace-related questions

Key Features

Natural Language to DQL

Transform plain English into powerful queries:

Input: "Show me CPU usage for all hosts in the last hour"
โ†“ Davis CoPilot AI โ†“
Generated: timeseries from:now()-1h, by:{dt.entity.host}, cpuUsage = avg(dt.host.cpu.usage)
DQL Explanation

Understand complex queries in plain English:

Input: fetch spans | filter duration > 5s | summarize avg(duration) by service.name
โ†“ Davis CoPilot AI โ†“
"This query retrieves all spans with duration longer than 5 seconds, then calculates the average duration grouped by service name"
AI Assistant

Get contextual help for any Dynatrace topic, from troubleshooting to best practices.

Workflow Integration

The recommended AI workflow is:

  1. Generate: Use generate_dql_from_natural_language to create queries from natural language
  2. Verify: Use verify_dql to validate DQL syntax and structure
  3. Execute: Use execute_dql to run queries with automatic budget tracking
  4. Monitor: Use get_grail_budget_status to check query costs and usage
  5. Reset: Use reset_grail_budget if budget limits are exceeded
  6. Iterate: Refine based on results, costs, and repeat

Required Scopes for Davis CoPilot

Add these scopes to your OAuth client:

davis-copilot:nl2dql:execute
davis-copilot:dql2nl:execute
davis-copilot:conversations:execute

Usage Examples

Generate DQL from Natural Language
{
  "tool": "generate_dql_from_natural_language",
  "arguments": {
    "text": "Show me CPU usage for all hosts in the last hour"
  }
}

Result: Generated DQL ready for execution with verification token.

Explain Complex DQL
{
  "tool": "explain_dql_in_natural_language", 
  "arguments": {
    "dql": "timeseries from:now()-1h, by:{dt.entity.host}, cpuUsage = avg(dt.host.cpu.usage)"
  }
}

Result: Plain English explanation of query logic and data sources.

Chat with Davis CoPilot
{
  "tool": "chat_with_davis_copilot",
  "arguments": {
    "text": "How do I optimize database query performance in my Java application?",
    "context": "We're seeing high response times in our e-commerce application"
  }
}

Environment Variables

Core Required Variables

VariableDescriptionExampleRequired
OAUTH_CLIENT_IDDynatrace OAuth Client IDdt0s02.ABC123...โœ…
OAUTH_CLIENT_SECRETDynatrace OAuth Client Secretdt0s02.ABC123.DEF456...โœ…
DT_ENVIRONMENTDynatrace environment URL (Platform API)https://abc12345.apps.dynatrace.comโœ…
DT_PLATFORM_TOKENPlatform API token for Davis CoPilotdt0c01.XYZ789...โœ… (for Davis CoPilot)

Budget & Logging Configuration (NEW!)

VariableDescriptionExampleDefault
DT_GRAIL_QUERY_BUDGET_GBGrail query budget limit in GB10010
LOG_LEVELLogging level (debug/info/warn/error)infoinfo
HTTP_PORTPort for HTTP server mode3000None

OAuth Configuration (Optional)

VariableDescriptionDefault
OAUTH_TOKEN_URLOAuth token endpointhttps://sso.dynatrace.com/sso/oauth2/token
OAUTH_URNOAuth resource URNurn:dtaccount:<your-account-urn-guid>

OpenTelemetry Tracing (Optional)

VariableDescriptionExample
OTEL_EXPORTER_OTLP_ENDPOINTOTLP endpoint for traceshttps://abc12345.live.dynatrace.com/api/v2/otlp/v1/traces
DYNATRACE_API_TOKENAPI token for trace/log exportdt0c01.ABC123...
DYNATRACE_LOG_INGEST_URLLog ingest endpointhttps://abc12345.live.dynatrace.com/api/v2/logs/ingest
OTEL_RESOURCE_ATTRIBUTESOpenTelemetry resource attributesservice.name=dynatrace-mcp-server,service.version=2.2.0

Slack Integration (Optional)

VariableDescriptionDefault
SLACK_CONNECTION_IDSlack connection ID from DynatraceNone

Document Sharing (Optional)

VariableDescriptionDefault
DT_SHARE_RECIPIENTSComma-separated list of user/group IDsNone
DT_SHARE_TYPEType of recipients (user/group)group

OpenKit Telemetry (Usage Analytics)

VariableDescriptionDefault
DT_MCP_DISABLE_TELEMETRYDisable usage telemetry collection (set to 'true')false
DT_MCP_TELEMETRY_ENDPOINT_URLCustom OpenKit telemetry endpointDynatrace production self-monitoring
DT_MCP_TELEMETRY_APPLICATION_IDCustom OpenKit application IDDefault MCP server app ID
DT_MCP_TELEMETRY_DEVICE_IDCustom device ID for consistent identificationAuto-generated from hostname
DT_MCP_TELEMETRY_USER_IDUser identifier for session taggingAuto-generated from hostname+username

The MCP server collects anonymous usage analytics using Dynatrace OpenKit SDK to help improve the tool. User identification supports multiple sources with fallback priority:

  1. Environment variable DT_MCP_TELEMETRY_USER_ID (highest priority)
  2. Auto-generated from system hostname and username: mcp-user-xxxxxxxx (medium priority)
  3. Generic fallback: mcp-anonymous-user (lowest priority)

You can completely disable telemetry by setting DT_MCP_DISABLE_TELEMETRY=true.

Complete Configuration Example

{
  "mcpServers": {
    "dynatrace": {
      "command": "npx",
      "args": ["@theharithsa/dynatrace-mcp-server"],
      "env": {
        "OAUTH_CLIENT_ID": "dt0s02.ABC123...",
        "OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
        "DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com",
        "OTEL_EXPORTER_OTLP_ENDPOINT": "https://abc12345.live.dynatrace.com/api/v2/otlp/v1/traces",
        "DYNATRACE_API_TOKEN": "dt0c01.XYZ789...",
        "DYNATRACE_LOG_INGEST_URL": "https://abc12345.live.dynatrace.com/api/v2/logs/ingest",
        "SLACK_CONNECTION_ID": "your-slack-connection-id",
        "DT_SHARE_RECIPIENTS": "group-id-1,group-id-2",
        "DT_SHARE_TYPE": "group"
      }
    }
  }
}

Authentication

๐Ÿ” Dual Authentication Architecture

Version 2.5.0 introduces a powerful dual authentication system that automatically routes requests to the appropriate Dynatrace API endpoints:

1. OAuth Client Authentication (Primary)
  • Purpose: Davis CoPilot AI, advanced platform features, and app execution
  • Endpoint: apps.dynatrace.com
  • Token Format: dt0s02.CLIENT_ID and dt0s02.CLIENT_ID.CLIENT_SECRET
  • Configuration: OAUTH_CLIENT_ID and OAUTH_CLIENT_SECRET
2. API Token Authentication (Secondary)
  • Purpose: Entity operations, tagging, basic data access
  • Endpoint: live.dynatrace.com
  • Token Format: dt0c01.API_TOKEN
  • Configuration: DT_API_TOKEN (optional for entity operations)
3. Platform Token Authentication (Tertiary)
  • Purpose: Environment information and platform management
  • Endpoint: apps.dynatrace.com
  • Token Format: dt0s16.PLATFORM_TOKEN
  • Configuration: DT_PLATFORM_TOKEN (optional for environment info)

Required OAuth Scopes

๐Ÿค– Davis CoPilot AI (Core Features):

  • davis-copilot:nl2dql:execute - Natural language to DQL conversion
  • davis-copilot:dql2nl:execute - DQL explanation in natural language
  • davis-copilot:conversations:execute - AI-powered conversations

๐Ÿ—๏ธ Platform & App Engine:

  • app-engine:apps:run - Execute Dynatrace apps
  • app-engine:functions:run - Execute TypeScript functions

๐Ÿ“Š Data & Query Engine:

  • storage:buckets:read - Access bucket metadata for Grail queries
  • storage:events:read - Read problems and other event data via DQL
  • storage:security.events:read - Read security events (vulnerabilities) via DQL
  • storage:logs:read, storage:metrics:read, storage:bizevents:read, storage:spans:read, storage:system:read, storage:user.events:read, storage:user.sessions:read, storage:entities:read - Required for full execute_dql coverage across data domains
  • environment-api:entities:read - Entity lookups and ownership information
  • environment-api:entities:write - Entity tagging (when using OAuth)

๐Ÿ” Monitoring & Security:

  • (Covered by the DQL scopes above for problems and vulnerabilities)

๐Ÿ”ง Automation & Workflows:

  • automation:workflows:write - Workflow creation
  • automation:workflows:read - Workflow management
  • automation:workflows:run - Workflow execution

๐Ÿ“‹ Documents & Dashboards:

  • document:documents:write - Dashboard creation
  • document:documents:delete - Dashboard deletion
  • document:environment-shares:write - Document sharing
  • document:direct-shares:write - Direct document sharing

๐Ÿ’ฌ Communication:

  • email:emails:send - Email notifications
  • app-settings:objects:read - Slack integration
  • settings:objects:read - Ownership information

Setting Up Authentication

Step 1: Create OAuth Client
  1. Navigate to Settings โ†’ Platform Management โ†’ OAuth clients
  2. Click Create OAuth client
  3. Set Client type to Public
  4. Add all required scopes from the list above
  5. Save and copy the Client ID and Secret
Step 2: (Optional) Generate API Token
  1. Go to Settings โ†’ Access Tokens โ†’ Generate new token
  2. Add scopes: entities.read, entities.write, problems.read
  3. Copy the token (format: dt0c01.XXXXXX)
Step 3: Configuration

Minimal Configuration (OAuth only):

{
  "OAUTH_CLIENT_ID": "dt0s02.ABC123...",
  "OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
  "DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com"
}

Full Configuration (All features):

{
  "OAUTH_CLIENT_ID": "dt0s02.ABC123...",
  "OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
  "DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com",
  "DT_API_TOKEN": "dt0c01.XYZ789...",
  "DT_PLATFORM_TOKEN": "dt0s16.PLATFORM123..."
}

Advanced Usage

Custom Dashboard Creation

Place JSON dashboard files in a /dashboards folder and use the create_dashboard tool to bulk-create them.

TypeScript Code Execution

Execute custom logic using the Dynatrace Function Executor:

// Example: Query and process data
export default async function ({ entityId }) {
  // Your custom TypeScript code here
  return { processed: true, entityId };
}

Slack Integration

Configure Slack notifications by setting up a Slack connection in Dynatrace and providing the SLACK_CONNECTION_ID.

Email Integration

Send professional emails with rich formatting using the send_email tool:

// Example: Send alert notification
{
  "toRecipients": ["oncall@company.com"],
  "ccRecipients": ["team-lead@company.com"],
  "subject": "๐Ÿšจ Critical Alert: High CPU Usage",
  "body": "**Alert Details:**\n- Server: web-prod-01\n- CPU Usage: 95%\n- Duration: 15 minutes\n\n**Action Required**: Immediate investigation needed.",
  "contentType": "text/plain"
}

Key Features:

  • Support for To, CC, and BCC recipients (up to 100 total)
  • HTML and plain text content types
  • Professional formatting with markdown support
  • Comprehensive error handling and delivery tracking
  • Integration with Dynatrace tenant domain validation

Development

For Code Customization

If you need to modify the server code:

# Install the package for customization
npm install @theharithsa/dynatrace-mcp-server

# Clone and modify the source
git clone https://github.com/theharithsa/dynatrace-mcp-otel.git
cd dynatrace-mcp-otel
npm install
npm run build

Local Development

{
  "mcpServers": {
    "dynatrace-local": {
      "command": "node",
      "args": ["dist/index.js"],
      "cwd": "/path/to/dynatrace-mcp-otel",
      "env": {
        "OAUTH_CLIENT_ID": "dt0s02.your-client-id",
        "OAUTH_CLIENT_SECRET": "dt0s02.your-client-id.your-client-secret",
        "DT_ENVIRONMENT": "https://your-tenant.apps.dynatrace.com"
      }
    }
  }
}

Installation Options

# NPX (recommended for most users)
npx @theharithsa/dynatrace-mcp-server

# Global installation
npm install -g @theharithsa/dynatrace-mcp-server

# Local project installation
npm install @theharithsa/dynatrace-mcp-server

Dynatrace MCP OpenTelemetry Integration

Observability Features

Log Correlation
  • All logs include dt.security_context field set to dynatrace_mcp_otel
  • Logs are tagged with logType: build-logs for filtering
  • Logs are automatically correlated with traces via standard OpenTelemetry attributes
CI/CD Observability

Our GitHub Actions workflows are instrumented with OpenTelemetry using inception-health/otel-action.

Configuration in GitHub Actions
- name: Setup OpenTelemetry
  uses: inception-health/otel-action@v2
  with:
    dsn: ${{ vars.OTEL_EXPORTER_OTLP_ENDPOINT }}
    service_name: 'dynatrace-mcp-server-build'
    access_token: ${{ secrets.DYNATRACE_API_TOKEN }}
    log_url: ${{ vars.DYNATRACE_LOG_INGEST_URL }}
    build_type: ${{ github.ref == 'refs/heads/dev' && 'dev' || 'prod' }}
Required Variables/Secrets
  • OTEL_EXPORTER_OTLP_ENDPOINT: Dynatrace OTLP endpoint URL
  • DYNATRACE_API_TOKEN: API token with ingest permission
  • DYNATRACE_LOG_INGEST_URL: Dynatrace log ingest URL
Known Issues
  • In version 1.0.8, trace ingestion might not work correctly in all environments, but logging functionality works as expected

Version History

  • 2.6.0: Added Grail budget tracking, entity tagging, HTTP server mode, enhanced testing (24 tools with 83% success rate)
  • 2.5.0: Enhanced authentication architecture with dual OAuth/API token support, improved platform integration
  • 2.3.0: Added comprehensive workflow automation and document sharing capabilities
  • 2.2.0: Added comprehensive email integration with send_email tool supporting HTML/plain text, multiple recipients, and professional formatting
  • 2.1.0: Added Davis CoPilot AI integration with natural language processing capabilities
  • 2.0.0: Updated package structure and naming; enhanced configuration options
  • 1.0.8: Switched to standard OpenTelemetry GitHub Action; enhanced logging with security context
  • 1.0.7: // ...existing version history...

Support


Note: This MCP server is designed for AI assistant integration. For standalone use cases, consider using the Dynatrace CLI or API directly.