theharithsa/dynatrace-mcp-otel
If you are the rightful owner of dynatrace-mcp-otel and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Dynatrace MCP Server is a Model Context Protocol server that integrates AI assistants with Dynatrace observability data, automation, and insights.
Dynatrace MCP
A powerful Model Context Protocol (MCP) server that provides AI assistants with comprehensive access to Dynatrace's observability platform. Features dual authentication architecture, Davis CoPilot AI integration, and 24 specialized tools for monitoring, automation, and operational intelligence.
🚀 What's New in v2.6.0
- 📊 Grail Budget Tracking: Advanced budget monitoring system with real-time usage tracking, warnings, and limits
- 🏷️ Entity Tagging: Add custom tags to monitored entities for better organization and filtering
- 📈 HTTP Server Mode: Run as a standalone HTTP server for broader integration possibilities
- 🔧 Enhanced DQL Execution: Improved metadata extraction, error handling, and budget integration
- 📧 Enhanced Email System: Professional email capabilities with HTML/text support and multiple recipients
- 🧪 Comprehensive Testing: 38 test cases covering all major functionality with 83% success rate
- 📚 Extended Documentation: Updated guides, examples, and troubleshooting resources
Costs
Important: While this local MCP server is provided for free, using it to access data in Dynatrace Grail may incur additional costs based on your Dynatrace consumption model. This affects execute_dql tool and other capabilities that query Dynatrace Grail storage, and costs depend on the volume (GB scanned/billed).
Before using this MCP server extensively, please:
- Review your current Dynatrace consumption model and pricing
- Understand the cost implications of the specific data you plan to query (logs, events, metrics) - see Dynatrace Pricing and Rate Card
- Start with smaller timeframes (e.g., 12h-24h) and make use of buckets to reduce the cost impact
Note: We will be providing a way to monitor Query Usage of the dynatrace-mcp-server in the future.
Table of Contents
- Quick Start
- Configuration
- Available Tools
- Davis CoPilot AI Integration
- Environment Variables
- Authentication
- Advanced Usage
- Development
- Dynatrace MCP OpenTelemetry Integration
🏆 Production Ready & Tested
v2.6.0 has been extensively tested with real Dynatrace environments:
- ✅ 20/24 tools working perfectly (83% success rate)
- ✅ All core monitoring functions operational (DQL, entities, dashboards)
- ✅ AI integration fully functional (Davis CoPilot, natural language processing)
- ✅ Budget tracking prevents cost overruns with real-time monitoring
- ✅ Communication tools validated (Slack, Email with professional formatting)
- ✅ Automation workflows tested and operational
Minor OAuth scope adjustments needed for 4 tools - mainly permission-related issues that are easily configurable.
Quick Start
1. Add to Your MCP Client
Configure your MCP client (Claude Desktop, Cline, etc.) by adding this server to your mcp.json:
{
"mcpServers": {
"dynatrace": {
"command": "npx",
"args": ["@theharithsa/dynatrace-mcp-server"],
"env": {
"OAUTH_CLIENT_ID": "dt0s02.ABC123...",
"OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
"DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com"
}
}
}
}
2. Get Your Dynatrace Credentials
- Go to your Dynatrace environment → Settings → Platform Management → OAuth clients
- Create a new OAuth client with the required OAuth scopes
- Copy the Client ID and Secret
3. Start Using
Your AI assistant can now:
- 🤖 AI-Powered Queries: Convert natural language to DQL using Davis CoPilot
- 📊 Monitor & Analyze: Query problems, vulnerabilities, and execute DQL statements
- 🏗️ Entity Management: Find, tag, and manage monitored entities with ownership info
- 📈 Dashboard Operations: Create, delete, and share dashboards with access control
- 🤖 Automation: Create workflows and execute custom TypeScript functions
- 💬 Communication: Send Slack messages and professional emails
- 📊 Budget Control: Track and manage Grail query usage with budget limits
ℹ️ On startup the server now verifies your Dynatrace connection with up to three exponential-backoff retries. Invalid URLs, credentials, or missing scopes will fail fast with detailed guidance so you can correct the configuration before using any tools.
Configuration
Basic Configuration
{
"mcpServers": {
"dynatrace": {
"command": "npx",
"args": ["@theharithsa/dynatrace-mcp-server"],
"env": {
"OAUTH_CLIENT_ID": "dt0s02.your-client-id",
"OAUTH_CLIENT_SECRET": "dt0s02.your-client-id.your-client-secret",
"DT_ENVIRONMENT": "https://your-tenant.apps.dynatrace.com"
}
}
}
}
Alternative: Global Installation
# Install globally
npm install -g @theharithsa/dynatrace-mcp-server
# Then use in mcp.json
{
"mcpServers": {
"dynatrace": {
"command": "dynatrace-mcp-server",
"env": {
"OAUTH_CLIENT_ID": "dt0s02.your-client-id",
"OAUTH_CLIENT_SECRET": "dt0s02.your-client-id.your-client-secret",
"DT_ENVIRONMENT": "https://your-tenant.apps.dynatrace.com"
}
}
}
}
HTTP Server Mode (NEW!)
Run as a standalone HTTP server for broader integration possibilities:
# Install globally first
npm install -g @theharithsa/dynatrace-mcp-server
# Run as HTTP server (requires all environment variables set)
dynatrace-mcp-server --http-port 3000
# Server will be available at http://localhost:3000
# Endpoints:
# GET /health - Health check
# POST /tools/list - List available tools
# POST /tools/call - Execute tool calls
Environment Variables for HTTP Mode:
export OAUTH_CLIENT_ID="dt0s02.your-client-id"
export OAUTH_CLIENT_SECRET="dt0s02.your-client-id.your-client-secret"
export DT_ENVIRONMENT="https://your-tenant.apps.dynatrace.com"
export DT_GRAIL_QUERY_BUDGET_GB="100" # Optional: Set Grail budget limit
export LOG_LEVEL="info" # Optional: debug, info, warn, error
Docker Support:
FROM node:18-alpine
WORKDIR /app
RUN npm install -g @theharithsa/dynatrace-mcp-server
EXPOSE 3000
CMD ["dynatrace-mcp-server", "--http-port", "3000"]
Available Tools (24 Total)
🤖 Davis CoPilot AI Integration (3 tools)
generate_dql_from_natural_language- Convert natural language to DQL queries using Davis CoPilot AIexplain_dql_in_natural_language- Get plain English explanations of complex DQL statementschat_with_davis_copilot- AI-powered assistant for Dynatrace questions and troubleshooting
🔍 Monitoring & Observability (4 tools)
get_environment_info- Get Dynatrace environment details and configurationlist_problems- List all active problems in your environmentlist_vulnerabilities- List security vulnerabilities detected by Dynatraceget_kubernetes_events- Get Kubernetes cluster events and status
📊 Data Querying & Analysis (2 tools)
execute_dql- Execute Dynatrace Query Language statements with budget trackingverify_dql- Validate DQL syntax and structure before execution
🏗️ Entity Management & Tagging (4 tools)
find_entity_by_name- Find monitored entities by name across all entity typesget_entity_details- Get detailed information about specific monitored entitiesadd_entity_tags- Add custom tags to Dynatrace monitored entitiesget_ownership- Get ownership information and team assignments for entities
📈 Dashboard & Document Management (4 tools)
create_dashboard- Create dashboards from JSON files in bulkbulk_delete_dashboards- Delete multiple dashboards by document IDsshare_document_env- Share documents across environments with access controldirect_share_document- Share documents directly with specific recipients
🤖 Automation & Workflows (3 tools)
create_workflow_for_notification- Create notification workflows for problem alertsmake_workflow_public- Make private workflows publicly accessibleexecute_typescript- Execute custom TypeScript code via Dynatrace Function Executor
💬 Communication (2 tools)
send_slack_message- Send messages via Dynatrace Slack integrationsend_email- Send professional emails with HTML/text support via Dynatrace Email API
📊 Budget & Usage Management (2 tools)
get_grail_budget_status- Monitor current Grail query budget usage and limitsreset_grail_budget- Reset the budget tracker when limits are exceeded
Davis CoPilot AI Integration
Overview
Davis CoPilot AI integration brings intelligent query generation and natural language processing to your Dynatrace MCP workflows. This feature is perfect for:
- Converting natural language requests into powerful DQL queries
- Understanding complex DQL statements in plain English
- Getting AI-powered assistance for Dynatrace-related questions
Key Features
Natural Language to DQL
Transform plain English into powerful queries:
Input: "Show me CPU usage for all hosts in the last hour"
↓ Davis CoPilot AI ↓
Generated: timeseries from:now()-1h, by:{dt.entity.host}, cpuUsage = avg(dt.host.cpu.usage)
DQL Explanation
Understand complex queries in plain English:
Input: fetch spans | filter duration > 5s | summarize avg(duration) by service.name
↓ Davis CoPilot AI ↓
"This query retrieves all spans with duration longer than 5 seconds, then calculates the average duration grouped by service name"
AI Assistant
Get contextual help for any Dynatrace topic, from troubleshooting to best practices.
Workflow Integration
The recommended AI workflow is:
- Generate: Use
generate_dql_from_natural_languageto create queries from natural language - Verify: Use
verify_dqlto validate DQL syntax and structure - Execute: Use
execute_dqlto run queries with automatic budget tracking - Monitor: Use
get_grail_budget_statusto check query costs and usage - Reset: Use
reset_grail_budgetif budget limits are exceeded - Iterate: Refine based on results, costs, and repeat
Required Scopes for Davis CoPilot
Add these scopes to your OAuth client:
davis-copilot:nl2dql:execute
davis-copilot:dql2nl:execute
davis-copilot:conversations:execute
Usage Examples
Generate DQL from Natural Language
{
"tool": "generate_dql_from_natural_language",
"arguments": {
"text": "Show me CPU usage for all hosts in the last hour"
}
}
Result: Generated DQL ready for execution with verification token.
Explain Complex DQL
{
"tool": "explain_dql_in_natural_language",
"arguments": {
"dql": "timeseries from:now()-1h, by:{dt.entity.host}, cpuUsage = avg(dt.host.cpu.usage)"
}
}
Result: Plain English explanation of query logic and data sources.
Chat with Davis CoPilot
{
"tool": "chat_with_davis_copilot",
"arguments": {
"text": "How do I optimize database query performance in my Java application?",
"context": "We're seeing high response times in our e-commerce application"
}
}
Environment Variables
Core Required Variables
| Variable | Description | Example | Required |
|---|---|---|---|
OAUTH_CLIENT_ID | Dynatrace OAuth Client ID | dt0s02.ABC123... | ✅ |
OAUTH_CLIENT_SECRET | Dynatrace OAuth Client Secret | dt0s02.ABC123.DEF456... | ✅ |
DT_ENVIRONMENT | Dynatrace environment URL (Platform API) | https://abc12345.apps.dynatrace.com | ✅ |
DT_PLATFORM_TOKEN | Platform API token for Davis CoPilot | dt0c01.XYZ789... | ✅ (for Davis CoPilot) |
Budget & Logging Configuration (NEW!)
| Variable | Description | Example | Default |
|---|---|---|---|
DT_GRAIL_QUERY_BUDGET_GB | Grail query budget limit in GB | 100 | 10 |
LOG_LEVEL | Logging level (debug/info/warn/error) | info | info |
HTTP_PORT | Port for HTTP server mode | 3000 | None |
OAuth Configuration (Optional)
| Variable | Description | Default |
|---|---|---|
OAUTH_TOKEN_URL | OAuth token endpoint | https://sso.dynatrace.com/sso/oauth2/token |
OAUTH_URN | OAuth resource URN | urn:dtaccount:<your-account-urn-guid> |
OpenTelemetry Tracing (Optional)
The MCP server now includes automatic OpenTelemetry instrumentation using @theharithsa/opentelemetry-instrumentation-mcp. This provides comprehensive distributed tracing of all MCP tool calls with zero configuration required.
Features:
- 🔄 Automatic instrumentation of MCP tool calls
- 📊 Parent-child span relationships for complete traces
- 🚀 Drop-in solution with auto-registration
- 📈 OTLP export with Dynatrace support
- 🔍 Error tracking and exception recording
| Variable | Description | Example | Required |
|---|---|---|---|
OTEL_EXPORTER_OTLP_ENDPOINT | OTLP endpoint for traces | https://abc12345.live.dynatrace.com/api/v2/otlp/v1/traces | ✅ |
OTEL_EXPORTER_OTLP_HEADERS | OTLP headers (comma-separated) | Authorization=Api-Token dt0c01.ABC123... | ✅ |
Example Configuration:
# Single header
OTEL_EXPORTER_OTLP_ENDPOINT=https://abc12345.live.dynatrace.com/api/v2/otlp/v1/traces
OTEL_EXPORTER_OTLP_HEADERS=Authorization=Api-Token dt0c01.ABC123...
# Multiple headers (comma-separated)
OTEL_EXPORTER_OTLP_HEADERS=Authorization=Api-Token dt0c01.ABC123...,Custom-Header=value
Trace Structure:
Each tool call creates a hierarchical span structure:
Tool.{toolName} (Parent Span)
├── mcp.tool:{toolName} (Child Span - auto-instrumented)
│ ├── Duration, status, attributes
│ └── Error tracking and exceptions
└── Additional child spans from operations
Slack Integration (Optional)
| Variable | Description | Default |
|---|---|---|
SLACK_CONNECTION_ID | Slack connection ID from Dynatrace | None |
Document Sharing (Optional)
| Variable | Description | Default |
|---|---|---|
DT_SHARE_RECIPIENTS | Comma-separated list of user/group IDs | None |
DT_SHARE_TYPE | Type of recipients (user/group) | group |
OpenKit Telemetry (Usage Analytics)
| Variable | Description | Default |
|---|---|---|
DT_MCP_DISABLE_TELEMETRY | Disable usage telemetry collection (set to 'true') | false |
DT_MCP_TELEMETRY_ENDPOINT_URL | Custom OpenKit telemetry endpoint | Dynatrace production self-monitoring |
DT_MCP_TELEMETRY_APPLICATION_ID | Custom OpenKit application ID | Default MCP server app ID |
DT_MCP_TELEMETRY_DEVICE_ID | Custom device ID for consistent identification | Auto-generated from hostname |
DT_MCP_TELEMETRY_USER_ID | User identifier for session tagging | Auto-generated from hostname+username |
The MCP server collects anonymous usage analytics using Dynatrace OpenKit SDK to help improve the tool. User identification supports multiple sources with fallback priority:
- Environment variable
DT_MCP_TELEMETRY_USER_ID(highest priority) - Auto-generated from system hostname and username:
mcp-user-xxxxxxxx(medium priority) - Generic fallback:
mcp-anonymous-user(lowest priority)
You can completely disable telemetry by setting DT_MCP_DISABLE_TELEMETRY=true.
Complete Configuration Example
{
"mcpServers": {
"dynatrace": {
"command": "npx",
"args": ["@theharithsa/dynatrace-mcp-server"],
"env": {
"OAUTH_CLIENT_ID": "dt0s02.ABC123...",
"OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
"DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com",
"OTEL_EXPORTER_OTLP_ENDPOINT": "https://abc12345.live.dynatrace.com/api/v2/otlp/v1/traces",
"OTEL_EXPORTER_OTLP_HEADERS": "Authorization=Api-Token dt0c01.XYZ789...",
"SLACK_CONNECTION_ID": "your-slack-connection-id",
"DT_SHARE_RECIPIENTS": "group-id-1,group-id-2",
"DT_SHARE_TYPE": "group",
"DT_GRAIL_QUERY_BUDGET_GB": "100"
}
}
}
}
Authentication
🔐 Dual Authentication Architecture
Version 2.5.0 introduces a powerful dual authentication system that automatically routes requests to the appropriate Dynatrace API endpoints:
1. OAuth Client Authentication (Primary)
- Purpose: Davis CoPilot AI, advanced platform features, and app execution
- Endpoint:
apps.dynatrace.com - Token Format:
dt0s02.CLIENT_IDanddt0s02.CLIENT_ID.CLIENT_SECRET - Configuration:
OAUTH_CLIENT_IDandOAUTH_CLIENT_SECRET
2. API Token Authentication (Secondary)
- Purpose: Entity operations, tagging, basic data access
- Endpoint:
live.dynatrace.com - Token Format:
dt0c01.API_TOKEN - Configuration:
DT_API_TOKEN(optional for entity operations)
3. Platform Token Authentication (Tertiary)
- Purpose: Environment information and platform management
- Endpoint:
apps.dynatrace.com - Token Format:
dt0s16.PLATFORM_TOKEN - Configuration:
DT_PLATFORM_TOKEN(optional for environment info)
Required OAuth Scopes
🤖 Davis CoPilot AI (Core Features):
davis-copilot:nl2dql:execute- Natural language to DQL conversiondavis-copilot:dql2nl:execute- DQL explanation in natural languagedavis-copilot:conversations:execute- AI-powered conversations
🏗️ Platform & App Engine:
app-engine:apps:run- Execute Dynatrace appsapp-engine:functions:run- Execute TypeScript functions
📊 Data & Query Engine:
storage:buckets:read- Access bucket metadata for Grail queriesstorage:events:read- Read problems and other event data via DQLstorage:security.events:read- Read security events (vulnerabilities) via DQLstorage:logs:read,storage:metrics:read,storage:bizevents:read,storage:spans:read,storage:system:read,storage:user.events:read,storage:user.sessions:read,storage:entities:read- Required for fullexecute_dqlcoverage across data domainsenvironment-api:entities:read- Entity lookups and ownership informationenvironment-api:entities:write- Entity tagging (when using OAuth)
🔍 Monitoring & Security:
- (Covered by the DQL scopes above for problems and vulnerabilities)
🔧 Automation & Workflows:
automation:workflows:write- Workflow creationautomation:workflows:read- Workflow managementautomation:workflows:run- Workflow execution
📋 Documents & Dashboards:
document:documents:write- Dashboard creationdocument:documents:delete- Dashboard deletiondocument:environment-shares:write- Document sharingdocument:direct-shares:write- Direct document sharing
💬 Communication:
email:emails:send- Email notificationsapp-settings:objects:read- Slack integrationsettings:objects:read- Ownership information
Setting Up Authentication
Step 1: Create OAuth Client
- Navigate to Settings → Platform Management → OAuth clients
- Click Create OAuth client
- Set Client type to
Public - Add all required scopes from the list above
- Save and copy the Client ID and Secret
Step 2: (Optional) Generate API Token
- Go to Settings → Access Tokens → Generate new token
- Add scopes:
entities.read,entities.write,problems.read - Copy the token (format:
dt0c01.XXXXXX)
Step 3: Configuration
Minimal Configuration (OAuth only):
{
"OAUTH_CLIENT_ID": "dt0s02.ABC123...",
"OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
"DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com"
}
Full Configuration (All features):
{
"OAUTH_CLIENT_ID": "dt0s02.ABC123...",
"OAUTH_CLIENT_SECRET": "dt0s02.ABC123.DEF456...",
"DT_ENVIRONMENT": "https://abc12345.apps.dynatrace.com",
"DT_API_TOKEN": "dt0c01.XYZ789...",
"DT_PLATFORM_TOKEN": "dt0s16.PLATFORM123..."
}
Advanced Usage
Custom Dashboard Creation
Place JSON dashboard files in a /dashboards folder and use the create_dashboard tool to bulk-create them.
TypeScript Code Execution
Execute custom logic using the Dynatrace Function Executor:
// Example: Query and process data
export default async function ({ entityId }) {
// Your custom TypeScript code here
return { processed: true, entityId };
}
Slack Integration
Configure Slack notifications by setting up a Slack connection in Dynatrace and providing the SLACK_CONNECTION_ID.
Email Integration
Send professional emails with rich formatting using the send_email tool:
// Example: Send alert notification
{
"toRecipients": ["oncall@company.com"],
"ccRecipients": ["team-lead@company.com"],
"subject": "🚨 Critical Alert: High CPU Usage",
"body": "**Alert Details:**\n- Server: web-prod-01\n- CPU Usage: 95%\n- Duration: 15 minutes\n\n**Action Required**: Immediate investigation needed.",
"contentType": "text/plain"
}
Key Features:
- Support for To, CC, and BCC recipients (up to 100 total)
- HTML and plain text content types
- Professional formatting with markdown support
- Comprehensive error handling and delivery tracking
- Integration with Dynatrace tenant domain validation
Development
For Code Customization
If you need to modify the server code:
# Install the package for customization
npm install @theharithsa/dynatrace-mcp-server
# Clone and modify the source
git clone https://github.com/theharithsa/dynatrace-mcp-otel.git
cd dynatrace-mcp-otel
npm install
npm run build
Local Development
{
"mcpServers": {
"dynatrace-local": {
"command": "node",
"args": ["dist/index.js"],
"cwd": "/path/to/dynatrace-mcp-otel",
"env": {
"OAUTH_CLIENT_ID": "dt0s02.your-client-id",
"OAUTH_CLIENT_SECRET": "dt0s02.your-client-id.your-client-secret",
"DT_ENVIRONMENT": "https://your-tenant.apps.dynatrace.com"
}
}
}
}
Installation Options
# NPX (recommended for most users)
npx @theharithsa/dynatrace-mcp-server
# Global installation
npm install -g @theharithsa/dynatrace-mcp-server
# Local project installation
npm install @theharithsa/dynatrace-mcp-server
Dynatrace MCP OpenTelemetry Integration
The Dynatrace MCP Server includes optional OpenTelemetry tracing support using @theharithsa/opentelemetry-instrumentation-mcp v1.0.2. This provides comprehensive observability for all MCP tool invocations, including automatic span creation, context propagation, and error tracking.
Features
- Automatic Instrumentation: All 24 MCP tools are automatically traced without code changes
- Parent-Child Span Relationships: Tool invocations create parent spans, with automatic child span creation for nested operations
- Rich Metadata: Captures tool names, arguments, results, success/failure status, and execution duration
- Error Tracking: Automatic exception recording with full stack traces
- Dynatrace Integration: Direct export to Dynatrace via OTLP for unified observability
Configuration
1. Environment Variables
Add the following to your .env file:
# Required: Dynatrace OTLP endpoint
OTEL_EXPORTER_OTLP_ENDPOINT=https://your-environment.live.dynatrace.com/api/v2/otlp
# Required: Dynatrace API token (v1.0.2 format)
DYNATRACE_API_TOKEN=dt0c01.XXXXXXXXXXXXXXXXXXXXXXXX.YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY
Important: The DYNATRACE_API_TOKEN must have the openTelemetryTrace.ingest scope. See Dynatrace API Token Setup below.
2. Automatic Registration
The OpenTelemetry instrumentation is automatically registered at server startup via:
import '@theharithsa/opentelemetry-instrumentation-mcp/register';
No additional configuration needed in code!
Trace Structure
Each tool invocation creates the following span structure:
Tool.tool_name (parent span)
├─ Attributes:
│ ├─ Tool.name: "tool_name"
│ ├─ Tool.args: "{...json args...}"
│ ├─ mcp.tool.success: true/false
│ └─ mcp.tool.result.length: 1234
└─ Child spans (automatically created by instrumentation library)
└─ [Internal tool operations...]
Example: Executing execute_dql creates:
- Parent span:
Tool.execute_dqlwith DQL statement in args - Child spans for HTTP requests, JSON parsing, etc.
Dynatrace API Token Setup
To enable trace export, your Dynatrace API token requires specific permissions:
- Go to Dynatrace → Access Tokens → Generate new token
- Set token name:
MCP Server OpenTelemetry - Add required scopes:
- ✅
openTelemetryTrace.ingest(required for trace export) - ✅ Other scopes for your MCP tools (see Required Scopes)
- ✅
- Generate and copy token
- Add to
.envasDYNATRACE_API_TOKEN
Troubleshooting
401 Unauthorized Error
Symptom: Traces fail to export with "Token Authentication failed"
Solution: Verify your API token has the openTelemetryTrace.ingest scope:
- Check token scopes in Dynatrace UI
- Regenerate token with correct scopes if needed
- Update
.envwith new token - Restart the MCP server
Note: MCP tools will continue to work even if trace export fails. Tracing is completely optional.
No Traces in Dynatrace
Checklist:
- ✅
OTEL_EXPORTER_OTLP_ENDPOINTpoints to correct Dynatrace environment - ✅
DYNATRACE_API_TOKENis set and hasopenTelemetryTrace.ingestscope - ✅ Endpoint URL ends with
/api/v2/otlp(not/v1/traces) - ✅ Token is valid and not expired
- ✅ Network connectivity to Dynatrace environment
Disabling Tracing
To disable OpenTelemetry tracing entirely:
- Remove or comment out the environment variables in
.env - The instrumentation will gracefully skip trace export
- All MCP tools continue to function normally
CI/CD Observability
Our GitHub Actions workflows are instrumented with OpenTelemetry using inception-health/otel-action.
Configuration in GitHub Actions
- name: Setup OpenTelemetry
uses: inception-health/otel-action@v2
with:
dsn: ${{ vars.OTEL_EXPORTER_OTLP_ENDPOINT }}
service_name: 'dynatrace-mcp-server-build'
access_token: ${{ secrets.DYNATRACE_API_TOKEN }}
log_url: ${{ vars.DYNATRACE_LOG_INGEST_URL }}
build_type: ${{ github.ref == 'refs/heads/dev' && 'dev' || 'prod' }}
Required Variables/Secrets
OTEL_EXPORTER_OTLP_ENDPOINT: Dynatrace OTLP endpoint URLDYNATRACE_API_TOKEN: API token with ingest permissionDYNATRACE_LOG_INGEST_URL: Dynatrace log ingest URL
Log Correlation
- All logs include
dt.security_contextfield set todynatrace_mcp_otel - Logs are tagged with
logType: build-logsfor filtering - Logs are automatically correlated with traces via standard OpenTelemetry attributes
Version Compatibility
- Current Version: @theharithsa/opentelemetry-instrumentation-mcp v1.0.2
- OpenTelemetry API: v1.x
- Export Protocol: OTLP/HTTP
- Dynatrace Platform: All versions supporting OTLP ingestion
Version History
- 2.6.0: Added Grail budget tracking, entity tagging, HTTP server mode, enhanced testing (24 tools with 83% success rate)
- 2.5.0: Enhanced authentication architecture with dual OAuth/API token support, improved platform integration
- 2.3.0: Added comprehensive workflow automation and document sharing capabilities
- 2.2.0: Added comprehensive email integration with
send_emailtool supporting HTML/plain text, multiple recipients, and professional formatting - 2.1.0: Added Davis CoPilot AI integration with natural language processing capabilities
- 2.0.0: Updated package structure and naming; enhanced configuration options
- 1.0.8: Switched to standard OpenTelemetry GitHub Action; enhanced logging with security context
- 1.0.7: // ...existing version history...
Support
- Issues: Report issues on GitHub
- Documentation: Dynatrace Platform Documentation
- MCP Protocol: Model Context Protocol
Note: This MCP server is designed for AI assistant integration. For standalone use cases, consider using the Dynatrace CLI or API directly.