ja2z/mcp-server
If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Sigma MCP Server is designed to be deployed on AWS Lambda with API Gateway, providing a scalable and secure environment for handling MCP requests.
Sigma MCP Server Deployment Guide
Overview
This guide walks you through deploying the Sigma MCP Server to AWS Lambda with API Gateway.
Sample Questions to test the MCP Server
"What's the status of the Sigma MCP server?" "Check the Sigma MCP server connection" "Test the Sigma MCP server connectivity" "Is the Sigma MCP server working?" "Get the current status of the Sigma MCP server" "Verify the Sigma MCP server is operational"
Architecture
Claude/MCP Client → API Gateway → Lambda → DynamoDB (cache)
↓
Secrets Manager (credentials)
↓
Sigma API
Prerequisites
- AWS CLI configured with appropriate permissions
- Terraform installed (v1.0+)
- Node.js 18+ and npm
- Sigma API credentials (client ID and secret)
Step 1: Prepare the Lambda Package
# Install dependencies
npm install
# Build the TypeScript code
npm run build
# Create deployment package
npm run package
This creates sigma-mcp-server.zip
with your compiled code and dependencies.
Step 2: Configure Terraform Variables
Create a terraform.tfvars
file:
aws_region = "us-east-1"
environment = "dev"
sigma_base_url = "https://api.sigmacomputing.com"
Step 3: Deploy Infrastructure
# Initialize Terraform
terraform init
# Plan the deployment
terraform plan
# Apply the changes
terraform apply
Important: After deployment, update the Secrets Manager secret with your actual Sigma credentials:
aws secretsmanager update-secret \
--secret-id "sigma-api-credentials-dev" \
--secret-string '{"clientId":"YOUR_ACTUAL_CLIENT_ID","clientSecret":"YOUR_ACTUAL_SECRET"}'
Step 4: Initial Cache Population
The document cache needs to be populated before the MCP server can search documents. You can do this by:
Option A: One-time Script
Create a simple script to populate the cache:
// populate-cache.ts
import { SigmaApiClient } from './src/sigma-client.js';
import { DocumentCache } from './src/document-cache.js';
async function populateCache() {
const client = new SigmaApiClient({
baseUrl: process.env.SIGMA_BASE_URL!,
clientId: process.env.SIGMA_CLIENT_ID!,
clientSecret: process.env.SIGMA_CLIENT_SECRET!,
});
const cache = new DocumentCache(process.env.CACHE_TABLE_NAME!);
await client.initialize();
await cache.initialize();
await cache.refreshCache(client);
console.log('Cache populated successfully');
}
populateCache().catch(console.error);
Option B: Lambda Function Invocation
You can invoke the Lambda directly to trigger cache refresh (you'd need to add an endpoint for this).
Step 5: Configure Claude Desktop
Add your MCP server to Claude Desktop's configuration:
{
"mcpServers": {
"sigma-analytics": {
"command": "node",
"args": [
"path/to/mcp-client-script.js"
],
"env": {
"API_GATEWAY_URL": "https://your-api-id.execute-api.region.amazonaws.com/dev"
}
}
}
}
You'll need to create a client script that communicates with your API Gateway endpoint instead of stdio.
API Gateway Setup Details
The Terraform creates:
- REST API - Main API Gateway resource
- Proxy Resource -
{proxy+}
to catch all paths - ANY Method - Accepts all HTTP methods
- Lambda Integration - Routes requests to your Lambda function
- Deployment - Creates a stage (dev/prod) with invoke URL
API Gateway Flow:
- Client sends HTTP POST with MCP request in body
- API Gateway forwards to Lambda via AWS_PROXY integration
- Lambda processes MCP request and returns response
- API Gateway returns response to client
Environment Variables
The Lambda function uses these environment variables (set by Terraform):
SIGMA_BASE_URL
- Sigma API endpointCACHE_TABLE_NAME
- DynamoDB table nameNODE_ENV
- Environment (dev/prod)
Credentials are loaded from AWS Secrets Manager automatically.
Monitoring and Logs
- CloudWatch Logs:
/aws/lambda/sigma-mcp-server-{environment}
- API Gateway Logs: Can be enabled in the API Gateway console
- DynamoDB Metrics: Available in CloudWatch
Outputs
After deployment, Terraform provides:
# Get the API Gateway URL
terraform output api_gateway_url
# Get other resource names
terraform output lambda_function_name
terraform output dynamodb_table_name
terraform output secrets_manager_secret_name
Local Testing
Before deploying to AWS, you can test the MCP server locally:
1. Set up Environment Variables
Create a .env
file in the project root (this file is already in .gitignore):
# Sigma API Configuration
SIGMA_CLIENT_ID=your_actual_sigma_client_id
SIGMA_CLIENT_SECRET=your_actual_sigma_client_secret
SIGMA_BASE_URL=https://api.sigmacomputing.com
# Cache Configuration
# Set to 'true' to skip caching entirely (for local testing)
# Set to 'false' or omit to use DynamoDB cache (for production)
SKIP_CACHE=true
# AWS Configuration (for local testing, these can be empty or use localstack)
AWS_REGION=us-east-1
CACHE_TABLE_NAME=sigma-documents-cache
# Environment
NODE_ENV=development
2. Install Dependencies
npm install
3. Test the Heartbeat
Run the local test script to verify connectivity:
npm run test:local
This will:
- Check your environment variables
- Build the TypeScript code
- Start the MCP server
- Send a heartbeat request
- Display the response with server status
4. Manual Testing
You can also run the server manually and interact with it:
# Build the project
npm run build
# Start the server
npm start
The server will run on stdio and wait for MCP requests.
Note: When using SKIP_CACHE=true
, the server will fetch data directly from the Sigma API for each request, which is useful for testing but may be slower than using cached data.
Testing
Test the deployment:
Debugging
The MCP server includes comprehensive debugging capabilities to help troubleshoot issues with REST API calls to Sigma.
Enabling Debug Mode
To enable debug mode, set the DEBUG_MODE
environment variable:
export DEBUG_MODE=true
Or add it to your .env
file:
DEBUG_MODE=true
Debug Output
When debug mode is enabled, you'll see detailed logging for:
- MCP Server Operations: Tool calls, resource requests, error handling
- Sigma API Calls: HTTP requests, responses, authentication
- Document Analytics: Cache operations, data fetching, parsing
- Token Management: Token refresh, expiry, authentication status
Testing with Debug Mode
Use the test script to run the analyze_documents tool with debugging:
# Build the project first
npm run build
# Run the test with debugging
node test-analyze-documents.js
Debug Log Format
Debug messages follow this format:
🔍 [DEBUG]
- Information and progress✅ [DEBUG]
- Success messages❌ [DEBUG]
- Error messages⚠️ [DEBUG]
- Warnings
Common Debug Scenarios
- Authentication Issues: Check token refresh logs
- API Call Failures: Look for HTTP status codes and error responses
- Data Parsing Issues: Check JSONL parsing logs
- Cache Problems: Verify cache hit/miss patterns
Troubleshooting Steps
- Check Environment Variables: Ensure all required variables are set
- Verify Sigma API Credentials: Test with the heartbeat tool first
- Review Network Connectivity: Check if Sigma API endpoints are reachable
- Examine Cache Status: Verify document cache is working properly
Security Considerations
- API Gateway has no authentication in this prototype - consider adding API keys or IAM auth for production
- Secrets Manager stores credentials securely with automatic rotation capability
- IAM roles follow least-privilege principle
- VPC - Consider deploying Lambda in VPC for additional network security
Scaling and Performance
- Lambda: Auto-scales, cold starts ~1-2 seconds
- DynamoDB: On-demand billing scales automatically
- API Gateway: Handles up to 10,000 requests per second by default
- Cache Strategy: In-memory cache in Lambda for fast lookups, DynamoDB for persistence
Troubleshooting
Common issues:
- "Secret not found" - Update Secrets Manager with real credentials
- "Table not found" - Ensure DynamoDB table exists and Lambda has permissions
- Cold starts - First request after idle time takes longer
- CORS errors - API Gateway includes CORS headers
Check CloudWatch Logs for detailed error information.