aws-lambda-mcp

debanjanbasu/aws-lambda-mcp

3.2

If you are the rightful owner of aws-lambda-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

Amazon Bedrock AgentCore Gateway is a production-ready Model Context Protocol server implementation that provides a secure, OAuth-authenticated bridge between Bedrock AI agents and custom tools.

Tools
1
Resources
0
Prompts
0

Amazon Bedrock AgentCore Gateway (MCP Server)

Deploy Security CodeQL

Production-ready Model Context Protocol server implementation using Amazon Bedrock AgentCore Gateway. Secure, OAuth-authenticated bridge between Bedrock AI agents and custom tools with automatic weather lookup and personalized greetings.

Table of Contents

Architecture

┌───────────┐     ┌───────────┐     ┌─────────────────────────────────────────────────┐
│    MCP    │────▶│ Entra ID  │     │                  AWS Cloud                      │
│   Client  │◀────│  (PKCE)   │     │                                                 │
└───────────┘     └───────────┘     │  ┌───────────────────────────────────────────┐  │
      │                │            │  │      Bedrock AgentCore Gateway            │  │
      │           OAuth Token       │  │  • JWT validation (OIDC)                  │  │
      │                │            │  │  • Semantic tool search                   │  │
      │                ▼            │  │  • Interceptor: JWT → User Info           │  │
      └───────────▶ Gateway ────────┼──│  • Target: tool_schema.json routing       │  │
                      URL           │  └──────────────────────┬────────────────────┘  │
                                    │                         │                       │
                                    │  ┌──────────────────────▼────────────────────┐  │
                                    │  │         Main Lambda (Rust) + Interceptor  │  │
                                    │  │  • Weather lookup (Open-Meteo API)        │  │
                                    │  │  • Personalized greetings                 │  │
                                    │  └──────────────────────┬────────────────────┘  │
                                    │                         │                       │
                                    │  ┌────────────────┴────────┐                    │
                                    │  │     CloudWatch Logs     │                    │
                                    │  └─────────────────────────┘                    │
                                    └─────────────────────────┬───────────────────────┘
                                                              ▼
                                                   ┌─────────────────┐
                                                   │  Open-Meteo API │
                                                   └─────────────────┘

**Stack**: ARM64 Lambdas (128MB, ~1.3MB UPX) | Entra ID OAuth | CloudWatch (3d retention)

**Features**: JWT token decoding, automatic weather lookup, personalized user greetings, secure header propagation, dynamic schema generation

**License**: MIT

## Model Context Protocol Implementation

This is a **Model Context Protocol (MCP) server** implemented as an AWS Lambda function for Amazon Bedrock AgentCore. MCP is an open-source specification that enables AI agents to discover and interact with external tools and APIs in a standardized way. This server uses the `rmcp` crate's `#[tool]` macro for MCP-compliant schema generation.

The Bedrock AgentCore Gateway is configured with a `SEMANTIC` search type, which enables intelligent tool selection. This means it can understand natural language queries, match tool descriptions and parameters, and provide context-aware tool recommendations, significantly improving the agent's ability to utilize available tools effectively.

## Features

- **ARM64/Graviton** - 20% cheaper, UPX compressed to 1.3MB per Lambda
- **Secretless OAuth** - PKCE flow, no client secrets
- **JWT Token Decoding** - Automatic user information extraction from Entra ID tokens with expiry validation
- **Gateway Interceptor** - Header propagation and identity resolution between gateway and tools
- **Dynamic Schema Generation** - Tool schemas automatically generated and deployed
- **Regional NAT Gateway** - Auto HA across AZs, stable egress IP
- **Zero Unsafe** - No `unwrap/expect/panic/unsafe`, strict lints
- **Concurrency Limits** - Function-level concurrent execution limits to prevent cost overruns
- **Event Notifications** - SNS notifications for infrastructure events
- **Structured Logging** - JSON logs for CloudWatch
- **Dead Letter Queue** - Failed invocations stored in encrypted SQS for debugging
- **Auto Schemas** - Generated from code annotations
- **Fast Cold Start** - Minimal deps, optimized binary
- **Cost Optimized** - Minimum memory (128MB), conservative timeouts, low concurrency limits
- **Principle of Least Privilege** - IAM policies scoped to specific resources
- **Resource Cleanup** - Terraform properly manages all resources
- **Free Tier** - Typical usage $0/month
- **Smart Weather Lookup** - Automatic geocoding and weather data retrieval from Open-Meteo
- **Personalized Greetings** - Context-aware user greetings with automatic name extraction

## One-Time Backend Setup

Before you can deploy, you need to run a one-time setup command to create the Terraform backend infrastructure:

```bash
make setup-backend

This command will:

  1. Prompt you for a unique S3 bucket name
  2. Create the S3 bucket for Terraform state storage
  3. Enable versioning and encryption on the bucket
  4. Configure native S3 state locking (Terraform 1.10+)
  5. Generate the iac/backend.config file

After setup, you can deploy your infrastructure with:

make deploy

Important: The backend.config file is essential for all Terraform operations. The Makefiles now include smart backend checking that will guide you if this file is missing.

Quick Start

make setup-backend # One-time backend setup (S3 with native locking)
make deploy        # Build and deploy to AWS
make test-token    # Get OAuth token + launch MCP Inspector

The test-token command automatically copies the token to clipboard (macOS/Linux/WSL) and provides instructions for testing with the MCP Inspector.

Ephemeral Pull Request Environments

This repository automatically creates isolated test environments for each pull request:

  • 🌱 Automatic Deployment: When you open a non-draft PR, an ephemeral environment is automatically created
  • 🔗 Isolated Testing: Each PR gets its own Gateway URL and backend resources
  • 🧪 Easy Testing: Use the same make test-token workflow to test your changes
  • 🗑️ Automatic Cleanup: Environments are destroyed when the PR is closed or merged

Manual Environment Management

To manually trigger an environment deployment or destruction:

# Deploy a manual environment (replace 123 with your PR number)
gh workflow run preview-environment.yml -f action=deploy -f pr_number=123

# Destroy a manual environment
gh workflow run preview-environment.yml -f action=destroy -f pr_number=123

Note: When running manually, the pr_number input is required to namespace the environment resources (e.g., preview-123). Use the actual PR number if you are debugging a specific PR, or any unique number for a scratch environment.

Automated Dependency Updates

Dependabot automatically creates PRs for:

  • 🦀 Rust dependencies - Cargo.toml updates
  • 🏗️ Terraform providers - AWS, Entra ID, and other providers
  • ⚙️ GitHub Actions - Workflow action updates

Updates are automatically tested and merged when all checks pass.

Example: Weather Tool

Included working tool demonstrates the pattern:

  • Simple location-based weather lookup (just provide "Kolkata" or "Sydney")
  • Automatic geocoding to coordinates (Open-Meteo API)
  • Smart default weather parameters (weather code, min/max temperature)
  • Automatic timezone detection and localization
  • Direct API integration with Open-Meteo weather service

Example: Personalized Greeting Tool

New personalized greeting tool demonstrates:

  • Zero-configuration user personalization
  • Automatic JWT token parsing and user identity extraction
  • Contextual responses based on authenticated user information
  • Secure header propagation between gateway and tools
  • Graceful fallback for missing user information

Prerequisites

  • Rust (edition 2024)
  • cargo-lambda: cargo install cargo-lambda
  • UPX: brew install upx (macOS) | apt install upx-ucl (Linux)
  • Zig: brew install zig (macOS) | apt install zig (Linux)
  • jq: brew install jq (macOS) | apt install jq (Linux)
  • Terraform (latest)
  • AWS CLI (configured)
  • Azure CLI (configured)

Note: Running make release will automatically install missing tools locally.

Initial Setup for GitHub Template Repositories

When using this repository as a GitHub template, you'll need to set up several secrets in your repository settings for the GitHub Actions workflows to function properly.

Resource Naming: The system automatically generates unique resource names by appending a random suffix (e.g., aws-agentcore-gateway-a1b2c3) to prevent conflicts when multiple deployments exist in the same AWS account.

Required GitHub Secrets

Secret NameDescriptionSetup Instructions
AWS_IAM_ROLE_ARNAWS IAM Role ARN for GitHub Actions OIDC authenticationAWS GitHub Actions Setup
AZURE_CLIENT_IDEntra ID App Registration Client IDAzure GitHub Actions Setup
AZURE_TENANT_IDEntra ID Tenant IDAzure GitHub Actions Setup
TF_BACKEND_BUCKETS3 Bucket name for Terraform state storageRun make setup-backend after setting AWS credentials
APP_PRIVATE_KEYPEM private key for the GitHub App @brown-ninja-bot (multi-line). Used to mint short-lived installation tokens for CI automation.Create a GitHub App (Settings → Developer settings → GitHub Apps), generate and download the private key, then add the PEM contents as the secret APP_PRIVATE_KEY in this repository's Settings → Secrets & variables → Actions.
APP_IDNumeric GitHub App ID for @brown-ninja-bot. Used together with the private key to mint JWTs.Add the numeric App ID as the secret APP_ID in repository secrets.

To validate your GitHub App setup you can use the provided test workflow:

# Trigger the test workflow which mints an installation token and validates it
# from the Actions tab: "Test: GitHub App Installation Token" → Run workflow
# or via CLI:
# gh workflow run test-github-app-token.yml

Optional GitHub Secrets

Secret NameDescriptionDefault
PROJECT_NAME_SUFFIXCustom suffix for resource names (e.g., "prod", "dev"). If not set, a random suffix is auto-generatedRandom 6-char string

Setting Up AWS Authentication

  1. Follow GitHub's documentation to configure OIDC between GitHub and AWS
  2. Create an IAM role with the necessary permissions for Lambda, API Gateway, and S3
  3. Set the AWS_IAM_ROLE_ARN secret to the ARN of this role

Setting Up Entra ID Authentication

  1. Follow GitHub's documentation to configure OIDC between GitHub and Azure
  2. Register a GitHub Actions application in Entra ID
  3. Set the AZURE_CLIENT_ID and AZURE_TENANT_ID secrets

Setting Up Terraform Backend

After configuring AWS authentication:

  1. Run make setup-backend locally to create the S3 bucket. This command will also automatically add the TF_BACKEND_BUCKET value to your local .env file.
  2. Use make update-secrets to push these values to your GitHub repository secrets.

Updating GitHub Secrets

To update your GitHub repository secrets for both GitHub Actions and Dependabot, create a .env file in the root of the project with the secrets you wish to update (e.g., MY_SECRET="myvalue"). You can use the provided .env.example file as a template for the required and optional secrets.

Then, run the following command:

make update-secrets

This command will read the .env file and use the gh CLI to set or update the corresponding repository secrets for both GitHub Actions and Dependabot.

Important: Ensure your .env file is in your .gitignore to prevent accidentally committing sensitive information.

Using with opencode.ai

This repository is pre-configured to work with opencode.ai, an AI-powered development assistant that can help you build, debug, and maintain your MCP server. The project includes:

  • Pre-configured GitHub Actions workflows that integrate with opencode.ai
  • Automatic schema generation for tool discovery
  • Standardized MCP implementation patterns
  • Built-in testing and debugging tools

To use opencode.ai with this project:

  1. Visit opencode.ai and sign up for an account
  2. Install the opencode CLI: npm install -g opencode
  3. Authenticate: opencode login
  4. Navigate to your project directory and run: opencode

The opencode assistant will automatically detect your project structure and provide context-aware help for:

  • Adding new tools and capabilities
  • Debugging deployment issues
  • Optimizing performance
  • Following MCP best practices
  • Integrating with other AI services

For more information, see the opencode.ai GitHub documentation.

Structure

src/
├── main.rs              # Main Lambda bootstrap + tracing
├── handler.rs           # Main Lambda event handler
├── lib.rs               # Library crate
├── models/              # Request/response types (JsonSchema)
│   ├── mod.rs
│   ├── weather.rs
│   └── open_meteo.rs
├── tools/               # Tool implementations (#[tool] macro)
│   ├── mod.rs
│   ├── weather.rs
│   └── personalized.rs
├── http/                # Global HTTP client
│   ├── mod.rs
│   └── client.rs
└── bin/
    ├── generate_schema.rs  # Schema generation utility
    └── interceptor.rs      # Gateway interceptor Lambda
iac/
├── main.tf              # Terraform infrastructure
└── ...

Usage

Build & Test

make schema    # Generate tool_schema.json
make build     # Debug build
make release   # ARM64 + UPX (~1.3MB)
make test      # Run tests
make all       # Test + release build
cargo clippy   # Run clippy lints
cargo fmt      # Format code

Deploy

make setup-backend # One-time backend setup
make deploy        # Build and deploy to AWS
make tf-destroy    # Destroy infrastructure

Development

make test-token   # OAuth + Inspector (token auto-copied)
make test-lambda  # Direct Lambda test
make logs         # Tail CloudWatch logs
make login        # AWS + Azure auth
make clean        # Remove tokens/backups

Advanced Terraform Operations

make tf-init   # Initialize Terraform
make tf-plan   # Plan changes
make tf-apply  # Apply changes
make tf-destroy # Destroy infrastructure

For full infrastructure commands: cd iac && make help

Troubleshooting

Gateway Exception Logging

Control Gateway exception logging verbosity in iac/terraform.tfvars:

# Disabled (default) - Minimal error information for security
gateway_exception_level = null

# Error level - Only error messages
gateway_exception_level = "ERROR"

# Warning level - Warning and error messages
gateway_exception_level = "WARN"

# Info level - Informational, warning, and error messages
gateway_exception_level = "INFO"

# Debug level - Most verbose logging (use only for troubleshooting)
gateway_exception_level = "DEBUG"

Redeploy: cd iac && terraform apply -auto-approve

⚠️ Security considerations: Higher verbosity levels may expose sensitive information in error responses. Use DEBUG/INFO only for troubleshooting, not in production environments. Disable (set to null) after troubleshooting to avoid exposing sensitive data.

Lambda Debug Logs

Edit iac/variables.tf:

variable "rust_log_level" {
  default = "debug"  # or "trace"
}

Redeploy and view: make logs

Production: Set to "info" to avoid logging sensitive payloads

Common Issues

IssueSolution
"Access denied"Gateway IAM needs both bedrock.amazonaws.com AND bedrock-agentcore.amazonaws.com principals
"Invalid Bearer token"Token needs api://CLIENT_ID/access_as_user scope. Run make test-token
Lambda timeoutIncrease lambda_timeout in iac/variables.tf

Commands

Main Commands

CommandDescription
make helpShow all commands with colored output
make schemaGenerate tool_schema.json
make buildDebug build
make releaseARM64 + UPX production build
make testRun tests
make allTest + release build
make deployBuild and deploy to AWS (smart backend checking)
make setup-backendOne-time backend setup
make test-tokenOAuth + Inspector (clipboard)
make test-lambdaDirect Lambda test
make logsTail CloudWatch logs
make update-depsUpdate all dependencies

Infrastructure Commands

CommandDescription
make loginAWS + Azure auth
make tf-initInitialize Terraform (smart backend checking)
make tf-planPlan Terraform changes
make tf-applyApply Terraform changes
make tf-destroyDestroy infrastructure (generate schema first)
make cleanRemove tokens/backups
make oauth-configDisplay OAuth configuration details
make add-redirect-urlAdd custom OAuth redirect URL to Entra ID app
make remove-redirect-urlRemove custom OAuth redirect URL from Entra ID app

For advanced infrastructure commands: cd iac && make help

Schema Generation

Generates Amazon Bedrock AgentCore schemas from code using rmcp crate's #[tool] macro:

use rmcp::tool;

#[tool(description = "Get current weather for a location")]
pub async fn get_weather(request: WeatherRequest) -> Result<WeatherResponse> {
    // implementation
}

Run make schema → generates tool_schema.json with:

  • Tool name from function name
  • Description from macro attribute
  • Input/output schemas from types (via schemars)
  • Bedrock-compatible format (no enums, inlined types)

Adding Tools

1. Model (src/models/your_tool.rs):

#[derive(Debug, Deserialize, JsonSchema)]
pub struct YourRequest {
    #[schemars(description = "Input description")]
    pub input: String,
}

See src/tools/personalized.rs for a complete example that demonstrates:

  • Extracting user information from interceptor-passed data
  • Creating personalized responses
  • Proper error handling with context

2. Tool (src/tools/your_tool.rs):

#[tool(description = "Clear, detailed description")]
pub async fn your_tool(request: YourRequest) -> Result<YourResponse> {
    // implementation
}

3. Register in src/bin/generate_schema.rs:

tool_entry!(
    aws_lambda_mcp::tools::your_tool::your_tool_tool_attr(),
    YourRequest,
    YourResponse
),

4. Generate: make schema

5. Route: Update handler.rs to call your tool

Configuration

Lambda (Cargo.toml):

[package.metadata.lambda.deploy]
memory = 128
timeout = 30
tracing = "active"

Infrastructure: Edit iac/terraform.tfvars for custom settings

Coding Standards

See for full guidelines.

Rules:

  • Result<T> + ? with .context()
  • #[must_use] on pure functions
  • #[derive(Debug, Serialize, Deserialize, JsonSchema)] on types
  • ❌ No unwrap/expect/panic/unsafe
  • ❌ No blocking I/O in async
  • ❌ No wildcard imports
  • ❌ No hardcoded secrets

Dependencies: lambda_runtime | tokio | serde | schemars | reqwest | tracing | anyhow | rmcp

Contributing

  1. Read
  2. cargo clippy -- -D warnings
  3. make schema if models changed
  4. make release succeeds
  5. make test passes