bigquery-mcp-server

scarcebytes/bigquery-mcp-server

3.1

If you are the rightful owner of bigquery-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

The BigQuery MCP Server on Google Cloud Platform is a serverless solution that enables querying BigQuery tables using the Model Context Protocol over Streamable HTTP, leveraging Google Cloud Run for deployment.

Tools
4
Resources
0
Prompts
0

BigQuery MCP Server on Google Cloud Platform

This project deploys a Model Context Protocol (MCP) server that provides BigQuery querying capabilities over Streamable HTTP. The server can be deployed to any GCP account using Terraform and runs on Google Cloud Run.

Features

  • BigQuery Integration: Query any BigQuery table with SQL
  • MCP Protocol: Follows the Model Context Protocol specification
  • Streamable HTTP: Production-ready HTTP transport for MCP
  • Cloud Run Deployment: Serverless, scalable deployment
  • Terraform Infrastructure: Infrastructure as Code for reproducible deployments
  • Security: Service account-based authentication with minimal permissions

Tools Available

The MCP server exposes the following tools:

  1. bigquery_query: Execute SQL queries against BigQuery
  2. list_datasets: List all datasets in the project
  3. list_tables: List all tables in a specific dataset
  4. describe_table: Get schema and metadata for a table

Prerequisites

Before deploying, ensure you have:

  1. Google Cloud CLI (gcloud) installed and configured
  2. Terraform (>= 1.0) installed
  3. Docker installed
  4. A GCP project with billing enabled
  5. Owner or Editor permissions on the GCP project

Quick Start

Option 1: Automated Deployment

Use the provided deployment script for a fully automated setup:

# Clone this repository or copy the files
cd archy_gcp

# Run the automated deployment
./deploy.sh

# Or run specific steps:
./deploy.sh deps     # Check dependencies
./deploy.sh auth     # Set up authentication
./deploy.sh infra    # Deploy infrastructure only
./deploy.sh container # Build and deploy container only
./deploy.sh test     # Test the deployment

Option 2: Manual Deployment

  1. Set up authentication:

    gcloud auth login
    gcloud config set project YOUR_PROJECT_ID
    gcloud auth application-default login
    
  2. Create terraform.tfvars:

    project_id = "your-gcp-project-id"
    project_name = "mcp-bigquery"
    region = "us-central1"
    bigquery_location = "US"
    bigquery_datasets = ["public"]
    allow_public_access = false
    create_sample_data = true
    
  3. Deploy infrastructure:

    terraform init
    terraform plan
    terraform apply
    
  4. Build and deploy container:

    # Get repository URL from Terraform output
    REPO_URL=$(terraform output -raw artifact_registry_repo)
    
    # Build and push image
    docker build -t $REPO_URL/mcp-bigquery:latest .
    docker push $REPO_URL/mcp-bigquery:latest
    
    # Update Cloud Run service
    gcloud run services update mcp-bigquery-mcp-service \
      --image=$REPO_URL/mcp-bigquery:latest \
      --region=us-central1
    

Configuration

Terraform Variables

VariableDescriptionDefault
project_idGCP project IDRequired
project_nameShort name for resource namingmcp-bigquery
regionGCP region for resourcesus-central1
bigquery_locationBigQuery locationUS
bigquery_datasetsList of datasets to grant access to["public"]
allow_public_accessAllow public access to the servicefalse
create_sample_dataCreate sample dataset and tabletrue
min_instancesMinimum Cloud Run instances0
max_instancesMaximum Cloud Run instances10
cpu_limitCPU limit per container"1"
memory_limitMemory limit per container"512Mi"

Environment Variables

The application supports these environment variables:

  • BIGQUERY_PROJECT: Override the default project for BigQuery operations
  • BIGQUERY_LOCATION: Override the default location for BigQuery operations
  • PORT: Port to run the server on (default: 8000)

Architecture

┌─────────────────┐    ┌──────────────────┐    ┌─────────────────┐
│   MCP Client    │────│   Cloud Run      │────│   BigQuery      │
│  (Claude, etc.) │    │  (MCP Server)    │    │   (Data)        │
└─────────────────┘    └──────────────────┘    └─────────────────┘
                              │
                       ┌──────────────────┐
                       │ Service Account  │
                       │ (Authentication) │
                       └──────────────────┘

Usage Examples

Once deployed, you can use the MCP server with any MCP-compatible client. The server will be available at the endpoint shown in the Terraform output.

Query BigQuery Table

{
  "method": "tools/call",
  "params": {
    "name": "bigquery_query",
    "arguments": {
      "sql": "SELECT * FROM `your-project.public.cars` LIMIT 10"
    }
  }
}

List Available Datasets

{
  "method": "tools/call",
  "params": {
    "name": "list_datasets"
  }
}

Describe Table Schema

{
  "method": "tools/call",
  "params": {
    "name": "describe_table",
    "arguments": {
      "dataset_id": "public",
      "table_id": "cars"
    }
  }
}

Security

The deployment follows security best practices:

  • Service Account: Dedicated service account with minimal required permissions
  • IAM: Granular BigQuery permissions (dataViewer, jobUser)
  • Network: Cloud Run service can be configured for private access
  • Container: Non-root user execution
  • Authentication: Google Cloud authentication for BigQuery access

Monitoring

The deployment includes:

  • Health Checks: Built-in health endpoint at /health
  • Cloud Run Metrics: Automatic monitoring in Google Cloud Console
  • Logging: Structured logging to Cloud Logging

Authentication

The MCP server supports API key authentication for security.

API Key Configuration

  1. Set API Key (Required for Production):

    # In terraform.tfvars
    mcp_api_key = "your-secure-api-key-here"
    
  2. Generate Secure API Key:

    python3 -c "import secrets; print('mcp_' + secrets.token_urlsafe(32))"
    
  3. Disable Authentication (Development Only):

    # In terraform.tfvars  
    mcp_api_key = ""
    

Authentication Methods

The server accepts API keys in multiple formats:

# Bearer token format
Authorization: Bearer mcp_your-api-key-here

# ApiKey format  
Authorization: ApiKey mcp_your-api-key-here

# Direct format
Authorization: mcp_your-api-key-here

Connect to Claude Desktop

Since Claude Desktop doesn't yet support HTTP MCP servers natively, you need to use the mcp-remote proxy package to connect to your deployed MCP server.

Add this to your Claude Desktop MCP configuration:

With Authentication (Recommended):

{
  "mcpServers": {
    "bigquery": {
      "command": "npx",
      "args": [
        "-y", "mcp-remote",
        "https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp",
        "--header", "Authorization: Bearer mcp_euzj2hYK0-MboHa45DucWq-CtJD-AAiRyvgelCwLSbQ"
      ]
    }
  }
}

Without Authentication (Development Only):

{
  "mcpServers": {
    "bigquery": {
      "command": "npx", 
      "args": ["-y", "mcp-remote", "https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp"]
    }
  }
}

Note: The mcp-remote package acts as a proxy between Claude Desktop (which expects stdio/local connections) and your HTTP-based MCP server. This is a temporary workaround until Claude Desktop adds native HTTP transport support.

Testing Authentication

# Test without API key (should fail)
curl -s "https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp" \
  -X POST -H "Content-Type: application/json"

# Test with API key (should work)  
curl -s "https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp" \
  -X POST -H "Authorization: Bearer mcp_your-api-key-here" \
  -H "Content-Type: application/json"

Troubleshooting

Common Issues

  1. Authentication Errors:

    gcloud auth application-default login
    
  2. Permission Denied:

    • Ensure your account has sufficient permissions
    • Check service account IAM bindings
  3. BigQuery Access Denied:

    • Verify dataset permissions in bigquery_datasets variable
    • Check if datasets exist in the specified project
  4. Container Build Failures:

    • Ensure Docker is running
    • Check Artifact Registry permissions

Debugging

View Cloud Run logs:

gcloud run services logs read mcp-bigquery-mcp-service --region=us-central1

Test the service locally:

python server_bigquery_mcp.py
curl http://localhost:8000/health

Cleanup

To remove all resources:

terraform destroy

This will remove:

  • Cloud Run service
  • Service account
  • Artifact Registry repository
  • BigQuery datasets (if created by Terraform)

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Test the deployment
  5. Submit a pull request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

For issues and questions:

  1. Check the troubleshooting section
  2. Review Cloud Run and BigQuery logs
  3. Open an issue in the repository