scarcebytes/bigquery-mcp-server
If you are the rightful owner of bigquery-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The BigQuery MCP Server on Google Cloud Platform is a serverless solution that enables querying BigQuery tables using the Model Context Protocol over Streamable HTTP, leveraging Google Cloud Run for deployment.
BigQuery MCP Server on Google Cloud Platform
This project deploys a Model Context Protocol (MCP) server that provides BigQuery querying capabilities over Streamable HTTP. The server can be deployed to any GCP account using Terraform and runs on Google Cloud Run.
Features
- BigQuery Integration: Query any BigQuery table with SQL
- MCP Protocol: Follows the Model Context Protocol specification
- Streamable HTTP: Production-ready HTTP transport for MCP
- Cloud Run Deployment: Serverless, scalable deployment
- Terraform Infrastructure: Infrastructure as Code for reproducible deployments
- Security: Service account-based authentication with minimal permissions
Tools Available
The MCP server exposes the following tools:
bigquery_query: Execute SQL queries against BigQuerylist_datasets: List all datasets in the projectlist_tables: List all tables in a specific datasetdescribe_table: Get schema and metadata for a table
Prerequisites
Before deploying, ensure you have:
- Google Cloud CLI (
gcloud) installed and configured - Terraform (>= 1.0) installed
- Docker installed
- A GCP project with billing enabled
- Owner or Editor permissions on the GCP project
Quick Start
Option 1: Automated Deployment
Use the provided deployment script for a fully automated setup:
# Clone this repository or copy the files
cd archy_gcp
# Run the automated deployment
./deploy.sh
# Or run specific steps:
./deploy.sh deps # Check dependencies
./deploy.sh auth # Set up authentication
./deploy.sh infra # Deploy infrastructure only
./deploy.sh container # Build and deploy container only
./deploy.sh test # Test the deployment
Option 2: Manual Deployment
-
Set up authentication:
gcloud auth login gcloud config set project YOUR_PROJECT_ID gcloud auth application-default login -
Create terraform.tfvars:
project_id = "your-gcp-project-id" project_name = "mcp-bigquery" region = "us-central1" bigquery_location = "US" bigquery_datasets = ["public"] allow_public_access = false create_sample_data = true -
Deploy infrastructure:
terraform init terraform plan terraform apply -
Build and deploy container:
# Get repository URL from Terraform output REPO_URL=$(terraform output -raw artifact_registry_repo) # Build and push image docker build -t $REPO_URL/mcp-bigquery:latest . docker push $REPO_URL/mcp-bigquery:latest # Update Cloud Run service gcloud run services update mcp-bigquery-mcp-service \ --image=$REPO_URL/mcp-bigquery:latest \ --region=us-central1
Configuration
Terraform Variables
| Variable | Description | Default |
|---|---|---|
project_id | GCP project ID | Required |
project_name | Short name for resource naming | mcp-bigquery |
region | GCP region for resources | us-central1 |
bigquery_location | BigQuery location | US |
bigquery_datasets | List of datasets to grant access to | ["public"] |
allow_public_access | Allow public access to the service | false |
create_sample_data | Create sample dataset and table | true |
min_instances | Minimum Cloud Run instances | 0 |
max_instances | Maximum Cloud Run instances | 10 |
cpu_limit | CPU limit per container | "1" |
memory_limit | Memory limit per container | "512Mi" |
Environment Variables
The application supports these environment variables:
BIGQUERY_PROJECT: Override the default project for BigQuery operationsBIGQUERY_LOCATION: Override the default location for BigQuery operationsPORT: Port to run the server on (default: 8000)
Architecture
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ MCP Client │────│ Cloud Run │────│ BigQuery │
│ (Claude, etc.) │ │ (MCP Server) │ │ (Data) │
└─────────────────┘ └──────────────────┘ └─────────────────┘
│
┌──────────────────┐
│ Service Account │
│ (Authentication) │
└──────────────────┘
Usage Examples
Once deployed, you can use the MCP server with any MCP-compatible client. The server will be available at the endpoint shown in the Terraform output.
Query BigQuery Table
{
"method": "tools/call",
"params": {
"name": "bigquery_query",
"arguments": {
"sql": "SELECT * FROM `your-project.public.cars` LIMIT 10"
}
}
}
List Available Datasets
{
"method": "tools/call",
"params": {
"name": "list_datasets"
}
}
Describe Table Schema
{
"method": "tools/call",
"params": {
"name": "describe_table",
"arguments": {
"dataset_id": "public",
"table_id": "cars"
}
}
}
Security
The deployment follows security best practices:
- Service Account: Dedicated service account with minimal required permissions
- IAM: Granular BigQuery permissions (dataViewer, jobUser)
- Network: Cloud Run service can be configured for private access
- Container: Non-root user execution
- Authentication: Google Cloud authentication for BigQuery access
Monitoring
The deployment includes:
- Health Checks: Built-in health endpoint at
/health - Cloud Run Metrics: Automatic monitoring in Google Cloud Console
- Logging: Structured logging to Cloud Logging
Authentication
The MCP server supports API key authentication for security.
API Key Configuration
-
Set API Key (Required for Production):
# In terraform.tfvars mcp_api_key = "your-secure-api-key-here" -
Generate Secure API Key:
python3 -c "import secrets; print('mcp_' + secrets.token_urlsafe(32))" -
Disable Authentication (Development Only):
# In terraform.tfvars mcp_api_key = ""
Authentication Methods
The server accepts API keys in multiple formats:
# Bearer token format
Authorization: Bearer mcp_your-api-key-here
# ApiKey format
Authorization: ApiKey mcp_your-api-key-here
# Direct format
Authorization: mcp_your-api-key-here
Connect to Claude Desktop
Since Claude Desktop doesn't yet support HTTP MCP servers natively, you need to use the mcp-remote proxy package to connect to your deployed MCP server.
Add this to your Claude Desktop MCP configuration:
With Authentication (Recommended):
{
"mcpServers": {
"bigquery": {
"command": "npx",
"args": [
"-y", "mcp-remote",
"https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp",
"--header", "Authorization: Bearer mcp_euzj2hYK0-MboHa45DucWq-CtJD-AAiRyvgelCwLSbQ"
]
}
}
}
Without Authentication (Development Only):
{
"mcpServers": {
"bigquery": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp"]
}
}
}
Note: The
mcp-remotepackage acts as a proxy between Claude Desktop (which expects stdio/local connections) and your HTTP-based MCP server. This is a temporary workaround until Claude Desktop adds native HTTP transport support.
Testing Authentication
# Test without API key (should fail)
curl -s "https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp" \
-X POST -H "Content-Type: application/json"
# Test with API key (should work)
curl -s "https://mcp-bigquery-mcp-service-5beo7r4imq-uc.a.run.app/mcp" \
-X POST -H "Authorization: Bearer mcp_your-api-key-here" \
-H "Content-Type: application/json"
Troubleshooting
Common Issues
-
Authentication Errors:
gcloud auth application-default login -
Permission Denied:
- Ensure your account has sufficient permissions
- Check service account IAM bindings
-
BigQuery Access Denied:
- Verify dataset permissions in
bigquery_datasetsvariable - Check if datasets exist in the specified project
- Verify dataset permissions in
-
Container Build Failures:
- Ensure Docker is running
- Check Artifact Registry permissions
Debugging
View Cloud Run logs:
gcloud run services logs read mcp-bigquery-mcp-service --region=us-central1
Test the service locally:
python server_bigquery_mcp.py
curl http://localhost:8000/health
Cleanup
To remove all resources:
terraform destroy
This will remove:
- Cloud Run service
- Service account
- Artifact Registry repository
- BigQuery datasets (if created by Terraform)
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test the deployment
- Submit a pull request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
For issues and questions:
- Check the troubleshooting section
- Review Cloud Run and BigQuery logs
- Open an issue in the repository