peter-palmer/bigquery-mcp-server
If you are the rightful owner of bigquery-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This document provides a structured overview of a minimal Model Context Protocol (MCP) server for Google Cloud BigQuery, implemented in TypeScript/Node.js.
BigQuery MCP Server (Node.js)
A minimal Model Context Protocol (MCP) server for Google Cloud BigQuery, written in TypeScript/Node.js.
Features
- Exposes tools:
listDatasets
: returns available datasets for the current projectlistTables(datasetId: string)
: returns tables in a datasetgetSchema(datasetId: string, tableId: string)
: returns columns and typesrunQuery(sql: string)
: executes read-onlySELECT
queries only
- Uses
@google-cloud/bigquery
- Auth via env vars:
BQ_SERVICE_ACCOUNT_JSON
andGCP_PROJECT_ID
- Defensive validation and clear error messages
Requirements
- Node.js >= 18.18
- IAM roles for the service account:
roles/bigquery.dataViewer
roles/bigquery.jobUser
Installation
npm install
Running locally
This server communicates over stdio per MCP conventions.
npm start
Ensure the following environment variables are set before running:
BQ_SERVICE_ACCOUNT_JSON
: Stringified JSON of the service account key for BigQuery access. Must includeclient_email
andprivate_key
.GCP_PROJECT_ID
: Your Google Cloud project ID to target for BigQuery operations.
Example .env
snippet (if you manage env separately):
GCP_PROJECT_ID=your-project-id
BQ_SERVICE_ACCOUNT_JSON={"type":"service_account","project_id":"your-project-id","private_key_id":"...","private_key":"-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n","client_email":"svc@your-project-id.iam.gserviceaccount.com","client_id":"...","auth_uri":"https://accounts.google.com/o/oauth2/auth","token_uri":"https://oauth2.googleapis.com/token","auth_provider_x509_cert_url":"https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url":"..."}
Tools (Commands)
All tool responses return plain JSON objects.
-
listDatasets()
- Input: none
- Output:
{ datasets: [{ datasetId, location }] }
-
listTables(datasetId: string)
- Input:
{ datasetId }
- Output:
{ datasetId, tables: [{ tableId, type }] }
- Input:
-
getSchema(datasetId: string, tableId: string)
- Input:
{ datasetId, tableId }
- Output:
{ datasetId, tableId, columns: [{ name, type, mode, description }] }
- Input:
-
runQuery(sql: string)
- Input:
{ sql }
(must start withSELECT
) - Output:
{ rows: [...] }
- Validation: If the SQL does not start with
SELECT
(case-insensitive, after trim), the server rejects withOnly SELECT queries are allowed
.
- Input:
Notes
- The server initializes a BigQuery client each call using credentials from env vars.
- For SQL,
useLegacySql
is set tofalse
. - Any missing env var or invalid credentials produce clear errors before attempting BigQuery calls.