ShubhamChougale01/bigquery-mcp-server
If you are the rightful owner of bigquery-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The BigQuery MCP Server is a custom Model Context Protocol server designed to facilitate secure interactions between AI agents and Google BigQuery.
README.md — BigQuery MCP Server
Overview
The BigQuery MCP Server is a custom Model Context Protocol (MCP) server that allows AI agents and clients to safely interact with Google BigQuery.
This server implements:
Part A — BigQuery MCP Tools
-
bq.run_queryExecute SQL queries against BigQuery. -
bq.list_tablesList all tables inside a dataset with metadata. -
bq.get_table_profileReturns table statistics including:- row count
- bytes
- partition details
- clustering fields
- schema
- sample rows
- last modified timestamp
Part B — OAuth-style Client Authentication
The server supports an OAuth-like workflow:
- Clients send
client_id+client_secret - Server validates credentials
- A unique session token is generated
- All tool calls require a valid session token
- Rate limits are enforced per-client
Additional Production Enhancements
| Feature | Status |
|---|---|
| Logging | ✔ Yes |
| Rate Limiting | ✔ Yes |
| Secure credential loading | ✔ Yes |
| Proper session lifecycle | ✔ Yes |
Project Structure
bigquery-mcp/
│
├── bq_mcp_server.py # Main MCP server with auth, tools, rate limiting
├── config.py # Project configuration & credentials path
├── bigquery-credentials.json # Service account key (DO NOT COMMIT)(You have to export this (GCP))
├── README.md
└── venv/ (optional)
Requirements
- Python 3.10+
- Google Cloud project with BigQuery enabled
- Service Account with roles:
- BigQuery Data Editor
- BigQuery Job User
- Service Account key file (
bigquery-credentials.json)
Install dependencies:
pip install google-cloud-bigquery google-auth
Setup Instructions
1️⃣ Create Service Account & Download Key
In Google Cloud Console:
IAM & Admin → Service Accounts → Create Service Account
Give roles:
- BigQuery Data Editor
- BigQuery Job User
Install dependencies:
pip install -r requirements.txt
Generate JSON key:
Keys → Add Key → JSON
Move it into your project directory:
mv ~/Downloads/bigquery-credentials.json ./bigquery-credentials.json
2️⃣ Configure config.py
import os
PROJECT_NAME = "******"
PROJECT_ID = "******"
PROJECT_NUMBER = "2837********"
CREDENTIALS_PATH = os.path.join(os.path.dirname(__file__), "bigquery-credentials.json")
REGISTERED_CLIENTS = {
"demo_client_id_123": "demo_secret_xyz789",
"analytics_client_456": "secret_abc123def"
}
3️⃣ Create Sample Dataset & Table
PROJECT_ID– your GCP project IDGOOGLE_APPLICATION_CREDENTIALS– path to the BigQuery service account JSON key (or put the file next to the repo and set this to./bigquery-credentials.json)CLIENTS_JSON– JSON string of registered clients, e.g.{"demo_client_id_123":"demo_secret_xyz789"}
test_db
Create a sample table:
CREATE OR REPLACE TABLE
`bigquery-mcp-******.test_db.sample_table` AS
SELECT
'User' || CAST(id AS STRING) AS user_id,
RAND() * 1000 AS revenue,
DATE_SUB(CURRENT_DATE(), INTERVAL CAST(RAND() * 365 AS INT64) DAY) AS date
FROM UNNEST(GENERATE_ARRAY(1, 100)) AS id;
Running the MCP Server
python bq_mcp_server.py
You should see:
Client 1: Authenticating...
Client 1: Listing tables...
MCP Tools
bq.run_query
Example request:
{
"method": "tools/call",
"params": {
"name": "bq.run_query",
"arguments": {
"sql": "SELECT COUNT(*) FROM `project.dataset.table`"
},
"session_token": "<token>"
}
}
bq.list_tables
{
"method": "tools/call",
"params": {
"name": "bq.list_tables",
"arguments": { "dataset_id": "test_db" },
"session_token": "<token>"
}
}
bq.get_table_profile
{
"method": "tools/call",
"params": {
"name": "bq.get_table_profile",
"arguments": {
"dataset_id": "test_db",
"table_id": "sample_table"
},
"session_token": "<token>"
}
}
Authentication Workflow
Step 1 — Authenticate
{
"method": "auth/authenticate",
"params": {
"client_id": "demo_client_id_123",
"client_secret": "demo_secret_xyz789"
}
}
Response:
{
"result": {
"session_token": "abcdef123456...",
"expires_at": "2025-01-01T12:00:00Z",
"client_id": "demo_client_id_123"
}
}
Step 2 — Use tools with session token
All further requests must include:
"session_token": "<your_token>"
Security Features
✔ Rate Limiting (per client)
Prevents excessive BigQuery usage.
✔ Logging
Tracks all requests, failures, warnings.
✔ Session Expiration
Sessions expire automatically after 1 hour.
Architecture Diagram
Below is the diagram.
┌────────────────────────────┐
│ MCP Client │
│ (AI Agent, CLI, App) │
└──────────────┬─────────────┘
│ 1. client_id + secret
▼
┌────────────────────────────┐
│ Authentication Manager │
│ - Validates credentials │
│ - Issues session tokens │
└──────────────┬─────────────┘
### Note on Registered Clients
The server loads registered client credentials from `config.REGISTERED_CLIENTS` (which comes from the `CLIENTS_JSON` environment variable). If `CLIENTS_JSON` is not set, the server falls back to built-in demo clients defined inside `bq_mcp_server.py`.
│ 2. session_token
▼
┌──────────────────────────────┐
│ BigQuery MCP Server │
│--------------------------------│
│ Tools: │
│ • bq.run_query │
│ • bq.list_tables │
│ • bq.get_table_profile │
│--------------------------------│
│ Security Layers: │
│ • Rate Limiter (per client) │
│ • Logging │
└──────────────┬─────────────────┘
│ 3. BigQuery API calls
▼
┌───────────────────────────────┐
│ Google BigQuery │
│ Dataset, tables, metadata │
└───────────────────────────────┘