revodatanl/databricks-mcp-server
If you are the rightful owner of databricks-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
This MCP server provides LLMs a set of read-only tools for interacting with Databricks workspaces through the MCP protocol.
Databricks Unity Catalog MCP Server
Access your Databricks workspace through Claude and other LLMs. Query Unity Catalog tables, inspect jobs, and retrieve detailed metadata—all through the Model Context Protocol.
Built on the Databricks SDK to provide read-only access to your workspace through the Model Context Protocol. Powered by FastMCP with async/aiohttp for efficient parallel data retrieval.
Read more about our vision and use cases .
Table of Contents
Features
Capabilities
What you can do:
- Ask Claude to find tables in your Unity Catalog
- Inspect job configurations and recent runs
- Generate queries based on your schema
Limitations
What you can't do:
- Modify tables or jobs (read-only by design)
- Execute queries directly (retrieves metadata only)
Available Tools
Unity Catalog
| Tool | Description | Parameters |
|---|---|---|
get-all-catalogs-schemas-tables | List all tables across catalogs and schemas | None |
get-table-details | Retrieve table descriptions, columns, and metadata | full_table_names (list of catalog.schema.table) |
Jobs
| Tool | Description | Parameters |
|---|---|---|
get-jobs | List all workspace jobs with IDs and names | None |
get-job-details | Get job settings, configurations, and tasks | job_ids (list of job IDs) |
get-job-runs | Fetch recent run history with duration, parameters, and results | job_ids (list), n_recent (1-5, default: 1) |
Quick Start
Prerequisites:
- Docker Desktop installed and running
- Databricks workspace access (host URL and access token)
Installation
Choose your editor and follow the configuration steps:
Cursor
Step 1: Add the following configuration to .cursor/mcp.json:
{
"mcpServers": {
"databricks": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"DATABRICKS_HOST",
"-e",
"DATABRICKS_TOKEN",
"ghcr.io/revodatanl/databricks-mcp-server:latest"
],
"env": {
"DATABRICKS_HOST": "${env:DATABRICKS_HOST}",
"DATABRICKS_TOKEN": "${env:DATABRICKS_TOKEN}"
}
}
}
}
Note: You can either use environment variable references (
${env:VARIABLE}) or hardcode the values as strings directly in the configuration.
Step 2: Create a .env file in your project root with your credentials:
DATABRICKS_HOST=your-workspace-url
DATABRICKS_TOKEN=your-access-token
Step 3: Restart Cursor to load the MCP server.
Step 4: Use the to enhance your Databricks development workflow.
Continue.dev
Step 1: Add the following configuration to .continue/mcpServers/databricks-mcp.yaml:
name: databricks_mcp_server
version: 0.1.3
schema: v1
mcpServers:
- name: databricks_mcp_server
command: docker
args:
- run
- -i
- --rm
- -e
- DATABRICKS_HOST=${{ inputs.DATABRICKS_HOST }}
- -e
- DATABRICKS_TOKEN=${{ inputs.DATABRICKS_TOKEN }}
- ghcr.io/revodatanl/databricks-mcp-server:latest
Step 2: Set your credentials either:
-
On the Continue.dev website (recommended for security)
-
Or in a
.envfile in your project root:DATABRICKS_HOST=your-workspace-url DATABRICKS_TOKEN=your-access-token
Step 3: Restart your editor to load the MCP server.
Step 4: Use the to enhance your Databricks development workflow.
Local Development
For contributors and developers who want to run the server locally:
Setup
-
Install uv - Fast Python package installer Follow the installation guide
-
Clone the repository
git clone https://github.com/revodatanl/databricks-mcp-server.git cd databricks-mcp-server -
Install dependencies
uv sync -
Set environment variables
export DATABRICKS_HOST=your-workspace-url export DATABRICKS_TOKEN=your-access-token -
Run the server
uv run databricks-mcp
License
MIT License - see for details.