sunishsheth2009/openapi-spec-custom-mcp-server
If you are the rightful owner of openapi-spec-custom-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This document provides a comprehensive guide on setting up and deploying a custom Model Context Protocol (MCP) server on Databricks Apps.
Custom MCP Server on Databricks Apps
This repository provides a framework for creating Custom Model Context Protocol (MCP) servers using Databricks Apps. It enables customers to create MCP servers that connect to external REST APIs defined by OpenAPI specifications, with secure authentication through Databricks external connections.
Overview
This MCP server acts as a bridge between LLM agents and external REST APIs. It:
- Loads OpenAPI specifications from
src/custom_server/spec.json
- Provides three main tools to LLM agents:
list_databricks_api_endpoints
- Discover available API endpointsget_databricks_api_endpoint_schema
- Get detailed schema for specific endpointsinvoke_databricks_api_endpoint
- Execute API calls with proper authentication
- Uses Databricks UC external connections for secure authentication to external services
- Runs on Databricks Apps for scalable hosting
Quick Start
1. Configure Your External API
Replace the OpenAPI specification in src/custom_server/spec.json
with your external REST API's OpenAPI spec:
# Replace with your API's OpenAPI specification
cp your-api-spec.json src/custom_server/spec.json
2. Create UC External Connection
Create a Unity Catalog external connection for authentication to your external service:
-- Example: Create connection with API key authentication
CREATE CONNECTION your_api_connection
TYPE http
OPTIONS (
'host' = 'https://your-api.example.com',
'authentication' = 'API_KEY',
'api_key' = secret('your-scope', 'your-api-key')
);
Learn more about external connections
3. Update Connection Name
Update the connection name in src/custom_server/app.py
:
# Line 236 in app.py
connection_name = "your_api_connection" # Replace "dogfood-databricks-api"
Prerequisites
- Databricks CLI installed and configured
uv
(Python package manager)- Unity Catalog external connection configured for your target API
Local development
- run
uv
sync:
uv sync
- start the server locally. Changes will trigger a reload:
uvicorn custom_server.app:app --reload
Deploying a custom MCP server on Databricks Apps
There are two ways to deploy the server on Databricks Apps: using the databricks apps
CLI or using the databricks bundle
CLI. Depending on your preference, you can choose either method.
Both approaches require first configuring Databricks authentication:
export DATABRICKS_CONFIG_PROFILE=<your-profile-name> # e.g. custom-mcp-server
databricks auth login --profile "$DATABRICKS_CONFIG_PROFILE"
Using databricks apps
CLI
To deploy the server using the databricks apps
CLI, follow these steps:
Create a Databricks app to host your MCP server:
databricks apps create mcp-custom-server
Upload the source code to Databricks and deploy the app:
DATABRICKS_USERNAME=$(databricks current-user me | jq -r .userName)
databricks sync . "/Users/$DATABRICKS_USERNAME/my-mcp-server"
databricks apps deploy mcp-custom-server --source-code-path "/Workspace/Users/$DATABRICKS_USERNAME/my-mcp-server"
Using databricks bundle
CLI
To deploy the server using the databricks bundle
CLI, follow these steps
Update the app.yaml
file in this directory to use the following command:
command: ["uvicorn", "custom_server.app:app"]
- In this directory, run the following command to deploy and run the MCP server on Databricks Apps:
uv build --wheel
databricks bundle deploy
databricks bundle run custom-mcp-server
Using the MCP Server
Connecting to the MCP Server
Once deployed, your MCP server will be available at:
https://your-app-url.usually.ends.with.databricksapps.com/mcp/
Important: The URL must end with /mcp/
(including the trailing slash).
Authentication
Use a Bearer token from your Databricks profile:
databricks auth token -p <name-of-your-profile>
Available MCP Tools
Your deployed server provides three tools to LLM agents:
-
list_databricks_api_endpoints
- Discovers available API endpoints from your OpenAPI spec
- Optional search filtering by path, method, or description
-
get_databricks_api_endpoint_schema
- Gets detailed schema information for a specific endpoint
- Includes request/response schemas, parameters, etc.
-
invoke_databricks_api_endpoint
- Executes API calls to your external service
- Handles authentication via UC external connections
- Supports all HTTP methods (GET, POST, PUT, DELETE, PATCH)
Tool Design Best Practices
This MCP server implements best practices for agent tool design based on Anthropic's engineering guidelines and Block's layered approach:
Layered Architecture
The three tools follow a progressive "discovery → planning → execution" pattern:
-
Discovery Layer (
list_databricks_api_endpoints
)- Helps agents understand what APIs are available
- Provides search/filtering to avoid overwhelming responses
- Returns human-readable summaries and descriptions
-
Planning Layer (
get_databricks_api_endpoint_schema
)- Gives detailed schema information for proper request formation
- Shows required parameters, data types, and expected responses
- Enables agents to understand how to structure API calls correctly
-
Execution Layer (
invoke_databricks_api_endpoint
)- Performs the actual API requests with proper authentication
- Handles different HTTP methods and parameter formats
- Returns structured, meaningful responses
Key Design Principles
- Token Efficiency: Built-in pagination (50 endpoint limit) and search filtering
- Clear Naming: Tool names clearly indicate their purpose and scope
- Meaningful Context: Returns natural language descriptions alongside technical data
- Error Handling: Provides helpful error messages that guide agents toward successful requests
- Flexible Responses: Supports various parameter formats (JSON objects, strings, query params)
- Authentication Abstraction: Handles complex UC external connection authentication transparently
This layered approach prevents agents from being overwhelmed while providing the structured guidance needed for reliable API interactions.
Configuration Details
OpenAPI Specification Format
Your spec.json
should be a valid OpenAPI 3.x specification. Example structure:
{
"openapi": "3.1.0",
"info": {
"title": "Your API",
"version": "1.0.0"
},
"servers": [
{
"url": "https://your-api.example.com"
}
],
"paths": {
"/your/endpoint": {
"get": {
"summary": "Your endpoint description",
"parameters": [...],
"responses": {...}
}
}
}
}
External Connection Authentication Types
Databricks external connections support multiple authentication methods:
- API Key: Store API keys in Databricks secrets
- OAuth: Configure OAuth 2.0 flows
- Basic Auth: Bearer authentication
- Custom Headers: Any custom authentication headers
Example with OAuth:
CREATE CONNECTION your_api_connection
TYPE http
OPTIONS (
'host' = 'https://your-api.example.com',
'authentication' = 'OAUTH2',
'oauth2_client_id' = secret('your-scope', 'client-id'),
'oauth2_client_secret' = secret('your-scope', 'client-secret'),
'oauth2_token_url' = 'https://your-api.example.com/oauth/token'
);
Troubleshooting
Common Issues
-
"Authentication token not found"
- Ensure your UC external connection is properly configured
- Verify the connection name in
app.py
matches your UC connection
-
"OpenAPI spec file not found"
- Check that
src/custom_server/spec.json
exists and is valid JSON - Validate your OpenAPI spec using tools like Swagger Editor
- Check that
-
"Endpoint not found in API specification"
- Verify the endpoint path matches exactly what's in your OpenAPI spec
- Check that the HTTP method is supported in your spec
Debugging
Enable detailed logging by setting the log level in app.py
:
logging.basicConfig(level=logging.DEBUG)