mcp-tg/snowflake-developer-kit
If you are the rightful owner of snowflake-developer-kit and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Snowflake Developer MCP Server is a robust Model Context Protocol server designed to facilitate comprehensive Snowflake database operations, Cortex AI services, and data management tools for AI assistants like Claude.
alter_database
Rename databases
create_table
Create a table with columns
query_data
Query data from tables
grant_privileges
Grant permissions
test_snowflake_connection
Test connection to Snowflake
Snowflake Developer MCP Server ๐
A powerful Model Context Protocol (MCP) server that provides comprehensive Snowflake database operations, Cortex AI services, and data management tools for AI assistants like Claude.
๐ Features
- ๐ง DDL Operations: Create and manage databases, schemas, tables, and other database objects
- ๐ DML Operations: Insert, update, delete, and query data with full SQL support
- โ๏ธ Snowflake Operations: Manage warehouses, grants, roles, and show database objects
- ๐ Secure Authentication: Support for passwords and Programmatic Access Tokens (PAT)
- ๐ฏ Simple Connection Pattern: Per-operation connections for reliability and simplicity
๐ Quick Start
Prerequisites
- Python 3.11+
- UV package manager (install from https://github.com/astral-sh/uv)
- Node.js and npm (for MCP inspector)
- Snowflake account with appropriate permissions
- Snowflake credentials (account identifier, username, password/PAT)
Installation
-
Clone the repository
git clone https://github.com/mcp-tg/snowflake-developer.git cd snowflake-developer
-
Set up environment
# Copy environment template cp .env.example .env # Edit .env with your Snowflake credentials # Required: SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PAT (or SNOWFLAKE_PASSWORD)
-
Install UV (if not already installed)
# On macOS/Linux curl -LsSf https://astral.sh/uv/install.sh | sh # On Windows powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
๐งช Testing with MCP Inspector
The easiest way to test your setup is using the MCP Inspector:
# Run the development inspector script
./dev-inspector.sh
This will:
- โ Create a virtual environment (if needed)
- โ Install all dependencies via UV
- โ Load your Snowflake credentials from .env
- โ Start the MCP Inspector web interface
- โ Open your browser to test tools interactively
Note: The script automatically handles UV package installation, so you don't need to manually install dependencies.
First Test: Verify Connection
- In the Inspector, go to the Tools tab
- Find
test_snowflake_connection
and click Run - You should see your account details and confirmation that the connection works
๐ Integration with AI Assistants
Claude Desktop
Option 1: Direct from GitHub (no local clone needed)
{
"mcpServers": {
"snowflake-developer": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/mcp-tg/snowflake-developer.git",
"main.py"
],
"env": {
"SNOWFLAKE_ACCOUNT": "your-account",
"SNOWFLAKE_USER": "your-username",
"SNOWFLAKE_PAT": "your-pat-token"
}
}
}
}
Option 2: Local installation
{
"mcpServers": {
"snowflake-developer": {
"command": "uv",
"args": [
"run",
"--directory",
"/path/to/snowflake-developer",
"python",
"main.py"
],
"env": {
"SNOWFLAKE_ACCOUNT": "your-account",
"SNOWFLAKE_USER": "your-username",
"SNOWFLAKE_PAT": "your-pat-token"
}
}
}
}
Setup Instructions:
- Clone the repository:
git clone https://github.com/mcp-tg/snowflake-developer.git
- Create the Claude Desktop config file:
~/Library/Application Support/Claude/claude_desktop_config.json
(macOS) - Add the configuration above, replacing
/path/to/snowflake-developer
with your actual path - Replace credential placeholders with your actual Snowflake credentials
- Restart Claude Desktop
Cursor
Note: Cursor doesn't support environment variables in MCP configuration. You'll need to use the local installation option or set environment variables globally on your system.
Option 1: Direct from GitHub (requires global env vars)
{
"mcpServers": {
"snowflake-developer": {
"command": "uvx",
"args": [
"--from",
"git+https://github.com/mcp-tg/snowflake-developer.git",
"main.py"
]
}
}
}
Requires setting SNOWFLAKE_ACCOUNT
, SNOWFLAKE_USER
, and SNOWFLAKE_PAT
as system environment variables.
Option 2: Local installation (recommended for Cursor)
{
"mcpServers": {
"snowflake-developer": {
"command": "uv",
"args": ["run", "/path/to/snowflake-developer/main.py"]
}
}
}
Use a local .env
file in the project directory with your credentials.
๐ Available Tools (22 Total)
๐ง DDL Tools (8 Tools)
Tools for managing database structure:
Tool | Description | Example in Inspector | Natural Language Query |
---|---|---|---|
alter_database | Rename databases | database_name: OLD_DB new_name: NEW_DB | "Rename database OLD_DB to NEW_DB" |
alter_schema | Rename or move schemas | schema_name: TEST_DB.OLD_SCHEMA new_name: NEW_SCHEMA | "Rename OLD_SCHEMA to NEW_SCHEMA in TEST_DB" |
alter_table | Modify table structure | table_name: TEST_DB.PUBLIC.USERS alter_type: ADD column_name: created_at data_type: TIMESTAMP | "Add a created_at timestamp column to TEST_DB.PUBLIC.USERS table" |
create_database | Create a new database | database_name: TEST_DB | "Create a new database called TEST_DB" |
create_schema | Create a schema in a database | database_name: TEST_DB schema_name: ANALYTICS | "Create a schema named ANALYTICS in TEST_DB database" |
create_table | Create a table with columns | database_name: TEST_DB schema_name: PUBLIC table_name: USERS columns: [{"name": "id", "type": "INT"}, {"name": "email", "type": "VARCHAR(255)"}] | "Create a USERS table in TEST_DB.PUBLIC with id as INT and email as VARCHAR(255)" |
drop_database_object | Drop any database object | object_type: TABLE object_name: TEST_DB.PUBLIC.OLD_TABLE | "Drop the table TEST_DB.PUBLIC.OLD_TABLE" |
execute_ddl_statement | Run custom DDL SQL | ddl_statement: CREATE VIEW TEST_DB.PUBLIC.ACTIVE_USERS AS SELECT * FROM TEST_DB.PUBLIC.USERS WHERE status = 'active' | "Create a view called ACTIVE_USERS that shows only active users" |
๐ DML Tools (6 Tools)
Tools for working with data:
Tool | Description | Example in Inspector | Natural Language Query |
---|---|---|---|
delete_data | Delete rows from a table | table_name: TEST_DB.PUBLIC.USERS where_clause: status = 'deleted' | "Delete all users with status 'deleted'" |
execute_dml_statement | Run custom DML SQL | dml_statement: UPDATE TEST_DB.PUBLIC.USERS SET last_login = CURRENT_TIMESTAMP() WHERE id = 1 | "Update the last login timestamp for user with id 1" |
insert_data | Insert rows into a table | table_name: TEST_DB.PUBLIC.USERS data: {"id": 1, "email": "john@example.com", "name": "John Doe"} | "Insert a new user with id 1, email , and name John Doe into the USERS table" |
merge_data | Synchronize data between tables | target_table: TEST_DB.PUBLIC.USERS source_table: TEST_DB.STAGING.NEW_USERS merge_condition: target.id = source.id match_actions: [{"action": "UPDATE", "columns": ["email", "name"], "values": ["source.email", "source.name"]}] not_match_actions: [{"action": "INSERT", "columns": ["id", "email", "name"], "values": ["source.id", "source.email", "source.name"]}] | "Merge new users from staging table into production users table, updating existing records and inserting new ones" |
query_data | Query data from tables | table_name: TEST_DB.PUBLIC.USERS columns: ["id", "email", "name"] where_clause: status = 'active' limit: 10 | "Show me the first 10 active users with their id, email, and name" |
update_data | Update existing rows | table_name: TEST_DB.PUBLIC.USERS data: {"status": "inactive"} where_clause: last_login < '2023-01-01' | "Set status to inactive for all users who haven't logged in since January 2023" |
โ๏ธ Snowflake Operations Tools (8 Tools)
Tools for Snowflake-specific operations:
Tool | Description | Example in Inspector | Natural Language Query |
---|---|---|---|
alter_warehouse | Modify warehouse settings | warehouse_name: COMPUTE_WH warehouse_size: MEDIUM auto_suspend: 300 | "Change COMPUTE_WH to MEDIUM size and auto-suspend after 5 minutes" |
describe_database_object | Get object details | object_name: TEST_DB.PUBLIC.USERS | "Describe the structure of TEST_DB.PUBLIC.USERS table" |
execute_sql_query | Run any SQL query | query: SELECT CURRENT_USER(), CURRENT_WAREHOUSE() | "Show me my current user and warehouse" |
grant_privileges | Grant permissions | privileges: ["SELECT", "INSERT"] on_type: TABLE on_name: TEST_DB.PUBLIC.USERS to_type: ROLE to_name: ANALYST_ROLE | "Grant SELECT and INSERT on TEST_DB.PUBLIC.USERS table to ANALYST_ROLE" |
revoke_privileges | Revoke permissions | privileges: ["SELECT"] on_type: TABLE on_name: TEST_DB.PUBLIC.USERS from_type: ROLE from_name: ANALYST_ROLE | "Revoke SELECT on TEST_DB.PUBLIC.USERS table from ANALYST_ROLE" |
set_context | Set database/schema/warehouse/role | context_type: DATABASE context_name: TEST_DB | "Use TEST_DB as the current database" |
show_database_objects | List database objects | object_type: DATABASES | "Show me all databases" |
test_snowflake_connection | Test connection to Snowflake | (no parameters) | "Test my Snowflake connection" |
๐๏ธ Architecture
The server uses a simple per-operation connection pattern:
- Each tool/resource call creates a fresh Snowflake connection
- Connections are automatically closed after each operation
- No connection pooling or persistence required
- Credentials are read from environment variables
๐ก๏ธ Security Best Practices
- Use Programmatic Access Tokens (PAT) instead of passwords when possible
- Never commit
.env
files to version control - Use least-privilege roles for your Snowflake user
- Rotate credentials regularly
- Consider using external secret management for production
๐ค Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
๐ License
This project is licensed under the MIT License - see the LICENSE file for details.
๐ Troubleshooting
Connection Issues
- Verify your account identifier format
- Check that your user has appropriate permissions
- Ensure your PAT token hasn't expired
- Test network connectivity to Snowflake
Tool Errors
- Check the error message in the Inspector console
- Verify required parameters are provided
- Ensure database objects exist before referencing them
- Check SQL syntax for custom statements
๐ FastMCP Framework
This MCP server is built using FastMCP, a modern Python framework that simplifies building Model Context Protocol servers. FastMCP provides:
Why FastMCP?
- ๐ฏ Simple API: Decorator-based tool and resource registration
- โก High Performance: Async/await support with efficient message handling
- ๐ง Type Safety: Full TypeScript-style type hints and validation
- ๐ Auto Documentation: Automatic tool/resource documentation generation
- ๐ก๏ธ Error Handling: Built-in exception handling and response formatting
- ๐ MCP Compliance: Full compatibility with MCP protocol specification
FastMCP vs Traditional MCP
# Traditional MCP server setup
class MyMCPServer:
def __init__(self):
self.tools = {}
def register_tool(self, name, handler, schema):
# Manual registration and validation
pass
# FastMCP - Clean and Simple
from fastmcp import FastMCP
mcp = FastMCP("MyServer")
@mcp.tool()
def my_tool(param: str) -> str:
"""Tool with automatic type validation and documentation."""
return f"Result: {param}"
@mcp.resource("my://resource/{id}")
async def my_resource(id: str, ctx: Context) -> dict:
"""Resource with built-in async support and context."""
return {"data": f"Resource {id}"}
Key FastMCP Features Used
- Decorator Registration: Tools are registered using simple decorators
- Type Validation: Automatic parameter validation using Python type hints
- Context Management: Built-in context for progress reporting and logging
- Resource Patterns: URI template matching for dynamic resource endpoints
- Error Handling: Automatic exception catching and standardized error responses
FastMCP Installation
# Install FastMCP
pip install fastmcp
# Or with UV (recommended)
uv add fastmcp
Learning FastMCP
- Official Docs: FastMCP Documentation
- Examples: Browse FastMCP example servers in the repository
- TypeScript MCP SDK: MCP TypeScript SDK