snowflake-developer-kit

mcp-tg/snowflake-developer-kit

3.2

If you are the rightful owner of snowflake-developer-kit and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Snowflake Developer MCP Server is a robust Model Context Protocol server designed to facilitate comprehensive Snowflake database operations, Cortex AI services, and data management tools for AI assistants like Claude.

Tools
  1. alter_database

    Rename databases

  2. create_table

    Create a table with columns

  3. query_data

    Query data from tables

  4. grant_privileges

    Grant permissions

  5. test_snowflake_connection

    Test connection to Snowflake

Snowflake Developer MCP Server ๐Ÿš€

A powerful Model Context Protocol (MCP) server that provides comprehensive Snowflake database operations, Cortex AI services, and data management tools for AI assistants like Claude.

๐ŸŒŸ Features

  • ๐Ÿ”ง DDL Operations: Create and manage databases, schemas, tables, and other database objects
  • ๐Ÿ“Š DML Operations: Insert, update, delete, and query data with full SQL support
  • โš™๏ธ Snowflake Operations: Manage warehouses, grants, roles, and show database objects
  • ๐Ÿ”’ Secure Authentication: Support for passwords and Programmatic Access Tokens (PAT)
  • ๐ŸŽฏ Simple Connection Pattern: Per-operation connections for reliability and simplicity

๐Ÿš€ Quick Start

Prerequisites

  • Python 3.11+
  • UV package manager (install from https://github.com/astral-sh/uv)
  • Node.js and npm (for MCP inspector)
  • Snowflake account with appropriate permissions
  • Snowflake credentials (account identifier, username, password/PAT)

Installation

  1. Clone the repository

    git clone https://github.com/mcp-tg/snowflake-developer.git
    cd snowflake-developer
    
  2. Set up environment

    # Copy environment template
    cp .env.example .env
    
    # Edit .env with your Snowflake credentials
    # Required: SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PAT (or SNOWFLAKE_PASSWORD)
    
  3. Install UV (if not already installed)

    # On macOS/Linux
    curl -LsSf https://astral.sh/uv/install.sh | sh
    
    # On Windows
    powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
    

๐Ÿงช Testing with MCP Inspector

The easiest way to test your setup is using the MCP Inspector:

# Run the development inspector script
./dev-inspector.sh

This will:

  • โœ… Create a virtual environment (if needed)
  • โœ… Install all dependencies via UV
  • โœ… Load your Snowflake credentials from .env
  • โœ… Start the MCP Inspector web interface
  • โœ… Open your browser to test tools interactively

Note: The script automatically handles UV package installation, so you don't need to manually install dependencies.

First Test: Verify Connection

  1. In the Inspector, go to the Tools tab
  2. Find test_snowflake_connection and click Run
  3. You should see your account details and confirmation that the connection works

๐Ÿ”Œ Integration with AI Assistants

Claude Desktop

Option 1: Direct from GitHub (no local clone needed)

{
  "mcpServers": {
    "snowflake-developer": {
      "command": "uvx",
      "args": [
        "--from",
        "git+https://github.com/mcp-tg/snowflake-developer.git",
        "main.py"
      ],
      "env": {
        "SNOWFLAKE_ACCOUNT": "your-account",
        "SNOWFLAKE_USER": "your-username",
        "SNOWFLAKE_PAT": "your-pat-token"
      }
    }
  }
}

Option 2: Local installation

{
  "mcpServers": {
    "snowflake-developer": {
      "command": "uv",
      "args": [
        "run",
        "--directory",
        "/path/to/snowflake-developer",
        "python",
        "main.py"
      ],
      "env": {
        "SNOWFLAKE_ACCOUNT": "your-account",
        "SNOWFLAKE_USER": "your-username",
        "SNOWFLAKE_PAT": "your-pat-token"
      }
    }
  }
}

Setup Instructions:

  1. Clone the repository: git clone https://github.com/mcp-tg/snowflake-developer.git
  2. Create the Claude Desktop config file: ~/Library/Application Support/Claude/claude_desktop_config.json (macOS)
  3. Add the configuration above, replacing /path/to/snowflake-developer with your actual path
  4. Replace credential placeholders with your actual Snowflake credentials
  5. Restart Claude Desktop

Cursor

Note: Cursor doesn't support environment variables in MCP configuration. You'll need to use the local installation option or set environment variables globally on your system.

Option 1: Direct from GitHub (requires global env vars)

{
  "mcpServers": {
    "snowflake-developer": {
      "command": "uvx",
      "args": [
        "--from",
        "git+https://github.com/mcp-tg/snowflake-developer.git",
        "main.py"
      ]
    }
  }
}

Requires setting SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, and SNOWFLAKE_PAT as system environment variables.

Option 2: Local installation (recommended for Cursor)

{
  "mcpServers": {
    "snowflake-developer": {
      "command": "uv",
      "args": ["run", "/path/to/snowflake-developer/main.py"]
    }
  }
}

Use a local .env file in the project directory with your credentials.

๐Ÿ“š Available Tools (22 Total)

๐Ÿ”ง DDL Tools (8 Tools)

Tools for managing database structure:

ToolDescriptionExample in InspectorNatural Language Query
alter_databaseRename databasesdatabase_name: OLD_DB
new_name: NEW_DB
"Rename database OLD_DB to NEW_DB"
alter_schemaRename or move schemasschema_name: TEST_DB.OLD_SCHEMA
new_name: NEW_SCHEMA
"Rename OLD_SCHEMA to NEW_SCHEMA in TEST_DB"
alter_tableModify table structuretable_name: TEST_DB.PUBLIC.USERS
alter_type: ADD
column_name: created_at
data_type: TIMESTAMP
"Add a created_at timestamp column to TEST_DB.PUBLIC.USERS table"
create_databaseCreate a new databasedatabase_name: TEST_DB"Create a new database called TEST_DB"
create_schemaCreate a schema in a databasedatabase_name: TEST_DB
schema_name: ANALYTICS
"Create a schema named ANALYTICS in TEST_DB database"
create_tableCreate a table with columnsdatabase_name: TEST_DB
schema_name: PUBLIC
table_name: USERS
columns: [{"name": "id", "type": "INT"}, {"name": "email", "type": "VARCHAR(255)"}]
"Create a USERS table in TEST_DB.PUBLIC with id as INT and email as VARCHAR(255)"
drop_database_objectDrop any database objectobject_type: TABLE
object_name: TEST_DB.PUBLIC.OLD_TABLE
"Drop the table TEST_DB.PUBLIC.OLD_TABLE"
execute_ddl_statementRun custom DDL SQLddl_statement: CREATE VIEW TEST_DB.PUBLIC.ACTIVE_USERS AS SELECT * FROM TEST_DB.PUBLIC.USERS WHERE status = 'active'"Create a view called ACTIVE_USERS that shows only active users"

๐Ÿ“Š DML Tools (6 Tools)

Tools for working with data:

ToolDescriptionExample in InspectorNatural Language Query
delete_dataDelete rows from a tabletable_name: TEST_DB.PUBLIC.USERS
where_clause: status = 'deleted'
"Delete all users with status 'deleted'"
execute_dml_statementRun custom DML SQLdml_statement: UPDATE TEST_DB.PUBLIC.USERS SET last_login = CURRENT_TIMESTAMP() WHERE id = 1"Update the last login timestamp for user with id 1"
insert_dataInsert rows into a tabletable_name: TEST_DB.PUBLIC.USERS
data: {"id": 1, "email": "john@example.com", "name": "John Doe"}
"Insert a new user with id 1, email , and name John Doe into the USERS table"
merge_dataSynchronize data between tablestarget_table: TEST_DB.PUBLIC.USERS
source_table: TEST_DB.STAGING.NEW_USERS
merge_condition: target.id = source.id
match_actions: [{"action": "UPDATE", "columns": ["email", "name"], "values": ["source.email", "source.name"]}]
not_match_actions: [{"action": "INSERT", "columns": ["id", "email", "name"], "values": ["source.id", "source.email", "source.name"]}]
"Merge new users from staging table into production users table, updating existing records and inserting new ones"
query_dataQuery data from tablestable_name: TEST_DB.PUBLIC.USERS
columns: ["id", "email", "name"]
where_clause: status = 'active'
limit: 10
"Show me the first 10 active users with their id, email, and name"
update_dataUpdate existing rowstable_name: TEST_DB.PUBLIC.USERS
data: {"status": "inactive"}
where_clause: last_login < '2023-01-01'
"Set status to inactive for all users who haven't logged in since January 2023"

โš™๏ธ Snowflake Operations Tools (8 Tools)

Tools for Snowflake-specific operations:

ToolDescriptionExample in InspectorNatural Language Query
alter_warehouseModify warehouse settingswarehouse_name: COMPUTE_WH
warehouse_size: MEDIUM
auto_suspend: 300
"Change COMPUTE_WH to MEDIUM size and auto-suspend after 5 minutes"
describe_database_objectGet object detailsobject_name: TEST_DB.PUBLIC.USERS"Describe the structure of TEST_DB.PUBLIC.USERS table"
execute_sql_queryRun any SQL queryquery: SELECT CURRENT_USER(), CURRENT_WAREHOUSE()"Show me my current user and warehouse"
grant_privilegesGrant permissionsprivileges: ["SELECT", "INSERT"]
on_type: TABLE
on_name: TEST_DB.PUBLIC.USERS
to_type: ROLE
to_name: ANALYST_ROLE
"Grant SELECT and INSERT on TEST_DB.PUBLIC.USERS table to ANALYST_ROLE"
revoke_privilegesRevoke permissionsprivileges: ["SELECT"]
on_type: TABLE
on_name: TEST_DB.PUBLIC.USERS
from_type: ROLE
from_name: ANALYST_ROLE
"Revoke SELECT on TEST_DB.PUBLIC.USERS table from ANALYST_ROLE"
set_contextSet database/schema/warehouse/rolecontext_type: DATABASE
context_name: TEST_DB
"Use TEST_DB as the current database"
show_database_objectsList database objectsobject_type: DATABASES"Show me all databases"
test_snowflake_connectionTest connection to Snowflake(no parameters)"Test my Snowflake connection"

๐Ÿ—๏ธ Architecture

The server uses a simple per-operation connection pattern:

  • Each tool/resource call creates a fresh Snowflake connection
  • Connections are automatically closed after each operation
  • No connection pooling or persistence required
  • Credentials are read from environment variables

๐Ÿ›ก๏ธ Security Best Practices

  1. Use Programmatic Access Tokens (PAT) instead of passwords when possible
  2. Never commit .env files to version control
  3. Use least-privilege roles for your Snowflake user
  4. Rotate credentials regularly
  5. Consider using external secret management for production

๐Ÿค Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ†˜ Troubleshooting

Connection Issues

  • Verify your account identifier format
  • Check that your user has appropriate permissions
  • Ensure your PAT token hasn't expired
  • Test network connectivity to Snowflake

Tool Errors

  • Check the error message in the Inspector console
  • Verify required parameters are provided
  • Ensure database objects exist before referencing them
  • Check SQL syntax for custom statements

๐Ÿš€ FastMCP Framework

This MCP server is built using FastMCP, a modern Python framework that simplifies building Model Context Protocol servers. FastMCP provides:

Why FastMCP?

  • ๐ŸŽฏ Simple API: Decorator-based tool and resource registration
  • โšก High Performance: Async/await support with efficient message handling
  • ๐Ÿ”ง Type Safety: Full TypeScript-style type hints and validation
  • ๐Ÿ“ Auto Documentation: Automatic tool/resource documentation generation
  • ๐Ÿ›ก๏ธ Error Handling: Built-in exception handling and response formatting
  • ๐Ÿ”Œ MCP Compliance: Full compatibility with MCP protocol specification

FastMCP vs Traditional MCP

# Traditional MCP server setup
class MyMCPServer:
    def __init__(self):
        self.tools = {}
    
    def register_tool(self, name, handler, schema):
        # Manual registration and validation
        pass

# FastMCP - Clean and Simple
from fastmcp import FastMCP

mcp = FastMCP("MyServer")

@mcp.tool()
def my_tool(param: str) -> str:
    """Tool with automatic type validation and documentation."""
    return f"Result: {param}"

@mcp.resource("my://resource/{id}")
async def my_resource(id: str, ctx: Context) -> dict:
    """Resource with built-in async support and context."""
    return {"data": f"Resource {id}"}

Key FastMCP Features Used

  1. Decorator Registration: Tools are registered using simple decorators
  2. Type Validation: Automatic parameter validation using Python type hints
  3. Context Management: Built-in context for progress reporting and logging
  4. Resource Patterns: URI template matching for dynamic resource endpoints
  5. Error Handling: Automatic exception catching and standardized error responses

FastMCP Installation

# Install FastMCP
pip install fastmcp

# Or with UV (recommended)
uv add fastmcp

Learning FastMCP

๐Ÿ“š Additional Resources

Snowflake Resources

MCP Protocol & Tools

Development Tools