MagiciannnY/postgresql-mcp-server
If you are the rightful owner of postgresql-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The PostgreSQL MCP Server is a Model Context Protocol server designed to enhance PostgreSQL database management for AI assistants.
Schema Management
Manage tables, columns, ENUMs, and constraints.
User & Permissions
Create users and manage permissions.
Query Performance
Analyze query performance and statistics.
Index Management
Create and optimize indexes.
Functions
Manage stored functions.
Triggers
Manage database triggers.
Constraints
Manage foreign keys and unique constraints.
Row-Level Security
Manage RLS policies.
Execute Query
Perform SELECT operations with support for count/exists.
Execute Mutation
Perform INSERT/UPDATE/DELETE/UPSERT operations.
Execute SQL
Execute arbitrary SQL with transaction support.
Comments Management
Manage comments for database objects.
Database Analysis
Analyze performance and configuration.
Debug Database
Troubleshoot connection and performance issues.
Data Export/Import
Migrate data in JSON/CSV format.
Copy Between Databases
Transfer data between databases.
Real-time Monitoring
Monitor live database metrics and alerts.
A little revision from HenkDz
PostgreSQL MCP Server
A Model Context Protocol (MCP) server that provides comprehensive PostgreSQL database management capabilities for AI assistants.
š What's New: This server has been completely redesigned from 46 individual tools to 17 intelligent tools through consolidation (34ā8 meta-tools) and enhancement (+4 new tools), providing better AI discovery while adding powerful data manipulation and comment management capabilities.
Quick Start
Option 1: npm (Recommended)
# Install globally
npm install -g @henkey/postgres-mcp-server
# Or run directly with npx (no installation)
npx @henkey/postgres-mcp-server --connection-string "postgresql://user:pass@localhost:5432/db"
Add to your MCP client configuration:
{
"mcpServers": {
"postgresql-mcp": {
"command": "npx",
"args": [
"@henkey/postgres-mcp-server",
"--connection-string", "postgresql://user:password@host:port/database"
]
}
}
}
Option 2: Install via Smithery
npx -y @smithery/cli install @HenkDz/postgresql-mcp-server --client claude
Option 3: Manual Installation (Development)
git clone <repository-url>
cd postgresql-mcp-server
cp .env.example .env
complete the "connectionString" in the .env
file to make your database available.
revise <your public ip>
in file src/index.ts
npm install
npm run build
start the project
npm run dev
Add to your MCP client configuration:
{
"mcpServers": {
"postgresql-mcp": {
"command": "node",
"args": [
"/path/to/postgresql-mcp-server/build/index.js",
"--connection-string", "postgresql://user:password@host:port/database"
]
}
}
}
What's Included
17 powerful tools organized into three categories:
- š Consolidation: 34 original tools consolidated into 8 intelligent meta-tools
- š§ Specialized: 5 tools kept separate for complex operations
- š Enhancement: 4 brand new tools (not in original 46)
š Consolidated Meta-Tools (8 tools)
- Schema Management - Tables, columns, ENUMs, constraints
- User & Permissions - Create users, grant/revoke permissions
- Query Performance - EXPLAIN plans, slow queries, statistics
- Index Management - Create, analyze, optimize indexes
- Functions - Create, modify, manage stored functions
- Triggers - Database trigger management
- Constraints - Foreign keys, checks, unique constraints
- Row-Level Security - RLS policies and management
š Enhancement Tools (4 NEW tools)
Brand new capabilities not available in the original 46 tools
- Execute Query - SELECT operations with count/exists support
- Execute Mutation - INSERT/UPDATE/DELETE/UPSERT operations
- Execute SQL - Arbitrary SQL execution with transaction support
- Comments Management - Comprehensive comment management for all database objects
š§ Specialized Tools (5 tools)
- Database Analysis - Performance and configuration analysis
- Debug Database - Troubleshoot connection, performance, locks
- Data Export/Import - JSON/CSV data migration
- Copy Between Databases - Cross-database data transfer
- Real-time Monitoring - Live database metrics and alerts
Example Usage
// Analyze database performance
{ "analysisType": "performance" }
// Create a table with constraints
{
"operation": "create_table",
"tableName": "users",
"columns": [
{ "name": "id", "type": "SERIAL PRIMARY KEY" },
{ "name": "email", "type": "VARCHAR(255) UNIQUE NOT NULL" }
]
}
// Query data with parameters
{
"operation": "select",
"query": "SELECT * FROM users WHERE created_at > $1",
"parameters": ["2024-01-01"],
"limit": 100
}
// Insert new data
{
"operation": "insert",
"table": "users",
"data": {"name": "John Doe", "email": "john@example.com"},
"returning": "*"
}
// Find slow queries
{
"operation": "get_slow_queries",
"limit": 5,
"minDuration": 100
}
// Manage database object comments
{
"operation": "set",
"objectType": "table",
"objectName": "users",
"comment": "Main user account information table"
}
š Documentation
š - All 18 tool parameters & examples in one place
For additional information, see the folder:
- - Comprehensive tool usage and examples
- - Setup and contribution guide
- - Architecture and implementation
- - API reference and advanced usage
- - Complete documentation overview
Features Highlights
š Consolidation Achievements
ā
34ā8 meta-tools - Intelligent consolidation for better AI discovery
ā
Multiple operations per tool - Unified schemas with operation parameters
ā
Smart parameter validation - Clear error messages and type safety
š Enhanced Data Capabilities
ā
Complete CRUD operations - INSERT/UPDATE/DELETE/UPSERT with parameterized queries
ā
Flexible querying - SELECT with count/exists support and safety limits
ā
Arbitrary SQL execution - Transaction support for complex operations
š§ Production Ready
ā
Flexible connection - CLI args, env vars, or per-tool configuration
ā
Security focused - SQL injection prevention, parameterized queries
ā
Robust architecture - Connection pooling, comprehensive error handling
Prerequisites
- Node.js ā„ 18.0.0
- PostgreSQL server access
- Valid connection credentials
Contributing
- Fork the repository
- Create a feature branch
- Commit your changes
- Create a Pull Request
See for detailed setup instructions.
License
AGPLv3 License - see file for details.