Souhar-dya/mcp-db-server
If you are the rightful owner of mcp-db-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The MCP-DB-Server is a Model Context Protocol server that enables AI agents to interact with relational databases using natural language queries.
mcp-db-server
An MCP (Model Context Protocol) server that exposes relational databases (PostgreSQL/MySQL) to AI agents with natural language query support. Transform natural language questions into SQL queries and get structured results.
Features
- Multi-Database Support: Works with PostgreSQL and MySQL
- Natural Language to SQL: Convert plain English queries to SQL using HuggingFace transformers
- RESTful API: Clean FastAPI-based endpoints for database operations
- Safety First: Read-only operations with query validation and result limits
- Docker Ready: Complete containerization with Docker Compose
- Production Ready: Health checks, logging, and error handling
- AI Agent Friendly: Designed specifically for AI agent integration
API Endpoints
| Endpoint | Method | Description |
|---|---|---|
/health | GET | Health check and service status |
/mcp/list_tables | GET | List all available tables with column counts |
/mcp/describe/{table_name} | GET | Get detailed schema for a specific table |
/mcp/query | POST | Execute natural language queries |
/mcp/tables/{table_name}/sample | GET | Get sample data from a table |
Quick Start
Option 1: Docker Compose (Recommended)
-
Clone and start the services:
git clone https://github.com/Souhar-dya/mcp-db-server.git cd mcp-db-server docker-compose up --build -
Test the endpoints:
# Health check curl http://localhost:8000/health # List tables curl http://localhost:8000/mcp/list_tables # Describe a table curl http://localhost:8000/mcp/describe/customers # Natural language query curl -X POST "http://localhost:8000/mcp/query" \ -H "Content-Type: application/json" \ -d '{"nl_query": "show top 5 customers by total orders"}'
Option 2: Local Development
-
Prerequisites:
- Python 3.11+
- PostgreSQL or MySQL database
-
Install dependencies:
pip install -r requirements.txt -
Set environment variables:
export DATABASE_URL="postgresql+asyncpg://user:password@localhost:5432/dbname" # or for MySQL: # export DATABASE_URL="mysql+pymysql://user:password@localhost:3306/dbname" -
Run the server:
python -m app.server
Sample Database
The project includes a sample database with realistic e-commerce data:
- customers: Customer information (10 sample customers)
- orders: Order records (17 sample orders)
- order_items: Individual items within orders
- order_summary: View combining order and customer data
Natural Language Query Examples
The server can understand various types of natural language queries:
# Get all customers
curl -X POST "http://localhost:8000/mcp/query" \
-H "Content-Type: application/json" \
-d '{"nl_query": "show all customers"}'
# Count orders by status
curl -X POST "http://localhost:8000/mcp/query" \
-H "Content-Type: application/json" \
-d '{"nl_query": "count orders by status"}'
# Top customers by order value
curl -X POST "http://localhost:8000/mcp/query" \
-H "Content-Type: application/json" \
-d '{"nl_query": "top 5 customers by total order amount"}'
# Recent orders
curl -X POST "http://localhost:8000/mcp/query" \
-H "Content-Type: application/json" \
-d '{"nl_query": "show recent orders from last week"}'
Configuration
Environment Variables
| Variable | Description | Default |
|---|---|---|
DATABASE_URL | Full database connection URL | postgresql+asyncpg://postgres:postgres@localhost:5432/postgres |
DB_HOST | Database host | localhost |
DB_PORT | Database port | 5432 |
DB_USER | Database username | postgres |
DB_PASSWORD | Database password | postgres |
DB_NAME | Database name | postgres |
HOST | Server host | 0.0.0.0 |
PORT | Server port | 8000 |
Database Connection Examples
# PostgreSQL
DATABASE_URL=postgresql+asyncpg://user:pass@localhost:5432/mydb
# MySQL
DATABASE_URL=mysql+pymysql://user:pass@localhost:3306/mydb
# PostgreSQL with SSL
DATABASE_URL=postgresql+asyncpg://user:pass@localhost:5432/mydb?sslmode=require
### Database Connection Examples
```bash
# PostgreSQL (local or cloud)
DATABASE_URL=postgresql+asyncpg://user:password@host:5432/dbname
# MySQL (local or cloud)
DATABASE_URL=mysql+aiomysql://user:password@host:3306/dbname
# PostgreSQL with SSL (cloud, e.g. Neon, Supabase, Aiven)
DATABASE_URL=postgresql+asyncpg://user:password@host:5432/dbname?sslmode=require
# MySQL with SSL (cloud, e.g. Aiven, PlanetScale)
DATABASE_URL=mysql+aiomysql://user:password@host:3306/dbname?ssl-mode=REQUIRED
Note:
- For MySQL cloud providers, the
ssl-modeparameter in the URL is ignored by the driver, but SSL is always enabled in the MCP server for cloud connections.- For PostgreSQL, use
sslmode=requirefor cloud DBs. For MySQL, just use the standard URL; SSL is handled automatically.- If you see errors about
ssl-modeorsslmode, check your URL and ensure you are using the correct driver prefix (mysql+aiomysqlorpostgresql+asyncpg).
Cloud Database Examples
# Neon (PostgreSQL)
DATABASE_URL=postgresql+asyncpg://username:password@ep-xxxxxx-pooler.us-east-2.aws.neon.tech/dbname
# Aiven (MySQL)
DATABASE_URL=mysql+aiomysql://avnadmin:yourpassword@mysql-xxxxxx-username-xxxx.aivencloud.com:11079/defaultdb?ssl-mode=REQUIRED
Docker Usage with Cloud DB
docker run -d \
-p 8000:8000 \
-e DATABASE_URL="<your_cloud_database_url>" \
souhardyak/mcp-db-server:latest
Troubleshooting
- If you get
connect() got an unexpected keyword argument 'ssl-mode', ignore it: SSL is still enabled. - For network errors, check firewall and DB credentials.
- For MySQL, always use
mysql+aiomysqlin the URL for async support.
## Security Features
- **Read-Only Operations**: Only SELECT queries are allowed
- **Query Validation**: Automatic detection and blocking of dangerous SQL operations
- **Result Limiting**: Maximum 50 rows per query (configurable)
- **Input Sanitization**: Protection against SQL injection
- **Safe Defaults**: Secure configuration out of the box
## Architecture
mcp-db-server/ ├── app/ │ ├── init.py # Package initialization │ ├── server.py # FastAPI application and endpoints │ ├── db.py # Database connection and operations │ └── nl_to_sql.py # Natural language to SQL conversion ├── .github/workflows/ │ └── docker-publish.yml # CI/CD pipeline ├── docker-compose.yml # Docker Compose configuration ├── Dockerfile # Container definition ├── init_db.sql # Sample database schema and data ├── requirements.txt # Python dependencies └── README.md # This file
## Model Context Protocol (MCP) Integration
This server is designed to work seamlessly with MCP-compatible AI agents:
1. **Standardized Endpoints**: RESTful API following MCP conventions
2. **Structured Responses**: JSON responses optimized for AI consumption
3. **Error Handling**: Consistent error messages and status codes
4. **Documentation**: OpenAPI/Swagger documentation available at `/docs`
## Deployment
### Docker Hub
```bash
# Pull the latest image
docker pull souhardyak/mcp-db-server:latest
# Run with your database
docker run -d \
-p 8000:8000 \
-e DATABASE_URL="your_database_url_here" \
souhardyak/mcp-db-server:latest
Kubernetes
apiVersion: apps/v1
kind: Deployment
metadata:
name: mcp-db-server
spec:
replicas: 3
selector:
matchLabels:
app: mcp-db-server
template:
metadata:
labels:
app: mcp-db-server
spec:
containers:
- name: mcp-db-server
image: souhardyak/mcp-db-server:latest
ports:
- containerPort: 8000
env:
- name: DATABASE_URL
valueFrom:
secretKeyRef:
name: db-secret
key: url
---
apiVersion: v1
kind: Service
metadata:
name: mcp-db-server-service
spec:
selector:
app: mcp-db-server
ports:
- port: 80
targetPort: 8000
type: LoadBalancer
Testing
Run Tests Locally
# Start test database
docker-compose up postgres -d
# Wait for database to be ready
sleep 10
# Run tests
python -m pytest tests/ -v
Manual Testing
# Test health endpoint
curl http://localhost:8000/health
# Test table listing
curl http://localhost:8000/mcp/list_tables
# Test natural language query
curl -X POST "http://localhost:8000/mcp/query" \
-H "Content-Type: application/json" \
-d '{"nl_query": "show me all customers from California"}'
Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
This project is licensed under the Apache License 2.0 - see the file for details.
📝 Changelog
v1.3.0 (2025-12-24) - Docker Path Fix
- Fixed: Resolved import path issues in Docker container causing
from db import DatabaseManagerto fail - Fixed: Changed relative paths to absolute paths in Dockerfile and docker-compose.yml healthchecks
- Improved:
mcp_server.pynow uses robust path resolution that works both locally and in Docker containers - Updated: Docker image rebuilt and pushed with all path fixes
v1.2.0 (2025-11-03) - MySQL Column Access Fix
- Fixed: Resolved
Could not locate column in row for column 'column_name'error with MySQL databases - Fixed: Changed
describe_tablemethod to use index-based row access for better SQLAlchemy compatibility - Improved: Enhanced cross-database compatibility for schema introspection
- Resolved: GitHub Issue #1
v1.1.0 (2025-09-28) - Async Bug Fix
- Fixed: Resolved
str can't be used in 'await' expressionerror in MCP server - Improved: NLP query processing now works correctly with Claude Desktop integration
- Enhanced: Added comprehensive test database setup scripts
- Updated: Docker image rebuilt with bug fixes and updated dependencies
v1.0.0 (2025-09-25) - Initial Release
- Initial: Full MCP Database Server implementation
- Added: RESTful API with FastAPI
- Added: Natural language to SQL conversion
- Added: Docker containerization and deployment
- Added: Multi-database support (PostgreSQL, MySQL, SQLite)
Acknowledgments
- FastAPI for the excellent web framework
- HuggingFace Transformers for NL to SQL capabilities
- SQLAlchemy for database abstraction
- The Model Context Protocol (MCP) community
Support
⭐ If this project helped you, please consider giving it a star!