ashishvalentinealex/MCP_NLP2SQL
If you are the rightful owner of MCP_NLP2SQL and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
The Model Context Protocol (MCP) server facilitates interaction with a PostgreSQL database using natural language processing to convert queries into SQL, ensuring minimal errors through strict schema validation.

Model context protocol for interacting with database using Natural Language processing
This project provides an MCP (Model Context Protocol) server that connects to a PostgreSQL database, auto-documents the schema, and enables natural language → SQL querying with minimal hallucination by using strict schema validation.
README for MCP Server
This MCP server uses FastMCP, SQLAlchemy, and OpenAI models to:
- Inspect and document your PostgreSQL schema.
- Expose schema resources (tables, columns, samples) via MCP.
- Translate natural language queries into safe, validated SQL.
Requirements
- PostgreSQL 12+
- Python 3.10+
- Conda or venv for environment setup
Dependencies (requirements.txt):
fastmcp
sqlalchemy
psycopg2-binary
openai
uvicorn
Setup Instructions
- Clone repository
git clone https://github.com/<your-org>/MCP_NLP2SQL.git
cd nlp2sql-mcp
Environment Setup
To create and activate the conda environment for this project:
# Create the environment
conda create -n env python=3.10 -y
Activate the environment
conda activate env
3. Install dependencies
pip install -r requirements.txt
4. Set environment variables
export DATABASE_URL="postgresql://user:password@localhost:5432/db" export OPENAI_API_KEY="sk-xxxxxxxxxxxxxxxxxxxx"
Usage Instructions
Run the following commands to set up and use the MCP server:
# Generate schema documentation
python generate_schema.py
# Start MCP server
python server.py
MCP Resources
warehouse://tables → List all tables
warehouse://{table}/columns → Show column details
warehouse://{table}/sample → First 3 rows of a table
MCP Tools
Use JSON-RPC requests to call tools:
nlp2sql
{
"method": "tools/call",
"params": {
"name": "nlp2sql",
"arguments": { "user_query": "Show all products with quantity > 50" }
}
}
table_info
{
"method": "tools/call",
"params": {
"name": "table_info",
"arguments": { "table": "products" }
}
}
Example Workflow
python generate_schema.py python server.py
Then query in natural language:
{
"method": "tools/call",
"params": {
"name": "nlp2sql",
"arguments": { "user_query": "List all suppliers and their contact emails" }
}
}
The SQL is generated, validated, and executed safely.
Key Parameters
ALLOW_WRITE_SQL=false → Keeps queries read-only (default)
MAX_ROWS=1000 → Caps query result size
SQL_TIMEOUT_SECONDS=30 → Query timeout