solidus85/mcp-server
If you are the rightful owner of mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server is a comprehensive platform designed for building AI-powered applications, integrating vector databases, PostgreSQL, and LLM capabilities.
MCP Server - Model Context Protocol Server with LLM Integration
A comprehensive Model Context Protocol (MCP) server implementation with vector database support, PostgreSQL integration, and LLM capabilities. This server provides a robust platform for building AI-powered applications with email management, document processing, and semantic search capabilities.
Features
Core Capabilities
- Model Context Protocol (MCP) - Full implementation with stdio transport
- Vector Database - ChromaDB integration for semantic search and embeddings
- PostgreSQL Database - Relational data storage with async SQLAlchemy
- LLM Integration - Support for OpenAI and Anthropic Claude APIs
- REST API - FastAPI-based API with authentication and rate limiting
- Email Management - Comprehensive email ingestion and organization system
Email Management System
- Automatic person and project discovery from emails
- Thread tracking and conversation management
- Domain-based project assignment
- Rich metadata storage using PostgreSQL JSONB
- Full-text and semantic search capabilities
Technical Features
- Async/await throughout for high performance
- Database migrations with Alembic
- Comprehensive logging and monitoring
- Docker support for easy deployment
- JWT-based authentication
- Rate limiting and request throttling
Quick Start
Prerequisites
- Python 3.12+
- PostgreSQL 16+
- Docker and Docker Compose (optional)
Local Development Setup
- Clone the repository
git clone https://github.com/yourusername/mcp-server.git
cd mcp-server
- Create virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
- Install dependencies
pip install -r requirements.txt
- Set up environment variables
cp .env.template .env
# Edit .env with your configuration
- Set up PostgreSQL database
# Create database and user
sudo -u postgres psql
CREATE DATABASE mcp_db;
CREATE USER mcp_user WITH PASSWORD 'your-secure-password';
GRANT ALL PRIVILEGES ON DATABASE mcp_db TO mcp_user;
\q
# Run setup script
sudo -u postgres psql -d mcp_db -f scripts/setup_database.sql
- Run database migrations
alembic upgrade head
- Start the server
python -m uvicorn src.api.app:app --reload
The API will be available at http://localhost:8000
Docker Setup
- Start all services
docker-compose up -d
- View logs
docker-compose logs -f mcp-server
- Stop services
docker-compose down
API Documentation
Once the server is running, you can access:
- Interactive API docs:
http://localhost:8000/docs
- ReDoc documentation:
http://localhost:8000/redoc
- OpenAPI schema:
http://localhost:8000/openapi.json
Key Endpoints
Email Management
POST /api/v1/emails/ingest
- Ingest a new emailGET /api/v1/emails/
- Search emailsGET /api/v1/emails/{email_id}
- Get email detailsPATCH /api/v1/emails/{email_id}
- Update email properties
People Management
POST /api/v1/people/
- Create a personGET /api/v1/people/
- Search peopleGET /api/v1/people/{person_id}
- Get person detailsPOST /api/v1/people/{person_id}/projects/{project_id}
- Assign person to project
Project Management
POST /api/v1/projects/
- Create a projectGET /api/v1/projects/
- Search projectsGET /api/v1/projects/{project_id}
- Get project details
Vector Search
POST /api/v1/vectors/embed
- Generate embeddingsPOST /api/v1/vectors/search
- Semantic searchPOST /api/v1/vectors/index
- Index documents
Authentication
POST /auth/register
- Register new userPOST /auth/login
- Login and get JWT tokenGET /auth/me
- Get current user info
Configuration
Environment Variables
Key environment variables (see .env.template
for full list):
# Database
DATABASE_URL=postgresql+asyncpg://user:password@localhost:5432/mcp_db
# ChromaDB
CHROMA_HOST=localhost
CHROMA_PORT=8000
CHROMA_COLLECTION=mcp_vectors
# API Keys
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
# JWT Settings
JWT_SECRET=your-secret-key
JWT_ALGORITHM=HS256
JWT_EXPIRATION_HOURS=24
# Server Settings
MCP_SERVER_HOST=0.0.0.0
MCP_SERVER_PORT=8000
DEBUG=false
LOG_LEVEL=INFO
Project Structure
mcp-server/
āāā src/
ā āāā mcp/ # MCP protocol implementation
ā ā āāā server.py # MCP server core
ā ā āāā tools.py # Tool definitions
ā ā āāā resources.py # Resource management
ā āāā database/ # Database layer
ā ā āāā connection.py # Database connection manager
ā ā āāā models.py # SQLAlchemy models
ā ā āāā email_models.py # Email-specific models
ā ā āāā repositories.py # Repository pattern implementation
ā ā āāā migrations/ # Alembic migrations
ā āāā api/ # REST API
ā ā āāā app.py # FastAPI application
ā ā āāā routes/ # API endpoints
ā ā āāā schemas/ # Pydantic schemas
ā ā āāā middleware.py # Middleware stack
ā āāā vector/ # Vector database
ā ā āāā database.py # ChromaDB integration
ā ā āāā embeddings.py # Embedding generation
ā āāā llm/ # LLM integration
ā ā āāā client.py # LLM client abstraction
ā ā āāā prompts.py # Prompt management
ā āāā utils/ # Utilities
āāā scripts/ # Utility scripts
āāā tests/ # Test suite
āāā config/ # Configuration files
āāā docker-compose.yml # Docker composition
āāā Dockerfile # Docker image definition
āāā requirements.txt # Python dependencies
āāā README.md # This file
Testing
Run Tests
# Run all tests
pytest
# Run with coverage
pytest --cov=src --cov-report=html
# Run specific test file
pytest tests/test_email_pipeline.py
Test Email Pipeline
python scripts/test_email_pipeline.py
Check Database Connection
python scripts/check_db.py
Development
Database Migrations
Create a new migration:
alembic revision --autogenerate -m "Description of changes"
Apply migrations:
alembic upgrade head
Rollback migration:
alembic downgrade -1
Adding New LLM Providers
- Create provider client in
src/llm/providers/
- Implement the
BaseLLMClient
interface - Register in
src/llm/client.py
Extending the Email System
- Add new fields to models in
src/database/email_models.py
- Create migration:
alembic revision --autogenerate -m "Add new fields"
- Update schemas in
src/api/schemas/email_schemas.py
- Add business logic to
src/database/email_repositories.py
Production Deployment
Security Considerations
- Environment Variables: Never commit
.env
files - Database Passwords: Use strong, unique passwords
- JWT Secret: Generate a secure random key
- API Keys: Store securely, rotate regularly
- HTTPS: Always use HTTPS in production
- Rate Limiting: Configure appropriate limits
Scaling
- Use PostgreSQL connection pooling
- Deploy multiple API server instances behind a load balancer
- Consider Redis for caching and session storage
- Use dedicated ChromaDB cluster for vector storage
- Implement horizontal scaling for compute-intensive operations
Monitoring
- Health check endpoint:
/health
- Metrics endpoint:
/metrics
(Prometheus format) - Structured logging with correlation IDs
- Database query performance monitoring
- API response time tracking
Troubleshooting
Common Issues
-
Database Connection Error
- Check PostgreSQL is running
- Verify credentials in
.env
- Run
scripts/check_db.py
to test connection
-
Permission Denied for Schema Public
- Run:
sudo -u postgres psql -d mcp_db -f scripts/setup_database.sql
- Run:
-
ChromaDB Connection Failed
- Ensure ChromaDB is running
- Check CHROMA_HOST and CHROMA_PORT settings
-
LLM API Errors
- Verify API keys are set correctly
- Check rate limits and quotas
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Run tests and linting
- Submit a pull request
License
MIT License - see LICENSE file for details
Support
For issues and questions:
- GitHub Issues: github.com/yourusername/mcp-server/issues
- Documentation: docs.example.com
Acknowledgments
- Model Context Protocol specification
- FastAPI framework
- ChromaDB vector database
- PostgreSQL database
- SQLAlchemy ORM
- Anthropic and OpenAI for LLM APIs