Lymah123/mcp-todo-server
If you are the rightful owner of mcp-todo-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
A production-ready Model Context Protocol (MCP) server using goose and Claude, demonstrating enterprise patterns for production systems.
MCP Todo Server
A production-grade TODO server implementing the Model Context Protocol (MCP) with FastAPI, PostgreSQL, Redis, and JWT authentication.
Built with pair programming assistance from goose and GitHub Copilot 🤖
✨ Features
- 🔐 JWT Authentication with RS256 (auto-generates keys)
- 🗄️ PostgreSQL database with connection pooling
- ⚡ Redis rate limiting with in-memory fallback
- 🎯 MCP Protocol tools and resources
- 📊 Structured Logging with OpenTelemetry support
- 🐳 Docker Compose for easy local development
- 🔒 Rate Limiting per user with sliding window algorithm
🚀 Quick Start
Prerequisites
- Python 3.11+
- Docker & Docker Compose
- Git (optional)
1. Clone and Setup
git clone <your-repo>
cd mcp-todo-server
# Create virtual environment
python -m venv venv
# Activate virtual environment
# Windows PowerShell:
venv\Scripts\Activate.ps1
# Windows CMD:
venv\Scripts\activate.bat
# Linux/Mac:
source venv/bin/activate
# Install dependencies
pip install -r requirements.txt
2. Start Infrastructure
# Start PostgreSQL and Redis
docker-compose up -d
# Verify services are healthy
docker-compose ps
# You should see both postgres and redis with status "Up (healthy)"
3. Run the Server
python -m mcp_todo_server
Server starts on http://localhost:8000 🎉
Note:
- JWT keys are auto-generated in
keys/directory on first run - Database schema is automatically created from
init.sql - No manual migration or seeding needed!
🐳 Docker Deployment
Quick Start (Full Stack)
# Build and start everything
make prod
# Or without make:
docker-compose up -d --build
Access the API at:
- API: http://localhost:8000
- Health: http://localhost:8000/health
- Docs: http://localhost:8000/docs
Development Mode (Local Python)
# Start only PostgreSQL + Redis
make dev
# Or without make:
docker-compose -f docker-compose.dev.yml up -d
# Run Python locally (in another terminal)
python -m mcp_todo_server
Useful Commands
# View logs
make logs # All services
make logs-app # App only
docker-compose logs -f postgres # PostgreSQL only
# Check health
make health
# Access database
make db-shell
# Access Redis
make redis-shell
# Stop services
make down
# Clean everything (including data)
make clean
Environment Variables for Docker
Create .env file for production:
ENV=production
LOG_LEVEL=WARNING
POSTGRES_DSN=postgresql://postgres:strongpassword@postgres:5432/todos
REDIS_URL=redis://redis:6379/0
Then use it:
docker-compose --env-file .env up -d
🐳 Docker Quick Commands
Windows PowerShell
# Start full stack
docker-compose up -d
# View logs
docker-compose logs -f app
# Check health
Invoke-WebRequest -Uri http://localhost:8000/health | ConvertFrom-Json
# Get token
$token = (Invoke-RestMethod -Uri "http://localhost:8000/auth/token?username=testuser" -Method Post).access_token
# Stop stack
docker-compose down
# Clean everything
docker-compose down -v
Linux/Mac (with Make)
make up # Start stack
make logs # View logs
make health # Check health
make down # Stop stack
make clean # Remove everything
📊 Resource Usage
Your stack is very efficient:
- App: ~60 MB RAM
- PostgreSQL: ~24 MB RAM
- Redis: ~4 MB RAM
- Total: ~88 MB RAM
Perfect for development and small production deployments! 🚀
📖 API Usage
Health Check
curl http://localhost:8000/health
Expected response:
{
"status": "healthy",
"database": "connected",
"redis": "connected",
"version": "1.0.0"
}
Authentication
Get a JWT token:
curl -X POST "http://localhost:8000/auth/token?username=testuser"
PowerShell:
$response = Invoke-RestMethod -Uri "http://localhost:8000/auth/token?username=testuser" -Method Post
$token = $response.access_token
$headers = @{Authorization = "Bearer $token"}
Response:
{
"access_token": "eyJhbGc...",
"token_type": "bearer"
}
Task Operations
Create a task:
# Bash
curl -X POST "http://localhost:8000/tasks?title=Buy%20milk&description=Organic" \
-H "Authorization: Bearer YOUR_TOKEN"
# PowerShell
Invoke-RestMethod -Uri "http://localhost:8000/tasks?title=Buy milk&description=Organic" `
-Method Post -Headers $headers
List tasks:
# Bash
curl "http://localhost:8000/tasks" \
-H "Authorization: Bearer YOUR_TOKEN"
# PowerShell
Invoke-RestMethod -Uri "http://localhost:8000/tasks" -Method Get -Headers $headers
Update a task:
# Bash
curl -X PATCH "http://localhost:8000/tasks/1?completed=true" \
-H "Authorization: Bearer YOUR_TOKEN"
# PowerShell
Invoke-RestMethod -Uri "http://localhost:8000/tasks/1?completed=true" `
-Method Patch -Headers $headers
Delete a task:
# Bash
curl -X DELETE "http://localhost:8000/tasks/1" \
-H "Authorization: Bearer YOUR_TOKEN"
# PowerShell
Invoke-RestMethod -Uri "http://localhost:8000/tasks/1" -Method Delete -Headers $headers
Delete all tasks:
curl -X DELETE "http://localhost:8000/tasks" \
-H "Authorization: Bearer YOUR_TOKEN"
MCP Protocol Endpoints
List todos (MCP tool):
curl -X POST "http://localhost:8000/mcp/tools/list_todos" \
-H "Authorization: Bearer YOUR_TOKEN"
Invoke-RestMethod -Uri "http://localhost:8000/mcp/tools/list_todos" `
-Method Post -Headers $headers | ConvertTo-Json -Depth 3
Add todo (MCP tool):
curl -X POST "http://localhost:8000/mcp/tools/add_todo?title=New%20Task" \
-H "Authorization: Bearer YOUR_TOKEN"
Get tasks resource:
curl "http://localhost:8000/mcp/resources/tasks" \
-H "Authorization: Bearer YOUR_TOKEN"
Export tasks as JSON:
curl "http://localhost:8000/export/tasks.json" \
-H "Authorization: Bearer YOUR_TOKEN"
🏗️ Project Structure
mcp-todo-server/
├── mcp_todo_server/
│ ├── __main__.py # Entry point (uvicorn runner)
│ ├── app.py # FastAPI app factory & routes
│ ├── auth.py # JWT authentication (RS256)
│ ├── config.py # Pydantic settings
│ ├── database.py # PostgreSQL connection pool
│ ├── rate_limiter.py # Redis rate limiting
│ ├── observability.py # Logging & OpenTelemetry
│ └── mcp/
│ ├── tools.py # MCP tool endpoints
│ └── resources.py # MCP resource endpoints
├── docker-compose.yml # PostgreSQL + Redis services
├── init.sql # Database schema (auto-loaded)
├── requirements.txt # Python dependencies
├── .env # Environment variables (optional)
└── keys/ # JWT keys (auto-generated)
├── private.pem
└── public.pem
🔧 Configuration
Create a .env file (optional, has sensible defaults):
# Application
APP_NAME=MCP Todo Server
ENV=development
HOST=0.0.0.0
PORT=8000
LOG_LEVEL=INFO
# Database
POSTGRES_DSN=postgresql://postgres:postgres@localhost:5432/todos
DB_POOL_MIN_SIZE=2
DB_POOL_MAX_SIZE=10
# Redis
REDIS_URL=redis://localhost:6379/0
# Rate Limiting
RATE_LIMIT_REQUESTS=100
RATE_LIMIT_WINDOW_SECONDS=60
# JWT (keys auto-generated if not specified)
JWT_ALGORITHM=RS256
JWT_ACCESS_TOKEN_EXPIRE_MINUTES=60
JWT_ISSUER=mcp-todo-server
JWT_AUDIENCE=mcp-client
PRIVATE_KEY_PATH=keys/private.pem
PUBLIC_KEY_PATH=keys/public.pem
🐳 Docker
Development (Current Setup)
# Start PostgreSQL + Redis only
docker-compose up -d
# Run Python server locally
python -m mcp_todo_server
Full Stack with Docker (Coming Soon)
# Build and start all services (app + DB + Redis)
docker-compose -f docker-compose.full.yml up --build
# Stop services
docker-compose down
# Reset everything (including data)
docker-compose down -v
Database Management
Access PostgreSQL:
docker exec -it mcp-todo-postgres psql -U postgres -d todos
# Inside psql:
\dt # List tables
\d tasks # Describe tasks table
SELECT * FROM users;
\q # Quit
Access Redis:
docker exec -it mcp-todo-redis redis-cli
# Inside redis-cli:
PING # Test connection
KEYS * # List all keys
GET rl:tasks:testuser # Check rate limit
EXIT
View logs:
docker-compose logs postgres
docker-compose logs redis
docker-compose logs -f # Follow all logs
🗄️ Database Schema
The schema in init.sql is automatically loaded on first run:
-- Users table
CREATE TABLE users (
id SERIAL PRIMARY KEY,
username VARCHAR(255) UNIQUE NOT NULL,
email VARCHAR(255) UNIQUE NOT NULL,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP
);
-- Tasks table
CREATE TABLE tasks (
id SERIAL PRIMARY KEY,
user_id INTEGER NOT NULL REFERENCES users(id),
title VARCHAR(500) NOT NULL,
description TEXT,
completed BOOLEAN DEFAULT FALSE,
priority INTEGER DEFAULT 0,
created_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP,
due_date TIMESTAMP WITH TIME ZONE,
tags TEXT[] DEFAULT '{}'
);
-- Auto-update trigger for updated_at
CREATE TRIGGER update_tasks_updated_at
BEFORE UPDATE ON tasks
FOR EACH ROW
EXECUTE FUNCTION update_updated_at_column();
Default user: testuser (id=1) is created automatically.
🧪 Testing
# Install test dependencies
pip install pytest pytest-asyncio httpx
# Run tests
pytest
# With coverage
pytest --cov=mcp_todo_server
# Verbose output
pytest -v
Example test:
import pytest
from httpx import AsyncClient
from mcp_todo_server.app import create_app
@pytest.mark.asyncio
async def test_health_check():
app = create_app()
async with AsyncClient(app=app, base_url="http://test") as client:
response = await client.get("/health")
assert response.status_code == 200
assert response.json()["status"] in ["healthy", "degraded"]
🔒 Security Features
- ✅ JWT Authentication with RS256 asymmetric encryption
- ✅ Auto-generated RSA keys (2048-bit) on first run
- ✅ Rate Limiting per user (configurable, default: 100 req/min)
- ✅ SQL Injection Protection via parameterized queries
- ✅ Connection Pooling prevents resource exhaustion
- ✅ Password-less auth (suitable for internal tools)
- ✅ Token expiration (default: 60 minutes)
Production Recommendations:
- Use environment variables for secrets
- Enable HTTPS/TLS
- Implement proper user authentication
- Add CORS configuration
- Use secrets management (AWS Secrets Manager, HashiCorp Vault)
📊 Monitoring & Observability
Logs
Structured JSON-like logs with timestamps:
timestamp=2025-11-05T04:28:33.043998+00:00 | level=INFO | logger=mcp_todo_server.app | message=✅ Server startup complete
Health Endpoint
curl http://localhost:8000/health
Returns:
{
"status": "healthy", // or "degraded"
"database": "connected", // or "disconnected"
"redis": "connected", // or "disconnected"
"version": "1.0.0"
}
Rate Limit Headers
Responses include rate limit info:
X-RateLimit-Limit: 100
X-RateLimit-Remaining: 99
X-RateLimit-Reset: 1730851200
Retry-After: 60
🚀 Production Deployment
Environment Variables
Set these in production:
ENV=production
LOG_LEVEL=WARNING
POSTGRES_DSN=postgresql://user:pass@prod-host:5432/todos
REDIS_URL=redis://prod-redis:6379/0
JWT_SECRET_KEY=your-super-secret-key
Dockerfile (Coming Soon)
FROM python:3.11-slim
WORKDIR /app
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
CMD ["python", "-m", "mcp_todo_server"]
📚 API Documentation
Once running, visit:
- Swagger UI: http://localhost:8000/docs
- ReDoc: http://localhost:8000/redoc
- OpenAPI JSON: http://localhost:8000/openapi.json
🛠️ Development
Manual Key Generation
If auto-generation fails:
# Bash/Linux/Mac
mkdir -p keys
openssl genrsa -out keys/private.pem 2048
openssl rsa -in keys/private.pem -pubout -out keys/public.pem
# PowerShell
if (!(Test-Path keys)) { New-Item -ItemType Directory -Path keys }
openssl genrsa -out keys/private.pem 2048
openssl rsa -in keys/private.pem -pubout -out keys/public.pem
Hot Reload
Server auto-reloads on code changes in development mode:
ENV=development python -m mcp_todo_server
🤝 Contributing
This project was built with pair programming assistance from:
- goose - Architecture design and problem-solving
- GitHub Copilot - AI code completion and suggestions
How to Contribute
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Make your changes
- Add tests for new functionality
- Ensure all tests pass (
pytest) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
📝 License
MIT License - see LICENSE file for details
📧 Support
- Issues:
- Discussions:
- MCP Protocol: Model Context Protocol Docs
- FastAPI: FastAPI Documentation
🙏 Acknowledgments
- Built with FastAPI - Modern Python web framework
- PostgreSQL - Robust relational database
- Redis - In-memory data store for rate limiting
- Model Context Protocol - Standardized AI context sharing
- goose & GitHub Copilot - AI pair programming assistants
Built with ❤️ using FastAPI, PostgreSQL, Redis, and AI assistance