wevertonsc/mcp_server
If you are the rightful owner of mcp_server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
Fast MCP Server is a FastAPI-based server that utilizes Google Gemini LLM to generate interview questions from job descriptions.
README.md
Fast MCP Server – Interview Question Generator
This project is a FastAPI-based MCP (Model Control Protocol) Server that uses Google Gemini LLM to generate five interview questions from a given job description. It can run locally for testing or be deployed to Google Cloud Run using Terraform.
Features
- REST API built with FastAPI
- Uses Gemini 2.5 Flash model via
google-generativeai - /generate_questions endpoint to generate interview questions
- Deployable to Google Cloud Run
- Secure Gemini API key stored in Secret Manager
- Infrastructure as code with Terraform
Project Structure
.
├── mcp_server.py # FastAPI app source code
├── requirements.txt # Python dependencies
├── docker-compose.yml # Container build file
├── terraform/
│ ├── provider.tf
│ ├── variables.tf
│ ├── enable_apis.tf
│ ├── service_account.tf
│ ├── secret_manager.tf
│ ├── secret_iam.tf
│ ├── cloud_run.tf
│ ├── outputs.tf
│ └── terraform.tfvars.example
└── README.md
Prerequisites
Make sure you have the following installed locally:
- Python 3.11+
- Docker
- Terraform ≥ 1.2.0
- Google Cloud SDK (
gcloud) - A GCP project with billing enabled
- A Gemini API key from Google AI Studio
Local Setup and Testing
1. Clone the Repository
git clone https://github.com/your-username/fast-mcp-server.git
cd fast-mcp-server
2. Create and Activate a Virtual Environment
python -m venv .venv
source .venv/bin/activate # Linux / Mac
# or
.venv\Scripts\activate # Windows
3. Install Dependencies
pip install --upgrade pip
pip install -r requirements.txt
4. Set the Gemini API Key
export GEMINI_API_KEY="your_api_key_here" # Linux / Mac
# or
setx GEMINI_API_KEY "your_api_key_here" # Windows (PowerShell)
5. Run the Application
uvicorn mcp_server:app --reload
6. Test the Endpoint
curl -X POST http://127.0.0.1:8000/generate_questions \
-H "Content-Type: application/json" \
-d '{"description": "We are looking for a data engineer with experience in Python, SQL, ETL pipelines, and cloud tools like GCP or AWS."}'
Expected output:
{
"questions": [
"1. How would you design an ETL pipeline using Python and SQL?",
"2. What are the key differences between GCP and AWS for data workflows?",
"3. How would you handle failures in a production data pipeline?",
"4. What strategies do you use to optimize SQL queries for large datasets?",
"5. Describe a scenario where you automated a data process using Python."
]
}
Docker Build (Optional)
1. Build Docker Image
docker build -t mcp-server:latest .
2. Run Locally in Docker
docker run -p 8000:8000 -e GEMINI_API_KEY=$GEMINI_API_KEY mcp-server:latest
3. Test
curl http://127.0.0.1:8000/health
Deploy to Google Cloud Run with Terraform
All Terraform configuration files are in the terraform/ directory.
1. Authenticate and Configure GCP
gcloud auth login
gcloud config set project YOUR_PROJECT_ID
gcloud auth application-default login
2. Build and Push Docker Image to GCP
IMAGE="gcr.io/YOUR_PROJECT_ID/mcp-server:latest"
docker build -t $IMAGE .
docker push $IMAGE
3. Configure Terraform Variables
Create a file named terraform.tfvars inside the terraform/ folder:
project = "your-gcp-project-id"
region = "us-central1"
image = "gcr.io/your-gcp-project-id/mcp-server:latest"
gemini_api_key = "YOUR_GEMINI_API_KEY"
(Do not commit this file with the real API key to version control.)
4. Initialize Terraform
cd terraform
terraform init
5. Deploy Infrastructure
terraform apply
Type yes to confirm.
Terraform will:
- Enable required GCP APIs
- Create a Cloud Run service
- Store your Gemini API key in Secret Manager
- Assign necessary IAM roles
- Expose a public endpoint
6. Get the Cloud Run URL
After deployment, Terraform will output the service URL:
Outputs:
cloud_run_url = "https://mcp-server-xxxxxx-uc.a.run.app"
7. Test the Deployed API
curl -X POST https://mcp-server-xxxxxx-uc.a.run.app/generate_questions \
-H "Content-Type: application/json" \
-d '{"description": "We are looking for a backend engineer with FastAPI and Google Cloud experience."}'
Troubleshooting
- 403 Permission Denied:
Check IAM roles for the Cloud Run service account (should have
Secret Manager Secret Accessor). - 500 Internal Server Error:
Make sure
GEMINI_API_KEYis correctly stored and accessible by the service. - Invalid JSON Error: Ensure you are sending proper JSON with double quotes, not single quotes.
Cleanup
To remove all deployed resources:
terraform destroy
Docker Compose
Run command PROD:
docker-compose up --build
Run command DEV:
docker-compose -f docker-compose.dev.yml up --build
Stop command:
docker-compose down
Uvicorn Server
Uvicorn is an ASGI web server implementation for Python.
Run MCP Server
uvicorn mcp_server:app --reload