aviciot/mcp-server-template-base
If you are the rightful owner of mcp-server-template-base and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
This repository provides a comprehensive template for building Model Context Protocol (MCP) services using FastMCP, featuring a production-ready server, automatic module discovery, and a sample client.
Modular MCP Server Template
This repository provides a batteries-included starting point for building Model Context Protocol (MCP) services with FastMCP. It ships with a production-style server, automatic module discovery, an opinionated configuration system, and a sample client that exercises the stack end to end.
Highlights
- FastMCP HTTP server powered by Starlette and Uvicorn.
- Automatic discovery of tools, resources, and prompts inside the
serverpackage (toggle withAUTO_DISCOVER). - Environment driven configuration backed by
server/config/settings.yamlwith sensible defaults. - Graceful shutdown and optional JSON logging for container friendly deployments.
- Sample Oracle tuning prompt and resources that demonstrate structured prompt construction.
- Client example (
client/check_query.py) that calls the server, fetches resources, invokes prompts, and streams results through Groq. - Docker ready: compose stack brings up the server and demo client with a single command.
Project Layout
.
|-- client/
| |-- check_query.py # Example FastMCP client + Groq workflow
| |-- requirements.txt # Client dependencies
| |-- Dockerfile # Client container
| `-- query_reports/ # Generated analysis artifacts
|-- server/
| |-- server.py # Entrypoint, HTTP app, auto-discovery
| |-- mcp_app.py # Shared FastMCP instance
| |-- config.py # Settings loader (env + YAML)
| |-- config/settings.yaml # Default configuration template
| |-- tools/ # Example tools (hello, math, remote bridge)
| |-- resources/ # Example resources (server info, documents)
| |-- prompts/ # Example prompts (Oracle tuning)
| |-- requirements.txt # Server dependencies
| `-- Dockerfile # Server container
|-- docker-compose.yml # Local orchestration for server + client
|-- .env # Runtime configuration (not committed)
`-- .gitignore
Prerequisites
- Docker and Docker Compose (for the quickest start)
- Python 3.11+ if you plan to run the server or client directly on your machine
- Groq API access (only required for the demo client)
Getting Started with Docker
- Copy
.env.exampleto.env(or create.env) and fill in the relevant values, notablyGROQ_API_KEYif you plan to run the sample client. - Build and launch the stack:
docker-compose up --build - Once both services are healthy, explore the server:
curl http://localhost:9002/healthz curl http://localhost:9002/_info - In another terminal, execute the demo client inside its container:
The client fetches tuning rules from the MCP server, generates prompts, calls Groq for analysis, and writes summaries to
docker-compose exec client python check_query.pyclient/query_summary.csvplus detailed markdown reports inclient/query_reports/.
Stop the stack with docker-compose down when you are finished.
Running Locally (Optional)
If you prefer to run the components without Docker:
# Server
cd server
python -m venv .venv && .venv\Scripts\activate # adjust for your shell
pip install -r requirements.txt
uvicorn server:app --host 0.0.0.0 --port 9002
# Client (new terminal)
cd client
python -m venv .venv && .venv\Scripts\activate
pip install -r requirements.txt
python check_query.py
Make sure the .env file is available in both directories so the scripts can read matching configuration.
Configuration
The server reads its settings from environment variables (optionally expanded through server/config/settings.yaml). Key values include:
| Variable | Description | Default |
|---|---|---|
MCP_SERVER_NAME | Name advertised by the server | ModularMCPServer |
MCP_PORT | Port exposed by the HTTP transport | 9002 |
MCP_SERVER_URL | Public URL for clients (used by the sample client) | http://server:9002/mcp/ |
AUTO_DISCOVER | Enable automatic import of tools/, resources/, prompts/ | true |
LOG_JSON | Emit JSON logs when set to 1 | disabled |
CORS_ORIGINS | Comma separated list of allowed origins | http://localhost,http://server |
GROQ_API_KEY | Groq key used by the client example | none (required for demo client) |
GROQ_MODEL | Groq model name | llama-3.1-8b-instant |
REMOTE_SERVERS_ENABLED | Enable remote server mounting/proxying | false |
Update .env (or your deployment secrets manager) before running the stack. The configuration loader falls back to environment values if the YAML file is missing.
Adding New MCP Components
-
Tools – drop a new function in
server/tools/and decorate it with@mcp.tool(). WhenAUTO_DISCOVERis enabled, the module is picked up automatically on restart.# server/tools/hello_tool.py from mcp_app import mcp @mcp.tool() def hello(name: str) -> str: return f"Hello, {name}!" -
Resources – define read-only data sources with
@mcp.resource("protocol://path")insideserver/resources/. -
Prompts – return formatted strings (or message structures) using
@mcp.prompt()inserver/prompts/. The included Oracle tuning prompt is a good starting point for structured LLM interactions.
Restart the server (or let Uvicorn reload) and the new components are available immediately to connected clients.
Demo Client Workflow
client/check_query.py illustrates how to:
- Connect to the server using
fastmcp.Clientover HTTP. - Retrieve server resources (
oracle://tuning/rules). - Request prompts (
oracle_query_tuning_prompt) and forward them to Groq. - Parse model output, summarise key findings, and write reports.
Use it as a reference when wiring the server into your own applications or agent frameworks.
Extending the Template
- Replace the sample prompt/resource logic with your own domain knowledge.
- Mount or proxy additional MCP services by enabling the remote server block in
settings.yaml. - Add authentication middleware to
server/server.pyto protect endpoints in production. - Instrument logging/metrics exporters as needed for your deployment environment.
License
Released under the MIT License. See LICENSE (add one if required) for details.