hjoykim/selfhosted-MCP-Server-Example
If you are the rightful owner of selfhosted-MCP-Server-Example and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
This repository provides a comprehensive example of a self-hosted Model Context Protocol (MCP) server stack, demonstrating the integration of custom tools with FastAPI, OAuth2, JWT, and Nginx.
Self‑Hosted MCP Server Example
This repository contains a complete example of a self‑hosted Model Context Protocol (MCP) stack. It demonstrates how to expose a set of custom tools via FastAPI, secure them with OAuth2 and JSON Web Tokens (JWT), and proxy everything behind an HTTPS reverse proxy (Nginx).
⚠️ Disclaimer: The code in this example is designed for learning and demonstration. It omits many production concerns such as proper user authentication, persistent storage, error handling, rate limiting and automated certificate management. Use it as a starting point and adapt to your needs.
Contents
selfhosted-mcp-server-example/
│
├── fastapi-mcp/ # FastAPI MCP server implementation
│ ├── app/
│ │ └── main.py # Implements the MCP and JSON‑RPC endpoints
│ ├── keys/
│ │ └── public.pem # RSA public key used to verify JWTs
│ └── requirements.txt # Python dependencies
│
├── mcpauth-express/ # Express‑based OAuth2/OIDC authorization server
│ ├── mcpauth-server.js # Implements token issuance and JWKS
│ ├── package.json # Node.js dependencies
│ └── keys/
│ ├── private.pem # RSA private key for signing JWTs
│ └── public.pem # RSA public key (same as in fastapi-mcp)
│
├── nginx/ # Nginx reverse proxy configuration
│ ├── default.conf # Example virtual host config
│ ├── ai-plugin.json # Plugin manifest for ChatGPT connector
│ ├── openapi.json # Minimal OpenAPI spec for the MCP API (optional)
│ └── jwks.json # JWKS file (optional – server also serves JWKS)
│
└── README.md # This file
Prerequisites
- Python 3.10+ with
pipfor the FastAPI server. - Node.js 18+ with
npmfor the OAuth2 server. - Nginx (Docker or system installation) for the reverse proxy.
- OpenSSL for generating RSA key pairs.
Generating the RSA Key Pair
The authorization server signs access tokens with an RSA private key. The
corresponding public key is used by the MCP server to verify tokens and by
clients (e.g. ChatGPT) via the JWKS endpoint. Keys are generated in
mcpauth-express/keys:
mkdir -p mcpauth-express/keys
openssl genpkey -algorithm RSA -out mcpauth-express/keys/private.pem -pkeyopt rsa_keygen_bits:2048
openssl rsa -in mcpauth-express/keys/private.pem -pubout -out mcpauth-express/keys/public.pem
mkdir -p fastapi-mcp/keys
cp mcpauth-express/keys/public.pem fastapi-mcp/keys/public.pem
The included archive already contains a generated key pair. Feel free to regenerate your own keys; just be sure to update both locations.
Installing Dependencies
FastAPI MCP Server
Use a virtual environment (optional) and install the requirements:
cd fastapi-mcp
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
OAuth2 Authorization Server
Install Node.js dependencies:
cd mcpauth-express
npm install
Running the Services
Start the OAuth2/OIDC Server
The server listens on port 3000 by default and uses the issuer URL
http://localhost:7102. You can override these via environment variables:
cd mcpauth-express
export PORT=3000
export ISSUER=https://your-domain # external URL for the server
export CLIENT_ID=chatgpt-client
export CLIENT_SECRET=your-secret
npm start
Endpoints provided:
| Path | Description |
|---|---|
/.well-known/openid-configuration | OIDC discovery document |
/.well-known/jwks.json | JWKS containing the RSA public key |
/oauth/authorize | Authorize endpoint (simulated user login and consent) |
/oauth/token | Token endpoint (exchanges code for JWT and refresh token) |
/oauth/ping | Health check for the auth server |
Start the FastAPI MCP Server
Run the server with Uvicorn. Bind to all interfaces so Nginx can proxy to it:
cd fastapi-mcp
uvicorn app.main:app --host 0.0.0.0 --port 7101
The MCP server exposes two categories of endpoints:
| Path | Description |
|---|---|
/mcp/ping | Health check (no auth) |
/mcp/screen_candidates | Returns a dummy list of stock tickers (requires Bearer token) |
/mcp/verify_performance | Returns a dummy accuracy measure (requires Bearer token) |
/mcp/record_outcome | Echoes back the outcome provided (requires Bearer token) |
/tools/list | JSON‑RPC list of available tools (requires Bearer token) |
/tools/call | Dispatches JSON‑RPC method calls to the appropriate MCP function (requires auth) |
All protected routes require a valid JWT in the Authorization header. The
token must be signed with the private key in mcpauth-express/keys/private.pem.
Reverse Proxy with Nginx
In production you should place the MCP server and the OAuth2 server behind an
HTTPS reverse proxy. The nginx/default.conf file shows a sample virtual host
for a domain your-domain. It assumes you have obtained an SSL certificate
from Let’s Encrypt or another authority and placed it under
/etc/letsencrypt/live/your-domain/ on the host. The configuration performs
the following:
- Serves the static plugin manifest (
/.well-known/ai-plugin.json) and optionally the OpenAPI specification and JWKS from thenginxdirectory. - Proxies
/oauth/*and/.well-known/*requests to the authorization server. - Proxies
/mcp/and/tools/requests to the FastAPI MCP server. - Redirects all HTTP traffic to HTTPS.
To use this configuration with Docker Compose, mount the nginx directory
into the container at /etc/nginx/conf.d and ensure the certificate files are
present. See the comments in nginx/default.conf for details.
ChatGPT Connector Setup
When adding this self‑hosted MCP server as a ChatGPT connector (via Developer
Mode), ChatGPT will fetch the plugin manifest from
https://your-domain/.well-known/ai-plugin.json. Be sure to update the
placeholders (such as your-domain) in nginx/ai-plugin.json to your actual
domain and adjust the OAuth endpoints to match where the authorization server
is exposed (e.g. https://your-domain/oauth/authorize).
After enabling the connector, ChatGPT will automatically perform the OAuth
flow, retrieving a code via the /oauth/authorize endpoint and exchanging it
for a JWT on /oauth/token. It will then include the access token when
calling the MCP tool endpoints via JSON‑RPC.
Customising and Extending
- Adding new tools: Define a new route under
/mcp/and describe it in the list returned by/tools/list. In the JSON‑RPC handler inapp.main.tools_call, add a new branch to dispatch your method. - Implementing real logic: Replace the dummy implementations with calls to your data sources or business logic. Ensure proper error handling and input validation.
- User authentication: Integrate a real login system in the auth server.
You might replace the stubbed
demoUserwith session‑based authentication or delegate to an external identity provider. - Production deployment: Store issued codes and tokens in a database, implement refresh token rotation, rate limiting, logging and other production‑grade features.
Troubleshooting
- 401 Unauthorized from MCP server: Ensure you are including an
Authorization: Bearer <token>header with a valid JWT issued by your auth server. Check that the public key infastapi-mcp/keys/public.pemmatches the private key used to sign tokens. - ChatGPT can’t complete OAuth flow: Make sure your redirect URI and
domain in
ai-plugin.jsonmatch your deployment. Check the logs of the auth server for errors. - Nginx can’t connect to backend: Verify that the FastAPI server is
listening on
0.0.0.0and that the container network allows communication between Nginx and the MCP and auth servers. Update the proxy addresses innginx/default.confaccordingly.
Feel free to adapt and improve this example. Contributions and suggestions are welcome!