selfhosted-MCP-Server-Example

hjoykim/selfhosted-MCP-Server-Example

3.2

If you are the rightful owner of selfhosted-MCP-Server-Example and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.

This repository provides a comprehensive example of a self-hosted Model Context Protocol (MCP) server stack, demonstrating the integration of custom tools with FastAPI, OAuth2, JWT, and Nginx.

Tools
3
Resources
0
Prompts
0

Self‑Hosted MCP Server Example

This repository contains a complete example of a self‑hosted Model Context Protocol (MCP) stack. It demonstrates how to expose a set of custom tools via FastAPI, secure them with OAuth2 and JSON Web Tokens (JWT), and proxy everything behind an HTTPS reverse proxy (Nginx).

⚠️ Disclaimer: The code in this example is designed for learning and demonstration. It omits many production concerns such as proper user authentication, persistent storage, error handling, rate limiting and automated certificate management. Use it as a starting point and adapt to your needs.

Contents

selfhosted-mcp-server-example/
│
├── fastapi-mcp/             # FastAPI MCP server implementation
│   ├── app/
│   │   └── main.py         # Implements the MCP and JSON‑RPC endpoints
│   ├── keys/
│   │   └── public.pem      # RSA public key used to verify JWTs
│   └── requirements.txt    # Python dependencies
│
├── mcpauth-express/         # Express‑based OAuth2/OIDC authorization server
│   ├── mcpauth-server.js   # Implements token issuance and JWKS
│   ├── package.json        # Node.js dependencies
│   └── keys/
│       ├── private.pem     # RSA private key for signing JWTs
│       └── public.pem      # RSA public key (same as in fastapi-mcp)
│
├── nginx/                   # Nginx reverse proxy configuration
│   ├── default.conf        # Example virtual host config
│   ├── ai-plugin.json      # Plugin manifest for ChatGPT connector
│   ├── openapi.json        # Minimal OpenAPI spec for the MCP API (optional)
│   └── jwks.json           # JWKS file (optional – server also serves JWKS)
│
└── README.md               # This file

Prerequisites

  1. Python 3.10+ with pip for the FastAPI server.
  2. Node.js 18+ with npm for the OAuth2 server.
  3. Nginx (Docker or system installation) for the reverse proxy.
  4. OpenSSL for generating RSA key pairs.

Generating the RSA Key Pair

The authorization server signs access tokens with an RSA private key. The corresponding public key is used by the MCP server to verify tokens and by clients (e.g. ChatGPT) via the JWKS endpoint. Keys are generated in mcpauth-express/keys:

mkdir -p mcpauth-express/keys
openssl genpkey -algorithm RSA -out mcpauth-express/keys/private.pem -pkeyopt rsa_keygen_bits:2048
openssl rsa -in mcpauth-express/keys/private.pem -pubout -out mcpauth-express/keys/public.pem

mkdir -p fastapi-mcp/keys
cp mcpauth-express/keys/public.pem fastapi-mcp/keys/public.pem

The included archive already contains a generated key pair. Feel free to regenerate your own keys; just be sure to update both locations.

Installing Dependencies

FastAPI MCP Server

Use a virtual environment (optional) and install the requirements:

cd fastapi-mcp
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt

OAuth2 Authorization Server

Install Node.js dependencies:

cd mcpauth-express
npm install

Running the Services

Start the OAuth2/OIDC Server

The server listens on port 3000 by default and uses the issuer URL http://localhost:7102. You can override these via environment variables:

cd mcpauth-express
export PORT=3000
export ISSUER=https://your-domain   # external URL for the server
export CLIENT_ID=chatgpt-client
export CLIENT_SECRET=your-secret
npm start

Endpoints provided:

PathDescription
/.well-known/openid-configurationOIDC discovery document
/.well-known/jwks.jsonJWKS containing the RSA public key
/oauth/authorizeAuthorize endpoint (simulated user login and consent)
/oauth/tokenToken endpoint (exchanges code for JWT and refresh token)
/oauth/pingHealth check for the auth server

Start the FastAPI MCP Server

Run the server with Uvicorn. Bind to all interfaces so Nginx can proxy to it:

cd fastapi-mcp
uvicorn app.main:app --host 0.0.0.0 --port 7101

The MCP server exposes two categories of endpoints:

PathDescription
/mcp/pingHealth check (no auth)
/mcp/screen_candidatesReturns a dummy list of stock tickers (requires Bearer token)
/mcp/verify_performanceReturns a dummy accuracy measure (requires Bearer token)
/mcp/record_outcomeEchoes back the outcome provided (requires Bearer token)
/tools/listJSON‑RPC list of available tools (requires Bearer token)
/tools/callDispatches JSON‑RPC method calls to the appropriate MCP function (requires auth)

All protected routes require a valid JWT in the Authorization header. The token must be signed with the private key in mcpauth-express/keys/private.pem.

Reverse Proxy with Nginx

In production you should place the MCP server and the OAuth2 server behind an HTTPS reverse proxy. The nginx/default.conf file shows a sample virtual host for a domain your-domain. It assumes you have obtained an SSL certificate from Let’s Encrypt or another authority and placed it under /etc/letsencrypt/live/your-domain/ on the host. The configuration performs the following:

  • Serves the static plugin manifest (/.well-known/ai-plugin.json) and optionally the OpenAPI specification and JWKS from the nginx directory.
  • Proxies /oauth/* and /.well-known/* requests to the authorization server.
  • Proxies /mcp/ and /tools/ requests to the FastAPI MCP server.
  • Redirects all HTTP traffic to HTTPS.

To use this configuration with Docker Compose, mount the nginx directory into the container at /etc/nginx/conf.d and ensure the certificate files are present. See the comments in nginx/default.conf for details.

ChatGPT Connector Setup

When adding this self‑hosted MCP server as a ChatGPT connector (via Developer Mode), ChatGPT will fetch the plugin manifest from https://your-domain/.well-known/ai-plugin.json. Be sure to update the placeholders (such as your-domain) in nginx/ai-plugin.json to your actual domain and adjust the OAuth endpoints to match where the authorization server is exposed (e.g. https://your-domain/oauth/authorize).

After enabling the connector, ChatGPT will automatically perform the OAuth flow, retrieving a code via the /oauth/authorize endpoint and exchanging it for a JWT on /oauth/token. It will then include the access token when calling the MCP tool endpoints via JSON‑RPC.

Customising and Extending

  • Adding new tools: Define a new route under /mcp/ and describe it in the list returned by /tools/list. In the JSON‑RPC handler in app.main.tools_call, add a new branch to dispatch your method.
  • Implementing real logic: Replace the dummy implementations with calls to your data sources or business logic. Ensure proper error handling and input validation.
  • User authentication: Integrate a real login system in the auth server. You might replace the stubbed demoUser with session‑based authentication or delegate to an external identity provider.
  • Production deployment: Store issued codes and tokens in a database, implement refresh token rotation, rate limiting, logging and other production‑grade features.

Troubleshooting

  • 401 Unauthorized from MCP server: Ensure you are including an Authorization: Bearer <token> header with a valid JWT issued by your auth server. Check that the public key in fastapi-mcp/keys/public.pem matches the private key used to sign tokens.
  • ChatGPT can’t complete OAuth flow: Make sure your redirect URI and domain in ai-plugin.json match your deployment. Check the logs of the auth server for errors.
  • Nginx can’t connect to backend: Verify that the FastAPI server is listening on 0.0.0.0 and that the container network allows communication between Nginx and the MCP and auth servers. Update the proxy addresses in nginx/default.conf accordingly.

Feel free to adapt and improve this example. Contributions and suggestions are welcome!