grzegorz-aniol/ultimate-mcp
If you are the rightful owner of ultimate-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Ultimate MCP Server is a versatile and minimalistic server designed to seamlessly integrate with any MCP-compatible client, providing a robust platform for AGI-ready applications.
Ultimate MCP Server
The Ultimate MCP Server is the only MCP server on the market that’s truly AGI‑ready out of the box—an ultra‑minimal, HTTP‑only FastMCP v2 endpoint that snaps into any MCP‑compatible client and cooperates with AGI from any provider. No stdio. No scaffolding. Point your client at /mcp, and retire every other server. One server. All providers. Zero doubt.
Playful MCP‑hype disclaimer: this repo is a tongue‑in‑cheek meta‑joke. Under the tuxedo it’s intentionally tiny (think “echo server with suits and badges”). Protocols are serious; this repo is not. Use responsibly, meme liberally. The answer is 42.
- Transport: HTTP (no stdio)
- Endpoint path: /mcp
- Server name: ultimate-mcp
Requirements
- Python 3.13+
- uv (recommended)
- macOS (tailored instructions; should work cross-platform)
Install
Sync dependencies (FastMCP + MCP SDK are already declared in pyproject.toml):
uv sync
Run
Start the server on localhost:8000 over HTTP:
uv run ultimate-mcp --host 127.0.0.1 --port 8000
This serves the MCP HTTP endpoint at:
Note: This is an MCP endpoint (JSON-RPC over HTTP/stream) intended for MCP clients, not a human-readable page.
Docker
Build the image (multi-stage, uv-based, Python 3.13-slim runtime):
docker build -t ultimate-mcp:latest .
Run the server (exposes HTTP SSE on port 8000, path /mcp):
docker run --rm -p 8000:8000 ultimate-mcp:latest
# Endpoint: http://localhost:8000/mcp
Configure host/port via env or CLI flags:
# Using env (inside container defaults to host=0.0.0.0, port=8000)
docker run --rm -p 9000:9000 -e MCP_PORT=9000 ultimate-mcp:latest
# Or override command flags
docker run --rm -p 8000:8000 ultimate-mcp:latest --host 0.0.0.0 --port 8000
Notes:
- Runs as a non-root user
- Uses a vendored virtualenv at /opt/venv
- Reproducible installs via uv and uv.lock
Using from MCP-compatible clients
Cursor
Config file: ~/.cursor/mcp.json
{
"mcpServers": {
"ultimate-mcp": {
"url": "http://127.0.0.1:8000/mcp"
}
}
}
Claude Desktop
Config file (macOS): ~/Library/Application Support/Claude/claude_desktop_config.json
(Windows: %APPDATA%\Claude\claude_desktop_config.json)
Claude Desktop connects to remote HTTP/SSE servers via mcp-remote:
{
"mcpServers": {
"ultimate-mcp": {
"command": "npx",
"args": ["mcp-remote", "http://127.0.0.1:8000/mcp"]
}
}
}
Reference: modelcontextprotocol.io “Connect to Local MCP Servers”
Cline (VS Code extension)
Config file (macOS): ~/Library/Application Support/Code/User/globalStorage/saoudrizwan.claude-dev/settings/cline_mcp_settings.json
{
"mcpServers": {
"ultimate-mcp": {
"url": "http://127.0.0.1:8000/mcp",
"disabled": false,
"autoApprove": []
}
}
}
Reference: GitMCP README “Connecting Cline”
VS Code (Copilot agent mode)
Workspace config: .vscode/mcp.json
User config: use Command Palette “MCP: Open User Configuration”
{
"servers": {
"ultimate-mcp": {
"type": "http",
"url": "http://127.0.0.1:8000/mcp"
}
}
}
Reference: code.visualstudio.com “Use MCP servers in VS Code”
Windsurf (Codeium)
Config file: ~/.codeium/windsurf/mcp_config.json
{
"mcpServers": {
"ultimate-mcp": {
"serverUrl": "http://127.0.0.1:8000/mcp"
}
}
}
Reference: GitMCP README “Connecting Windsurf”
MCP Inspector (for testing)
npx @modelcontextprotocol/inspector
- Transport: SSE/HTTP
- URL: http://127.0.0.1:8000/mcp
Notes:
- This server exposes an HTTP SSE endpoint at /mcp, compatible with clients supporting MCP over HTTP/SSE.
- For Claude Desktop, use the mcp-remote wrapper as shown above.
- If your client requires auth headers, set them in its config (not required by this server by default).
Use over the Internet with OpenAI Platform via ngrok
Expose your local MCP server to a public HTTPS URL and connect it to ChatGPT.
- Run the server locally (port 8000):
uv run ultimate-mcp --host 127.0.0.1 --port 8000
# Local endpoint: http://127.0.0.1:8000/mcp
- Install and authenticate ngrok (macOS shown; see https://ngrok.com/download for others):
brew install ngrok
ngrok config add-authtoken <YOUR_NGROK_AUTHTOKEN>
- Start a tunnel to your local server:
ngrok http 8000
- Copy the HTTPS “Forwarding” URL from ngrok and append /mcp, e.g.:
https://abc123.ngrok-free.app/mcp
- Configure Chat on OpenAI Platform to use this MCP server:
- In OpenAI Platform, create a new Chat.
- Select "Add Tools" > "MCP Server"
- Choose "+ Server" and paste the ngrok URL (including /mcp)
- Save. Optionally enable auto-approve for the "always_ask" tool
Configured MCP Server:

Conversation:

Taks for Cline:

Notes:
- ngrok URLs change each time unless you use a reserved domain. Update the ChatGPT MCP server URL when it changes.
- The endpoint is an MCP JSON-RPC-over-HTTP/SSE endpoint, not a human-facing webpage.
- Corporate proxies/firewalls may need to allow SSE/long-lived HTTP connections.
Notes
- This server exposes only HTTP transport; there is no stdio transport usage anywhere in the implementation.
- Host/port can be customized via CLI flags or environment variables:
- MCP_HOST / MCP_PORT
- Or
--host/--portflags
Development
- Format and type hints are compatible with Pylance/pyright.
- To modify or add tools, decorate functions with
@mcp.tool(). FastMCP auto-derives JSON schemas from type hints and docstrings.