openapi-mcp-server

duplocloud/openapi-mcp-server

3.2

If you are the rightful owner of openapi-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The Model Context Protocol (MCP) server is designed to facilitate communication between AI models and client applications, ensuring efficient data exchange and model management.

mcp-ai-server

FastMCP OpenAPI Proxy

This project provides a lightweight proxy MCP server using FastMCP. It reads an OpenAPI 3.x spec (openAPI.json) and automatically exposes every operation as an MCP JSON-RPC tool at /mcp.

🚀 Getting Started

Installation

git clone https://github.com/your-org/fastmcp-openapi-proxy.git
cd fastmcp-openapi-proxy

# (optional) create a virtual-env
python -m venv .venv
source .venv/bin/activate

# install dependencies
pip install -r requirements.txt

⚙️ Configuration

1 OpenAPI Spec

Place or regenerate openAPI.json:

python generateSpec.py --url http://localhost:8001/openapi.json

2 Server Settings (openAPIServer.py)

mcp.run(
    transport="http",
    host="0.0.0.0",
    port=8090,
    path="/mcp",
    log_level="info",
)

Adjust host / port / path to fit your environment.


▶️ Usage

# start the proxy
python openAPIServer.py
# → Serving HTTP on 0.0.0.0:8090 (transport=http)

Test with the provided client:

import asyncio
from fastmcp import Client

async def test_proxy():
    openapi_mcp_url = "http://localhost:8090/mcp"

    async with Client(openapi_mcp_url) as client:
        await client.initialize()
        tools = await client.list_tools()
        print("Available tools:", [t.name for t in tools.tools])

if __name__ == "__main__":
    asyncio.run(test_proxy())

📄 License

Released under the MIT License – see LICENSE for details.