OpenAPI-MCP
If you are the rightful owner of OpenAPI-MCP and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The OpenAPI-MCP proxy translates OpenAPI specs into MCP tools, enabling AI agents to access external APIs without custom wrappers.
The OpenAPI to Model Context Protocol (MCP) proxy server bridges the gap between AI agents and external APIs by dynamically translating OpenAPI specifications into standardized MCP tools, resources, and prompts. This simplifies integration by eliminating the need for custom API wrappers. The server is optimized for fast transport using stdio and works seamlessly with popular LLM orchestrators. It parses OpenAPI operations, registers them as callable tools, and converts component schemas into resource objects with defined URIs. Additionally, it generates contextual prompts to guide LLMs in using the API effectively. The server supports OAuth2 authentication and is fully compliant with JSON-RPC 2.0 request/response structures. It also features auto metadata derivation, sanitized tool names, flexible parameter parsing, and enhanced parameter handling.
Features
- FastMCP Transport: Optimized for stdio, working out-of-the-box with popular LLM orchestrators.
- OpenAPI Integration: Parses and registers OpenAPI operations as callable tools.
- Resource Registration: Automatically converts OpenAPI component schemas into resource objects with defined URIs.
- Prompt Generation: Generates contextual prompts based on API operations to guide LLMs in using the API.
- OAuth2 Support: Handles machine authentication via Client Credentials flow.