abumalick/openapi-mcp
If you are the rightful owner of openapi-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to dayong@mcphub.com.
MCP (Model Context Protocol) server that enables LLMs to explore OpenAPI specifications.
@abumalick/openapi-mcp
MCP (Model Context Protocol) server that enables LLMs to explore OpenAPI specifications. Load any OpenAPI spec and query its endpoints, parameters, request bodies, and response schemas through natural conversation.
Installation
npm install -g @abumalick/openapi-mcp
Configuration
You can pre-load OpenAPI specs at startup using the --spec or -s flag with format alias=url.
OpenCode
Add to your OpenCode config (~/.opencode/config.json):
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"openapi": {
"type": "local",
"command": ["npx", "-y", "@abumalick/openapi-mcp"],
"enabled": true
}
}
}
With pre-loaded specs:
{
"$schema": "https://opencode.ai/config.json",
"mcp": {
"openapi": {
"type": "local",
"command": [
"npx", "-y", "@abumalick/openapi-mcp",
"--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml",
"--spec", "myapi=https://api.example.com/openapi.json"
],
"enabled": true
}
}
}
Learn more about configuring MCP servers in OpenCode.
Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):
{
"mcpServers": {
"openapi": {
"command": "npx",
"args": ["-y", "@abumalick/openapi-mcp"]
}
}
}
With pre-loaded specs:
{
"mcpServers": {
"openapi": {
"command": "npx",
"args": [
"-y", "@abumalick/openapi-mcp",
"--spec", "petstore=https://petstore3.swagger.io/api/v3/openapi.yaml"
]
}
}
}
Example Workflow
With pre-loaded specs, the LLM can immediately query them:
User: What specs are available?
Assistant: [calls openapi_list_specs] -> shows petstore is loaded
User: What endpoints are available for pets?
Assistant: [calls openapi_list_endpoints with alias="petstore", tag="pet"]
User: Show me details about the GET /pet/{petId} endpoint
Assistant: [calls openapi_get_endpoint with alias="petstore", method="GET", path="/pet/{petId}"]
Without pre-loading, first load the spec:
User: Load the Petstore API spec
Assistant: [calls openapi_load with source="https://petstore3.swagger.io/api/v3/openapi.yaml", alias="petstore"]
Tools
openapi_list_specs
List all currently loaded OpenAPI specs. Use this to see what's available.
{}
Returns:
{
"specs": [
{ "alias": "petstore", "title": "Swagger Petstore", "version": "1.0.0", "endpointCount": 19 }
]
}
openapi_load
Load an OpenAPI spec from URL or file path. Only needed if spec is not pre-loaded.
{
"source": "https://api.example.com/openapi.yaml",
"alias": "example"
}
openapi_list_endpoints
List endpoints with optional filtering.
{
"alias": "example",
"tag": "users",
"search": "create"
}
openapi_get_endpoint
Get detailed endpoint information.
{
"alias": "example",
"method": "POST",
"path": "/users"
}
Supported OpenAPI Versions
- OpenAPI 3.0.x
- OpenAPI 3.1.x
Note: Swagger 2.0 specs should be converted to OpenAPI 3.x format first.
Development
# Install dependencies
npm install
# Build
npm run build
# Run tests
npm test
# Development mode
npm run dev
License
MIT