HTTP-API-To-MCP-Server

ARJ999/HTTP-API-To-MCP-Server

3.2

If you are the rightful owner of HTTP-API-To-MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

Transform any HTTP/REST API into a Model Context Protocol (MCP) server, enabling seamless interaction with AI assistants.

HTTP-API-To-MCP-Server

Transform any HTTP/REST API into a Model Context Protocol (MCP) server β€” enabling AI assistants like Cursor, VS Code Copilot, Windsurf, and Claude Desktop to interact with your existing APIs without any code modifications.

Deploy on Railway


🎯 What is This?

HTTP-API-To-MCP-Server is a lightweight protocol gateway that bridges the gap between traditional HTTP/REST APIs and the emerging Model Context Protocol (MCP) ecosystem. It acts as a transparent proxy that:

  • Converts HTTP requests/responses to MCP protocol format
  • Enables AI assistants to discover and call your APIs automatically
  • Requires zero modifications to your existing backend services
  • Supports both Server-Sent Events (SSE) and Streamable HTTP transports
  • Scales to handle production workloads with Cloudflare's Pingora engine

Built on Pingora β€” the same ultra-high-performance proxy library that powers Cloudflare's infrastructure (40M+ requests/second).


πŸš€ Quick Start

Option 1: Deploy to Railway (Recommended)

The fastest way to get started β€” one-click deployment to Railway with automatic HTTPS and persistent storage.

  1. Click the Deploy on Railway button above
  2. Configure your environment variables (see Railway Deployment)
  3. Add your OpenAPI specification
  4. Connect your MCP client

Your MCP server will be live at: https://your-app.railway.app

Option 2: Run with Docker

# Pull the pre-built image
docker pull ghcr.io/arj999/http-api-to-mcp-server:latest

# Run with your configuration
docker run -d \
  --name mcp-server \
  -p 8080:8080 \
  -v $(pwd)/config.yaml:/app/config/config.yaml \
  ghcr.io/arj999/http-api-to-mcp-server:latest

Option 3: Run Locally

# Download the binary
git clone https://github.com/ARJ999/HTTP-API-To-MCP-Server.git
cd HTTP-API-To-MCP-Server

# Edit configuration
cp config/config.example.yaml config/config.yaml
nano config/config.yaml

# Run the server
./mcp-access-point -c config/config.yaml

πŸ“‹ Prerequisites

  • For Railway: Railway account (free tier available)
  • For Docker: Docker installed
  • For Local: Linux/macOS system (Ubuntu 20.04+ recommended)
  • OpenAPI Spec: Your API's OpenAPI/Swagger specification (JSON or YAML)

🎨 Architecture

graph LR
    A[AI Assistant<br/>Cursor/VS Code] -->|MCP Protocol<br/>SSE or HTTP| B[HTTP-API-To-MCP-Server<br/>Railway/Docker]
    B -->|Standard HTTP| C1[Your REST API<br/>api.example.com]
    B -->|Standard HTTP| C2[Another API<br/>internal-service:8080]
    B -->|Standard HTTP| C3[Third-party API<br/>api.weather.gov]
    
    style A fill:#e1f5ff,stroke:#01579b,stroke-width:2px
    style B fill:#fff9c4,stroke:#f57f17,stroke-width:2px
    style C1 fill:#c8e6c9,stroke:#2e7d32,stroke-width:2px
    style C2 fill:#c8e6c9,stroke:#2e7d32,stroke-width:2px
    style C3 fill:#c8e6c9,stroke:#2e7d32,stroke-width:2px

How it works:

  1. AI Assistant sends MCP-formatted requests via SSE or Streamable HTTP
  2. HTTP-API-To-MCP-Server receives the request, parses the MCP payload
  3. Server translates the request to standard HTTP based on OpenAPI spec
  4. Backend API processes the HTTP request normally (no changes needed)
  5. Server converts the HTTP response back to MCP format
  6. AI Assistant receives the structured MCP response

βš™οΈ Configuration

Basic Configuration Structure

Create a config.yaml file with the following structure:

# Server Configuration
pingora:
  version: 1
  threads: 4
  pid_file: /tmp/pingora.pid
  upgrade_sock: /tmp/pingora_upgrade.sock

access_point:
  listeners:
    - address: 0.0.0.0:8080
  admin:
    address: "127.0.0.1:8081"
    api_key: "your-secure-api-key-here"

# MCP Services
mcps:
  - id: my-api-service
    upstream_id: 1
    path: config/openapi.json  # Local file or https:// URL

# Backend Services
upstreams:
  - id: 1
    nodes:
      "api.example.com": 1
    type: roundrobin
    scheme: https
    pass_host: rewrite
    upstream_host: api.example.com

Configuration Examples

See the directory for complete configuration examples:

  • β€” Basic REST API proxy
  • β€” Multiple APIs with different auth
  • β€” External API integration
  • β€” Private network APIs

OpenAPI Specification

Your OpenAPI spec can be:

  • Local file: config/openapi.json or config/openapi.yaml
  • Remote URL: https://api.example.com/openapi.json
  • Swagger 2.0 or OpenAPI 3.0+ formats supported

The server automatically:

  • Parses operation IDs, paths, methods, and parameters
  • Generates MCP tool definitions from your API endpoints
  • Validates requests against your schema
  • Maps responses to MCP format

πŸš‚ Railway Deployment

Step 1: Prepare Your Repository

  1. Fork this repository to your GitHub account
  2. Add your config.yaml to the config/ directory
  3. Add your OpenAPI spec to config/openapi.json

Step 2: Deploy to Railway

  1. Go to Railway
  2. Click New Project β†’ Deploy from GitHub repo
  3. Select your forked repository
  4. Railway will automatically detect the configuration

Step 3: Configure Environment Variables

In Railway dashboard, add these variables:

VariableDescriptionExample
PORTServer listening port8080
ADMIN_API_KEYAdmin API authentication keyyour-secure-key-123
CONFIG_PATHPath to config file/app/config/config.yaml

Step 4: Configure Networking

  1. In Railway dashboard, go to Settings β†’ Networking
  2. Click Generate Domain to get a public URL
  3. Your MCP server will be available at: https://your-app.railway.app

Step 5: Verify Deployment

# Test SSE endpoint
curl https://your-app.railway.app/sse

# Test Streamable HTTP endpoint
curl https://your-app.railway.app/mcp

Railway Configuration Files

The repository includes:

  • railway.toml β€” Railway build and deployment configuration
  • Dockerfile β€” Container image definition
  • docker-entrypoint.sh β€” Container startup script

πŸ”Œ Connecting MCP Clients

Cursor Desktop

  1. Open Cursor Settings (Cmd/Ctrl + ,)
  2. Navigate to Features β†’ Model Context Protocol
  3. Add a new MCP server:
{
  "mcpServers": {
    "my-api": {
      "url": "https://your-app.railway.app/sse",
      "transport": "sse"
    }
  }
}
  1. Restart Cursor
  2. Your API tools will now appear in the AI chat context

VS Code (GitHub Copilot)

Add to .vscode/settings.json:

{
  "github.copilot.chat.mcpServers": {
    "my-api": {
      "url": "https://your-app.railway.app/mcp",
      "transport": "http"
    }
  }
}

Windsurf IDE

Configure in Windsurf settings:

{
  "cascade.mcp.servers": {
    "my-api": {
      "url": "https://your-app.railway.app/sse",
      "transport": "sse"
    }
  }
}

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS):

{
  "mcpServers": {
    "my-api": {
      "url": "https://your-app.railway.app/sse",
      "transport": "sse"
    }
  }
}

MCP Inspector (Testing)

npx @modelcontextprotocol/inspector
# Navigate to http://localhost:6274/
# Select transport: SSE or Streamable HTTP
# Enter URL: https://your-app.railway.app/sse

πŸ” Security Best Practices

1. API Key Management

  • Generate strong keys: openssl rand -hex 32
  • Use environment variables for sensitive values
  • Rotate keys regularly (every 90 days recommended)
  • Use different keys for dev/staging/production

2. Network Security

  • Enable HTTPS (automatic with Railway)
  • Restrict Admin API to localhost only (127.0.0.1:8081)
  • Use firewall rules to limit access
  • Enable rate limiting in production

3. Configuration Security

# βœ… Good: Use environment variables
upstreams:
  - id: 1
    headers:
      Authorization: "${API_KEY}"  # Read from environment

# ❌ Bad: Hardcoded secrets
upstreams:
  - id: 1
    headers:
      Authorization: "Bearer secret-token-123"

4. Railway-Specific Security

  • Use Railway's Private Networking for internal services
  • Enable Deploy Protection for production environments
  • Use Environment Variables for all secrets
  • Enable Health Checks for automatic recovery

πŸ“Š Monitoring & Troubleshooting

Health Checks

# Check if server is running
curl https://your-app.railway.app/sse

# Expected response: 405 Method Not Allowed (normal for GET on SSE endpoint)

View Logs (Railway)

  1. Go to Railway dashboard
  2. Select your project
  3. Click Deployments β†’ View Logs
  4. Filter by severity: Info, Warning, Error

Common Issues

IssueCauseSolution
Connection refusedServer not runningCheck Railway logs, verify deployment
404 Not FoundWrong endpoint URLUse /sse or /mcp, not root /
401 UnauthorizedMissing/invalid API keyCheck upstream headers configuration
OpenAPI spec not foundFile path incorrectVerify path: in config.yaml
Timeout errorsUpstream service slowIncrease timeout in config

Debug Mode

Enable verbose logging in config.yaml:

pingora:
  error_log: /tmp/error.log
  log_level: debug

πŸ§ͺ Testing

Local Testing

# Start the server
./mcp-access-point -c config/config.yaml

# Test with MCP Inspector
npx @modelcontextprotocol/inspector

# Test with curl
curl -X POST http://localhost:8080/mcp \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","method":"tools/list","id":1}'

Integration Testing

See for a Python MCP client example.


πŸ“š Documentation

  • β€” Detailed Railway setup
  • β€” Complete config options
  • β€” Connect various MCP clients
  • β€” Admin API documentation
  • β€” Common issues and solutions

🀝 Contributing

Contributions are welcome! Please read our for details.

Development Setup

# Clone the repository
git clone https://github.com/ARJ999/HTTP-API-To-MCP-Server.git
cd HTTP-API-To-MCP-Server

# Build from source (requires Rust)
cargo build --release

# Run tests
cargo test

πŸ“„ License

This project is licensed under the MIT License - see the file for details.


πŸ™ Acknowledgments


πŸ“ž Support


πŸ—ΊοΈ Roadmap

  • SSE transport support
  • Streamable HTTP transport support
  • Railway deployment template
  • WebSocket transport support
  • Built-in authentication plugins
  • Prometheus metrics export
  • GraphQL API support
  • gRPC protocol support

Made with ❀️ for the MCP community