MCP-Server

me-sachinsingh/MCP-Server

3.2

If you are the rightful owner of MCP-Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

This project demonstrates a complete implementation of an HTTP-based Model Context Protocol (MCP) server using FastAPI, including Server-Sent Events (SSE) for streaming responses.

Tools
2
Resources
0
Prompts
0

HTTP-based MCP Server Implementation

This project demonstrates a complete implementation of an HTTP-based Model Context Protocol (MCP) server using FastAPI, including Server-Sent Events (SSE) for streaming responses.

Project Structure

MCP Server/
ā”œā”€ā”€ mcp_server_add.py          # Basic stdio-based MCP server
ā”œā”€ā”€ mcp_server_add_sse.py      # SSE-simulated version  
ā”œā”€ā”€ mcp_server_add_http.py     # Full HTTP-based MCP server ⭐
ā”œā”€ā”€ mcp_client_http.py         # HTTP client example
ā”œā”€ā”€ requirements.txt           # Python dependencies
└── README.md                  # This file

Key Features of HTTP MCP Server

1. HTTP JSON-RPC Endpoint

  • Endpoint: POST /mcp
  • Protocol: JSON-RPC 2.0 over HTTP
  • Content-Type: application/json

2. Server-Sent Events (SSE) Support

  • Endpoint: GET /mcp/sse/{client_id}
  • Real-time streaming of tool execution progress
  • Automatic client management and cleanup

3. Built-in Tools

  • add_numbers: Add two numbers with optional streaming
  • multiply_numbers: Multiply two numbers with optional streaming

4. Standard HTTP Features

  • CORS support for web clients
  • Health check endpoint (/health)
  • Auto-generated API documentation (/docs)
  • Proper error handling and logging

Installation

  1. Install Python dependencies:

    pip install -r requirements.txt
    
  2. Start the HTTP server:

    python mcp_server_add_http.py
    
  3. The server will be available at:

Usage Examples

1. Basic JSON-RPC Requests

Initialize the server:

curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -d '{
    "jsonrpc": "2.0",
    "method": "initialize",
    "id": "1"
  }'

List available tools:

curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -d '{
    "jsonrpc": "2.0",
    "method": "tools/list",
    "id": "2"
  }'

Call a tool:

curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -d '{
    "jsonrpc": "2.0",
    "method": "tools/call",
    "params": {
      "name": "add_numbers",
      "arguments": {"a": 10, "b": 5}
    },
    "id": "3"
  }'

2. Streaming with Server-Sent Events

Call a tool with streaming:

curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -d '{
    "jsonrpc": "2.0",
    "method": "tools/call",
    "params": {
      "name": "add_numbers",
      "arguments": {"a": 10, "b": 5, "stream": true}
    },
    "id": "4"
  }'

Listen to the SSE stream:

curl -N http://localhost:8000/mcp/sse/{client_id}

3. Using the Python Client

from mcp_client_http import MCPHTTPClient

# Create client
client = MCPHTTPClient("http://localhost:8000")

# Initialize connection
init_response = client.initialize()

# List tools
tools = client.list_tools()

# Call a tool
result = client.call_tool("add_numbers", {"a": 10, "b": 5})

# Call with streaming
stream_result = client.call_tool("add_numbers", {"a": 10, "b": 5, "stream": True})

Architecture Overview

HTTP MCP Server Components

  1. FastAPI Application

    • Handles HTTP requests/responses
    • Provides automatic API documentation
    • Manages CORS and middleware
  2. JSON-RPC Handler

    • Implements MCP protocol methods
    • Validates requests and formats responses
    • Routes tool calls to appropriate handlers
  3. Tool Implementations

    • Modular tool functions
    • Support both sync and async operations
    • Optional streaming via SSE
  4. SSE Manager

    • Manages client connections
    • Handles streaming data queues
    • Automatic cleanup and heartbeat

Request Flow

Client → HTTP POST /mcp → JSON-RPC Handler → Tool Function → Response
                          ↓ (if streaming)
                     SSE Generator → Client via GET /mcp/sse/{id}

Advantages of HTTP-based MCP

1. Web-Native

  • Works with any HTTP client
  • Browser-compatible (with CORS)
  • Standard REST patterns

2. Scalable

  • Horizontal scaling with load balancers
  • Stateless request handling
  • Connection pooling support

3. Observable

  • HTTP status codes
  • Standard logging patterns
  • Monitoring and metrics integration

4. Secure

  • HTTPS support
  • Authentication middleware
  • Rate limiting capabilities

5. Developer-Friendly

  • Auto-generated documentation
  • Standard debugging tools
  • Language-agnostic clients

Extending the Server

Adding New Tools

  1. Define the tool schema in handle_tools_list():
{
    "name": "new_tool",
    "description": "Description of the new tool",
    "inputSchema": {
        "type": "object",
        "properties": {
            "param1": {"type": "string", "description": "Parameter description"}
        },
        "required": ["param1"]
    }
}
  1. Add tool handler in handle_tools_call():
elif tool_name == "new_tool":
    return await self.new_tool_handler(request_id, arguments)
  1. Implement the tool function:
async def new_tool_handler(self, request_id: str, arguments: Dict[str, Any]) -> Dict[str, Any]:
    # Tool implementation
    result = process_arguments(arguments)
    return {
        "jsonrpc": "2.0",
        "id": request_id,
        "result": {
            "content": [{"type": "text", "text": str(result)}],
            "isError": False
        }
    }

Adding Authentication

from fastapi import Depends, HTTPException
from fastapi.security import HTTPBearer

security = HTTPBearer()

@app.middleware("http")
async def auth_middleware(request: Request, call_next):
    if request.url.path.startswith("/mcp"):
        # Add authentication logic here
        pass
    response = await call_next(request)
    return response

Adding Rate Limiting

from slowapi import Limiter, _rate_limit_exceeded_handler
from slowapi.util import get_remote_address

limiter = Limiter(key_func=get_remote_address)
app.state.limiter = limiter

@app.post("/mcp")
@limiter.limit("10/minute")
async def mcp_endpoint(request: Request):
    # ... existing code

Performance Considerations

  1. Async/Await: Use async handlers for I/O operations
  2. Connection Pooling: Configure HTTP client pools appropriately
  3. SSE Cleanup: Implement proper client connection management
  4. Resource Limits: Set timeouts and memory limits
  5. Monitoring: Add metrics and health checks

Comparison with stdio MCP

Featurestdio MCPHTTP MCP
Transportstdin/stdoutHTTP
ClientsSame processAny HTTP client
ScalingSingle processHorizontal scaling
DebuggingProcess logsHTTP logs + tools
SecurityProcess isolationHTTP security
StreamingLine-basedServer-Sent Events

Testing

Run the client example:

python mcp_client_http.py

Test with curl:

# Health check
curl http://localhost:8000/health

# Initialize
curl -X POST http://localhost:8000/mcp -H "Content-Type: application/json" -d '{"jsonrpc":"2.0","method":"initialize","id":"1"}'

Production Deployment

For production use:

  1. Use a production ASGI server:

    pip install gunicorn
    gunicorn -w 4 -k uvicorn.workers.UvicornWorker mcp_server_add_http:app
    
  2. Add reverse proxy (nginx/Apache)

  3. Configure HTTPS

  4. Add monitoring and logging

  5. Set up database for persistent state if needed

This HTTP-based MCP implementation provides a solid foundation for building scalable, web-native MCP servers that can integrate with modern application architectures.