MCP_Server

skpriya12/MCP_Server

3.1

If you are the rightful owner of MCP_Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.

The MCP Server is a backend system designed for routing AI tools, supporting real-time streaming and modular tool execution.

šŸ¤– MCP Server

This is the backend MCP Server (Modular Cognitive Processor) for routing AI tools like summarization, knowledgebase search, and agent-based responses.

Deployed using FastAPI with Server-Sent Events (SSE) to support real-time streaming.


šŸ”§ Features

  • 🧠 Modular tool execution via ClientSession
  • šŸ” Live communication via SSE (/sse)
  • 🌐 Easily deployable to Render.com or Hugging Face Spaces
  • šŸ¤– Works with frontends like Gradio and ADK Agents

šŸš€ Deployment on Render

  1. Clone and push to GitHub:

    git clone https://github.com/YOUR_USERNAME/mcp-server.git
    cd mcp-server
    git push origin main
    
  2. Connect to Render.com:

    • Choose "New Web Service"

    • Link your GitHub repo

    • Use the following:

      KeyValue
      EnvironmentPython
      Build Commandpip install -r requirements.txt
      Start Commanduvicorn server:app --host 0.0.0.0 --port 10000
      Instance TypeFree
  3. Access endpoint:
    Your server will be hosted at:
    https://your-app-name.onrender.com/sse


šŸ“‚ Project Structure

mcp-server/
│
ā”œā”€ā”€ server.py           # FastAPI server with SSE endpoint
ā”œā”€ā”€ requirements.txt    # Python dependencies
ā”œā”€ā”€ Procfile            # Start command for Render
└── README.md           # You're here!