skpriya12/MCP_Server
3.1
If you are the rightful owner of MCP_Server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The MCP Server is a backend system designed for routing AI tools, supporting real-time streaming and modular tool execution.
š¤ MCP Server
This is the backend MCP Server (Modular Cognitive Processor) for routing AI tools like summarization, knowledgebase search, and agent-based responses.
Deployed using FastAPI with Server-Sent Events (SSE) to support real-time streaming.
š§ Features
- š§ Modular tool execution via
ClientSession
- š Live communication via SSE (
/sse
) - š Easily deployable to Render.com or Hugging Face Spaces
- š¤ Works with frontends like Gradio and ADK Agents
š Deployment on Render
-
Clone and push to GitHub:
git clone https://github.com/YOUR_USERNAME/mcp-server.git cd mcp-server git push origin main
-
Connect to Render.com:
-
Choose "New Web Service"
-
Link your GitHub repo
-
Use the following:
Key Value Environment Python Build Command pip install -r requirements.txt
Start Command uvicorn server:app --host 0.0.0.0 --port 10000
Instance Type Free
-
-
Access endpoint:
Your server will be hosted at:
https://your-app-name.onrender.com/sse
š Project Structure
mcp-server/
ā
āāā server.py # FastAPI server with SSE endpoint
āāā requirements.txt # Python dependencies
āāā Procfile # Start command for Render
āāā README.md # You're here!