cnaseeb/MCP-REMOTE
If you are the rightful owner of MCP-REMOTE and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
This project implements an agentic AI control server using a lightweight MCP framework, connecting agents to remote LLMs like OpenAI and Claude.
MCP Server with Remote LLM Integration
This project implements an agentic AI control server using a lightweight MCP (Multi-Agent Control Protocol) framework. It connects agents to powerful remote LLMs like OpenAI and Claude, and exposes an HTTP API for orchestrating tasks like planning, classification, code generation, and anomaly detection.
π Features
- Agent routing to LLMs based on task type
- Remote model integration: OpenAI GPT-4, Anthropic Claude 3
- Flask-based HTTP server (
/mcp
endpoint) - Optional agent memory module
- Small, modular codebase for easy customization
π§± Project Structure
mcp-server/
-
agent_router.py # Routes requests to appropriate agent/model
-
llm_clients.py # API wrappers for OpenAI & Claude
-
memory.py # Optional agent memory
-
server.py # Flask MCP server
-
.gitignore # Ignores venv and secrets
-
.env # API keys (not committed)
-
README.md # This file
π 1. LICENSE (MIT License)
π License: MIT License β see LICENSE file.
π³ 2. Docker Deployment
Create a π Dockerfile
π Build and run locally
docker build -t mcp-server . docker run -p 5000:5000 --env-file .env mcp-server
βοΈ 3. Render Deployment (Free Hosting Option)
-
Go to https://render.com
-
Click "New Web Service"
-
Connect your GitHub repo
-
Set build settings:
Build Command: pip install -r requirements.txt Start Command: python server.py Environment: Python 3.x
-
Add environment variables in the "Environment" tab: OPENAI_API_KEY = your-key ANTHROPIC_API_KEY = your-key
-
Hit deploy!
4. Future Additions
-
Tool calling / function execution
-
Redis-backed long-term memory
-
Async agent orchestration
-
Slack or web front-end integration
π§ͺ Setup Instructions
1. Clone the repo
git clone <your-repo-url>
cd mcp-server
2. Create and activate a virtual environment
python3 -m venv venv
source venv/bin/activate # Windows: venv\Scripts\activate
3. Install dependencies
pip install -r requirements.txt
#If requirements.txt doesnβt exist yet, generate it with:pip freeze > requirements.txt
4. Add API keys
Create a .env file in the root directory:
OPENAI_API_KEY=your_openai_key
ANTHROPIC_API_KEY=your_claude_key
5. Running the Server
python server.py
The API will be live at http://localhost:5000/mcp
Example request (via curl):
curl -X POST http://localhost:5000/mcp \
-H "Content-Type: application/json" \
-d '{"input": "Plan my day", "agent": "planner"}'
π .gitignore
Make sure your .gitignore includes:
venv/
.env
__pycache__/
This prevents local environments and secrets from being tracked.
To untrack venv/ if it was already committed:
git rm -r --cached venv
π Credits
Built with inspiration from modern agentic architectures and the growing community around multi-agent AI systems.
---