saurav-rcrm/mcp-remote-server
If you are the rightful owner of mcp-remote-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The RecruitCRM MCP Server is a Model-Controlled Program server designed to interact with the RecruitCRM API, facilitating integration with LLM platforms.
RecruitCRM MCP Server
This project is an MCP (Model-Controlled Program) server for interacting with the RecruitCRM API. It can be run locally for development or deployed as a remote web service to be used with any LLM platform.
Local Development
To run the server on your local machine for testing and development:
-
Create a virtual environment:
python -m venv venv source venv/bin/activate
-
Install dependencies:
pip install -r requirements.txt
-
Set up your environment variables: Create a file named
.env
in the project root and add your RecruitCRM API token:RCRM_TOKEN="your-api-token-here"
-
Run the server:
python recruitcrm_mcp.py
The server will start on
http://0.0.0.0:8000
.
Deployment to Render
This application is configured for production deployment on Render.
-
Push your code to GitHub: Create a new repository on GitHub and push this project's code.
-
Create a New Web Service on Render:
- Go to your Render Dashboard and click New + > Web Service.
- Connect your GitHub account and select your repository.
-
Configure the Service:
- Name: A name for your service (e.g.,
recruitcrm-mcp-server
). - Runtime: Render should automatically detect
Python 3
. - Build Command:
pip install -r requirements.txt
- Start Command:
python3 recruitcrm_mcp.py
. Render will use the command from theProcfile
.
- Name: A name for your service (e.g.,
-
Add Environment Variables:
- Go to the Environment tab for your new service.
- Add a Secret File:
- Filename:
.env
- Contents:
RCRM_TOKEN="your-api-token-here"
- Filename:
- This keeps your API token secure.
-
Deploy:
- Click Create Web Service.
- Render will build and deploy your application. Once complete, you will get a public URL (e.g.,
https://your-app-name.onrender.com
).
Using the Remote MCP Server
Once deployed, you can use your MCP server as a custom tool in any LLM platform that supports them (like OpenAI's Assistants API).
Use the following configuration:
- URL:
https://your-app-name.onrender.com/mcp
- Transport:
sse
(Server-Sent Events)