fjsuarez/roche-mcp-server
3.2
If you are the rightful owner of roche-mcp-server and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Roche MCP Server is a Model Context Protocol server designed to expose API endpoints as tools for LLMs to use with Ollama.
Roche MCP Server
A Model Context Protocol (MCP) server that exposes API endpoints as tools for LLMs to use with Ollama.
Prerequisites
Installation
1. Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
2. Clone and setup the project
cd roche-mcp-server
uv sync
This will create a virtual environment and install all dependencies from pyproject.toml
.
3. Install Ollama and llama3.2
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull llama3.2 model
ollama pull llama3.2
Usage
1. Start your API server
Make sure your API server is running on the configured URL (default: http://127.0.0.1:8000
).
2. Start Ollama model
In a new terminal, start Ollama with llama3.2:
ollama run llama3.2
3. Start the MCP server
uv run api.py
Keep this running in a terminal.
License
MIT