EOSC-Data-Commons/data-commons-mcp
If you are the rightful owner of data-commons-mcp and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The EOSC Data Commons MCP server provides an HTTP API endpoint to access data from various open access data publishers, developed for the EOSC Data Commons project.
๐ญ EOSC Data Commons MCP server
A Model Context Protocol (MCP) server exposing an HTTP POST endpoint to access data from various open-access data publishers, developed for the EOSC Data Commons project.
It uses a search API, and a Large Language Model (LLM) to help users find the datasets and tools they need.
๐งฉ Endpoints
The HTTP API comprises 2 main endpoints:
-
/mcp
: MCP server that searches for relevant data to answer a user question using the EOSC Data Commons OpenSearch service- Uses
rmcp
with Streamable HTTP transport - Available tools:
- Search datasets
- Search tools
- Search citations related to datasets or tools
- Uses
-
/chat
: HTTP POST endpoint (JSON) for chatting with the MCP server tools via an LLM provider (API key provided through env variable at deployment)
๐ ๏ธ Development
[!IMPORTANT]
Requirements:
- Rust
- API key for a LLM provider: Mistral.ai or OpenAI, you can use the free tier, you just need to login
Recommended VSCode extension:
rust-analyzer
๐ฅ Install dev dependencies
rustup update
cargo install cargo-release cargo-deny cargo-watch git-cliff
Create a .cargo/config.toml
file with your Mistral API key or OpenAI API key:
[env]
MISTRAL_API_KEY = "YOUR_API_KEY"
OPENAI_API_KEY = "YOUR_API_KEY"
EINFRACZ_API_KEY = "YOUR_API_KEY"
โก๏ธ Start dev server
Start the MCP server in dev at http://localhost:8000/mcp, with OpenAPI UI at http://localhost:8000/docs
cargo run
Customize server configuration through CLI arguments:
cargo run -- --mcp-only -b 0.0.0.0:8004 --opensearch-url http://localhost:9200
Run and reload on change to the code:
cargo watch -x run
[!NOTE]
Example
curl
request:curl -X POST http://localhost:8000/chat -H "Content-Type: application/json" -H "Authorization: SECRET_KEY" -d '{"messages": [{"role": "user", "content": "data insulin"}], "model": "mistralai/mistral-small-latest", "stream": true}'
Recommended model per supported provider:
openai/gpt-4.1
mistralai/mistral-large-latest
groq/moonshotai/kimi-k2-instruct
einfracz/qwen3-coder
To build and integrate the frontend web app to the server, from the frontend folder run:
npm run build && rm -rf ../data-commons-mcp/src/webapp/ && cp -R dist/spa/ ../data-commons-mcp/src/webapp/
๐ Connect MCP client
Follow the instructions of your client, and use the /mcp
URL of your deployed server (e.g. http://localhost:8000/mcp)
๐ VSCode GitHub Copilot
Add a new MCP server through the VSCode UI:
- Open the Command Palette (
ctrl+shift+p
orcmd+shift+p
) - Search for
MCP: Add Server...
- Choose
HTTP
, and provide the MCP server URL http://localhost:8000/mcp
Your VSCode mcp.json
should look like:
{
"servers": {
"data-commons-mcp-server": {
"url": "http://localhost:8000/mcp",
"type": "http"
}
},
"inputs": []
}
๐ฆ Build for production
Build binary in target/release/
cargo build --release
[!NOTE]
Start the server with:
./target/release/data-commons-mcp
๐ณ Deploy with Docker
Create a keys.env
file with the API keys:
MISTRAL_API_KEY=YOUR_API_KEY
SEARCH_API_KEY=SECRET_KEY_YOU_CAN_USE_IN_FRONTEND_TO_AVOID_SPAM
[!TIP]
SEARCH_API_KEY
can be used to add a layer of protection against bots that might spam the LLM, if not provided no API key will be needed to query the API.
You can use the prebuilt docker image ghcr.io/eosc-data-commons/data-commons-mcp:main
Example compose.yml
:
services:
mcp:
image: ghcr.io/eosc-data-commons/data-commons-mcp:main
ports:
- "127.0.0.1:8000:8000"
environment:
RUST_LOG: info
OPENSEARCH_URL: "http://opensearch:9200"
MISTRAL_API_KEY: "${MISTRAL_API_KEY}"
Build and deploy the service:
docker compose up
๐งผ Format & lint
Automatically format the codebase using rustfmt
:
cargo fmt
Lint with clippy
:
cargo clippy --all
Automatically apply possible fixes:
cargo clippy --fix
โ๏ธ Check supply chain
Check the dependency supply chain: licenses (only accept dependencies with OSI or FSF approved licenses), and vulnerabilities (CVE advisories).
cargo deny check
Update dependencies in Cargo.lock
:
cargo update
๐ท๏ธ Release
Dry run:
cargo release patch
Or
minor
/major
Create release:
cargo release patch --execute